Following up my query about how folks are making the most of their flow data, I’m also curious about what automated tasks people are performing as reactions to flow data, if any. This question is rooted in the reading & writing I’ve been doing about software defined networking.
For example, let’s say there’s a 10Mbps WAN link we use for voice as well as data that a monitoring station is getting flow data from. In a statically defined network (i.e. the way we do things today), we’ve probably built a QoS policy for that link based on expected peak call volume, but the key is that the interface definition is static. We make our best guess as to how to provision the interface based on traffic mix history and expected future trends.
But, what if the QoS policy was flexible based on current demand the flow data is telling us about? The idea is that the flow data gets exported to a controller that can reprovision the QoS policy of the WAN interface on the fly. More calls coming through? Beef up the low-latency queue. Fewer calls? Re-allocate queues to other flows. I recognize that there are products that can do this, although I think of them as service provider tools than enterprise tools for the most part.
The question then is this: does running a network this way - where real-time flow data could change the overall traffic policy of a network – sound like the best thing ever, or more like a miserable pain in the backside you’d get sick of troubleshooting? I know networking vendors want to take our networks to this software defined panacea in a thousand different ways, so it’s an interesting point to discuss.