About
Subscribe

Intelligence in the network

Networking equipment manufacturers are introducing more intelligence into the devices that connect applications, computers and people together, but what does this actually mean for businesses?
Andy Brauer
By Andy Brauer, Chief Technology Officer at Business Connexion
Johannesburg, 30 Aug 2005

With the announcement of strategies by the makers of networking equipment to introduce more intelligence into the devices that connect applications, computers and people together, one needs to take a step back to look at what this actually means, both for the network and for the businesses using them.

There are two scenarios around putting intelligence into the network: the first is making the application aware of the network, and the second is to make the network aware of the application.

Essentially, the concept is to give the application that relies on connectivity the ability to examine network conditions and respond accordingly. Hence, if the application detects congestion, for example, it can take action to address the problem. Such action could include the introduction of compression or an adjustment of the quality of service mechanism.

If the application is aware of the network and knows how to drive the network, the application has to be programmed in such a way that it can identify various scenarios and take action. The ability to do this is based to some extent on the complexity and probability theories of Russian mathematician Andrey Kolmogorov. Initially applied in the field of genetics, these theories have evolved to start creating the so-called `neural networks` that can identify traffic types and network conditions.

As implied, for the intelligence in such networks to function, the way programmers develop applications will have to change, enabling a far closer relationship between the software and the network that connects it.

Looking at the second method of creating an intelligent network, which is to make the network itself intelligent through embedded software, the approach is one that looks at the architecture of the centre, and examining methods of getting the various components to communicate more effectively with one another. This comes in three flavours: Stream Control Transfer Protocol, Datagram Congestion Control Protocol and Application Aware Network Protocol (AONP).

For the intelligence in such networks to function, the way programmers develop applications will have to change.

Andy Brauer, CTO of Business Connexion

These methods of data transfer differ from vanilla TCP/IP in that they introduce the level of intelligence necessary to adjust data transfer according to changing conditions. AONP is a Cisco propriety protocol which seams to indicate that sometimes open standards do not also provide the solution in the industry.

Without going into the technicalities, let`s look at what these developments mean for business. For the typical network-accessed applications that characterise the modern business, it means far greater reliability and predictable performance. Application-aware networks can also intelligently identify rogue traffic and throttle it back, possibly reducing virus and other attacks, and also posing a more serious barrier to hackers. Within the data centre, where resources are connected by the network, it means much improved resource utilisation, taking advantage of more memory, CPU capability and storage resources, delaying the need for upgrades.

The benefits are significant, and they are immediately available. However, those looking to take advantage of application-aware networks do have to temper their enthusiasm, as adoption depends on how willing one is to spend large amounts of money on such networks. Right now, such networks are in the domain of academics and technical people, and should be ready for prime time in perhaps 18 months.

Share