Beating Bandwidth Bottlenecks At The Network Edge

The ability to communicate information and data over low bandwidth connections can be a game-changing capability for organisations with a distributed workforce. Access to the latest information can have a critical impact on the ability of remote users to perform their roles and those located in areas with limited bandwidth must be supported by tightly-integrated information systems that ensure uninterrupted, real-time access to operational data.

Global web portals provide a single and consistent view of data and are a common means of providing access to corporate or mission-critical data, which are often deployed across distributed networks and enable firms to store, search, categorise and archive critical information assets. Chemical company Celanese for example, deploys SharePoint as its main strategic document management and business collaboration platform. It is used to provide its sales people with 24/7 access to up-to-date content, on their laptops, wherever they are in the world.

While web portals provide a platform to put information at an operative’s fingertips wherever they are located, they are typically designed to operate over a local area network (LAN) and often struggle when deployed over an extended wide area network (WAN). Accessing any web-based application over an extended WAN can introduce performance degradation and impact on the user-experience of remote workers – and the more remote the location, the more constrained bandwidth resources are likely to become.

The dangers of workarounds

Remote and mobile users can quickly become frustrated with the performance and availability of enterprise web applications over limited bandwidth connections or where localised issues such as latency and periodic disconnection occur. They either stop using them altogether, or once they have accessed information from the central source, copy it manually to local servers or hard drives. This results in multiple issues.

If remote users have to employ workarounds to overcome poor performance on the WAN, the return on investment realised via web applications is severely diminished because of the inefficiency in the way the information is being accessed. Furthermore, any changes or updates to information held locally are not reflected in the central database – i.e. the master database or ‘single source of truth’.

This can have disastrous consequences given that web applications are often used to keep their employees up-to-date with ever-evolving regulatory frameworks that corporations must comply with in order to manage risk. The maritime industry for example has to comply with rigorous safety standards such as the International Safety Code, which go through constant review and updates. It is of paramount importance that key personnel on ships that are at sea for months at a time are able to access the latest versions of these regulations.

Potentially even more serious is that the workarounds employed to copy confidential enterprise data locally can threaten the security and the integrity of the data itself. An extreme example is that of the military, who need to communicate highly-sensitive information to personnel out in the field, but must at all costs prevent this data from being accessed or saved on a device that could potentially fall into enemy hands.

Satellite networks are commonly used by organisations where employers are required to work in extremely remote locations such as on oil rigs or battlefields and provide a unique opportunity for linking globally distributed assets, but their limited bandwidth, high latency and intermittent availability make them a highly restrictive and often costly information channel.

At the same time, the sheer volume of application data that needs to be replicated over available networks can easily consume all the available satellite bandwidth. And should poor weather conditions such as heavy rainfall bring a satellite connection down, employees still need a way to be able to work offline until the connection is restored.

The challenge for companies is therefore to reconcile the need for universally available and globally consistent information with the fact that many users of that information operate at the very edge of the network where available bandwidth and network coverage can be inconsistent at best.

Super-charging network connectivity

There are four broad options available to organisations facing the connectivity conundrum at the network edge. The first is network acceleration. Most accelerator devices can be installed in the form of hardware appliances at each end of the network and, as the name suggests, have the effect of speeding up communication between any two points on that network.

Generally speaking, these devices will store repeated network calls issued by the computer in an intelligent fashion and effectively reduce the amount of data to be sent over the network, speeding up network traffic by a factor of between six and ten.

The second option is compression. By reducing the quantity of data that needs to be sent over the network, compression techniques will reduce the amount of bandwidth required and consequently the cost of delivery. Various compression tools are available that provide the mechanism for reducing the data footprint of any updates sent over the network so that better use can be made of the available capacity.

Thirdly, content distribution enables firms to proactively deploy key data closer to the end user and thus reduce reliance on external network connections. This allows users to avoid calling over the connection to access data. When the remote worker needs to access that information, it is available in a local store that does not require reaching back to shore over a fragile or costly satellite connection.

Finally, least-cost routing enables companies to actively switch between providers of bandwidth, and is a smart way to reduce bandwidth costs. In a typical scenario, communications will switch from satellite-based delivery when operating remotely and out of range of the network, to more cost-effective VHF (radio) delivery when in range.

The most practical approach is to adopt a hybrid of all four solutions, whereby essential content is automatically distributed over an accelerated network giving end users guaranteed LAN speed access to data that originated over the corporate WAN. However, hybrids must also be designed according to the needs of the organisation, the network topology and the types of devices being served at the network edge.

For example, server-to-server replication technology combined with compression allows update amendments to be passed between a master and replica server over connections as small as 1-100kbps, enabling organisations to ensure 24/7 business continuity. Meanwhile, server to virtual server solutions create a virtual copy of content on remote servers or devices and can support lightweight read-only portals for remote offices.

Similarly, server to laptop connectivity solutions can ensure mobile and field workers dependent on laptop connectivity have exactly the same experience as they would if working at head office by creating a virtualised copy of the master server application.

In all of these scenarios, web page links will still work, documents can be updated and database search and access is available despite low-bandwidth connectivity – even if the user is offline. Crucially, solutions combining web virtualisation and innovative forms of compression technology are extremely cost effective, which is essential given the high cost of acquiring capacity on specialist networks powered by satellite and VHF, or the potential for breaches in security and compliance should employees seek workarounds to poor performance and degradation in the WAN.

And with the wide choice of connectivity solutions now available, organisations have many more ways of delivering business- or mission-critical information to the network edge – but with the type of user experience expected within the corporate LAN.

Lawrence Poynter is Product Director at iOra, with responsibilities for product management and quality. He obtained a BSc in Statistics and an MSc in Intelligent Systems from Plymouth University. His career has spanned a variety of roles in a number of small venture funded companies. Before working at iOra he was Director of Product management at BeFree, based in Massachusetts. Prior to BeFree he worked for TriVida in California and was responsible for Product Management and Development. Before working in America, Lawrence worked primarily in consulting and project management roles throughout Europe based in the UK.