Windows 8 And The Great Bandwidth Backup Issue

Windows 8 will be an exciting and challenging time for IT managers. Soon Windows 8 tablets will be added to the mix of laptops and desktops that have corporate information that requires data backup. However, the IT manager is unlikely to have any additional bandwidth for his users.

Now that Windows 8 is available, many IT managers will be asking themselves if they are ready for it. What they also need to ask is, are the network and backup systems ready for it too? With Windows 8 available on laptops, PCs, tablets and smartphones there will be a lot more devices connecting to the network and you will need to be able to deal with them and still be able to run an effective network.

These extra devices will also need to backup over a network, which is finite, and will consume bandwidth when they are in use and when they are being backed up. So what do you do with these extra devices? Especially if they are being used in a remote office where the connection is slow?

This is one of the major headaches for many companies when they have satellite offices all over the world. They are producing and accessing important data, which has to be backed up, but they have a slow connection. So instead of everyone in that office fighting to get their backup off to the centralised vault somewhere else in the world, which eats into the already crowded bandwidth, why not back up to a local solution that connects back to main vault when the network is quiet at night or over the weekend?

This is what some new backup services are now providing. Using locally placed NAS devices, that can be installed on the LAN at the office, they backup at network speed without taking up huge amounts of the bandwidth. Running Windows Server or Windows Storage Server can start for as little as £400. Users backup locally over the LAN and store all the backups from one office. All the local staff backup to the NAS.

The network manager has control over when the connections out to the main vault happen – for instance when the network is quieter. Therefore, instead of lots of different connections happening haphazardly you can optimise to the speed and capacity of the specific network.

The initial backup is larger in footprint, you can set this one main backup to then only upload to the main vault when the office is quietest – at the weekend perhaps. This sort of solution is ideal for areas where bandwidth is expensive and it has been thoroughly proven in areas where connections are akin to the speeds achieved with old 14.4k modems in less developed areas of the world.

Another benefit of doing it this way is these tools also often come with deduplication features too, so you make sure that every bit and every byte that you send over your precious bandwidth is unique and hasn’t already been sent by another device in other backups.

The system checks every block of data before sending to ensure it doesn’t already exist. Many backup solutions swamp the network by sending the same document or PowerPoint off-site from every user with the document.The first backup of the system is always the largest, but this way future backups can be done faster and more efficiently by ensuring only changed blocks are sent.

The flipside to backups is the time it takes to restore. By keeping the backups on the NAS locally, systems can be restored quickly, unlike if they were stored on the centralised vault that could be thousands of miles away. This dramatically improves the recovery times for users and also saves on bandwidth too. An example of this is if someone loses their laptop whilst en route from one office to another, with local backups a new laptop can be restored with that person’s data quickly and easily at LAN speed.

In addition to this some also have a feature that enables users to backup even when they are out of the office and don’t have an internet connection. The software runs in the background so that when they get a connection again it then starts to synchronise the backup to the vault.

Device theft or loss and consequent loss of data is probably one of the biggest headaches for Organisations and with the EU looking into fining companies even more than even the Information Commissioner’s Office (ICO) currently can (the maximum penalty being £500,000), making sure that they minimise the risk of data breach.

With this in mind, you need to make sure that the backups are not only encrypted, but the software provides users with a detailed audit of what was on the device at the time of loss. This is because many companies don’t actually know precisely what data was actually lost and often admit to losing more than they did, simply because the user wasn’t sure what data was involved.

This may seem a little irrelevant when you consider that the data is also encrypted on the device as well as during the entire backup process and therefore relatively safe. However, by being able to find out what data was lost you can use it to look at your processes and, potentially, reduce the amount of future risk.

The software can also be used to block ports so that you can control the ways in which data leaves your company. Added to this many companies have found that as little as 25% of devices actually use encryption because end-users disable it as they feel it slows down their productivity – this system allows centralised management of the entire encryption process.

Backing up is essential part of everyday life, but you don’t want it to slow your business down when staff are already working on slow network connections. Over the years the cost of storage has gone down and down but the cost of bandwidth hasn’t and it is unlikely to as businesses and individuals demand more and more, faster networks.

Companies need a way to backup data without affecting productivity, but still remaining secure and then being able to call on those backups again when the data needs to be restored. Couple this with the influx of devices running new software such as Windows 8 and your problems have got a lot more complicated; unless you keep a local cache that keeps the data close before copying it over to a centralised backup vault.

So the next time you think about your backup, think about your network and the impact all those backups have on it. If your network is slow when people are working on it, how much slower is it when they are all backing up over it at the same time?

SHARETweet about this on TwitterShare on LinkedInShare on FacebookShare on Google+Pin on PinterestDigg thisShare on RedditShare on TumblrShare on StumbleUponEmail this to someone

Phil Evans is VP at Datacastle. He has over 20 years of combined sales and business development experience in both North America and Europe. Phil was responsible for setting up the EMEA sales operations for EVault prior to Seagate's acquisition and the creation of i365 and held numerous positions at i365 including Director of Business Development (EMEA) and as VP of Sales for Northern Europe. Phil also served as a Director at a UK storage management company and established the European operations of Professional Services in EMEA for Legato Systems.