When you hear someone refer to the speed of an Internet connection, you typically hear it measured by how many Megabits per second (Mbps) it is. When in reality, the amount of Megabits you move per second only provides a small contribution to the perceived “speed” of a connection and the remote user experience.
When it comes to the perception of speed, latency, which is the time it takes for a packet of data to get from point A to point B, can either be your best friend or your worst enemy and could make or break any number of cloud-based solutions.
In a cloud-based solution of any kind it would make sense that the amount of data you could move in a second would become less important than the speed at which you could move it.
Think about the user experience, or more importantly where the user’s experience takes place. Let’s use a basic example of a virtual desktop in a co-located hosting facility. The file, email, application and database servers commonly accessed by that virtual desktop are local to the host on which that virtual instance resides. The datacenter fabric technology connecting those resources inside that environment provides 10+Gbps connectivity with less than 1ms of latency providing near instantaneous access. The only data traversing your connection to that facility is interaction and display data, consuming on average a low 128Kbps +/- 100Kbps.
The focus quickly shifts from bits per second to milliseconds of latency and illustrates why a low bandwidth, low latency circuit based connection with a dedicated access rate could provide a better user experience than a high bandwidth shared connection in most cases.
“So what’s the difference? I mean a 45Mbps business class Broadband connection could cost $100 - $300 per month and a 45Mbps DS3 could cost $6,000 – $10,000 per month. 45Mbps is 45Mbps right?” No. Latency is the difference. It’s dedicated access vs. shared access.
Being as it’s not uncommon for oversubscription ratios of 100:1 to exist in business class Broadband ISPs and up to 500:1 in residential, it would make sense that in order to be guaranteed a 2:1 or better subscription ratio you better get ready to pay for it.
There’s good news though. Just like pulling down large amounts of data necessitated the availability of high bandwidth shared internet circuits, the trend of remotely accessing hosted solutions will likely necessitate the need for low latency offerings and as usual, competition will bring more options and better pricing.
Until then, it’s a good idea to understand how you can ensure you’re getting the most out of what you have, and understand how to improve performance without just adding bandwidth. Explore WAN optimization, de-duplication and link management solutions. Talk to the experts about ways to stop the exponential growth and consumption of bandwidth while improving the most important thing of all: The User Experience.