With numerous cloud storage applications, VPNs, and remote servers all referring to latency, it’s important to identify just what latency is.
In the simplest terms, latency refers to the delay between making an action on a web application or server and when that action actually takes place. At its core, it’s the total time accrued for data to travel, usually measured in milliseconds. While latency can be shortened or narrowed, it can never be instantaneous because the data must travel a physical distance.
If you’ve ever uploaded files to a cloud storage application like OneDrive, Google Drive, or Drop Box, the time it takes for the data to be available in the cloud is your latency. Large files can take several minutes or more depending on the size. It might not seem too long to wait for a single upload, but if a team of remote users are collaborating on a large file, high latency means a lot of time will be spent discovering ways to waiting for file uploads and downloads.
VPNs are a common way for businesses to secure connections to private networks for remote users. With a single sign on, a user can access their in-office desktop or on-premises server. In terms of latency, the most obvious variable is the physical distance the remote user is to the server they’re connecting to. It would be quite a bit longer to transfer data from Australia to Canada, for example, than it would be from New York to Boston. While decreasing the distance to the server is not always an option, there are other ways to narrow latency.
As mentioned above, latency can never be instantaneous. With each additional node in a VPN, latency increases. For example, if a remote user accesses their in-office desktop to then access the on-premises server from that desktop, latency will be increased. Removing the different steps to access data is a surefire way to lower latency.
Another way to shrink latency and garner faster download and upload speeds is to decrease the server load. Although this is easy enough using a personal VPN that can access servers all over the world, it becomes more difficult for small to medium-sized businesses for different reasons. As Network Attached Storage (NAS) is usually located on the premises of a business, all remote users would be accessing the same servers and storage. Therefore, decreasing server load isn’t always an option for most businesses.
One of the easiest, but not the most cost-effective ways to lower latency is to upgrade a user’s internet speed. Because latency is defined by speed, the faster a user’s connection, the quicker data can transfer. Downloading or uploading files to cloud storage applications can vary widely with different internet speeds. If a user is in a remote location with a less than ideal internet connection, latency could become a larger issue.
Fortunately, there are alternative models to VPNs and cloud storage applications. Operating with the same functionality as a traditional NAS, a Cloud NAS offers users a low latency solution. Instead of routing through various nodes to access the desired server and thus increasing latency, the user has instant access to their files in the Cloud NAS. In that vein, users won’t need to upload or download to a storage application as everything will be immediately available. And for those remote users with tough internet connections? Morro Data’s CacheDrive allows large files to be instantly accessible.
Learn more about Morro Data and Cloud NAS today!