Cloud computing is the on-demand delivery of computing services over the Internet. The services include servers, storage, databases, networking, software, and analytics. With the pay-as-you-go pricing model, cloud services reduce organizations’ capital costs and provide flexible resources.
Despite the overwhelming advantages of moving all types of computing to the cloud, most businesses are hesitant to use the cloud for storage, due to significant performance issues. The primary problem created by the cloud is distance. Of course, it makes sense that the further data has to travel, the longer it will take to get there. Since the cloud, by its very nature, is far away from your physical infrastructure, it stands to reason that both upload and download speeds can suffer tremendously.
And that’s just primary storage for a single location. Consider what happens when you have multiple office locations that are geographically dispersed, and you, therefore, need to continuously sync data and files between those locations; or if you intend to use the cloud for your disaster recovery solution. Since the cloud relies on existing cloud WAN connections, which are plagued with speed and performance issues, such activities would grind your network’s performance to a halt.
But what if you could dramatically improve that performance by blending the benefits of the cloud with that of cache to keep critical and frequently used data close? The concept of cache is well understood and universally employed, at just about every level of computing – from the network perimeter to individual endpoints – to dramatically improve the speed at which critical or frequently used data is served to the requestor. But currently, there’s no ability to deliver local caching for files stored in the cloud, so moving data long distances across legacy WAN connections continues to be the Achilles’ heel of cloud storage. If you could apply the principle of local caching to the cloud, you’d be able to solve the crippling performance and latency issues; therefore, you could finally reap all the benefits of the cloud without compromising network performance. Related reading: Benefits of Cloud Migration.
That’s exactly what’s possible with Morro Data Hybrid Cloud NAS. Employing Cache & Sync technology and the Morro Data CacheDrive, Morro Data enables organizations to apply the caching principle to a cloud/local edge device setup. By saving data directly to the CacheDrive, a small form factor hardware device that physically resides in the local office and is directly attached to the LAN, with internal storage ranging from 1TB to 16TB, businesses benefit from robust NAS-like features, including a regular filesystem interface via a drive letter from any machine on the LAN.
It’s also important to note that cache capacity is not the same as cloud capacity; in fact, it’s common to only cache approximately 10 percent of all available cloud storage, to reap the performance benefits of a legacy NAS while keeping costs to a minimum. The CacheDrive immediately delivers local cache capabilities to avoid the inherent latency involved with retrieving files that are stored remotely, thereby allowing you to upload and access files at gigabit speed. But in the background, the CacheDrive also negotiates WAN protocols to keep all files continuously synchronized with the cloud, without relying on existing cloud WAN connections. As a result, all files are continuously stored and protected in the cloud, but with a local cache available for immediate access to frequently used files.
Cache & Sync technology enables Morro Data to deliver primary file storage with a cloud interface and multi-office sync across up to 500 locations, as well as backup and disaster recovery – at speeds that are similar to what you’re accustomed to with your NAS device, but without the cost, complexity, and instability of a NAS. To learn more about how to apply the caching principle to a cloud/local edge device deployment, visit https://www.morrodata.com/.