We use cookies to provide you with a better experience. If you continue to use this site, we'll assume you're happy with this. Alternatively, click here to find out how to manage these cookies

hide cookie message
80,259 News Articles

Cloud adoption pushes storage virtualization

About three years ago Host.net, a collocation and managed service provider, decided it was going to fully embrace the cloud as a new suite of offerings for customers. When migrating to the new service, the company wanted to use its legacy hardware infrastructure - mostly Dell and EMC storage servers - with new hardware that had been purchased for the upgrade. But they didn't want to be tied down to any one vendor moving forward, in case application needs or markets changed.

Host.net officials wanted to manage these storage components in a unified control panel, with the flexibility to add additional hardware in the future, if need be. The answer they found was storage virtualization.

MORE CLOUD: Gartner: 1/3 of consumer data will be stored in the cloud by '16 

OPEN SOURCE CLOUDS: Open source cloud vendors air their differences

In a cloud world, data storage is like the foundation of a house, says Host.net CTO Jeffrey Slapp. "If it's weak from an availability or resiliency standpoint, everything else above it can fail," he says. "Storage has to be bulletproof for everything else to work."

Host.net has used a storage hypervisor to power its cloud offering from DataCore, which is one of a handful of companies in the storage virtualization market. Later this month, on July 28, Host.net hopes to celebrate 1,000 consecutive days of cloud storage offerings without any downtime, which Slapp attributed to the high availability storage virtualization has provided the company.

The idea of storage virtualization is to allow for central management of disparate underlying storage hardware components. David Hill, a storage analyst at the Mesabi Group, says the technology is not new, but offerings are becoming more advanced and users are increasingly demanding the functionality as the cloud is embraced. Storage hypervisors, he says, need to be able to control both horizontal and vertical storage hardware. Horizontal means controlling different types of storage components such as solid state drives, external drives, in-memory and server storage. Vertical management means managing heterogeneous arrays, meaning both cloud-based and on-premise storage options and all of their functionality, from data management to virtual machine snapshots and replication. True storage hypervisors, he says, work across multiple storage providers.

There are a variety of products on the market, including those from DataCore, and others from hardware vendors, such as IBM, HP and Dell, which Hill says are optimized to work on their own infrastructure. SolarWinds and FalconStor each have storage hypervisors as well, along with Hitachi Data Systems and NetApp. Hill says he likes DataCore's offering for its breadth of functionality. "They're like the overnight sensation that's been around for 15 years," Hill says about DataCore.

This week DataCore released an updated version of its SAN Symphony 9.0 release, which includes an offering aimed specifically at cloud service providers that can be purchased on a per-use basis. Enterprise licenses of the software are available based on the size of the storage that is being managed, the company says, with prices less than $10,000. "We are basically a software that virtualizes the disks that you already have," says George Teixeira, president at CEO of DataCore. "We run on standard servers or virtual machines and take whatever storage is available and present that up to the application servers."

Storage hypervisors, Teixeira says, should eliminate the need to rip and replace existing legacy storage infrastructure, while creating a more efficient storage management platform. DataCore creates tiered storage, for example that automatically allocates storage resources based on the application demands. By reserving the high-performance storage, such as SSDs for only high performance use cases, it allows for a more efficient utilization of storage resources and better performance. Backing up redundant copies of all stored information creates high availability, he says.

Hill says such features are a way for cloud service providers to create a highly available system, or for enterprises that may be looking to centrally manage disparate storage components in a single pane.

Network World staff writer Brandon Butler covers cloud computing and social collaboration. He can be reached at [email protected] and found on Twitter at @BButlerNWW.

Read more about data center in Network World's Data Center section.


IDG UK Sites

Best January sales 2015 UK tech deals LIVE: Best New Year bargains and savings on phones, tablets,...

IDG UK Sites

Chromebooks: ready for the prime time (but not for everybody)

IDG UK Sites

Best Photoshop Tutorials 2014: 10 inspiring step-by-step guides to creating amazing art,...

IDG UK Sites

Apple TV expert tips: get US Apple TV content, watch Google Play, use multiple Apple IDs and more