In an effort to diversify its cloud offerings, NaviSite - a Time Warner Cable cloud computing and managed hosting company - announced plans to offer storage service for customers in what one analyst says could the beginning of service providers looking to offer big data analysis from the cloud.
At Interop, NaviSite announced NaviCloud Intelligent Storage, which is based on an EMC Atmos cloud architecture that lets enterprise customers store, backup and share files. NaviSite has traditionally offered managed applications services that in recent years has expanded into collaboration tools and virtual desktops. Object storage is the next logical progression for the company, says Chris Patterson, product manager for NaviSite's cloud offerings.
"This is an extension of things we've already been doing," Patterson says. "We've had a nice cloud offering, but we've been looking to add complementary services on top of that." The company offers a pricing model that will start at $0.20 per GB; the service will be available beginning in June.
MORE INTEROP: Cloud providers touting multi-hypervisor support
Richard Villars, an analyst with IDC Research, says NaviSite is solidifying its cloud offering to provide a more complete package for customers. Into the future, he says the next play for infrastructure companies will be continual advancements in the area of analyzing the data that is stored in the cloud. "We predict that if you want to be a serious high-end cloud service provider, that big data services will be prominent within a few years," he says. The first step to doing that is having an object storage service. "What's become clear is that anyone who's looking to do [data analytics in the cloud], needs an object storage," he says. "You have to give people a place to park that data that's going to be analyzed."
NaviSite has not said it plans to move into the big data analytics arena, but Patterson says the company is looking for continual advancements of its services in the coming months.
Villars says big data analytics in the cloud will become increasingly popular from an enterprise end user perspective. Many big data analytics jobs require a large amount of compute resources in a short, temporary amount of time, known as bursting. "One of the best uses case is not a continuous analysis, but doing a job and reducing the time to do that job from four days to four hours," Villars says.
The more companies put their data into a public cloud, the more interested they will likely be in running some sort of analysis of that data, he says. Service providers, meanwhile, have an opportunity to leverage analytics tools, such as Hadoop and others, to offer such a service.
There have already been some such moves. This month, for example, Google released its BigQuery, which is a cloud-based big data analytics tool. Amazon Web Services already has its Elastic Map Reduce, which is a cloud-based Hadoop offering. "We're not saying that you're doomed if you don't do this, but it's where the market is going," Villars says.
Network World staff writer Brandon Butler covers cloud computing and social collaboration. He can be reached at BButler@nww.com and found on Twitter at @BButlerNWW.