We use cookies to provide you with a better experience. If you continue to use this site, we'll assume you're happy with this. Alternatively, click here to find out how to manage these cookies

hide cookie message
80,259 News Articles

World's servers process 9.57ZB of data a year

UC San Diego research report estimates that average worker processes about 3TB of data annually

Three years ago, the world's 27 million business servers processed 9.57 zettabytes, or 9,570,000,000,000,000,000,000 bytes of information.

Researchers at the School of International Relations and Pacific Studies and the San Diego Supercomputer Center at the University of California, San Diego, estimate that the total is equivalent to a 5.6-billion-mile-high stack of books stretching from Earth to Neptune and back to Earth, repeated about 20 times.

By 2024, business servers worldwide will annually process the digital equivalent of a stack of books extending more than 4.37 light-years to Alpha Centauri, according to a report compiled by the scientists.

The report, titled " How Much Information?: 2010 Report on Enterprise Server Information ," was released at the SNW conference last month.

Roger Bohn, one of the report's co-authors, compared the world's business servers to the underwater portion of an iceberg "that runs the world that we see.

"Most of this information is incredibly transient: it is created, used and discarded in a few seconds without ever being seen by a person," said Bohn, a professor of technology management at UC San Diego.

The study included estimates of the amount of data processed as input and delivered by servers as output. For example, one email message may flow through multiple servers and would thus be counted multiple times, he said.

The workload of the 27 million or so enterprise servers in use worldwide in 2008 was estimated by using cost and performance benchmarks for online transaction processing, Web services and virtual machine processing tasks.

The scientists estimate there were 3.18 billion workers in the world's labor force at the time, each of whom received an average of 3TB of information per year.

The analysis relied heavily on data and estimates from researchers at IDC and Gartner, which compile regular reports on server sales.

As large as the numbers may seem, the three scientists who worked on the report stated that their server workload figures may be low because server industry sales figures don't fully account for the millions of servers built in-house by Google, Microsoft, Yahoo and other companies using individual component parts .

The report estimates that Google runs the largest installed base of servers -- more than a million -- in the world. It estimates that Microsoft has between 500,000 and three quarters of a million servers running worldwide.

"The exploding growth in stored collections of numbers, images and other data is well known, but mere data becomes more important when it is actively processed by servers as representing meaningful information delivered for an ever-increasing number of uses," said James Short, who served as research director of the project.

Short, a research scientist at UC San Diego, said that as the capacity of servers used to process the explosion of data increases, there are "unprecedented challenges and opportunities for corporate information officers."

For example, the study pointed to a sharp increase in the use of server virtualization technology beginning in 2006, as well the more recent use of cloud computing systems where server-processing power is provided as a centrally administered commodity delved out on a pay-as-needed basis.

The scientists focused their analysis on server performance per dollar versus raw processing power. They said the calculation used offered "a more consistent yardstick" when one considers the wide variety of servers used by enterprises.

For example, during the five years prior to 2008, new-server performance went up five- to eight-fold.

"While midrange servers doubled their Web processing and business application workloads every two years, they doubled their performance per dollar every 1.5 years," Bohn said.

In 2008, not surprisingly, entry-level servers -- those that cost less than $25,000 -- processed about 65% of the world's information, while midrange servers processed 30%, and high-end servers costing $500,000 or more processed just 5%, according to the 36-page report.

The report also stated that the total worldwide sales of all servers has remained stable at about $50 billion to $55 billion per year for five years ending in 2008.

Lucas Mearian covers storage, disaster recovery and business continuity, financial services infrastructure and health care IT for Computerworld. Follow Lucas on Twitter at @lucasmearian , or subscribe to Lucas's RSS feed . His e-mail address is [email protected] .

Read more about storage in Computerworld's Storage Topic Center.


IDG UK Sites

The best iPhone 6 alternatives: Price and specs compared with the best smartphones

IDG UK Sites

The top 10 Apple products ranked by pixel density: Which Apple devices have the sharpest screens?

IDG UK Sites

SBTRKT's Look Away webcam-based interactive music video won't keep your gaze

IDG UK Sites

Retina MacBook Air release date rumours and specs: Gold 12in Retina MacBook Air almost 1cm thinner...