We use cookies to provide you with a better experience. If you continue to use this site, we'll assume you're happy with this. Alternatively, click here to find out how to manage these cookies

hide cookie message
80,259 News Articles

Facebook growth 'explosive', requires Terabit Ethernet

Data centre apps already demand 100-Gigabit version

According to a senior Facebook network engineer, the growth of the site has been such that Facebook's data centres already need 100-Gigabit Ethernet and ideally could use 1-Terabit Ethernet.

The popular social-networking service's growing bandwidth needs reflect the explosion in overall network traffic that enterprises and technology companies are trying to address, according to speakers at the Ethernet Alliance's Technology Exploration Forum, held Tuesday in Santa Clara.

Facebook builds its own data centres out of many identical low-cost servers, linked via standard Ethernet, and pools their processing power to run its core applications. The network fabric that links those systems is a critical piece of the infrastructure, Facebook Engineer Donn Lee told the Ethernet gathering.

"Already, there is a need for 100-Gigabit Ethernet, and where we're going for our upgrades, there is already a need for 1 terabit," Lee said. Facebook has so many servers, and those servers can process data so fast, that they could fill 64 Terabit Ethernet pipes in the backbone of one data centre, Lee said.

Ethernet is moving toward 100 Gbps (bits per second), with the option of 40G bps, but those long-awaited specifications aren't expected to be complete until later this year.

Though carrier backbone networks get a lot of attention, the escalating speeds of Ethernet may be needed first in data centres. The servers in Facebook's centres use a huge amount of network capacity as they collectively solve problems such as backing up databases, according to Lee.

"Just one of our tasks could eclipse the bandwidth of all of our users combined on any given day," Lee said.

Facebook is different from many enterprises in that it throws many servers at a single application rather than dividing up each server into multiple virtual machines. That means it faces a special challenge of knitting the many servers together. But its bandwidth challenge is rooted in fundamental advances in technology.

All server motherboards come with Gigabit Ethernet built in, and today's multicore processors can easily fill those pipes, Lee said. Meanwhile, 10-Gigabit Ethernet is the fattest standard pipe Facebook can use to tie the whole data centre together, and the company can't find switches that pack in enough 10-Gigabit ports to run the network at optimal speed, he said.

Ultimately, network bottlenecks slow innovation at Facebook, Lee said. Faster network connections would mean faster development, as new internal or customer-facing applications were quickly run through their paces and refined.

Ideally, each server could send a full 1G bps to any other server in the data centre, but that would require the 64 Terabit Ethernet pipes Facebook can't buy today - or 6,400 of today's 10-Gigabit Ethernet connections, which isn't really feasible, he said. A fabric of that size would require 160 of the largest switches available, Lee said.

"Just imagine trying to manage these," Lee said. "There are a lot of things that are missing from this data centre fabric. All of these things are essential to running an Ethernet fabric, yet none of them exist at this scale." Lee said his job is to piece together the elements that are available.

Related articles:


IDG UK Sites

Nexus 6 vs Samsung Galaxy Note 4 comparison: What's the best Android phablet?

IDG UK Sites

The iPhone is doomed. Doomed to be marginally less successful than a very successful thing.

IDG UK Sites

How to prototype native mobile apps without writing code

IDG UK Sites

How to prepare for and update to OS X Yosemite: Get your Mac ready to download & install Apple's...