Behold the server farm! Glorious temple of the information age!

Do you have a question? Post it now! No Registration Necessary.  Now with pictures! =

Behold the server farm! Glorious temple of the information age!
They're ugly. They require a small city's worth of electricity. And
they're where the web happens. Microsoft, Google, Yahoo, and others are
spending billions to build them as fast as they can.

By Stephanie N. Mehta, Fortune Magazine
August 1 2006: 12:22 PM EDT

(FORTUNE Magazine) -- Margie Backaus is standing on a 17-acre plot
outside Secaucus, N.J., shaking her head in disbelief. Garbage bags and
broken glass litter the ground, and weeds have taken over the parts of
the land that aren't bald. At least the view is nice, if you like power
lines, shipping containers, and the New Jersey Turnpike. Backaus, a
chatty, petite blond, doesn't say a word, but you can tell what she's
thinking: This looks like a place where Tony Soprano would dump a body.

Only Backaus isn't scouting locations for a TV shoot. She's chief
business officer for Equinix (Charts), a company that operates huge
data centers for corporations and Internet companies, and she's looking
for someplace to build another facility, Equinix's third new center in
the past several years.
More from FORTUNE
Can AOL keep pace?

Big musicians flex their muscle with record labels

$3-plus for gas, but we keep buying more

Current Issue
Subscribe to Fortune

A colleague tries to explain why the site they're visiting on this hot
May afternoon might work: It's big enough to accommodate a
750,000-square-foot complex--equivalent to seven Costco (Charts) stores
and three times the size of the megacenter Equinix is building in
phases in the Chicago suburbs. And those unsightly wires overhead are
actually a plus: The company could tap them to build its own electrical
substation to power the facility.

But Backaus just doesn't like it. More troubling to her than the
Superfund-site vibe is the amount of time it would take to construct a
new building and get it up and running. "I have to wait two years till
it's done?" she says, surveying the detritus. "I'm out."

Backaus's impatience is understandable. Equinix, a small but
fast-growing publicly traded company whose clients include Google
(Charts), Yahoo (Charts), MySpace, and other Internet powers, is
bursting at the seams, as are data centers operated by the likes of
AT&T (Charts) and IBM (Charts).

Competition for real estate, even ugly scraps of land such as the
Secaucus acreage, is so fierce that Equinix's brokers began
cold-calling landlords in northern Jersey when it became apparent the
company would need to expand in the New York area.

Most people don't think of it this way, but the Information Age is
being built on an infrastructure as imposing as the factories and mills
of yore. Think of all the things people have been using the Internet
for--all the e-mails, blogs, photos, videogames, movies, TV shows.

None of those bits and bytes simply float off into the ether, magically
arriving at their assigned destinations. Storing, processing, and
moving them all is heavy, heavy lifting. And the work is performed by
tens of millions of computers known as servers, all packed into data
centers around the world.

The industry term for the vast rooms full of humming, blinking
computers inside each of these complexes is "server farms," but "work
camps" would be more accurate. Consider that every time you conduct a
web search on one of Yahoo's sites, for example, you activate roughly
7,000 or more computers--and that doesn't count at least 15,000 others
that support every query by constantly poking around the Net for

"When you go to certain parts of a data center, it looks much more like
a factory than something high-tech," says Urs H=F6lzle, a senior vice
president of operations at Google.

The Great Planting of these server farms has only begun, thanks to a
revolution currently taking place in the $120 billion software
industry. Software is becoming webified: Computer programs that
traditionally have been installed on personal computers--from simple
word processing and e-mail to heavy-duty applications that help
companies manage payroll--are going online. (Bye-bye to CD-ROMs and
300-page installation manuals.)

Google in June released an online spreadsheet and earlier this year
acquired the maker of a web-based word-processing program called
Writely. The true sign of the times: Microsoft, a company that has
become synonymous with desktop software, has pledged to move a big
swath of its applications to the online world.

To handle this change, Internet companies are building their own
centers, though none of them is all that eager to talk about it.
Microsoft has been the most open--it recently broke ground on a
1=2E4-million-square-foot campus in Quincy, Wash., close to hydroelectric
power. Company officials acknowledge that centers in the South and
Europe will come afterward.

Yahoo, meanwhile, also has purchased 50 acres in Quincy for a server
farm. Google, which enjoys discussing its data centers about as much as
the NSA enjoys discussing its code-breaking techniques, hasn't been
able to conceal a two-building complex under construction on 30 acres
of former farmland in Oregon.

The Google facility will contain some of the estimated half-million to
one million (only Google knows for sure) servers that the company
operates to handle 2.7 billion online searches a month, its Gmail
service, and other applications. Experts figure such a project easily
could run north of $150 million; Google, of course, isn't saying.
Analysts expect that the three companies combined will devote roughly
$4.7 billion to capital expenditures this year, double 2005 levels.

Then there's the enormous cost of operating these things. New and
improved microchips that can process more data mean that standard-sized
servers can do a lot more than their ancestors did, but the newest gear
also throws off more heat. And that means cranking up the air
conditioning to make sure the computers don't literally melt themselves
into slag.

Vericenter, an operator of data centers, says a rack of "blade" servers
can get as hot as a seven-foot tower of toaster ovens. It gets hot
enough that for every dollar a company spends to power a typical
server, it spends another dollar on a/c to keep it cool. No wonder
Yahoo, Google, and Microsoft all are building their server farms in the
Pacific Northwest, near hydroelectric power plants selling cheap

"If I saved just $10 in the operation of each of those servers, that's
$10 million per year," says Greg Papadopolous, chief technology officer
of Sun Microsystems. "So how much would you be willing to invest in
order to save $10 per server? This is exactly the discussion companies
had around the time of the Industrial Revolution."

All this talk of plants and equipment must come as a shock to some
Internet investors, who surely thought they were buying shares in
elegant, money-printing intellectual-property companies, not some
dirty, noisy, capital-intensive industry.

When Microsoft in April signaled that it would need to pour at least $2
billion more than analysts expected into cap ex in the coming year, the
stock sank 11% in a day. No, Internet companies aren't spewing Ma Bell
levels of capital spending (yet): North American phone companies will
spend about $60 billion on cap ex this year, ten times what the five
largest Internet companies will cough up. But there's no question that
we're seeing a change in web economics.

No company feels this more acutely than Microsoft, which is making a
shift from a traditional software model, with its low capital costs and
robust margins, to the software-as-services model embraced by smaller
companies such as

The transition is fraught with challenges: Instead of collecting big
upfront payments for software, for example, services companies subsist
on subscription revenue that trickles in slowly over a longer period of
time. But the biggest challenge lies in building and maintaining the
kind of physical infrastructure needed to distribute software, games,
and other content--plus storage--via the Net.

Despite the financial unknowns, Microsoft is forging ahead. To build
and operate its share of the new web, it has turned to Debra Chrapaty,
a feisty technologist who, in all earnestness, says things like, "I'm a
woman who loves data centers. I love how you walk near a rack of
servers and it's really hot, and how it's cool in other spots." (She's
actually part of a lively sorority: As I traveled the country checking
out data centers, I found a surprising number of women in charge--and
they all seem to keep an extra sweater in their office.)

Chrapaty's team handles the infrastructure for anything at Microsoft
that a customer might access over the Internet--from Xbox Live
multiplayer games to Hotmail--a huge job that is only going to get

Today the company operates a handful of facilities worldwide that
occupy as much space as 12 Madison Square Gardens. Quincy, Wash., which
will be home to Microsoft's newest data center, showed up on the
company's radar eight or nine months ago partly because of its cheap
power, which Microsoft reportedly will be able to purchase for as
little as two cents a kilowatt-hour. (In comparison, engineers say
utilities in California routinely charge 11 cents a kilowatt-hour.)

"A couple of years ago I would measure a data center in square
footage," Chrapaty says. "Now I look at megawatts of power. It is a new
way of measuring technology."

So what do these data centers look like? They're a weird mash-up of
high tech (state-of-the-art three-quarter-inch blade servers) and heavy
industry (row after row of diesel generators) wrapped in the
architectural charm of a maximum-security prison.

To enter AT&T's data center in Ashburn, Va., for example, you have to
pass through a "mantrap," a revolving-doorlike contraption connected to
a special scanner that literally reads your palm before releasing you
into the heart of the facility.

Once inside, it's hard to see what all the fuss is about. The main
computing area looks a little like a mainframe storage room circa 1960.
Everything is painted dull beige, adding to the Cold War-era feel.

I almost expect to look up at the balcony-like observation area AT&T
maintains for customers and see Ernst Blofeld or some other Bond
villain surveying the blinking lights of those magical machines called

But there's nothing retro (or, as best I can tell, overtly villainous)
about the transactions taking place on those computers. The machines
humming along in concert are processing your urgent e-mail messages or
your company's website or--critically--your kid's online game activity.

And that dull roar in the background isn't the sound of servers serving
but of air-conditioning units cranking away to keep the temperature in
the computer room wine-cellar cool. (AT&T operates four enormous
"chillers" in Ashburn that continuously pump 13.5 million gallons of
water a day.)

It may come as a surprise, but municipalities aren't necessarily
prostrating themselves to host data centers. It isn't as if these
facilities generate tons of jobs--a center such as Google's megaplex in
Oregon is expected to add 50 to 100 jobs to the local economy,
according to press reports--and most communities simply can't cope with
the infrastructure demands of a massive server farm.

"You go to a local utility and tell them that you want 30 megawatts of
power, and they're just catatonic," Janice Fetzer, who heads up
data-center construction for Equinix, says drily. "They don't have the
resources to build a city within their city." (This is part of what is
driving Microsoft and others to resource-rich areas like eastern
Washington; remote locations aren't an option for Equinix and AT&T,
which have multiple customers using their facilities.)

Fetzer is a bit like a general contractor: Part of her job is helping
scope out real estate for future data centers, but she also coordinates
a small group of mechanical and electrical engineers and architects who
design Equinix's huge facilities. And Fetzer and her team are always
looking at cheaper ways to keep the server farms cool.

One approach involves special racks that use chilled water at the
source of the heat--the computers themselves--to keep the racks from
spewing hot air into the room. This is more radical than it sounds.
"Computers and water don't really mix," Fetzer remarks. "It's a very
emotional subject."

Remember, we're talking serious cooling here. In fact, the
underappreciated science of air conditioning has become a hot topic in
the technology world, with everyone from server makers to disk-drive
designers keenly focused on ways to build cooler computers.

"The duality of heating and cooling is the greatest challenge to
electromechanical engineering in the last 30 years," says Kfir Godrich,
director of technology development for EYP Mission Critical Facilities,
a Manhattan-based engineering firm that helps design fail-safe
facilities. Even the companies whose microchips go into servers are
thinking cool.

Advanced Micro Devices, for example, is pushing its Opteron chips as an
energy-efficient solution for data centers; the company has even
launched a major marketing campaign with billboards in Times Square and
Silicon Valley trumpeting itself as an environmentally friendly

Opteron puts two lower-power processors on a single chip, reducing the
electricity needs of servers that use it. Opteron's edge, claims Marty
Seyer, an AMD senior vice president, is that it helps data centers
expand their computing power without having to add real estate or draw
a lot more energy.

For Equinix, though, needing to add real estate is a good problem to
have: It means the Internet is growing and demand for its
state-of-the-art data centers is robust. In a few weeks the company
expects to disclose the location of its newest facility in New Jersey.
The details, like so many things in a data center, are top secret, but
if you're ever driving around Secaucus and see a bunch of two-megawatt
generators and seven-foot-high electrical switches sitting around
waiting to be installed, you'll know what they're for.  

Site Timeline