Worldwide, the digital warehouses use about 30 billion watts of electricity, roughly equivalent to the output of 30 nuclear power plants, according to estimates industry experts compiled for The Times. Data centers in the United States account for one-quarter to one-third of that load, the estimates show.
A few companies say they are using extensively re-engineered software and cooling systems to decrease wasted power. Among them are Facebook and Google, which also have redesigned their hardware. Still, according to recent disclosures, Google’s data centers consume nearly 300 million watts and Facebook’s about 60 million watts.
Now I get very confused by sciencey things. But a Watt is a rate isn’t it?
So a watt isn’t what you use, it’s the rate at which you use it? What you use is some number of watt-hours isn’t it?
Help me out here, I’m confused. Have the NYT got their units wrong or not?
OK, thanks for the explanations. It does make sense now. They say that the US uses 1/3 or so of this 30 billion, then later they say that total usage in US is something like 76 billion Kwh over a year.
Which, working back, indicates that they’re on average, working at about 70% of that 30 GW. Which seems to make some sort of sense to me at at least.