Some people just don’t understand that data centers are not teakettles.

When discussing sustainability, energy conservation, and other “green” topics, one of the favorite targets is the lowly teakettle. Untold gallons of water and millions of dollars are wasted because people thoughtlessly boil an entire kettle full of water for their tea.

Indeed, a study by the U.K. Energy Saving Trust found that three-quarters of households boil more water than they need, costing households £68m (more than $100 million) a year in energy bills. Or, as design consultant Leyla Acaroglu put it in a TED talk, “One day of extra energy use from these kettles is enough to light all the streetlights in London for a night.”

Eek.

Of course, there are all sorts of other aspects. Let’s say you efficiently boil just a single cup of water for tea, but then decide you want another. Doesn’t it take more energy to boil that second cup from room temperature rather than reheating the still-warm leftover water? Or what if you don’t properly calculate the amount of water needed and end up with only a partial cup? And consider the risk avoidance cost of ensuring that you don’t boil the teakettle dry and have to replace it.

It’s obviously a lot more complicated than most people realize.

So what do teakettles have to do with data centers, anyway? A frequent lament about data centers is that they, too, waste energy. (Even though some data centers are moving toward renewable energy).

One of the worst offenders was a New York Times article railing about the wastefulness of the computer industry, due to data centers. “A yearlong examination by The New York Times has revealed that this foundation of the information industry is sharply at odds with its image of sleek efficiency and environmental friendliness,” wrote reporter James Glanz. “Most data centers, by design, consume vast amounts of energy in an incongruously wasteful manner, interviews and documents show. Online companies typically run their facilities at maximum capacity around the clock, whatever the demand. As a result, data centers can waste 90 percent or more of the electricity they pull off the grid.”

That’s a lot of teakettles.

Glanz went on to complain about diesel-based generators and other ways of backing up power, as well as criticizing administrators for overprovisioning network and server hardware. “Low efficiencies made sense only in the obscure logic of the digital infrastructure,” he noted.

The article was criticized at the time, but also had its share of supporters. “Many of the companies the Times cites are providing what amounts to an inessential service to everyday consumers,” writes Will Oremus in Slate. “Yahoo, for instance, spends huge amounts of energy storing data from people’s old fantasy football leagues.”

Fantasy football? Inessential? As if.

Such criticisms are ongoing. A more recent report from the National Resources Defense Council claimed that some servers were operating at 18 percent capacity or less—with up to one-third of servers no longer needed at all, which it called “zombie servers.”

The problem is, there’s no universal definition for “inessential” or “unneeded,” and there aren’t many companies that can afford to be cavalier about uptime. As one Glanz expert notes, “They don’t get a bonus for saving on the electric bill. They get a bonus for having the data center available 99.999 percent of the time.”

Well, yes. When an hour of network downtime can cost $300,000 or more, as estimated by Gartner, it makes sense to focus on availability. Even Oremus concedes, “You could also make the case that 2 percent of our energy supply is not a huge price to pay for all the services the Internet provides.”

Certainly there are ways to look into reducing the amount of non-renewable energy that data centers use, leveraging equipment more efficiently, and having a better idea of how much cooling a server room requires. But at the same time, data centers require redundancy and spare capacity to keep doing what they do. “Think of it this way,” writes Dan Woods in Forbes. “Roads aren’t 100 percent utilized. The telephone system isn’t 100 percent utilized. They are there when they are needed.”


Simplicity 2.0 is where we examine the intricate and transitory world of technology—through a Laserfiche lens. By keeping an eye on larger trends, we aim to make software that’s relevant to modern day workers, rather than build technology for technology’s sake.

Subscribe to Simplicity 2.0 and follow us on Twitter. If what we’re saying piques your interest, head over to Laserfiche.com where you’ll see how we apply the lessons learned on Simplicity 2.0 to our own processes, products and industry.

Machine Learning

Learn how machine learning can be the driving force for digital transformation in your organization.

Listen Now

Related Articles

By Sharon Fisher, November 16, 2017

Digital transformation and artificial intelligence? As it turns out, they're two great tastes that taste great together. Here's how they help your company.

Read More

By Sharon Fisher, November 02, 2017

Customer experience includes the “journey,” the path customers take to interact with a firm. Digital transformation lets you streamline this journey.

Read More

By Sharon Fisher, October 19, 2017

Nearly everyone will tell you that to be successful in business, you need to be innovative. But what does being innovative really mean?

Read More