Data centres in America are consuming massive amounts of energy in a wasteful manner by running computer servers that do little or no work most of the time, according to research.

The Natural Resources Defense Council (NRDC) analysed energy use by data centres and found that, on average, they were using only 12% to 18% of the electricity powering their computer servers.

Another 30% of servers are still powered on when they’re longer needed, while 80% of organisations said that the department responsible for data centre management has no contact with the department paying electric bills.

Most notably, data centres consumed an estimated 91 billion kilowatt-hours of electricity in 2013, which the report claims is enough to power all of New York City’s households twice over.

The report also predicted that energy consumption in data centres would reach 140 billion kilowatt hours by 2020. This could cost businesses $13bn annually for electricity, which is equivalent to that generated by 50 large coal-fired power plants emitting nearly 150 million tons of carbon pollution.

It added that electricity use could be cut by 40% if more efficient practices were realised in the industry. This would equal $3.8bn in savings for businesses and cut 39 billion kilowatt-hours of electricity, the report said.

Pierre Delforge, director of high-tech energy efficiency at NRDF, said: "New practices and policies are needed to accelerate the pace and scale of the adoption of energy efficiency best-practices throughout the industry.

"Nearly one-third of all leased data centre space will come up for renewal over the next year, so the time to act is now."