Last month, researchers at OpenAI in San Francisco revealed an algorithm capable of learning, through trial and error, how to manipulate the pieces of a Rubik’s Cube using a robotic hand. It was a remarkable research feat, but it required more than 1,000 desktop computers plus a dozen machines running specialized graphics chips crunching intensive calculations for several months.

The effort may have consumed about 2.8 gigawatt-hours of electricity, estimates Evan Sparks, CEO of Determined AI, a startup that provides software to help companies manage AI projects. That’s roughly equal to the output of three nuclear power plants for an hour. A spokesperson for OpenAI questioned the calculation, noting that it makes several assumptions. But OpenAI declined to disclose further details of the project or offer an estimate of the electricity it consumed.

Artificial intelligence routinely produces startling achievements, as computers learn to recognize images, converse, beat humans at sophisticated games, and drive vehicles. But all those advances require staggering amounts of computing power—and electricity—to devise and train algorithms. And as the damage caused by climate change becomes more apparent, AI experts are increasingly troubled by those energy demands.

“The concern is that machine-learning algorithms in general are consuming more and more energy, using more data, training for longer and longer,” says Sasha Luccioni, a postdoctoral researcher at Mila, an AI research institute in Canada.

It’s not just a worry for academics. As more companies across more industries begin to use AI, there’s growing fear that the technology will only deepen the climate crisis. Sparks says that Determined.ai is working with a pharmaceutical firm that’s already using huge AI models. “As an industry, it’s worth thinking about how we want to combat this,” he adds.

Some AI researchers are thinking about it. They’re using tools to track the energy demands of their algorithms, or taking steps to offset their emissions. A growing number are touting the energy efficiency of their algorithms in research papers and at conferences. As the costs of AI rise, the AI industry is developing a new appetite for algorithms that burn fewer kilowatts.

Luccioni recently helped launch a website that lets AI researchers roughly calculate the carbon footprint of their algorithms. She is also testing a more sophisticated approach—code that can be added to an AI program to track the energy use of individual computer chips. Luccioni and others are also trying to persuade companies that offer tools for tracking the performance of code to include some measure of energy or carbon footprint. “Hopefully this will go toward full transparency,” she says. “So that people will include in the footnotes ‘we emitted X tons of carbon, which we offset.’”

The energy required to power cutting-edge AI has been on a steep upward curve for some time. Data published by OpenAI shows that the computing power required for key AI landmarks over the past few years, such as DeepMind’s Go-playing program AlphaZero, has doubled roughly every 3.4 months—increasing 300,000 times between 2012 and 2018. That’s faster than the rate at which computing power historically increased, the phenomenon known as Moore’s Law (named after Gordon Moore, cofounder of Intel.)

Recent advances in natural language processing—an AI technique that helps machines parse, interpret, and generate text—have proven especially power-hungry. A research paper from a team at UMass Amherst found that training a single large NLP model may consume as much energy as a car over its entire lifetime—including the energy needed to build it.

Training a powerful machine-learning algorithm often means running huge banks of computers for days, if not weeks. The fine-tuning required to perfect an algorithm, by for example searching through different neural network architectures to find the best one, can be especially computationally intensive. For all the hand-wringing, though, it remains difficult to measure how much energy AI actually consumes, and even harder to predict how much of a problem it could become.

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

‘Slow-living’ animals that breed less rapidly could be reservoirs of disease

‘Slow-living’ animal species that breed less rapidly and live longer could be…

Millions missing out on cheaper Wi-Fi – you could be saving around £200 a year

MILLIONS of Brits are missing out on cheaper Wi-Fi, with many unaware…

Why Crypto Idealogues Won’t Touch Bitcoin ETFs

Bitcoin devotees are declaring a historic victory after US regulators approved a…

Dish in Talks to Sell Wireless Plans Through Amazon

Share Listen (2 min) This post first appeared on wsj.com