If data centers are the foundation of our digital world, then with their increasing proliferation they pose a major environmental challenge amid concerns about the carbon footprint of the technology sector. Does the solution come from artificial intelligence?
The challenge is enormous, as by 2025 it is expected that this sector will consume 20% of the electricity generated in the world, and emit 5.5% of total carbon emissions.
With more uses and applications that consume more and more electricity, the pace may accelerate even more.
Aaron Iyengar, president of Anthere AI, a company that specializes in making electronic chips and seeks to develop semiconductors for artificial intelligence that consume less energy, acknowledged, “We have opened Pandora’s box.”
“We can use artificial intelligence in a way that suits climate requirements, or we can ignore these requirements and face the consequences,” he said.
However, the servers must be adapted now, at a time when they are being modified to adapt to the needs of artificial intelligence, in what a Google Group official described as “a turning point that we only witness once in a generation in the world of information.”
Race for energy efficiency
Tools for generative artificial intelligence, such as “GPT 4” for the “Chat GPT” chatbot, and “Palm 2” developed by Google for the “Bard” bot, are being developed through two phases: “training” and “put into service”, both of which Energy consumer.
And researchers from the University of Massachusetts who trained these tools discovered in 2019 that training one of these models emits emissions equivalent to those of five cars during its entire life cycle.
A joint study by Google and the University of Berkeley recently concluded that training the GPT3 obstetric robot caused carbon emissions of 552 tons, which is equivalent to the emissions of a car traveling two million kilometers.
As for its successor, GPT4, it was trained on a number of settings that exceed the previous version by 570 times, and this number will increase with the development of artificial intelligence and its increasing spread and capabilities.
This development was made possible by microprocessors produced by the electronic chip giant Nvidia, known as “graphics processing units” (GPU). Although these processors are superior to regular electronic chips in terms of energy efficiency, they consume large amounts of energy.
After the training is completed, putting generative AI tools into service via the cloud requires energy through consumption associated with the demands on the data received, which requires a much larger amount of energy than the previous stage.
However, companies may choose more environmentally friendly solutions since cloud servers do not require super-powerful processors.
Amazon Web Service, Microsoft and Google, the major cloud computing companies, confirm that they are committed to the highest possible energy efficiency.
Amazon Web Services even pledged to become carbon neutral by 2040, while Microsoft committed to being a “zero waste company” by 2030.
Initial elements indicate that these groups are determined to achieve their goals. Between 2010 and 2018, the energy consumption of data centers around the world increased by only 6%, while their use increased by 550%, according to figures from the International Energy Agency.
global warming (expressive)
climate warming
On the other hand, the major artificial intelligence companies confirm that the carbon footprints of this modern sector are not at stake, but rather that the controversy over it obscures its ability to bring about a real revolution.
Sam Altman, founder of OpenAI, the developer of ChatGBT, said, “Addressing the problem of climate change will not be difficult after we have super-powerful artificial intelligence.”
“It shows how big our dreams have to be… Imagine a system that you can ask to figure out how to produce a large amount of clean energy cheaply, how to capture carbon efficiently and how to build a plant that can do that globally.”
For his part, NVIDIA President Jensen Huang said that the widespread use of artificial intelligence and the operations that it can perform at high speed will reduce the demand for cloud services, which will reduce the consumption of this sector.
Thanks to artificial intelligence, laptops, mobile phones and cars may become energy-efficient supercomputers that do not need to use data from the cloud.
Huang told reporters, “In the future, you will have a high-precision processor in your phone, and 90% of the pixels will be generated, instead of 100% currently, and only 10% of them will be withdrawn, which means that your consumption will decline.”
But some experts believe, on the contrary, that the race to develop artificial intelligence diverts attention from risks to the environment.
“Large groups are currently spending huge amounts of money to deploy artificial intelligence. I don’t think they are worried about the environmental impact at this stage, but I think that will happen later,” said Aaron Iyengar.