The GenerIA Blog

AI and environment (3/3): the systemic risks

Blog post illustration
Share this article:

Overloaded power grids, the return of fossil fuels, non-recycled electronic waste, skyrocketing social costs... Conventional AI's systemic and societal indicators are all red.

The year is 2024. The world's 8,000 data centers already consume 0.5% of the global electricity production. Now, let's look ahead five years. On a like-for-like basis, this consumption will have tripled, reaching up to 8% in North America. In the USA only, data centers will gobble up around 88 TeraWatt-hours (TW/h) per year, or 1.6 times the electricity consumption of the entire New York metropolitan area. For the record, Canada's annual electricity production will be around 690 TW/h in 2023.

While figures for China and India are lacking, unfortunately, it is clear that Europe is following an equally detrimental trajectory. By 2030, the needs of data centers on the Old Continent will exceed the combined current consumption of Portugal, Greece and the Netherlands. In Ireland alone, there was 82 data centers in 2022. Today, despite increasingly stringent regulatory measures, 14 more are built or under construction, and a further 40 have been given the go-ahead by the authorities. Whose fault is it? The fault of conventional, i.e. non-frugal, AI.

GPUs guilty of gluttony

No need to graduate from Stanford cum laude to understand that the energy demand linked to the multiplication of compute processors forms a curve close to exponential. That's why it is so difficult to buy professional GPUs or rent CUDA instances in the cloud nowadays. In concrete terms, the International Energy Agency estimates that, by 2027, GPUs alone could consume 27% of the increase in production in the USA, and up to 14% of all commercial energy. 14%, that's right! Goldman Sachs Research makes this projection even more alarming. According to the investment bank, the 20% mark will be passed as early as 2030.

Gartner, for its part, sees things from a complementary but equally striking perspective. For its analysts, a data center today consumes around 10 to 50 times more energy per square meter than any other business. However hard the best AI producers may try to refine the quantization of their models to make them executable on CPUS, nothing seems likely to change: as long as an inference is faster on a GPU than on a CPU, especially at scale, NVIDIA's share price will continue to rise.

Power grids under tension

The explosion of this demand obviously puts pressure on the world's often aging power grids (in North America, for example, the average grid is 40 years old). What's more, the insatiability of AI is compounded by various aggravating factors, such as the transition to electric mobility. Hence this legitimate question: can infrastructures handle this surge?

Already, operators are reporting signs of tension arising from the increase in renewable capacity and the re-commissioning of gas- and coal-fired power plants to meet immediate needs, particularly in AI. Despite the interconnection of networks on a continental scale, incidents of congestion and stability have been rising steadily since 2021. Because, of course, server farms enter into "supply competition" with local families and businesses.

Experts therefore do not rule out the possibility of regional tipping points where demand exceeds supply, with their corollaries of rising prices and declining power quality. These anticipations bring a new focus to the notion of resilience, and lead the various stakeholders to refine their strategies.

For operators, in addition to sheer capacity increases, major investments are being made across the board to analyze more precisely the origin and nature of consumption peaks, in order to "offload" more intelligently. Public authorities are forced to resolve a complex equation between the environmental aspirations of the electorate, the encouragement of liberalization to bring costs down, and the orderly planning of initiatives over more or less long time frames. Last but not least, for all the major players, the location of computing and storage centers is taking priority over all other technical and political considerations. More than ever, they are competing for locations close to robust power plants that can supply power directly, i.e. without sharing.

An obstacle to the energy transition?

These trends raise questions about our society's ability to achieve its goals in the transition to clean energy. For example, in addition to the direct damage caused by the increased use of fossil fuels, new energy development projects are being postponed because they are less profitable. So voices make themselves heard. They blame OpenAI, Google, Meta et al. for the growing delay in the widespread introduction of "zero-carbon" electricity, and even the impossibility for certain regions to achieve this with a reasonable timing. Google, for example, refers in its communication to a "worst-case scenario" where its AIs alone could consume as much energy as the whole of Ireland (29.3 TW/h per year).

Depending on the local context, the problem is more or less serious. In Canada, almost 83% of electricity is entirely or partly decarbonized, thanks to our hydro-electric power generation. In the U.S., the states that host the majority of computing centers (California, Virginia, Texas) benefit from additional financial resources to increase the proportion of non-fossil fuels in their energy mix. In Europe, a large fleet of nuclear power plants produces CO2-free electricity. Although unevenly distributed (France is the gold medalist in this competition, while Germany comes last), this fleet is currently being renewed, with the arrival of smaller plants and reactors that are less costly to build, simpler to commission and easier to distribute across the region.

That being said, we're still a long way off the mark. If nothing changes, many studies consider an increase in AI-related CO2 emissions to be inevitable, year after year. The most pessimistic even see these emissions doubling, overall, by the end of the decade. This is the case of Goldman Sachs Research, which also estimates the social cost of this increase, i.e. health and environmental remediation expenses, at around $125 to $140 billion in present value terms. Because the energy sector is the biggest producer of greenhouse gases, ahead of transport and heavy industry.

Finally, how not to mention the indirect risks resulting from the rapid obsolescence of electronic equipment. From large servers to small cell phones, the entire chain is concerned. On the one hand, manufacturing them causes significant impacts (energy and CO2, but also mineral attrition). On the other hand, as computing power requirements correlate with the increasing size of neural models, their remaining useful life (RUL) is gradually diminishing. In addition to ever-shorter renewal cycles, there are the challenges of reprocessing them. According to UNITAR (UN Institute for Training and Research), barely 23% of "devices" in the broadest sense are reprocessed and their most harmful elements (lead, mercury, etc.) adequately recovered. The rest decompose day after day in unauthorized landfills, mainly in underprivileged countries, which don't need this to suffer from degraded environmental conditions.

Conclusion

We can choose to do nothing, let conventional AI continue on its course and see what happens. Or we can choose reason and prefer frugal AI, the only way to reconcile legitimate productivity needs with respect for the great ecological balances. In this case, GenerIA has sustainable solutions to offer.

References

Bank of America - Global Energy Weekly

Gouvernement du Canada : Environnement et ressources naturelles

Gartner: Keep AI From Doing More Climate Harm Than Good

Bloomberg Talks: IEA's Fatih Birol

Columbia University: Projecting the Electricity Demand Growth of Generative AI Large Language Models in the US

Goldman Sachs: AI is poised to drive 160% increase in data center power demand

UNITAR : The Global E-Waste Monitor

Google's 2024 Environmental Report

In the GenerIA blog:

Article Image

Regulating Frugal AI: Between Progress and Challenges...

Frugality is a radical shift in the way businesses and governments think about AI. But how do we regulate a technology that promises both performance and a sustainable environmental footprint? Let's take a look at how three major regions - Canada, Europe and the United States - are approaching the problem...

Article Image

AFNOR SPEC 2314: Best Practices in Frugal AI

From project design to end-user acculturation, frugal AI is above all a matter of best practices. Numerous and complementary, these BPs are detailed in AFNOR SPEC 2314. Here is a thematic summary.

Article Image

Frugal AI: A Gentle Introduction to the AFNOR SPEC 2314 Framework

Fostering innovation without hastening the attrition of natural resources. This is the rationale behind frugal artificial intelligence, whose definition, contours and practices AFNOR intends to normalize.

Article Image

Telemetry, an essential component of the best AIs

Extensive telemetry brings a great deal to enterprise artificial intelligence. Performance, behavior, response biases, prompt injections... Everything that can be observed contributes to continuous optimization, thereby guaranteeing the full success of AI projects.

Article Image

AI and environment (2/3): water, critical issue!

Artificial intelligence - at what cost to our water resources? Just like its carbon footprint, Conventional AI's consumption of cooling water is becoming a real ecological threat.

Article Image

AI and environment (1/3): alarming numbers

Insatiable for energy and a major producer of CO2, conventional artificial intelligence looks more and more like an environmental dead end. Is there any hope of sustainability? Everywhere, the numbers suggest otherwise...