Insights

Is AI Sustainable?

Is AI Sustainable?

Insights Firm-wide
26 Nov 24
Shutterstock 2445193579 1600 (1)
Scroll to explore

In brief

As sustainable investors, we think of the wider costs and benefits of new technologies. Recently we redoubled our focus on the energy demands of artificial intelligence (AI), something that we have followed for years.1 In this Insights piece, we narrow the focus to data centres, which supply the computing power for AI.

We ask a crucial question: will generative AI cause global carbon emissions to rise?

The context

Data centres already use a lot of energy. In combination with data transmission networks, they now account for 1-2% of energy-related greenhouse-gas emissions.2 As AI has become more clever, it has become hungrier for power. According to one estimate, OpenAI’s GPT-4 required 50 gigawatt hours (GWh) of electricity to train -- roughly the annual consumption of a small city.3

Once AI models are trained, they then require power to be used in real-world applications (so-called ‘inference’). It is harder to measure the power demands of inference “largely due to its distributed nature, compared with the relatively time- and location-constrained nature of training.”4 But the academic evidence suggests the energy demands of inference increase with the size of the AI model.5 The most energy-intensive tasks are those which generate new content, such as text or images, as the graph below illustrates.

Figure 1: Carbon intensity of AI tasks

Source: Luccioni, Sasha, Yacine Jernite, and Emma Strubell. "Power hungry processing: Watts driving the cost of AI deployment?." In The 2024 ACM Conference on Fairness, Accountability, and Transparency, pp. 85-99. 2024.

Academic evidence suggests that, at present, training and inference account for roughly equal proportions of AI’s lifecycle carbon emissions, with the embodied carbon of AI hardware accounting for the rest.6

Figure 2: Lifecycle carbon impact of AI

Source: Wu, Carole-Jean, Bilge Acun, Ramya Raghavendra, and Kim Hazelwood. "Beyond Efficiency: Scaling AI Sustainably." IEEE Micro (2024).

What happens as AI grows?

As AI models improve, they will require bigger and bigger data centres. On a recent podcast Mark Zuckerberg of Meta suggested that before long the world would need a 1 GW data centre to train a single AI model.7 For comparison, Northern Virginia, the biggest data centre market in the world, currently has 3.6 GW of total capacity.8 A 1 GW data centre could consume approximately 8,760 GWh of electricity annually -- close to 200 times the energy used to train GPT-4.9,10 Zuckerberg suggested an entire power plant might be needed to power a single data centre. It is ambitious, to say the least.

Different organisations have estimated future energy demand from data centres. The International Energy Agency (IEA) thinks it will almost double from 2022 to 2026.11 There are, however, forecasts that are much more drastic.12 It is easy to imagine a world in which ever more sophisticated AI causes global energy demand to soar, which in turn raises carbon emissions. Will this happen? There are three issues to bear in mind.

First, cloud companies may find it harder than they expect to build data centres, limiting the increase in demand for energy. As we have noted in a previous Insights piece, many rich countries make it hard to build new infrastructure.13 Local residents often resist construction near where they live. Some governments are already clamping down on data centres. There is a moratorium on data centre construction in the greater Dublin area until 2028, for instance.14 So far, in America, the global home of AI, spending on data centres continues to rise, though continued heavy spending is not guaranteed.

Figure 3: US monthly spending on data centre construction

Source: US Census Bureau.


Second
, researchers may exaggerate the likely future energy demand from more sophisticated AI. In the past we have seen catastrophic predictions on future data centre power use, which have proven to be illusory.15 Things have turned out better than expected in part because data centres have become more efficient at converting energy to brainpower. Over the past decade, internet traffic has increased 25-fold, yet power usage from data centres has risen only slightly.16 Improvements in efficiency have probably sped up during the AI era. If this continues, AI may not need as much extra power as some fear.

The third issue is the most tricky. This is the question of where AI power comes from. It matters a great deal whether AI is powered by fossil fuels or renewable energy. In theory, cloud companies could build vast numbers of renewable-energy power plants to power their AI data centres, meeting additional energy demands without increasing carbon emissions. But things are more complicated than that.

Unfortunately, it is usually too risky to try to power data centres solely with on-site renewables. Renewables tend not to be reliable enough to provide power at all times; the wind, after all, does not always blow. As a recent Bernstein paper argues, the 'intermittency problem' does not go away even with large-scale energy storage (such as batteries) on site.17

As a result, data centres usually need to be hooked up to the grid. This could massively increase demand on grids which are already overworked. More capacity will be required. And yet, just like with data centres, there are often large regulatory and political barriers to the construction of additional power plants. “If you’re talking about building large new power plants or large build-outs,” says Zuckerberg, “and then building transmission lines that cross other private or public land, that’s just a heavily regulated thing.” Power-plant construction in the US remains surprisingly weak, according to official data.

You also have to connect the data centres to the grid. It sounds trivial, but in fact these interconnections need regulatory approval, and possibly the construction of dedicated substations, power lines and backup power systems. This whole process can be slow. For years there have been lengthy interconnection queues -- and last year in the US they grew by 27%.18 We know that last year, production by US companies involved in the transmission, control and distribution of electric power actually declined, a surprising statistic given the excitement over AI.19 It is not much of an exaggeration to say that the main constraint for AI is not energy supply per se, but the grid. As someone put it to us: “renewables are cheap, copper is expensive.” (You need copper to build and upgrade grid infrastructure.)

This leads to a worrying possibility: that to meet the demands of generative AI without relying on a constrained grid, companies will turn to fossil fuels. Fossil fuels could provide a steady, 'behind-the-meter' (i.e., off-grid) supply of energy for data centres. Experts we have spoken to suggest that to meet the extra energy demand, we might have to keep old gas- or coal-power plants running longer, or even build new ones. Already there are anecdotes of this happening. That is deeply concerning. Just as the world needs to make progress on cutting carbon emissions, we risk going backwards.

The world therefore needs to think about workarounds, to avoid a situation in which fossil fuels power the growth of AI. One solution is to do a large amount of training and inference a long way from busy population centres. Here, supplies of renewable energy tend to be more reliable. Wind farms, for instance, are typically located in rural or coastal areas where wind speeds are higher and more consistent. Rather than carrying energy across large distances to power AI models in cities, you could instead carry the data from the AI models. This approach is fine when it comes to training models. It is trickier when it comes to inference, however. If data centres were located far away from demand, there would be more 'latency' -- a delay between when you ask a question of an AI model and when you get an answer. For many AI applications this small delay might not be a big problem. But for others, latency would be far from ideal.

The role of cloud companies

When it comes to AI and emissions, the role of the big cloud companies cannot be overstated. Their decisions could be the difference between sustainable and unsustainable AI. We have robust internal debates over how committed these companies truly are to implement AI sustainably. To be frank, we think it is highly unlikely that any cloud player would slow down its rollout of data centres in order to meet climate targets. After all, they believe they are in a race to invent a digital god -- and so, for them, the risk of not building fast enough vastly outweighs the risk of overbuilding. Emissions at many of the largest tech companies are rising as a result of the build-out. This is a concerning development.

The cloud companies point out that the deployment of AI, at scale, will help reduce energy consumption, including in data centres but elsewhere too. Both in energy generation and across the various industries which use energy, AI can help optimise decision-making processes, improving efficiency and minimising waste. In the footnote we provide some examples for readers who are interested in learning more about this.20 At the same time, it is crucial to think about the time value of carbon, something we have discussed in previous Insights pieces.21 How sustainable are efforts to reduce carbon emissions in the future, if they result in higher carbon emissions today?

Without question, action is needed now. Some is taking place. The biggest cloud players have robust internal targets for emissions and renewable-energy use. They often back these words with action. Microsoft recently signed a $10 billion deal with Brookfield for renewable energy. The agreement focuses on developing over 10 GW of new renewable-energy capacity, primarily through wind and solar projects. Amazon, which aims to power its operations entirely with renewable energy by next year, recently bought a 960 MW data centre in Pennsylvania, powered by a nuclear plant. Tech firms have also seriously suggested that they are not willing to buy data centres in areas that are expanding the use of fossil fuels.

To return to the question posed at the start of this piece: there is great uncertainty about whether AI will prove to be sustainable or not. This gives the cloud companies a grave responsibility. They have real agency here -- not only to participate in the buildout of the grid, but also to reward, punish and lobby governments which help or hinder the rollout of renewables and grid reform. They have a real opportunity to show that their climate ambitions are as important as the business considerations of AI. They must remain on track for their climate goals and be honest when they are missing them. As sustainable investors, we will rigorously hold them accountable.

  1. See, for instance, our work that was published long before the excitement around AI: https://medium.com/generation-investment-management/unlocking-the-promise-of-artificial-intelligence-with-green-data-infrastructure-ed534908f8f https://medium.com/generation-investment-management/green-data-embedding-sustainability-in-it-infrastructure-a15a7b92957
  2. https://www.iea.org/energy-system/buildings/data-centres-and-data-transmission-networks
  3. https://www.ri.se/en/news/blog/generative-ai-does-not-run-on-thin-air#:~:text=The%20Cost%20of%20Training%20GPT%2D4&text=OpenAI%20has%20revealed%20that%20it,of%20energy%20usage%20during%20training.
  4. Luccioni, Sasha, Yacine Jernite, and Emma Strubell. "Power hungry processing: Watts driving the cost of AI deployment?." In The 2024 ACM Conference on Fairness, Accountability, and Transparency, pp. 85-99. 2024.
  5. Luccioni, Sasha, Yacine Jernite, and Emma Strubell. "Power hungry processing: Watts driving the cost of AI deployment?." In The 2024 ACM Conference on Fairness, Accountability, and Transparency, pp. 85-99. 2024.
  6. Wu, Carole-Jean, Bilge Acun, Ramya Raghavendra, and Kim Hazelwood. "Beyond Efficiency: Scaling AI Sustainably." IEEE Micro (2024).
  7. https://www.dwarkeshpatel.com/p/mark-zuckerberg
  8. https://www.us.jll.com/content/dam/jll-com/documents/pdf/research/global/jll-data-center-outlook-global-2024.pdf
  9. 24 hours/day × 365 days/year = 8,760 hours/year
  10. Here we assume that GPT-4 took a year to train, though we don’t know the exact figure.
  11. https://www.iea.org/energy-system/buildings/data-centres-and-data-transmission-networks
  12. Stiefel, Klaus M., and Jay S. Coggan. "The energy challenges of artificial superintelligence." Frontiers in Artificial Intelligence 6 (2023): 1240653.
  13. https://www.generationim.com/our-thinking/insights/how-climate-nimbyism-prevents-net-zero/
  14. https://www.datacenterdynamics.com/en/news/aws-restricts-data-center-access-in-ireland-amid-power-concerns-report/
  15. https://www.science.org/doi/abs/10.1126/science.aba3758
  16. https://www.iea.org/energy-system/buildings/data-centres-and-data-transmission-networks
  17. Bernstein, “The Electric Butterfly Effect: AI Data Centers vs. The Electric Grid — Sizing the Supply/Demand Balance through 2030”
  18. https://emp.lbl.gov/sites/default/files/2024-04/Queued%20Up%202024%20Edition_1.pdf
  19. https://fred.stlouisfed.org/graph/?g=1sZa9
  20. https://azure.microsoft.com/en-us/blog/quantum/2024/01/09/unlocking-a-new-era-for-scientific-discovery-with-ai-how-microsofts-ai-screened-over-32-million-candidates-to-find-a-better-battery/ https://blog.google/technology/ai/ai-airlines-contrails-climate-change/ https://www.technologyreview.com/2023/11/22/1083792/ai-power-grid-improvement/
  21. https://www.generationim.com/our-thinking/insights/the-time-value-of-carbon/

IMPORTANT INFORMATION

The ‘Insights 20: Is AI Sustainable’ is a report prepared by Generation Investment Management LLP (“Generation”) for discussion purposes only. It reflects the views of Generation as at November 2024. It is not to be reproduced or copied or made available to others without the consent of Generation. The information presented herein is intended to reflect Generation’s present thoughts on sustainable investment and related topics and should not be construed as investment research, advice or the making of any recommendation in respect of any particular company. It is not marketing material or a financial promotion. Certain companies may be referenced as illustrative of a particular field of economic endeavour and will not have been subject to Generation’s investment process. References to any companies must not be construed as a recommendation to buy or sell securities of such companies. To the extent such companies are investments undertaken by Generation, they will form part of a broader portfolio of companies and are discussed solely to be illustrative of Generation’s broader investment thesis. There is no warranty investment in these companies have been profitable or will be profitable. While the data is from sources Generation believes to be reliable, Generation makes no representation as to the completeness or accuracy of the data. We shall not be responsible for amending, correcting, or updating any information or opinions contained herein, and we accept no liability for loss arising from the use of the material.