AI Energy Consumption: Balancing AI Adoption with Sustainability Goals for Irish Businesses

Hi, I'm Karen, an AI consultant and trainer based in Dublin, helping business leaders cut through the noise and make informed decisions about AI.

Lately, I've been getting lots of questions from business leaders about the energy consumption of AI tools like ChatGPT. It might be alarming for some to hear that data centres consume around 21% of Ireland’s electricity demand [1], a huge figure when compared to the global average of just 1-2%.

And as the world grapples with climate change and puts more focus on achieving sustainability targets, it's only natural to feel conflicted about using AI.

This article aims to separate fact from fiction and critically consider the impact of AI's energy consumption today and in the future. 

How Much Energy Does AI Use?

A single ChatGPT query consumes approximately 2.9 watt-hours (Wh) of energy, which is nearly ten times more than a standard Google search, which uses about 0.3 Wh. [2

ChatGPT’s yearly energy usage for processing queries is projected to be around 226.8 GWh. To put this into perspective:

The annual energy consumed by ChatGPT could:

  • Power about 54,000 Irish homes for an entire year [3]

  • Make approximately 9.45 billion cups of tea - which would supply the nation's tea drinkers for 20 months [4

  • Run the entire country of Ireland for nearly 3 days

While it may be natural to get caught up in these headlines, the more productive questions I believe we should be ruminating on are:

  1. How does AI stack up in global energy usage?

  2. Is it possible that the AI tools we use might use less energy in the future?

  3. Can our future energy supply meet this growing demand?

How Does AI Stack Up in Global Energy Usage?

AI is just one of many contributors to global electricity demand. Data centres, which include AI and cryptocurrency, account for about 1-2% of global electricity use. [5] While that’s a notable share, it's much smaller than sectors like transportation and industry, which have significantly larger energy footprints.

By 2030, the increase in demand for data centres is expected to be just a fraction of the total growth in global electricity demand, with other factors like increased use of electric vehicles and air conditioning driving most of the growth [7].  

Global growth in final electricity demand by use in the Stated Policies Scenario, 2023-2030

IEA (2024), Global growth in final electricity demand by use in the Stated Policies Scenario, 2023-2030, IEA, Paris https://www.iea.org/data-and-statistics/charts/global-growth-in-final-electricity-demand-by-use-in-the-stated-policies-scenario-2023-2030, Licence: CC BY 4.0

The International Energy Agency (IEA) projects that growth in data centre energy demand, including AI, will be manageable in the coming years. [8]

But Ireland’s journey to manage energy supply and demand could be more challenged than other countries. The percentage of total metered electricity consumption used by data centres rose from 5% in 2015 to 21% in 2023.

If growth is to continue at the same pace, it could pose risks to Ireland’s energy security, which relies heavily on energy imports, with 82% of its energy needs coming from abroad in 2022 [6].

This high dependence on imported energy makes Ireland critically exposed to global energy market fluctuations and supply disruptions. For the country to meet future energy demands, it is essential to increase domestic renewable energy production and invest in energy storage and grid infrastructure. Ireland’s Energy Security Strategy for 2030 paves the way for this - but the key will be in how well it is executed.

As you can see, the technology landscape is complex, and the picture is far from black and white.

Could AI Use Less Energy In The Future?

If we accept the IEA’s forecast that current energy demand from AI is manageable, it's also true that as AI applications grow, so will their energy footprint. There are 80,000 AI companies globally in 2024 and we're adopting generative AI at a faster rate than the internet and personal computers. 

IEA (2024), Household adoption rates of digital technologies in the United States, IEA, Paris https://www.iea.org/data-and-statistics/charts/household-adoption-rates-of-digital-technologies-in-the-united-states, Licence: CC BY 4.0

AI uses a lot of energy because running these tools requires an enormous number of calculations, often working with vast datasets to understand and generate responses. The amount of energy used depends on the efficiency of the computer chips like the ones Nvidia is making billions from today.

Some good news here is that the efficiency of AI-related computer chips has doubled roughly every two-and-a-half to three years. A modern AI-related computer chip today uses 99% less power to perform the same calculations as a model from 2008. 

IEA (2024), Efficiency improvement of AI related computer chips, 2008-2023, IEA, Paris https://www.iea.org/data-and-statistics/charts/efficiency-improvement-of-ai-related-computer-chips-2008-2023, Licence: CC BY 4.0

So, will this rate of efficiency continue? It's hard to say for certain, but Koomey's Law suggests that computing efficiency has historically doubled roughly every 1.57 years. This means that, if the trend continues, we could expect ongoing improvements in how much energy AI requires for the same level of calculations and generated responses.

The rise of Smaller, More Efficient Models (SLMs)

Overview of four SLMs Microsoft has developed as part of the Phi-3 family

The rise of smaller, more focused models, known as Small Language Models (SLMs), is a significant trend in AI development. Unlike large, general-purpose models like GPT-4o, SLMs like Microsofts "Phi-3" or Google's "Gemma", are tailored for tasks that do not require a lot of reasoning. 

Microsoft suggests that a business could use Phi-3 for tasks such as summarising the main points of a long document or creating content for marketing or sales teams [9].

These developments point to a future where businesses will likely use a combination of Large Language Models for more complex tasks and Small Language Models for other tasks. Today most businesses are solely using LLMs but if we diversify the models we use, it might also change our energy consumption of AI. 

Can Our Future Energy Supply Meet Growing Demands?

Finally, there’s the question of how we generate the electricity needed to power AI. The energy needs to be either renewable or carbon-free to achieve net zero goals and reduce global warming. The problem with renewables like wind and solar is that they are not always reliable due to weather dependency. 

Energy companies like ESB are exploring alternative fuel options and hydrogen, a clean burning fuel with no emissions, is one that holds much promise. In October 2024, Microsoft signed a deal with ESB to use hydrogen cells power its data centre power control and administration building in Dublin using this energy. However, the road to use hydrogen at scale is likely to span more than ten years.

This is where nuclear power, which creates zero carbon emissions, and is operational today is already presenting itself as part of the solution. 

In October 2024, Amazon signed an agreement with Oklo [10], a California-based nuclear energy company, to power its operations with small modular nuclear reactors (SMRs) as part of its broader commitment to achieving net-zero carbon emissions by 2040. Oklo’s advanced SMRs will provide carbon-free energy, contributing to Amazon’s goal of using 100% renewable energy by 2025.

And in September 2024, Microsoft announced it had signed a 20-year deal with America's Three Mile Island energy plant, the site of the worst nuclear accident in US history. Again, this move comes as Microsoft wants to continue growing their AI business, while also meeting net zero targets. 

At first I thought this shift towards nuclear sounded scary - in particular because of past disasters that I associate with nuclear energy. However, advancements in reactor technology and stringent safety protocols are making it a feasible part of a diversified clean energy strategy.

Moving Forward: Balancing AI Adoption with Sustainability Goals

AI undeniably has environmental costs and at the same time I believe that AI will also be part of the solution. AI can help us find solutions to reduce the amount of energy it uses through computational power used in chips and in the LLMs and SLMs it uses.

For business leaders who are conscious about the energy consumption of AI, consider these practical steps: 

  1. Improve your team's prompt writing capability so you get the results you want first time without having to put 10 prompts in to get the result you want. 

  2. Explore how you might start using SLMs for certain tasks 

  3. Ask your AI providers if they use green or carbon free energy

  4. Clearly communicate to your employees how you intend to achieve your sustainability goals, while also adopting AI. 

  5. Take time to critically understand AI's energy consumption in the context of technological innovations that create efficiencies. 

If you're ready to elevate your team's skills in generative AI or want to inspire your audience with a powerful keynote on the future of AI and business adoption, reach out today.

References

1. Central Statistics Office (CSO), 2024, Data centres: Metered electricity consumption 2023: Key findings. Available at: https://www.cso.ie/en/releasesandpublications/ep/p-dcmec/datacentresmeteredelectricityconsumption2023/keyfindings/#:~:text=July%202024%20%2C%2011am-,Key%20Findings,Table%201%20and%20Figure%201). (Accessed: 26 November 2024)

2. Balkan Green Energy News, 2023, ChatGPT consumes enough power in one year to charge over three million electric cars. Available at: https://balkangreenenergynews.com/chatgpt-consumes-enough-power-in-one-year-to-charge-over-three-million-electric-cars/ (Accessed: 26 November 2024).

3. Selectra, n.d., Average electricity usage in Ireland. Available at: https://selectra.ie/energy/guides/average-electricity-usage (Accessed: 26 November 2024).

4. To boil a kettle for one cup of tea requires approximately 23-25 watt-hours (Wh) of energy. This calculation is based on heating 250ml of water, which is a standard cup size, from room temperature (20°C) to boiling point (100°C).

5. Central Statistics Office (CSO), 2022, Metered electricity consumption 2022: Key findings. Available at: https://www.cso.ie/en/releasesandpublications/ep/p-mec/meteredelectricityconsumption2022/keyfindings/#:~:text=Total%20metered%20electricity%20consumption%20was,2022%20(See%20Table%201). (Accessed: 26 November 2024).

6. Department of the Environment, Climate and Communications (DECC), 2023, Energy security in Ireland to 2030: Energy security package. Available at: <www.decc.gov.ie> (Accessed: 28 November 2024)

7. International Energy Agency (IEA), 2024, Electricity market forecast 2026. Available at: https://www.iea.org/reports/electricity-2024 (Accessed: 26 November 2024).

8. International Energy Agency (IEA), 2024, World energy outlook 2024. Available at: https://iea.blob.core.windows.net/assets/02b65de2-1939-47ee-8e8a-4f62c38c44b0/WorldEnergyOutlook2024.pdf (Accessed: 26 November 2024).

9. Microsoft, 2023, The Phi-3 small language models with big potential. Available at: https://news.microsoft.com/source/features/ai/the-phi-3-small-language-models-with-big-potential/ (Accessed: 26 November 2024).

10. Amazon, 2024, Amazon nuclear small modular reactor net carbon zero. Available at: https://www.aboutamazon.com/news/sustainability/amazon-nuclear-small-modular-reactor-net-carbon-zero (Accessed: 26 November 2024).

Next
Next

5 Ways Businesses Can Humanise AI Content for Better Engagement