Menu

ChatGPT: climate catastrophe?

ChatGPT: climate catastrophe?

The tech community has been abuzz since ChatGPT exploded onto the scene with the general release of its conversational chatbot in December of 2022.

Fears of robots taking human jobs, students cheating on exams and even the wrongful imprisonment of a supposedly sentient artificial intelligence all attest to the hype around the subject. Financial services firms are already experimenting with AI-enhanced financial advice with some commentators fearing that the days of human financial advisers are numbered.

Google has kicked off the search engine wars again with the closed release of Bard, its LamDA-powered competitor to Microsoft’s Bing-OpenAI, and Baidu announcing Ernie shortly after.

Climate change was evident in 2022 with the hottest summer ever recorded in the UK, a series of violent storms in the Americas, unprecedented drought in China and extreme flooding in India, Bangladesh, and Pakistan.

Computing is impactful: the International Energy Agency estimates data centres and networks currently consume up to 3% of energy globally and contribute around 0.6% of all greenhouse gas emissions. And the power required for machine learning is substantial: the total compute required by these models has increased exponentially, with OpenAI analysts predicting the trend to continue. This article takes a closer look at the chatbot revolution through the lens of sustainable development.

First, some definitions

ChatGPT, Bard and Ernie are all examples of chat-based large language models designed to generate conversational, human-like responses to prompts from a user. In this context, a large language model is a complex statistical model which attempts to predict the correct sequences of words in response to an input. The results are, usually, impressively coherent and have wowed millions of users across the world.

These models require huge sets of training data. ChatGPT trained on over 300 billion words of conversational web content – that’s half a terabyte of pure text. Churning through that amount of data requires enormous amounts of computing power, both during training (when a model is being developed) and in inference (when the model is making predictions). This aspect of the technology has gone underlooked when compared to its headline-grabbing chat capabilities.

Assessing the impact

The computational abilities of these models, and machine learning solutions in general, are staggering, but all that computation power requires serious amounts of energy.

Exactly how much is hard to gauge, since providers don’t tend to publish the details. According to Tweets from OpenAI founder Sam Altman, ChatGPT currently costs less than ten US cents per chat but that the overall compute charges are “eye-watering”. And this is just a single pilot!

Conversational large language models like ChatGPT will need frequent retraining on current data to provide up to date results. Traditional search engines are estimated to use a fraction of the energy by comparison.

Reducing the impact

Since detailed information on language model energy use is hard to find, researchers have suggested generic advice to reduce climate impact, broadly grouped into two categories:

Computing efficiency

  • ChatGPT and its peers are among the first of their kind generally available to the public. Improving algorithmic efficiency by even a single percent could result in big savings across a model serving millions of users.
  • ChatGPT itself is said to run on thousands of Nvidia Graphical Processing Units (GPUs) geared towards AI tasks. Efficiency could be improved by exploiting the new wave of purpose-built Application-Specific Integrated Circuit machines (ASICs) instead of traditional hardware.

Operational efficiency

  • Providers like OpenAI use data centres across the world. Public information is understandably kept scarce but the geographical location of data centres has a large effect on their climate footprint. Hosting power-hungry services in countries with greener energy infrastructures can lessen the impact.
  • Improving data centre efficiency – often measured using metrics like Power Usage Effectiveness (PUE) – can have an enormous impact, but this is an article in itself!

So what?

There is no doubt that large language models are going to disrupt industries like the financial services. However, without careful management, there is a risk that the proliferation of poorly optimised models could result in an outsized climate impact compared to the benefits delivered by these services. Firms which value sustainable development should keep this in mind when making their first moves with these exciting new technologies.

Keep exploring...

The Advice Guidance Boundary Review: Solving the demand-side equation
Governance in IT 
Stop Ignoring End-of-Life Core Operating Software

Subscribe

Don't miss out on news and opinion pieces from Altus experts

Insights - Subscribe form

Name
Business email preferable
Please confirm what you would like to receive from us