As the QuitGPT movement gains traction, individuals concerned about the environmental consequences of artificial intelligence (AI) are questioning whether opting out is a viable choice.
Change by degrees offers sustainable living advice and life hacks every Saturday to help reduce household carbon footprints.
If you have questions or tips on reducing household emissions, please email changebydegrees@the.com.
Only a few years have passed since the release of AI tools, yet the rapid integration of AI into various sectors has triggered a surge in datacentres, accompanied by increasing environmental costs.
According to the International Energy Agency, global datacentre power demand is growing faster than all other sectors combined and is projected to surpass Japan’s electricity consumption by 2030.
In Australia, the energy market operator forecasts that datacentre energy demand will triple within five years, exceeding the electricity used by the nation’s electric vehicle fleet by 2030. Authorities also expect significant pressure on the power grid.
With the QuitGPT movement—a boycott of AI due to concerns over surveillance and weaponization—gaining momentum, the question arises: should those worried about AI’s environmental footprint also consider opting out?
How bad is AI for the environment?
Estimates vary, but most studies indicate that generative AI models—which produce text, images, and video—consume substantially more energy than traditional computing methods.
Some assessments suggest the energy consumption is several times higher, while others propose it could be significantly greater. The energy use depends on the specific model or query type.
Professor Jeannie Paterson, co-director of the Centre for AI and Digital Ethics at the University of Melbourne, highlights the issue of limited transparency from technology companies regarding the energy, water, and emissions impacts of AI and datacentres.
“But it’s clear that training models and running datacentres is an energy intensive task,” she says.
Ketan Joshi, an Oslo-based climate analyst affiliated with the Australia Institute, explains that consumer software generating text, images, and videos is particularly energy inefficient due to the vast datasets and computational strain involved in pattern matching.
“Consumer software that generates text, images and videos are uniquely energy inefficient,” says Ketan Joshi, an Oslo-based climate analyst associated with the Australia Institute, due to the “vast datasets and computational strain of pattern-matching that happens underneath the hood.”
He notes that querying an AI chatbot consumes significantly more energy than obtaining the same information through a simple web search or calculator. This additional demand is unnecessary, likened to driving an SUV to the store instead of riding a bike.
“You might still get the shopping done, and that single trip alone may not even look all that bad in terms of cost or emissions, but what happens when that’s all of your trips, and when all of society starts doing this?”
One study estimates AI’s global carbon footprint to be between 32.6 and 79.7 million tonnes of CO2 emissions in 2025, with water usage ranging from 312.5 to 764.6 billion litres—comparable to global bottled water consumption.
In Australia, the expansion of datacentres for AI data processing and storage threatens to slow the energy transition, increase emissions, and raise power costs for consumers.
“That’s a lot of energy demand for unclear or small societal benefit,” Joshi says. “Compare that to the global benefit of video-calling technology, which has reduced flights and enabled communication during the pandemic.”
AI is everywhere. Is it possible to opt out?
AI tools are increasingly integrated into workplace and educational software, as well as chatbots used by banks and local governments. Generative AI is also being implemented in supermarket self-checkouts, facial recognition systems at hardware stores, and for transcribing medical notes.
Professor Paterson remarks on the ubiquity of this technology.
“We’re becoming immersed in this technology,” Paterson says. “It’s really hard to avoid.
“But we still have a chance to express our views about what and how we want AI to be used.”
There are small actions individuals can take to limit AI use, such as conserving energy by turning off lights and appliances. People can un from AI platforms, exclude AI-generated results from search queries (for example, by appending “-AI” to a search), or avoid energy-intensive tasks like text-to-video prompts or AI-generated images for celebrations or presentations.
Joshi observes that major companies like Meta, Google, and Microsoft have deeply integrated generative AI into their systems.
“Meta, Google and Microsoft have all baked [generative AI] deep into their systems,” Joshi says. “I see this all as very much part of the tactic of trying to embed these systems into society and instil dependency in a fashion similar to the growth of single-use plastics in the 1970s.”
He considers opting out a meaningful form of resistance.
“It’s partly about not creating that energy demand but mostly about being part of broad collective action against [a] corrosive, harmful industry.”
While consumer boycotts can be effective, Joshi expresses disappointment with QuitGPT’s approach of redirecting users from one AI platform to another rather than encouraging a complete withdrawal from AI. QuitGPT has urged users to cancel ChatGPT subscriptions while promoting Anthropic’s Claude, which Joshi views as a cynical exploitation of widespread opposition to AI.
What about the impacts of datacentres on local communities?
Datacentres, which are rapidly increasing in number and size, physically represent the AI boom. There is growing advocacy for holding the industry accountable for its environmental impacts.
A coalition of energy and environmental organizations—including the Clean Energy Council, Electrical Trades Union, Australian Conservation Foundation (ACF), and Climate Energy Finance—has proposed a set of "public interest principles for datacentres". These principles call for investment in renewable energy and responsible water use.
“If you want to build a datacentre, you should have to build the renewables and water recycling to power it,” the ACF chief executive, Adam Bandt, says. “Big tech corporations should be forced to do their fair share so they don’t drain our resources.”
Beyond energy, water, and emissions, datacentres can impact local communities and wildlife due to their large warehouse-like structures, continuous lighting, and constant operation of air conditioning systems.
Some communities have actively opposed the construction of large datacentres in their areas.
Dr Bronwyn Cumbo, a transdisciplinary social researcher at the University of Technology Sydney, explains that these facilities are often clustered in industrial hubs rather than isolated buildings.
“Of course, it’s in their interest to communicate, engage with the community, incorporate local knowledge, think about the local concerns, because they do want to be a good neighbour. But the incentive to be a good neighbour really depends on the company.”
Cumbo notes that discussions about AI’s relationship with the physical environment and its social, political, and economic implications are intensifying.
“There is an inevitability to AI being part of our lives but how it’s part of our lives is something we can definitely control.”







