**
In the heart of Silicon Valley, a curious debate has emerged surrounding the environmental implications of artificial intelligence (AI) interaction. Some have proposed that forgoing polite phrases like “please” and “thank you” when querying AI models like ChatGPT could contribute to a more sustainable future. While this notion seems plausible at first glance, it conceals deeper complexities about the energy demands of AI systems and the broader environmental footprint that comes with them.
The Illusion of Impact
The idea that dropping courtesy from conversational prompts could lessen energy consumption is rooted in the incremental processing nature of AI systems. Longer prompts do require slightly more computational power, which can translate to increased energy use. Sam Altman, CEO of OpenAI, has acknowledged that operational costs rise significantly with billions of interactions daily. However, the reality is that the energy expended by a few extra words pales in comparison to the substantial resources consumed by the data centres housing these sophisticated models.
This persistent misconception raises an important question: why do so many people perceive AI as an ethereal technology, devoid of substantial physical impact? It appears that there is a growing awareness that the digital realm is not as immaterial as it seems, and this is a sentiment that deserves further exploration.
The Heavy Footprint of AI Infrastructure
Artificial intelligence is fundamentally reliant on extensive data centres, equipped with high-density computing infrastructure. These facilities are energy-hungry, requiring significant electricity and continuous cooling, while also being intricately linked to broader systems of energy supply, water, and land use. As AI becomes more prevalent, its environmental footprint expands, leading to critical questions about how we assess its impact.
Unlike standard digital services, which often incur energy costs primarily during data retrieval, AI requires fresh computations for each query. This means that every interaction demands a new cycle of energy consumption—a characteristic that positions AI more as a resource-intensive infrastructure than as mere software. Recent research published in the journal *Science* highlights that data centres already account for a notable share of global electricity use, with projections indicating a potential doubling of this demand by the decade’s end if current trends continue.
The Broader Environmental Context
The energy consumption of data centres is just one facet of the environmental challenges posed by AI. The operation of these facilities also necessitates large volumes of water for cooling, and their establishment and maintenance involve land and material resources. This creates localised impacts, even when the services they provide are global in scope.
Take New Zealand as a case in point. Although its renewable energy resources make it an appealing destination for data centre operators, the influx of new demand can strain local energy grids. The perception that renewable energy sources mitigate these impacts does not always hold true, particularly when existing generation capacity is not expanded to accommodate new consumption.
Viewing AI through a systemic lens reveals that it introduces additional stress onto regions already grappling with climate change, population growth, and competing resource demands. The interconnectedness of energy, water, land, and infrastructure necessitates a comprehensive approach to understanding and managing these impacts.
Rethinking AI’s Role in Resource Management
As we navigate the complexities of climate adaptation and long-term urban planning, it is crucial to consider AI infrastructure as an integral part of our physical landscape. Decisions surrounding land use, water management, and energy supply must incorporate the demands of AI systems rather than treating them as isolated digital services.
Focusing on superficial behavioural changes, such as modifying the way we phrase prompts, distracts us from addressing the more pressing systemic challenges. We must engage in discussions about how AI infrastructure can be harmonised with energy planning, water resource management, and land-use priorities to ensure it does not compete unfavourably with other societal needs.
Why it Matters
The fascination with reducing polite language in AI queries is less about the actual energy savings and more about public consciousness surrounding AI’s tangible environmental footprint. This awareness signals an opportunity for a more nuanced dialogue about how AI interacts with our existing infrastructures and ecosystems. As we continue to integrate AI into our lives, recognising its physical presence and resource demands will be pivotal in shaping sustainable strategies that balance technological advancement with environmental stewardship.