The real environmental impact of Artificial Intelligence is still largely unknown to most people. A significant gap, considering that this technology, however revolutionary, entails a considerable ‘ecological cost’ that spans multiple areas. Let’s therefore provide some useful information on the subject, so that we can approach its use with greater awareness.

‘Data centers’ are, undoubtedly, the beating heart of Artificial Intelligence, a fascinating, almost poetic image, which, however, clashes with the problem posed by their disproportionate energy requirements. Indeed, even just by looking at them, this becomes evident: these are enormous structures, often sprawling across areas equivalent to multiple football fields (*1), inside which thousands of servers (*2), neatly arranged in endless rows, ceaselessly perform billions of calculations, twenty-four hours a day. A system of this scale obviously needs vast amounts of electrical energy to operate, not to mention that needed to ‘keep in check’ the heat generated by so many machines. According to updated data from the International Energy Agency, it is estimated that, at a global level, data centers consume approximately 2% of the world’s electricity, a percentage set to grow dramatically in the coming years: many projections point to a doubling of this demand by 2030.
*1: The largest data centers comfortably exceed half a million square meters in size;
*2: A ‘server’ is a computer (or a program) that provides data or services to other computers, called clients, over a network such as the Internet.

Although, at first glance, the cooling of the machines required to run Artificial Intelligence may seem an entirely negligible detail to non-experts, it actually represents one of the most significant items of energy expenditure in this field. Keeping in check the heat of the computers that ‘power’ the new technology requires in fact enormous refrigeration systems: from ventilation units to liquid cooling dissipators.
To give a fairly precise idea of the actual figures involved, it is enough to consider that, currently, a 100-megawatt hyperscale data center requires approximately 2 million litres of water per day to operate correctly.
It goes without saying that the geographical location of these facilities can profoundly affect their environmental impact: for instance, one in Iceland, powered by geothermal energy, will have a considerably lower ‘ecological footprint’ than one dependent on the coal of a less ‘virtuous’ region.

Framing the environmental impact of Artificial Intelligence in global terms risks losing sight of the ecological cost of each individual interaction that every one of us has, every day, with it. A cost that is far from negligible, even when we perform apparently trivial tasks, such as asking a chatbot to proofread a text or suggest a recipe.
To put it in concrete terms: a single query to a large language model such as ChatGPT consumes roughly ten times the energy required by a standard Google search. It’s clear, then, that the hundreds of millions of such interactions occurring daily worldwide paint the picture of a technology that is not exactly ‘green’. It is estimated that ChatGPT alone consumes each day an amount of electricity comparable to that needed to power around 33,000 European households. And this considering only text-based queries, excluding image and video generation, which is far more energy-intensive.
The point is not to do without AI, but to use it with the same awareness and care that many of us already apply to other everyday consumption habits.

Given its fundamental importance, it should come as no surprise that the environmental sustainability of Artificial Intelligence is increasingly protected by law. The European Union, often a forerunner in regulatory matters, has already included in its AI Act assessment criteria for high-risk systems and standards relating to energy efficiency and consumption transparency for high-impact models. Other jurisdictions too, including those of the United States, the United Kingdom, and several Asian states, are introducing, albeit more slowly,
specific energy efficiency requirements for data centres through dedicated regulations. Finally, some advocates of ‘environmental taxation’ are beginning to propose the introduction of specific levies on CO₂ emissions generated by the production of electricity used in data centres: true ‘carbon taxes’ that would target the fossil fuels employed in the sector, but which remain, for the time being, at the stage of theoretical proposals.

The enormous energy consumption of Data Centers is not the only cause of the high environmental impact associated with Artificial Intelligence: one must also consider the significant energy expenditure caused by the production of chips essential for the functioning of this new technology. According to a 2025 report by Greenpeace, electricity demand in this sector increased by 351% between 2023 and 2024, leading to a corresponding rise in greenhouse gas emissions of 357%. This is due to the fact that the majority of factories are concentrated in East Asia, particularly in Taiwan, South Korea, and Japan, where power grids are largely fuelled by fossil fuels. The outlook for the future is not very encouraging: it is estimated that by 2030 energy consumption will increase by up to 170 times compared to 2023 levels.

Although it may at first glance seem like a harmless activity, simply training an advanced AI model can require weeks, if not months, of uninterrupted computation by thousands of processors working simultaneously (in ‘parallel’). Such an activity, as is easy to understand, demands an enormous amount of electricity, not to mention the additional power needed to cool the heat generated by the operation of the machines. Among the many critical aspects that are often overlooked, it should also be highlighted that tech companies typically require hundreds of attempts before achieving a satisfactory result. A striking example concerns the ‘formation’ of the GPT-3 model, which consumed approximately 1,287 MWh of electricity, generating 552 metric tons of CO₂ emissions (*1).
*1: 2021 paper by Patterson et al., Google/UC Berkeley;

The exponential rise in the use of Artificial Intelligence, and above all the proliferation of the machinery required for its operation, entails a type of ‘environmental cost’ that is often underestimated: the massive production of electronic waste. According to a study published in Nature Computational Science, the new technology could generate between 1.2 and 5 million metric tons of e-waste over the period from 2020 to 2030. None of this, on closer inspection, should come as a great surprise: much of the hardware used in data centers is subject to an incredibly rapid process of obsolescence that renders it ‘unusable’ within a surprisingly short period, generally between two and five years, thus requiring continuous replacement.
The recycling of this ‘electronic junk’ is a double-edged sword: while it is true that circuits contain precious metals such as copper, gold, and silver, it is equally true that they contain significant quantities of toxic materials such as lead, mercury, and chromium, which are notoriously difficult (and, as such, costly) to dispose of. This explains why the current e-waste treatment rate is so disappointing: only 22% of electronic waste completes the recovery cycle, while the fate of the remaining share is largely unknown, with an estimated portion ending up in landfills in developing countries.

Among the many strategies being planned to fight the environmental impact of Artificial Intelligence, one of the most promising is undoubtedly the so-called ‘Edge Computing’. What does it entail? The answer is straightforward: it means progressively shifting the enormous volume of computations required for the functioning of this new technology away from current centralised data centers and onto the ‘peripheral’ devices that people use on a daily basis. For instance, by running AI algorithms on the millions of smartphones currently in circulation, it is possible to effectively optimise energy consumption (with estimates suggesting a reduction of between 14 and 25% compared to centralised architectures), thereby considerably relieving the burden on centralised infrastructure.

It’s almost surprising to discover that Artificial Intelligence, currently a source of considerable concern from an environmental impact standpoint, could soon become an effective solution to the very problem it helps create. A case in point is the work of Orbital Materials, company specialized in the development of innovative materials which, largely thanks to AI, has invented one capable of capturing carbon emissions, much in the same way a sponge absorbs water. Through its collaboration with Civo, a British cloud provider focused on sustainable infrastructure, Orbital is planning to integrate this technology into a data center in the United Kingdom, as part of a pilot project that, should the results prove up to expectations, could transform these infrastructures from contributing factors in climate change into effective tools for fighting it.

Among the most revolutionary innovations that the (near) future may have in store for us, ‘space’ data centers are undoubtedly among the most fascinating. Positioned in Low Earth Orbit (LEO), these highly advanced structures will be powered by solar energy and cooled, effectively and at zero cost, by the icy void of space, thus eliminating the need for traditional heat dissipation systems. According to a study by the Nanyang Technological University, published in the peer-reviewed scientific journal Nature Electronics, such an architecture could enable carbon emission savings of up to ten times those achieved by currently available solutions.
Although the project is held back by significant obstacles, such as vulnerability to cosmic radiation, which would require the use of specialised processors, projections suggest that a first operational test could take place by 2027: Google is indeed already planning the launch of two prototype satellites in partnership with Planet, to put this promising technology to the test in real-world conditions.

It’s interesting to discover that Artificial Intelligence, whose operation is the source of growing environmental concerns, could at the same time help to resolve them, or at least mitigate their effects. We are indeed witnessing the development of a large number of applications based on this technology and designed specifically to optimise energy consumption. A few examples: ‘smart’ power grids capable of managing distribution more efficiently, algorithms that minimise energy use in industrial production processes, and software that improves the performance of public transport. According to the International Energy Agency (IEA), the widespread adoption of AI applications could reduce CO₂ emissions by 1.4 gigatons by 2035, roughly five times the emissions produced by data centers in the same year, a result that would give this emerging technology a pivotal role in the fight against climate change.

Le immagini presenti in questa pagina web sono state realizzate impiegando strumenti di Intelligenza Artificiale generativa.