https://www.lifegate.it/impatto-ambientale-ia
- |
The daily cost of ChatGpt, OpenAI's chatbot, would be around 700 thousand dollars, according to one esteem of last year.While other analyzes go up to a million per day.What is certain is that the maintenance and operation of similar products is very expensive and bills can only increase as services spread.OpenAI, and other companies in the sector, would therefore be trapped in a vicious circle: reach new users to increase turnover, knowing that each user can only increase their costs.
A technological breakthrough is needed to make AI more sustainable
This is also why Microsoft, an investor and commercial partner of OpenAI, is working on a new type of chip that should make these systems more efficient and less energy-intensive, and therefore more sustainable.Sam Altman, co-founder and head of OpenAI, himself has declared recently that the only way to resolve the issue is a “technological breakthrough” capable of allowing the development of AI cheaper.Let's take it Gpt-4, the large linguistic model underlying ChatGpt, that is, the one that allows you to understand and generate texts and images:in addition to the enormous initial cost in the so-called "training", each request and interaction with Gpt marks an unforgettable consumption of energy.To date, however, there are no signs of this type on the horizon.
The sustainable alternatives available on the market
Thus, while Microsoft invests in chips, Altman tries to circumnavigate the issue and dreams of very low-cost energy sources.In fact, for some time the entrepreneur has invested in Helion, an experimental startup on the nuclear fusion whose goal is to create "artificial suns", infinite and very powerful sources of clean energy.Nuclear fusion is now the Holy Grail of renewable energy but, thanks to Altman's pressure, Microsoft has signed an agreement buying and selling energy from Helion starting from 2028.In the meantime, however, there are sustainable alternatives available today that we could focus on.
+775%
The problem is destined to get bigger with the further diffusion of AI in the coming years.According to one research of the University of Washington, the management of hundreds of millions of requests to ChatGpt would consume approximately one gigawatt hour, "or the equivalent of the energy consumed by 33 thousand US homes". According to Arijit Sengupta, CEO of a company in the AI sector, at this point we are at around 1 percent of the adoption of similar systems in the coming years, a rosy prospect for the sector which however hides an enormous environmental risk and the possibility of "a rather serious energy crisis."
Artificial intelligence is only part of the energy consumption and CO2 emissions of the entire digital sector, whose environmental impact is destined to grow exponentially in the coming years.Between now and 2040, according to a study recently published by the journal Nature, le emissions of CO2 linked to digital are intended to grow by 775 percent, going from 1.6 percent of the total recorded in 2017 to 14 percent.To put this data in perspective, if digital were a nation, it would be the fifth largest emitter of carbon dioxide globally, with 3.8 percent of total emissions.This is a value four times greater than that of an industrial power like China France.