The rise of AI has brought on a whole host of problems that, while unexpected, we have see before when it comes to the rise of technology. The most pressing of these problems is the tremendous amount of energy that is drawn from the grid in order to power the computers necessary to run it and the impact that it is having on the population at large.
Power outages are a fairly common thing in the US, especially in hot climates during the summer months when AC is run without interruption causing a lot of pressure on the grid. A good example of this is the rolling blackouts that the city of Los Angeles is known for implementing during heatwaves. But this is not the first time that a surge on tech has caused energy use to spike without backups and the situation did not end up being as dire as many though.
Back in the early 2000s, data centers were starting to become commonplace and doubled power consumption almost overnight, making people panic about them taking all available electricity, which was a valid concern at the time considering the state of the grid. However, after that, between 2010 and 2018, computing power exploded by 500%, yet the energy used only went up 6% because the grid was prepared and the technology was a lot more efficient, meaning it drew less power overall and energy use became smarter.
But now panic is rising again due to the growth in AI technology, and likely the fact that we are focusing more on making it more powerful than making it more efficient. Efficiency is powerful because it does not just save energy now; it sets up future systems to keep saving energy too, and since this does not seem to be a principle applied for he new tech, many are scared of the consequences, especially as summer approaches.
The impact of AI in energy consumption
AI is growing very fast, McKinsey found that last year, 65% of companies had already integrated generative AI into at least one part of their business and this year it is already up to 71%. The problem is that a lot of that early adoption has been more like grabbing whatever was lying around to get started, rather than building things smartly from the ground up.
What this means is that there are a lot of AI systems running on dirty generators and massive models burning way more energy than needed for pretty simple tasks, which is not sustainable in the long term for the companies or for the grid.
Luckily, many companies have noticed the strain on their finances and their operation and have begun shifting toward more energy-efficient tools that actually match what the workload needs. It is a push to make AI greener.
Chips that are essential for computers are being changed to be able to process more information, faster and consuming less energy. One of the latest upgrades came from MIT, who recently showed off a photonic processor that could speed things up while using less power, and another one from IBM, who is working on a brain-inspired prototype that’s 25 times more energy efficient.
Connections are also being improved, as a lot of energy gets wasted on slow transmissions, and so copper is being replaced by light-speed fiber optics almost as fast as it can be manufactured. While this all might seem trivial and just a band aid (it very well might be) AI is here to stay. Anything that can be done to improve the way it works and the efficiency of the technology will help us step into the future a little bit more prepared than we were yesterday.
