436 private links
Billionaire Elon Musk said this month that while the development of AI had been “chip constrained” last year, the latest bottleneck to the cutting-edge technology was “electricity supply.” Those comments followed a warning by Amazon chief Andy Jassy this year that there was “not enough energy right now” to run new generative AI services. //
“One of the limitations of deploying [chips] in the new AI economy is going to be ... where do we build the data centers and how do we get the power,” said Daniel Golding, chief technology officer at Appleby Strategy Group and a former data center executive at Google. “At some point the reality of the [electricity] grid is going to get in the way of AI.” //
Such growth would require huge amounts of electricity, even if systems become more efficient. According to the International Energy Agency, the electricity consumed by data centers globally will more than double by 2026 to more than 1,000 terawatt hours, an amount roughly equivalent to what Japan consumes annually.