- cross-posted to:
- [email protected]
- [email protected]
- cross-posted to:
- [email protected]
- [email protected]
one assessment suggests that ChatGPT, the chatbot created by OpenAI in San Francisco, California, is already consuming the energy of 33,000 homes. It’s estimated that a search driven by generative AI uses four to five times the energy of a conventional web search. Within years, large AI systems are likely to need as much energy as entire nations.
It should be considered, an ultra-light laptop, or a raspberry pi consume WAY less power than a full on gaming rig, and the same can be said between a data server that is used for e-commerce and a server running AI, the AI server has higher power requirements (it may not be as wide of a margin from my first comparison, but there is one), now multiply that AI server and add hundreds more and you start seeing a considerable uptick in power usage.
And if external costs are priced into the cost of electricity then that will be reflected in the cost of operating these devices.
Also there are far more data servers than servers running AI which increases the total effect they have.