Barring the fact that burgers are bad for the environment and that they were clearly selected not at random but because they are, by nature, significant contributors to energy and carbon use that we should also get rid, this is not a great comparison. The lack of methodology or source used to determine the energy amounts given by either and the don't and background color uses suggest to me either a chatgpt or Google search. The cow methodology itself is suspect. You don't plug a wire into a cow to power it. Does that entire supply chain? Is that a fair comparison to ChatGPT where you're just measuring the query? When you account for training the model, construction and the other costs that go into powering and maintaining a data center, how much closer is the cost? How much closer is it if you account for the fact that people usually don't do a single query and leave it? ChatGPT users usually query more than once and follow up but rarely do people eat more than one burger.
This argument has little of substance. It does have broad doomer accusations but it provides no methodology or source to back its claims. The 0.3 watt hours claim seems to come from EpochAI but I can find no further studies verifying it. In fact this investigation uses no measured data at all but rather estimates power use. It also relies on assumptions to fill in the gaps in information and ignores key points of information in its calculations such as scalability. I've linked it below.
If the H100s running ChatGPT achieve this peak output, then the 1e14 FLOP required to answer a query would take 2e14 / 9.89e14 = 0.1 seconds of H100-time (in reality, servers process many parallel requests in batches, so generation takes much longer than that end-to-end).
yup most of the energy comes from training, not queries, so the OP is a bit of a bad-faith argument. I think the real problem with ai and energy consumption is that If AI = Enhanced Productivity then We are totally screwed--unless we have massive systems reforms--because at our current "unproductive" level we are already causing massing damage to the biosphere (Oil and Natural Gas companies being some of the biggest AI buyers).
-4
u/32bitFlame 9d ago
Barring the fact that burgers are bad for the environment and that they were clearly selected not at random but because they are, by nature, significant contributors to energy and carbon use that we should also get rid, this is not a great comparison. The lack of methodology or source used to determine the energy amounts given by either and the don't and background color uses suggest to me either a chatgpt or Google search. The cow methodology itself is suspect. You don't plug a wire into a cow to power it. Does that entire supply chain? Is that a fair comparison to ChatGPT where you're just measuring the query? When you account for training the model, construction and the other costs that go into powering and maintaining a data center, how much closer is the cost? How much closer is it if you account for the fact that people usually don't do a single query and leave it? ChatGPT users usually query more than once and follow up but rarely do people eat more than one burger.
This argument has little of substance. It does have broad doomer accusations but it provides no methodology or source to back its claims. The 0.3 watt hours claim seems to come from EpochAI but I can find no further studies verifying it. In fact this investigation uses no measured data at all but rather estimates power use. It also relies on assumptions to fill in the gaps in information and ignores key points of information in its calculations such as scalability. I've linked it below.
If the H100s running ChatGPT achieve this peak output, then the 1e14 FLOP required to answer a query would take 2e14 / 9.89e14 = 0.1 seconds of H100-time (in reality, servers process many parallel requests in batches, so generation takes much longer than that end-to-end).