Calculating water/energy usage for "AI" per token is a bit problematic: A data center has a massive base load even if nobody uses it just by sheer existence. And since we have no actual data for any of the popular platforms all numbers floating around are problematic and not very useful.
Like how much power does one of those servers+NVIDIA cards really save if its utilization is only 50%? And are the overhead costs actually counted?



