cross-posted from: https://lemmy.dbzer0.com/post/32023985
Writing a 100-word email using ChatGPT (GPT-4, latest model) consumes 1 x 500ml bottle of water It uses 140Wh of energy, enough for 7 full charges of an iPhone Pro Max
cross-posted from: https://lemmy.dbzer0.com/post/32023985
Writing a 100-word email using ChatGPT (GPT-4, latest model) consumes 1 x 500ml bottle of water It uses 140Wh of energy, enough for 7 full charges of an iPhone Pro Max
It’s not in a closed loop?
No, the water is evaporated as part of the cooling process.
What is this system called? I’d like to read up on it. Maybe there couldnbe regulations that those kind of systems need to use grey or recycle water. Maybe they already are in some data centers too
Another article said they use drinking water because the water comes close to delicate components and grey water can cause issues.
Well that’s a stupid design.
That’s physics. For a closed loop system, the cooling has to be done with air, which is less efficient and that doesn’t work so well in say Texas in summer.
Making it a stupid design, yes.
Edit: putting your massive heat generating data center (beyond what most DCs will do) in support of AI in Texas is stupid.
Closed loop systems absolutely have other options in design, which ive mentioned in another comment chain.
As terrible as they are as companies, meta, apple, and others have made much more appropriate decisions - like locating their big load DCs in cold climates, partnering with the locale to make use of the heat being generated, removing the need for power to be used to perform those tasks - making them not only efficient designs, but compared to putting a DC in Texas like a dipshit (or LA, or NV, or anywhere else with a hot climate), makes the whole thing better for the environment.
Yes, its a stupid design.
It’s not, if your only concern is lowering cost.
That’s how most data center cooling works, afaik.
I am so not an engineer, so maybe this is super stupid, but would there be some way to make it a closed loop by capturing the evaporated water and then letting it travel enough of a distance that it cools off and
liquefiescondenses and ends up in a holding pool?Edit: I told you I was stupid.
It’s actually how a liquid cooled PCs works. The warm liquid goes to a big radiator where fans blow air on the radiator to cool things. But lots of radiators becomes expensive and takes space. You’re talking about a few hundred megawatt of heat.
The space would be a problem for sure, but couldn’t you just use natural condensation to do it with a long enough pipe?
Yes, but you have to account for pressure differences. Steam condensing to water shrinks and causes big pressure changes. It’s a lot easier to either vent it or use liquid everywhere.
Thats one option, part of strategies for reuse of liquid cooling.
To mention, its more energy efficient than air cooling, so there is a benefit. Smart companies though will also look to reuse strategies like using it for building heat. Larger companies will partner with the town/city to distribute the heat into town-wide systems, like for power generation or distributed heating systems, warm greenhouses, or even to dry out wood pellets for pellet stove systems.
Going long is effectively the same as using radiators though, you’ll just need more pipe to do it without a radiator.
Probably expensive+ you’re going to lose mass no matter what because physics don’t give a shit.
You’re describing rain.
Unless you are talking about the entire planet, I’m fairly certain rain is not part of a closed loop cooling system for a server farm.
It improves the efficiency of the data centre because you need less expansive cooling. You need to get rid off the heat somehow.
There tend to be 2 loops, an closed one to the server and an evaporative cooler as a cold side.