Is there any reason the water can’t be safely consumed later? It’s not toxic or nuclear is it? The cooling water didn’t just up and disappear did it?
Edit: Links provided in the comments…
- https://arxiv.org/pdf/2304.03271
- https://www.youtube.com/watch?v=H_c6MWk7PQc
- https://www.youtube.com/watch?v=H_c6MWk7PQc&t=1264
- https://pivot-to-ai.com/2026/03/06/how-much-water-do-the-data-centres-use-its-a-secret/
- https://ui.adsabs.harvard.edu/abs/2025EcInd.17012986J/abstract
- https://en.wikipedia.org/wiki/List_of_tallest_cooling_towers
- https://en.wikipedia.org/wiki/Niederaussem_Power_Station
Notable comments:
- https://lemmy.world/comment/23672269
- https://sh.itjust.works/comment/25288634
- https://lemmy.cafe/comment/16350045
- https://sh.itjust.works/comment/25294655
Edit addendum: I’d like to thank everyone that’s participated in this question thread, sorry if I missed any good relevant links in the comments.
To be clear, I still loathe the whole AI datacenter era, it really is heavily wasteful of resources, notably energy, but I wanted to better understand the water usage situation.
It evaporates, that’s how it cools. The water is sprayed over a heat exchanger and gets turned to essentially steam and then new water is pumped in and thus the water is “gone”. It will fall as rain somewhere but likely not near where it was taken from.
A closed loop system could be used but they are more expensive and require more maintenance so large data centers don’t usually use them unless required to.
I am still learning. Thank you for your educational comment.
I loathe AI anyways, I just wanna better understand why I loathe AI…
I loathe AI anyways, I just wanna better understand why I loathe AI…
What a perfect encapsulation of AI rage hate lmao
Further to this, as well as the source of the water often being the local city’s drinking water supply (as we’ve found this puts a strain on that supply), evaporative cooling systems concentrate the minerals / contaminants in the water, meaning a smaller (relative to what is evaporated) of now highly-concentrated runoff water also has to be constantly disposed of. This likely is also going into the city’s wastewater systems.
Radiators for closed-loop systems do also occupy more space (for the same cooling capacity) versus evaporative cooling towers, and are more limited in the range of climates they can be deployed in.
On balance though, the closed-loop cooling should always be the first choice; if it works for the deployment it will never be the wrong choice on a long-term / total cost of ownership basis.
This was the most comprehensive list I’ve seen about the negative aspects of LLM/datacentres use. Very informative.
Every car in the world uses a closed loop cooling system that does not consume water.
Car cooling systems are stupidly expensive, run at temps that would damage computer CPUs, run outside, and have a really nice advantage over computers which is that at higher heat loads they also tend to go faster thus cooling them off faster.
Now imagine you redlined a dozen cars for days on end in a garage in the middle of the summer do you think you might damage some components?
It is still very possible to use closed loop cooling on data centers but any system you build needs to be able to work in summer temps which can be as high as 35-40C and needs to do that without letting the computers exceed 60C. An air cooled system to handle that much heat is going to be very expensive and use a ton of power (and power generation also uses water)
While you’re effectively right in your comparison, you also must understand the difference between electronic data center cooling vs vehicle engine cooling.
Vehicle engines run best at a higher temperature range than electronics, so they install a thermostat, to literally bring the engine temperature up to a suitable range for ideal performance. But the thermostat is not necessary (unless you live near cold polar regions and want heat).
The thermostat can be safely removed from vehicles in more comfortable climates and the vehicle will run just fine, but just quite a bit cooler.
So, take the concept of a closed loop cooling system, remove the thermostat from the equation, and you got a more viable closed loop system more suitable to keep electronics cool.
If you remove the thermostat and redline it in a garage it still won’t be able to keep up because it doesn’t have the airflow that’s required
The concept of closed loop cooling for servers has always existed and it works for home computers. What is conventionally called closed loop cooling just means that you transfer heat from the computer to a liquid and then from the liquid to air. Transferring 100MW of heat to the air is what makes this difficulty especially in a stationary computer.
You’re almost right, but there do exist air cooled engines with no conventional radiator or water/antifreeze pump…
https://en.wikipedia.org/wiki/Volkswagen_air-cooled_engine
Many motorcycles also use air cooling.
Some aircraft engines, too. The old single-engine Cessnas I trained on were air-cooled. Though that’s pretty easy when you’re pushing cool, atmospheric air over the engine at 100 knots.
This is comparing apples and oranges though. Automotive cooling systems are designed for a very different problem set than datacenter cooling systems. The temperature gradients are much larger in ICE systems, they need to be small, light, and portable, and they cool something that generates much more variable heat loads.
A data center creates a consistent heat load, is stationary, with access to a source of water that is functionally limitless to the operators, cools a much smaller gradient and needs to do so in the most economical way possible to be as profitable as it can be to the owners. Evaporative coolers are dead simple, very effective, and scale very easily which is why they are used.
Ok so what you’re telling me is power plants generate electricity by burning fossil fuels which power a turbine with steam, then the data center uses all that electricity to produce even more steam?
And all that is to… Produce steamy furry porn
If we’re lucky it’s furry porn. I’m more worried about the non-consensual porn of real people, including simulated CSAM.
The most funny thing is, that once trained, the model can run and make furry porn on your local machine. So they don’t even make any money on this.
I haven’t tried furry porn models in particular, but all local image generation I’ve tried locally was really bad. It was with a 3070, so nothing really meant for this.
Mainline stable diffusion is pretty horrible. Try using SDXL community fine tunes (if they fit into your VRAM). They’re worse than cutting edge cloud models, but they punch way above their league
I have bad news for you; it’s all steam. EVERYTHING is steam. 🌍🧑🚀🔫🧑🚀🌚
Even you and I are just steam in liquid and solid phases.
You mean even my Steam games are actual steam? Neat.
They’re dispensed via valve, no?! Checkmate asteamists.
more expensive
That’s the real reason right there.
deleted by creator
we already have heating plants that transfer heat to homes via water, couldnt they just do that instead of wasting all the drinking water?
I think you’re describing district heating, which works great in places that planned ahead and buried the necessary plumbing so that the waste heat from nearby industrial processes can be beneficially used to heat nearby homes and offices.
The detail, however, is that those industrial processes are diverting the heat to the district plumbing, but if nobody needs heating (eg 40 C summer weather), then they will vent the heat using air cooling to the atmosphere. That is to say, the demand for heating will vary at times, and this is fine because the industrial process can just go back to dumping the heat into the air.
This doesn’t work for AI data centers because the amount of “waste” heat (eg 100+ megawatts) is well in excess of any nearby demand for heating. To quantify demand, I looked to the district heating system of Ulaanbaatar, the capital city of Mongolia, home to 1.67 million people, and the coldest capital city in the world by average annual temperature:
the Ulaanbaatar District Heating Company, encompassing 13,500 buildings with a total connected capacity of 3924 MW
The system serves 60% of the population, so about 1 million people. Where in the mostly-temperate USA could a 4 gigawatt AI data center be located so that it’s right next to 1 million people that need 24/7 heating as though they lived in Mongolia?
Scaling down to a 100 megawatt data center, the demand would be for a population of 25,000 living in essentially arctic conditions. Such places already have district heating, such as in Alaska. So if a smaller AI data center shows up, it just means the existing non-AJ heat source would fall back to dumping heat into the air.
In the end, there are very few places that need heating all year round, but AI datacenters would be producing heat all year round. Even if the heat were used for something outlandish, like heating every square meter of public roadway, that still might not be enough demand to quench these behemoth AI datacenters. And that’s before the cost of building out the district heating system.
We should definitely build district heating systems where they make sense, but building them so AI data centers can exist would be doing the right thing for the most terrible of reasons
I think it would be a huge infrastructure project. But yeah, makes way more sense logically. Although knowing these AI dip shits they’d probably charge you a ton even though you’re basically using their “waste”. Almost like “I know what I got” Facebook marketplace post.
Other commenters correctly describe the cost analysis for using evaporative cooling, but I’ll add one more reason why it’s the preferred method when water is available: evaporating water can dissipate truly outlandish amounts of heat with very few moving parts.
Harkening back to high school physics class, water – like all other substances – has a certain thermal capacity, meaning the energy needed to increase the temperature of 1 kg of water by 1 degree C. The specific thermal capacity of water is already quite high, at 4184 J/(kg*C), besting all the common metals and only losing to lithium, hydrogen, and ammonia. In nature, this means that large bodies of water are natural moderators of temperature, because water can absorb an entire day’s worth of sunlight energy but not substantially change the water temperature.
But where water really trounces the competition is its “heat of vaporization”. This is the extra energy needed for liquid water to become vapor; simply bringing water to 100 C is not sufficient to make it airborne. Water has a value of 2146 kJ/kg. Simplifying to where 1 kg of water is 1 liter of water, we can convert this unit into something more familiar: 0.596 kWh/L.
What these two physical properties of water tell us is that if our city water comes out of the pipe at 20 C, then to get it to 100 C to boil, we need the difference (80) times the thermal capacity (4184 J/kg*C), which is 334,720 J/kg . Using the same simplification from earlier, that comes out to be 0.093 kWh/L. And then to actual make the boiling liquid become a vapor (so that it’ll float away), we then need 0.596 kWh/L on top of that.
Let that sink in for a moment: the energy to turn water into vapor (0.596 kWh/L) is six times higher than the energy (0.093 kWh/L) to raise liquid water from 20 C to 100 C. That’s truly incredible, for a non-toxic, life-compatible substance that we can (but should we?) safely dump into the environment. If you total the two values, one liter of water can dissipate 0.69 kWh of energy per liter. Nice!
In the context of a 100 megawatt data center (which apparently is what the industry considers as the smallest “hyperscale data center”), if that facility used only evaporative cooling, the water requirement would be 144,927 L/hour. That is an Olympic-size swimming pool every 6.9
secondshours. Not nice!And AI datacenters are only getting larger, with some reaching into the low single-digits of gigawatts. But what is the alternative to cooling the more-modest data center from earlier? The reality is that the universe only provides for three forms of heat transfer: conduction, convection, and radiation. The heat from data centers cannot be concentrated into a laser and radiated into space, and we don’t have some sort of underground granite mountain that the data centers can conduct their heat into. Convection is precisely the idea of storing the heat into a substance (eg water, air) and then jettisoning the substance.
So if we don’t want to use water, then we have to use air. But for the two qualities of water that make it an excellent substance for evaporative cooling, air doesn’t come close – 1003 J/(kg*C) and no heat of vaporization, because air is already gaseous. That means we need to move ungodly amounts of air to dissipate 100 megawatts. But humanity has already invented the means to do this, by a clever structure that naturally encourages air to flow through it.
The only caveat is that the clever structure is a cooling tower, and is characteristic of nuclear power stations. It’s also used for non-nuclear power station cooling, but it’s most famous in the nuclear context, where generators are well into the gigawatt range. Should AI datacenters use nuclear-sized air cooling towers instead of water evaporation? It would work, but even as someone that’s not anti-nuclear, the optics of raising a cooling tower in rural America just to cool a datacenter would be untenable. And that’s probably why no AI datacenter has done that.
To be abundantly clear, I’d rather not have AI datacenters at all. But since the question was why water consumption is such a big deal, it might be best to say that it’s a physics problem: there isn’t any other readily-available way to provide cooling for 100+ megawatts, without building a 100+ meter tower. Water is always going to be cheaper and more on-hand than concrete.
if that facility used only evaporative cooling, the water requirement would be 144,927 L/hour. That is an Olympic-size swimming pool every 6.9 seconds. Not nice!
You mean 6.9 hours? You’re definitely off by a few orders of magnitude there.
Darn, you’re right, the hours fell off in my dimensional analysis. Corrected, although 6.9 hours for a pool isn’t much time for swimming at all.
Ok follow up question here. Is there cause to be concerned that releasing tons and tons of steam into the environment that was not there before will cause other environmental impact beyond just the reduced water supply? Like… If the ambient air is cooling all that water back into rain or something will that tangibly impact temperatures, or will average humidity change? Or is that part at least too small of an impact to be particularly material?
The atmosphere is very very very big and water is very very very normal thing to have there.
Not meaningfully, no. In the middle of a dry desert far from other bodies of water you could theoretically form cumulus clouds downwind of your site (I have heard of this happening), but it would be teeny tiny.
The amount of water evaporation is just orders of magnitude too small. The earth gets about 1kW of energy per square meter, so a 9GW data center is approximately the same amount of waste heat as 9 million square meters, which is 900 hectares.
There is almost certainly an impact somewhere, but I don’t have the data to know where it is. My conjecture is that a localized mass of steam would cause convection currents and drive microweather phenomena, especially downwind of such an air cooled facility. I’m not sure rain is necessarily the result, unless there’s a sizable mountain downwind, since although hot air will rise, it might run out of steam (pun intended) before cooling down enough to fully condense out. So it might just be adding a layer of humidity that floats a few hundred meters above the surface.
But even that could be devastating, if said layer blocks natural convection currents over a downwind town or city. It could act as a thermal cap, making that town warmer at night, because heat rising from the city would meet that humid layer and get absorbed by the water. The thermal capacity of water comes into play again, but this time against the city.
Heat energy is a driver for cyclones, such as when the warm, moist water of the Caribbean accelerates air as it approaches the southern USA, and only once landborne does it start to slow down due to drag and losing its energy source. I doubt we’ll ever have an AI-induced hurricane, but in a situation where there’s already an energetic weather event, it cannot possibly help to be adding heat to that situation.
I defer to the meteorologists to say what happens to the local weather and climate, and biologists on what happens to humans and wildlife. But I can’t see it being good, no.
I doubt that it has meaningful impact on climate. Evaporation from plants and oceans is many orders of magnitude greater. The issue is pretty always about fresh water availability in the given region.
Followup: what are the impediments to using, say, seawater instead?
Salt water is a huge pain to work with. The salt would quickly corrode any cooling systems.
And even for fresh water, you have biofouling to worry about and what to do with the water after you’ve used it, can’t just dump it into the environment untreated.
There are already heat exchanging systems that do this with brackish water already; you don’t need to treat water if all you ate doing to the water is making the water hotter or colder.
While not strictly biofouling, the marine environment can definitely be affected by introducing hotter water where it didn’t exist prior, in and around the outflow pipe. Seaside nuclear power stations that use seawater cooling need to be mindful to diffuse the heated water over a large area, to minimize the ecological impact. Citation: https://ui.adsabs.harvard.edu/abs/2025EcInd.17012986J/abstract
I agree that pumping in water at a different temperature can affect the environment. It is just that a lot of people tend to conflate the effluent coming from plants like this as something which needs chemical or other treatment when the issue is thermal only.
People mentioned corrosion which is true of all sea water systems but in evaporative systems you also have the addition of salt forming on all the evaporative surfaces which can drastically increase corrosion more than normal seawater and cause fouling
So to do this properly you would want an RO system making freshwater before the cooler which at that point it would make more sense to just have a separate company doing desalination.
Very similar problems arise with desalination plants, which I wrote about here: https://sh.itjust.works/comment/14613302
So is air cooling actually feasible but we don’t do it cause it would make data centers look like nuclear reactors? Or is it just not feasible?
Air cooling is feasible, as evidenced by existing power stations that use air cooling. A lot of newer nuclear generation use water cooling, being sited along the ocean and in the multi gigawatt range. But we can also find examples of inland power stations that have no water connection, and therefore need some massive cooling towers. Here is one in Germany that has a 2.2 GW rating and a 200 meter tall tower: https://en.wikipedia.org/wiki/Niederaussem_Power_Station
This is, as you can imagine, rather expensive to build, but it’s doable. Cooling a coal fire is not substantially different than cooling compute loads in a data center, as it’s all just a matter of moving heat around. Will there be differences due to the base temperature of coal versus GPUs? Yes, since the ratio of input to ambient temperature matters. But on the flip side, this should make it easier to construct, as the plumbing for lower temperatures is simpler.
Mechanical engineers can chime in on feasibility for AI data centers, but seeing as it hasn’t been done, it’s probably still cost related.
AFAIK it’s feasible for most data centers except the where power density is so huge that you just can’t do it with air cooling. That issue is most common for large scale AI data centers.
Modern CPU consumes ~150W, modern AI chip can eat 700W and they’re packed as densely as possible with multiple cards slotted in every motherboard.
The rate that water returns to aquifers it was drawn from is very slow. Rainfall from the evaporation is only the first step of a long process. So it’s not contamination, just being used up faster than is reasonable.
I can’t touch on all of them, but a lot of them do actually just make it disappear.
A lot of the large data centers use evaporative cooling. The water basically boils off as vapor they just pump into the sky. This is cheaper in many places than the electricity needed for condenser cooling or other methods as it requires less electricity. (Which at the scale of these data centers they literally are unable to get enough electricity). That water vapor can drift off as clouds and come down somewhere, but no guarantee where or when.
Some data centers also introduce more runoff of pollutants from their methane generators and such that can make the water unusable. If they do capture the vapor and reintroduce into the water table it isn’t always cooled down and the heat can cause major problems in the environment by raising temperatures. This can sometimes lead to the only thing surviving around the data centers being toxic algae or something.
There are so many more ways they can be problematic. That’s just scratching the surface
Can this steam be used to turn turbines to make power? Or is it not hot enough to generate the required pressure?
Surely it could at least be fed into a power station that now only needs half the fuel to get it up to temperature?
It’s definitely not able to run a turbine as is but either way it doesn’t really solve the problem. My understanding is steam turbines don’t actually condense or cool the water all that much. You still have hot water, maybe not fully boiling but still hot enough you’ll have a not insignificant amount of evaporation and environmental damage if you just dump it. There’s condensing and non condensing designs but the condensing design requires massive cooking towers and more water draw from a heat exchanger.
I’m not a systems engineer so calculating potential cost savings of adding the remaining heat to capture power vs just letting it evaporate vs using a closed loop system is outside my wheelhouse.
It wouldn’t solve the evaporation problem, but you could get some extra electricity from the steam. Makes the creation of said steam slightly less wasteful
It doesn’t just disappear. It falls back to the ground.
The problem is the rate that it leaves the local system is faster than the rate it returns so in the long term the local systems lose water. If you want to look at it even deeper it’s actually sooo much worse as it also causes local droughts which kill flora which then cause less water to be stored in the local system which then kills more flora and so on as you get desertification and then by pumping down aquifers those aquifers can collapse and never hold the same water that they used to hold
So yes it does fall back to the ground but that’s an explanation for a 10 year old learning about the water cycle. An adult should be aware of how much more damaging it is
Clouds can move at up to 100 miles an hour if they’re high altitude, average closer to 25 miles an hour for regular cumulus clouds. You pump that vapor into the air and it’s off to the races. If it forms a cumulonimbus cloud those last. What, 5 days? (Can fact check me here, I’m going off my memory of facts from a weather obsessed kid I knew).
5 days x 24 hours in a day x 20 miles per hour = 3000 miles.
So vapor from a datacenter in California might potentially come down in North Carolina or something. For the sake of those Californians, that’s gone.
Sure maybe not all hits high enough in the atmosphere, and some might travel in a circle and fall back around the datacenter, but not all. I guarantee not all.
I mean sure, but that’s an argument against where you locate data centres, not necessarily to stop them entirely. i.e. evaporating that water is a problem in a region that’s already over populated and doesn’t have enough water
Not really, because you also start depleting the readily available potable water and increasing toxin concentration in remaining sources. See other comments for just how much these things suck up. It’s truly mind staggering amounts of water. There’s very few areas that can handle that amount of water with no issue for extended periods of time. For some reason (read money) the politicians in charge of approving these things are turning a blind eye to those problems.
Low population areas with high amounts of water are usually nature preserves and things we don’t want these data centers anywhere near. If you do chance into finding a low population center with high water that isn’t a needed nature preserve, odds are you won’t have all the infrastructure you need to build the thing and you’ll run into other issues that might have an equally large but different impact. You can’t just plop these anywhere and run a power line and call it good.
That’s why in other sane countries outside the US you see a large number of proposed data centers get blocked during the environmental impact assessment stages.
Whwre, though? And does it require retreatment to make it safe for consumption?
The water is used to absorb heat and reject it outside. It will not be contaminate. It evaporates into the air which depletes local water supply. Could it come back? Sure. Can you guarantee it will? No.
Agriculture by and large still uses the most water but in the year of our lord 2026 theres no reason to not be building closed loop data centers. Evaporative cooling is mostly done in places where water is cheaper than power. Its still grossly irresponsible but that doesnt matter if their arent laws on the books.
There will undoubtedly be some level of contamination.
No more than any normal level of evaporated water in the area. Its literally just water evaporating from heat.
Agriculture’s use has been eliminated. The farmers and ranchers who voted for this regime will be out of business by the time the data centers are up and running because ICE kidnapped all their laborers. The only jobs left will be guarding the centers from Antifa.
This is the sort of critical thinking that gets u in trouble with the hive mind. Yeah its water. About 2-5% of the water evaporates the rest is just some slightly warmer water.
Alright, so what do we do with that “slightly” (infact quite a bit) warmer water?
Can’t just discharge it into a river. That hot water is gonna cause all kinds of havoc on the environment. Even if the temperature doesn’t outright kill things, warm water holds less oxygen so that’s going to harm fish, it’s probably gonna fuck up their spawning cycles because suddenly they have warm water in the middle of winter, it might cause algae blooms, etc.
So we have to cool that water down. How are we gonna do that? We can spend even more money and energy to refrigerate it I suppose, but of course that would be stupid since these data centers are already using ridiculous amounts of energy.
So most likely we’d just put it in some giant holding tanks and wait for it to cool off or maybe run it through a massive radiator to cool off. That’s even more land being taken up by these monstrosities, more maintenance needed, and at the end of the day, that’s still water sitting around somewhere besides in our aquifers and waterways where it’s needed, and we’re probably going to be losing even more to evaporation in the process.
And while it’s being pumped around in those data centers, I’ll bet you it’s being run though all kinds of plastic pipes and such, maybe coming into contact with lead solder and such because these aren’t potable water systems, sounds like a great way to introduce more heavy metals and microplastics into the environment to me.
And that 2% or so that’s being lost to evaporation? Some of these large data centers are using well in excess of a million gallons a day, so that’s 20,000 gallons a day lost to evaporation, so roughly every month you’re losing an entire Olympic sized swimming pool to evaporation. Again, that’s water that’s supposed to be in rivers and aquifers that’s now not.
And what doesn’t evaporate? Well now any minerals, heavy metals, etc. that were in the water are now concentrated by that much. Hope your water treatment is prepared to handle that.
The other question is: why do they have to use potable water, as opoosed to, for example, filtered river water?
Google already uses so called “grey water” to cool their data centers.
Maybe some do but the Google datacenter where I live are using potable water.
I spoke with our CEO, Mr. Dip Shit, and he said we can’t afford filters…
Because you cannot dump infinite amounts of heated water into a river, it will kill the species living downstream if the river gets warmer than maybe 25 degrees.
Could they not evaporate river water, instead of using drinking water?
Edit: the volumes involved (from the other post) seem to indicate only the largest rivers could support the operation. Doesnt seem very viable at all. So lets just use up our valuable supplies of drinking water instead! /s
This is most likely more expensive, because you need 2 water cycles. One with a clean cooling liquid that carries the heat from the CPUs to a heat Transfer/cooling block, and then another system that passes river water over the cooling block. You cannot just use the river water directly because of all the sediment and other elements in untreated water that may clog or corrode your pipes.
It evaporates and they do nothing or very little to reclaim the vapor. So yeah, it just “disappears.”
The water gets used over and over and over in the data center. It’s in a loop. The reporting that data centers consume vast quantities of water completely misunderstand the core concept of a water loop.
That said, most data centers use the water for evaporative cooling. In that case, it comes back down as rain. But again, even in that case the reporting is still very overblown.
That is not the conclusion of the video you linked
From the video (timestamped):
Even under the maximlist goals of AI companies, the projected increase of water use is small compared to what cities and industries already use.
He even mentions how US corn uses 80x more water than worldwide AI use; with 40% of that corn burned as ethanol. And that power usage is the much larger concern.
It’s amazing that Hank can come to this conclusion since basically every genai company is hiding their resource usage. (Well, actually not that amazing as Hank has gone completely on the side (and gets sponsored by) these companies and is strongly biased).
Have a look at eg https://pivot-to-ai.com/2026/03/06/how-much-water-do-the-data-centres-use-its-a-secret/That link shows Altman saying current datacenters use closed loop systems and make vague rhetorical questions. That’s not a source
Corn ethanol being stupid doesn’t make AI less stupid.
There’s plenty of stupidity out there. That logic would make everything useless.
By that logic, no one outside the US would care about voting, since they already reached peak stupidity, so voting for a better president in Argentina will not get rid of trump.
It is useful if you’re trying to figure out what to focus on. In this case, the concern is wasteful water usage. If you point to a larger area of wasteful water consumption, it would make more sense to target that first.
Except we are already focusing on AI data centers. By bringing up corn you’re just unfocusing. Which is the opposite of what you want.
Yes, both are bad, and we should get rid of both. But both things can be done at the same time. You don’t need to steal the focus from another issue to try to redirect it.
Instead of “why are we caring about data centers? Corn is worse!”. Try “while we are fighting data centers, we should also look at corn, they’re bad too”.
I made a point to update my post, not only with your link, but also with your timestamped link.
This is why I’m here, to ask questions and seek answers…
I appreciate it. <3
Lots of other people are making solid points as well. Glad to see people engaging.
So, similar to a vehicle radiator, just larger scale?
Well, if that’s the case, yeah antifreeze isn’t good for anyone, but still a proper closed loop cooling system isn’t exactly wasting water is it?
still a proper closed loop cooling system isn’t exactly wasting water is it?
If you take good water from underground, it evaporates, and you’re in a drought-prone area, your area effectively just lost the water. Even if you’re not in a drought-prone area, you’re never going to have easy access to that clean, underground water again.
https://arxiv.org/pdf/2304.03271
I found this, and it has a cool overview of water towers and such.
This is exactly why I asked, hoping for more information like this.
👍
Welcome to Lemmy. “AI bad” downvoting will always happen at anything that symbolizes sympathy, even when you’re just providing an objective factual take. Thanks for your informative post.












