The energy usage of datacentres, particularly for AI applications, has been covered extensively – and for good reason. AI consumes more power and runs hotter thanThe energy usage of datacentres, particularly for AI applications, has been covered extensively – and for good reason. AI consumes more power and runs hotter than

Pipe flow to datalakes: How AI can streamline its water and energy usage

The energy usage of datacentres, particularly for AI applications, has been covered extensively – and for good reason. AI consumes more power and runs hotter than standard computing loads. In 2022, the IEA reported that the total power used by datacentres, including for AI and cryptocurrency, was around 460TWh.  

Although estimates see this power usage potentially grow to 945TWh by 2030, electric vehicles are predicted to consume around 780TWh by 2030, to put this in context. When we look at AI specifically, Schneider Electric has estimated that AI’s share of this power consumption is currently around 8% and may grow to 15-20% by 2028.  

These estimates are still prone to be too high. Koomey’s Law tells us that over time, we see greater efficiencies in computing – or specifically, that the number of calculations per unit of energy increase over time. For example, between 2010 and 2018, the amount of computing being done in datacentres increased by over 500%, but the amount of energy being used only increased by 6%.  

However, although the amount of energy used by AI is considerable, it can also return the favour.  

AI: Water and Chips with that?  

AI’s contribution to human endeavor is already significant. Perhaps the most high-profile example is AlphaFold, which helps us predict protein structures, improving drug discovery and our understanding of diseases.  

But we’ve seen many other applications, including improving chili yields in India, reducing conflict between humans and snow leopards, or supporting better risk modelling for insurance companies.  

AI lives in the cloud, so the most logical place to use AI to reduce water usage is the datacentre. Datacentres have historically been cooled with air conditioning. With AI’s workloads, cloud companies are rapidly realizing that air is insufficient, and the future will revolve around using liquid cooling.  

The reason for this is simple: the thermal conductivity of water is about 23 times better than air, and when you consider additional factors like flow rate, water’s volumetric heat capacity is over 3000 times better than air when used in an industrial setting.  

On this basis alone, it’s a no-brainer to use water to cool technology infrastructure. Better conductivity means more power efficiency, and ultimately, less power used to remove more heat.  

And we’re still seeing innovation in this field. Historically, cloud companies and gamers alike have attached plates to CPUs (and often, GPUs) and used water to remove the heat. This is known as direct liquid to chip cooling.  

We are now starting to see immersive cooling techniques emerge, where the entire server is immersed in fluid. Although this has a number of implications for unit maintenance, servers immersed in fluid are not only more power-efficient, but it also eliminates dust from units, improving component lifespans.  

So how do we use AI to further improve this efficiency?  

Air, water and changing priorities 

AI’s core strength lies in pattern recognition, analysing complex data sets and finding links. Most servers have the ability to measure their own workloads and temperatures, and this data can be fed back to data lakes where AI systems can learn how to optimise cooling and power requirements.  

However, sensors can also be put on the servers themselves, measuring water flow and consequently obtaining more information about a server’s temperature and cooling requirements.  

It’s important to remember that cloud servers don’t exist in isolation. Local weather affects cooling: many datacentres use ‘free air cooling’ and use ambient temperature to cool the servers – this is more effective in Iceland than in Florida, for example. At the same time, most datacentres use dry coolers outside to do evaporative cooling – but this is less effective in areas of high humidity.  

Balancing these equations is where AI excels. AI can analyse not only the temperature and power consumption of the servers, but also the environment around them, including data from weather stations. This helps to react to local conditions, but also to predict them and streamline water usage now and in the future.  

Conversely, the datacentre may not be in an area of water scarcity, in which case, AI can be tailored to optimise the server performance or the power usage of the pumps and other equipment. Datacentres in urban areas may prioritise noise reduction to avoid disturbing local residents – which AI can also help with, optimising systems to decrease volume from mechanical operations.  

Self-optimising technologies 

The technology industry is always moving forwards, and although the AI industry has seen a considerable amount of backlash, it also has considerable potential to improve our lives and the world around us. However, we should always have sustainability in mind, considering how to provide for today’s needs while still safeguarding the world of tomorrow.  

This does require a complex conjunction of worlds: AI needs data to operate, which means using a combination of IoT and industrial expertise alongside data analysis techniques. But with the right skills, vision and commitment, we can not only benefit from AI directly, but also use it to streamline its own resource consumption, driving a self-improving virtuous circle.  

Market Opportunity
Pipe Network Logo
Pipe Network Price(PIPE)
$0.06488
$0.06488$0.06488
-0.23%
USD
Pipe Network (PIPE) Live Price Chart
Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact service@support.mexc.com for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.

You May Also Like

Trust Wallet’s Decisive Move: Full Compensation for $7M Hack Victims

Trust Wallet’s Decisive Move: Full Compensation for $7M Hack Victims

BitcoinWorld Trust Wallet’s Decisive Move: Full Compensation for $7M Hack Victims In a significant move for cryptocurrency security, Trust Wallet has committed
Share
bitcoinworld2025/12/26 17:40
Cashing In On University Patents Means Giving Up On Our Innovation Future

Cashing In On University Patents Means Giving Up On Our Innovation Future

The post Cashing In On University Patents Means Giving Up On Our Innovation Future appeared on BitcoinEthereumNews.com. “It’s a raid on American innovation that would deliver pennies to the Treasury while kneecapping the very engine of our economic and medical progress,” writes Pipes. Getty Images Washington is addicted to taxing success. Now, Commerce Secretary Howard Lutnick is floating a plan to skim half the patent earnings from inventions developed at universities with federal funding. It’s being sold as a way to shore up programs like Social Security. In reality, it’s a raid on American innovation that would deliver pennies to the Treasury while kneecapping the very engine of our economic and medical progress. Yes, taxpayer dollars support early-stage research. But the real payoff comes later—in the jobs created, cures discovered, and industries launched when universities and private industry turn those discoveries into real products. By comparison, the sums at stake in patent licensing are trivial. Universities collectively earn only about $3.6 billion annually in patent income—less than the federal government spends on Social Security in a single day. Even confiscating half would barely register against a $6 trillion federal budget. And yet the damage from such a policy would be anything but trivial. The true return on taxpayer investment isn’t in licensing checks sent to Washington, but in the downstream economic activity that federally supported research unleashes. Thanks to the bipartisan Bayh-Dole Act of 1980, universities and private industry have powerful incentives to translate early-stage discoveries into real-world products. Before Bayh-Dole, the government hoarded patents from federally funded research, and fewer than 5% were ever licensed. Once universities could own and license their own inventions, innovation exploded. The result has been one of the best returns on investment in government history. Since 1996, university research has added nearly $2 trillion to U.S. industrial output, supported 6.5 million jobs, and launched more than 19,000 startups. Those companies pay…
Share
BitcoinEthereumNews2025/09/18 03:26
Trust Wallet Hack Hits $7M: CZ Hints at Possible Insider Role

Trust Wallet Hack Hits $7M: CZ Hints at Possible Insider Role

CZ hinted at possible insider involvement in the Trust Wallet incident while assuring users that their funds would be reimbursed.
Share
CryptoPotato2025/12/26 16:48