Researchers are coming up with other ideas, too.
In July, Renkun Chen, at the University of California San Diego, and colleagues, published a paper, external detailing their idea for a pore-filled membrane-based cooling technology that could help to cool chips passively – without the need to actively pump fluids or blow air around.
“Essentially, you are using heat to provide the pumping power,” says Prof Chen. He compares it to the process by which water evaporates from a trees’ leaves, inducing a pumping effect that draws more water up through the plant’s trunk and along its branches to replenish the leaves. Prof Chen says he hopes to commercialise the technology.
New ways of cooling down data centre tech are increasingly sought-after, says Sasha Luccioni, AI and climate lead at Hugging Face, a machine learning company.
This is partly due to demand for AI – including generative AI, or large language models (LLMs), which are the systems that power chat bots. In previous research, external, Dr Luccioni has shown that such technologies eat up lots of energy.
“If you have models that are very energy-intensive, then the cooling has to be stepped up a notch,” she says.
Reasoning models, external, which explain their output in multiple steps, are even more demanding, she adds.
They use “hundreds or thousands of times more energy” than standard chat bots that just answer questions. Dr Luccioni calls for greater transparency from AI companies regarding how much energy their various products consume.
For Mr Ballon, LLMs are just one form of AI – and he argues they have already “reached their limit” in terms of productivity.
