Best of LinkedIn: Sustainability & Green ICT CW 14/ 15
Show notes
We curate most relevant posts about Sustainability & Green ICT on LinkedIn and regularly share key takeaways. We at Frenus support ICT enterprises with precise market and pricing intelligence that goes beyond traditional analyst subscriptions and existing databases, delivering actionable insights for better decision-making. You can find more info here: https://www.frenus.com/usecases/filling-the-strategic-gaps-your-current-intelligence-sources-leave-open
This edition examines the critical intersection of artificial intelligence infrastructure and environmental sustainability, highlighting a global shift toward Green AI. Industry experts discuss how rising compute densities and AI inference are driving unprecedented demands for energy and water, necessitating a transition to liquid cooling, modular data centres, and on-site renewable power. Several reports detail innovative solutions such as underwater facilities, waste heat recovery for community heating, and the use of engineered timber to lower the carbon footprint of digital hubs. This edition also emphasise the importance of lifecycle extension, green coding, and standardised carbon metrics like the Software Carbon Intensity spec to ensure transparent reporting. Furthermore, leaders advocate for a strategic alignment between technology and sustainability, urging organisations to prioritise purpose-built, efficient models over mere scale. Collectively, the contributions present a roadmap for balancing technological innovation with environmental stewardship through smarter engineering and policy.
This podcast was created via Google NotebookLM.
Show transcript
00:00:00: This episode is provided by Thomas Allgaier and Frennus based on the most relevant LinkedIn posts about sustainability in green ICT, in CW-IV & XV.
00:00:09: Frenness supports ICT enterprises in the form of delivering precise ICT market and pricing intelligence that analyst subscriptions and existing databases cannot provide.
00:00:18: You can find more info
00:00:21: You know, to really set the stage for this deep dive.
00:00:23: I like to think about boiling a kettle of water
00:00:25: okay?
00:00:26: A kettle?
00:00:26: yeah imagine boiling that kettle every single time.
00:00:28: you Type a prompt into chat GPT or you know ask an AI to summarize and email That unseen physical reality.
00:00:37: if the AI revolution is exactly what we're unpacking today
00:00:40: right?
00:00:41: And if you are a professional working in the tech or sustainability space Right now, you know Exactly how tense This Is getting Yeah!
00:00:48: like massive tug of war.
00:00:49: Oh, absolutely!
00:00:50: Your organization is demanding explosive AI growth but at the exact same time you're being handed these incredibly strict sustainability targets.
00:00:59: so today we are gonna break down the actual strategies industry leaders are discussing on LinkedIn right now to bridge that gap.
00:01:06: Yeah, we're looking at green AI efficiency sustainable infrastructure and how we even measure this carbon footprint.
00:01:12: Let's jump into the energy diet first.
00:01:14: We have to And we have to start by completely resetting How we think about it because over the past few weeks The conversation among these experts has Well, it's radically shifted.
00:01:26: It is no longer just about the energy required to build and train these models.
00:01:30: The real crisis Is the Energy Required To Run Them In Everyday Life?
00:01:34: Wait
00:01:34: wait I need you to clarify something right out of the gate here Because everything i've read over last year pointed to training phase as a big environmental disaster.
00:01:44: Yeah That's the common narrative.
00:01:46: Right,
00:01:46: like we've all seen the headlines about it taking gigawatt hours of power just to teach GPT-IV how to speak.
00:01:52: So once its trained isn't a piece of software sitting on a server?
00:01:56: How is simply running at their real problem?
00:01:58: See that as exact misconception.
00:02:00: thats getting companies into trouble.
00:02:01: Training is massive one time event But running model what industry calls inference happens billions times a day Basically forever.
00:02:11: Yeah Meal Sahota recently broke this down pointing out that inference is now the dominant driver of global tech energy consumption.
00:02:19: He noted, a single AI query can use up to ten times the power of traditional search.
00:02:25: Ten Times?
00:02:26: Yeah Just for one query.
00:02:28: Why The Massive Difference?
00:02:30: Think about the mechanics of it.
00:02:31: A Traditional Search Engine Is Essentially A Highly Organized Library.
00:02:36: You Type in a Query And It Quickly Pulls The Right Index Card To Point you to a Website.
00:02:40: Its just Retrieval.
00:02:42: But generative AI doesn't just retrieve, it calculates.
00:02:45: It generates every single word or pixel on the fly by calculating complex probabilities across billions of parameters.
00:02:57: That
00:02:57: sounds exhausting.
00:02:58: It is and that requires immense continuous processing power from GPUs, those graphics-processing units.
00:03:04: Okay!
00:03:04: That makes sense.
00:03:05: Retrieval versus constant heavy computation Yeah.
00:03:08: But if inference is the real energy hog why isn't it dominating all the corporate sustainability reports I see?
00:03:13: Because its practically invisible.
00:03:15: Tyler Malin's recent analysis really brought this into focus.
00:03:19: He calls AI inference The carbon footprint.
00:03:22: nobody is measuring.
00:03:23: Why does no one measure
00:03:24: it?
00:03:24: Well, when a company trains the model it's like massive construction project.
00:03:28: You can measure steel and concrete – you put that in an ESG report!
00:03:33: But inference?
00:03:34: That happens continuously through millions of API calls…
00:03:38: Which is just software.
00:03:39: talking to software in the background.
00:03:40: Exactly!!
00:03:41: It's constant invisible draw of electricity slips right thru the cracks of standard corporate reporting.
00:03:47: So if we use this analogy its like the phantom drink... You measure the energy to build a massive factory, but you completely forget to measure the Energy it takes to keep the lights The machines.
00:03:56: The furnaces running two four seven for every single customer request.
00:04:01: that is a perfect analogy and Will Nordberg made a critical observation about how this is finally hitting the boardroom.
00:04:08: This is rapidly becoming a CXO level issue because
00:04:11: its getting too expensive to ignore.
00:04:13: exactly For a long time companies were shielded from this reality because they bought AI through a saw subtraction layer.
00:04:21: They paid for tokens, you know fraction of ascent per word and treated AI like a cheap cloud subscription.
00:04:27: energy was just the cloud provider's problem.
00:04:30: but has these organizations scale their AI deployment?
00:04:33: From cute little experiments to core business functions that abstraction completely breaks down
00:04:40: Precisely.
00:04:40: The volume gets so high that companies are transitioning from paying for abstract tokens to paying the raw compute, efficiency and actual energy required.
00:04:50: Suddenly it's not a lightweight software experiment anymore.
00:04:53: It is heavy physical infrastructure burden taking massive bite out of their margins.
00:04:59: Let us talk about this physical burden because all those mathematical calculations in the cloud don't just magically stay in the cleat.
00:05:07: No they do NOT!
00:05:09: that all the consumed electricity ultimately turns into heat.
00:05:13: And based on what we're seeing in the posts, That heat is having devastating local consequences.
00:05:17: It's entirely visceral.
00:05:19: We are moving from abstract data centers to physical realities hitting local communities.
00:05:26: Leslie Sheridan and Bjorn Soren Giggler shared some really alarming new research On this localized physical fallout.
00:05:33: What kind of fall out?
00:05:34: Data centers are literally generating local heat islands.
00:05:37: In extreme cases, they're seeing the land temperatures surrounding these massive server farms rise by up to sixteen degrees Fahrenheit.
00:05:45: Sixteen...wait that isn't just an infrastructure problem.
00:05:49: if you live in a neighborhood next one of these facilities your own air conditioning bill is skyrocketing!
00:05:54: That's a public health issue for millions of people.
00:05:56: oh absolutely.
00:05:57: and think about the operators.
00:05:58: how do you cool down a building?
00:06:02: BJ Statler and Dev Karlakar highlighted that water scarcity is rapidly becoming a hard strategic constraint for data center operators.
00:06:11: Traditional cooling towers work through evaporation, they literally evaporate water to reject the heat into massive server farms located in water-stressed regions pulling millions of gallons from local aquifers just to keep the hardware from physically melting down.
00:06:31: So
00:06:31: let me get this straight, we're depleting local water tables to cool the servers and the servers are still heating up the surrounding land.
00:06:37: if Water is becoming a bottleneck what about the power grid itself?
00:06:41: can The Grid even handle these dense clusters of GPUs?
00:06:44: Short answer no.
00:06:45: that Is where the system is truly breaking.
00:06:47: the grid cannot handle it.
00:06:49: Nita Ladd and Michael Thomas shared some incredible insights on this.
00:06:53: Power operators are hitting a hard wall to the point where they're just bypassing traditional electrical grid entirely to feed AI's hunger.
00:06:59: Bypassing the grid?
00:07:00: How?!
00:07:01: Well, Michael Thomas released a report detailing how Google is exploring onsite natural gas power for its expanding AI campuses.
00:07:08: They're partnering with a company called Crusoe To build massive nine hundred thirty-three megawatt natural gas plant in Texas.
00:07:15: Wait!
00:07:15: I have stop you there.
00:07:17: Just give YOU the listener A sense of scale, nine hundred and thirty-three megawatts is enough to power hundreds of thousands of homes.
00:07:26: Here's where it gets really interesting... To power the AI that is theoretically supposed help us solve climate change?
00:07:33: Tech giants are building new dedicated natural gas plants completely off grid!
00:07:39: That feels like a massive step backer.
00:07:40: It is the ultimate contradiction of this whole tech cycle, but it really comes down to timelines.
00:07:46: The visceral tension here is between the blistering speed of AI deployment where a company can design and order chips in six months And the painfully slow pace of grid modernization.
00:07:55: Right getting A new connection Can take forever.
00:07:57: exactly Getting a New Connection into the main Grid can Take five To ten Years.
00:08:01: In some Markets Tech Companies Simply Aren't willing to wait A decade.
00:08:05: They Are prioritizing Speed & Reliability Taking Matters Into Their Own Hands Even If directly contradicts their immediate carbon goals.
00:08:12: Okay, so the traditional model building a warehouse plugging it into the public grid and blowing cold air over which explains why we are seeing this radical push to fundamentally redesign how a data center even operates.
00:08:30: Exactly, legacy infrastructure simply cannot cool these new high-density chips with air.
00:08:35: Brian Lilly pointed out that liquid cooling has officially moved out of the experimental phase.
00:08:41: it is now a foundational requirement for AI infrastructure.
00:08:44: Let's explain why.
00:08:44: that is mechanically because Air Cooling A massive AI Data Center.
00:08:50: It's like trying to cool down an overheating car engine by blowing on it with a desk fan.
00:08:54: Yes, perfect!
00:08:55: Its wildly inefficient.
00:08:57: water On the other hand absorbs heat over twenty times faster than air.
00:09:00: its difference between standing in a sixty degree breeze and jumping into a sixty-degree swimming pool.
00:09:04: A
00:09:04: perfect analogy.
00:09:05: And because humans have struggled To optimize this efficiently AI is actually taking Over The design process itself.
00:09:11: Michael Lesniak shared a fascinating development Ai Is now designing its own data centers.
00:09:16: Wait really?
00:09:17: The ai is designing the buildings.
00:09:19: Yes, he pointed to Nvidia's new Vera Rubin reference design.
00:09:24: When they let AI analyze the physics of heat dissipation The software completely eliminated mechanical chillers from the blueprint.
00:09:31: Instead it opted for warm water radiators
00:09:35: Warm-water radiators?
00:09:36: To cool down servers.
00:09:37: how does that work?
00:09:38: That sounds completely counterintuitive.
00:09:41: It comes back to the physics you just mentioned.
00:09:43: Legacy human engineering spent decades trying to refrigerate data centers down to sixty degrees which takes massive amounts of electricity, but these new AI chips run incredibly hot sometimes over a hundred and eighty degrees.
00:09:56: oh wow right.
00:09:57: so that I realized You don't need to chill water to cool one hundred and ninety degree chip.
00:10:02: Pumping ninety degree water over it will still extract massive amounts of heat without spending a single watt on mechanical refrigeration.
00:10:09: It essentially humiliated decades of human engineering that just relied upon building bigger fans and deeper wells.
00:10:15: That is brilliant, its just leveraging the natural temperature differential.
00:10:18: And speaking about natural temperature differentials some operators are taking this swimming pool analogy we used earlier Very literally, right?
00:10:25: They really are.
00:10:26: Jonathan Joseph and Lydia Sewell discussed a major push to move compute directly into the water.
00:10:32: instead of bringing water to the servers they're bringing the servers to the water.
00:10:36: oh man
00:10:37: yeah.
00:10:38: China recently deployed at thirteen hundred ton underwater data center off the coast of Hainan by utilizing the natural infinite heat sink of The Deep Ocean.
00:10:54: Meanwhile, over in Europe the Netherlands is actively exploring floating dated tenors on their canals.
00:10:59: So what
00:11:00: does this all mean?
00:11:00: We've been trying to cool down an elephant with a hand fan and AI just looked at our blueprints said why don't we let it go for a swim?
00:11:07: Exactly!
00:11:08: We
00:11:08: are literally submerging thousands of pounds of highly sensitive electronics into the ocean.
00:11:12: Just keep them running.
00:11:13: It sounds like science fiction.
00:11:15: But there also brilliant terrestrial solutions that do not require submarines.
00:11:20: right?
00:11:20: Yes absolutely.
00:11:22: Petri Nikki offered a phenomenal practical solution from Finland that completely rethinks the data center's relationship with the community.
00:11:30: The Atnorth Data Center in Espoo is operating as a local energy hub, A
00:11:35: Local Energy Hub?
00:11:36: Yeah!
00:11:36: This was one of my favorite developments.
00:11:38: instead of viewing all that waste heat as a liability to be vented out into atmosphere like those heat islands we talked about earlier.
00:11:45: they capture it.
00:11:46: They route that thermal energy to a nearby grocery store, providing all its heating needs and saving two hundred tons of CO₂ annually.
00:11:55: That perfectly illustrates the shift toward the circular energy model.
00:11:58: Energy is never destroyed it's only transferred.
00:12:01: so let's transfer somewhere useful.
00:12:03: The future of sustainable digital infrastructure means integrating waste heat recovery from day one.
00:12:09: Right
00:12:09: you turn a massive liability into public utility.
00:12:12: But as crucial is these physical infrastructure redesigns are, the better cooling, underwater pods and grocery store heating.
00:12:18: they aren't the whole solution because Better Plumbing doesn't excuse sloppy code.
00:12:22: Precisely We have to address software running inside those buildings how we measure it And most importantly why deploy.
00:12:30: Right now, developers are spinning up massive models to perform incredibly trivial tasks simply because the tools are available.
00:12:38: To fix this we need rigorous standards and measurement.
00:12:41: Daniele Valko in Navin Bellani highlighted a major milestone on this front.
00:12:45: The Green Software Foundation has officially ratified the SCI for AI specification.
00:12:50: Let's unpack that jargon for a second.
00:12:52: SCI stands for software carbon intensity.
00:12:56: It is way to score how green a piece of code actually is.
00:12:59: But why does this specific ratification matter so much right now?
00:13:02: Because it maps directly to CSRD requirements.
00:13:05: That's the Corporate Sustainability Reporting Directive, a sweeping new law in EU.
00:13:09: Right!
00:13:09: The big compliance push
00:13:10: Exactly For anyone not deep-in the compliance weeds.
00:13:14: This means tracking carbon is no longer voluntary greenwashing.
00:13:18: If you do business in Europe You are legally required To report your emissions across the entire supply chain.
00:13:24: By ratifying the SCI for AI, organizations finally have a standardized universal ruler to measure those invisible inference emissions we talked about earlier.
00:13:33: And once you can measure it ,you could finally optimize it which brings us to the concept of frugal AI.
00:13:39: James Martin and Maria Gabriel pointed out that We don't always need a sledgehammer to crack a nut.
00:13:44: right now companies are using massive large language models Models trained on the entirety of the internet.
00:13:50: just to parse an internal HR spreadsheet
00:13:52: It's the equivalent of using an eighteen-wheeler to deliver a single pizza.
00:13:56: Martin and Gabriel note that smaller, highly efficient purpose built AI models can cut energy consumption by upto ninety percent without losing any performance for those specific narrow tasks.
00:14:07: it is entirely about right sizing.
00:14:08: the tool
00:14:22: isn't enough if the AI is being deployed for a fundamentally useless task.
00:14:27: Yes, it forces.
00:14:30: If you are deploying an AI model to optimize and industrial energy grid or discover new battery materials that has high impact, the massive energy cost is justified by their return society.
00:14:42: Exactly!
00:14:43: But if you're burning megawatts of power to generate endless AI selfies Or automate spam emails No amount liquid cooling makes it worth carbon emitted.
00:14:54: It's
00:14:54: a question of governance And challenges.
00:14:56: how we fundamentally view this technology pushed back incredibly hard against a narrative champion recently by Sam Altman.
00:15:04: Altman claimed that AI is essentially utility, just like water or electricity and therefore the massive infrastructure buildout is
00:15:11: justified."
00:15:12: Yeah!
00:15:12: And DeCosta points out how dangerous that framing really is... First, AI is largely controlled by private monopolies not public entities.
00:15:20: but more importantly physical realities are completely different.
00:15:23: Let's look at traditional utilities.
00:15:26: We still have Roman Aqueduct standing today.
00:15:29: The US electrical grid was built to last fifty-to a hundred years.
00:15:34: An AI utility is billed on GPUs.
00:15:37: Due the intense heat and blistering pace of technological obsolescence, AI hardware effectively dies and must be completely replaced every three year.
00:15:45: Three
00:15:45: Years?
00:15:46: That's a staggering contrast!
00:15:48: So when we call it a Utility We are completely masking the brutal reality its physical depreciation.
00:15:55: Every three years millions of servers packed with cobalt, copper silicon and rare earth metals mined across the globe manufactured in Taiwan.
00:16:03: And shipped worldwide become e-waste
00:16:05: exactly.
00:16:06: it requires a constant vicious cycle resource extraction.
00:16:09: so
00:16:09: you cannot compare three year hardware life cycle to one hundred your water reservoir.
00:16:13: no You can't trigger.
00:16:14: AI isn't just about sourcing renewable power sinking at data center into the ocean.
00:16:18: It means maximizing the business value per compute unit through incredibly disciplined software design and strict governance choosing not to scale blindly just because you can.
00:16:28: This raises a huge question for you listening right now, Just Because Your Team Can Use A Massive LLM For A Basic Administrative Task Does The Actual Business Value Of That Task Justify the Real World Carbon Cost.
00:16:42: Are You Deploying A Supercomputer To Write A Calendar Invite?
00:16:45: That is the exact Question Every Tech Leader Needs To Be Asking And It Perfectly Sets Up Our Final Thought For this Deep Dive.
00:16:53: Anna Lerner-Nesbitt offered an incredibly sharp insight on how corporate structures need to adapt.
00:16:58: Oh,
00:16:58: this is a great point.
00:17:00: She argues that chief sustainability officers the CSOs must stop operating as siloed compliance officers.
00:17:06: They have to step up and become strategic allies to the CTO and the CFO.
00:17:10: The technical the financial and the sustainable wings of the company basically had to merge.
00:17:15: So if you head back to your desk today we want to leave you with this thought to mull over.
00:17:19: Look at your own organization structure right now.
00:17:21: Are your sustainability teams and software engineering teams actively collaborating?
00:17:26: Or are they living in totally different worlds?
00:17:28: Yeah, do even speak the same language!
00:17:30: Right because until those two departments share that exact same dashboard looking for both code performance and carbon emissions True Green ICT is going to remain just a buzzword.
00:17:42: It is a complex, rapidly evolving landscape.
00:17:46: But understanding the physical mechanics of this technology and having these hard conversations about governance.
00:18:06: Thank you so much for taking this deep dive with us today.
00:18:09: Absolutely, don't forget to hit subscribe.
New comment