Best of LinkedIn: Sustainability & Green ICT CW 36/ 37
Show notes
We curate most relevant posts about Sustainability & Green ICT on LinkedIn and regularly share key takeaways.
This edition focuses extensively on the imperative for sustainability within the technology sector, particularly concerning Artificial Intelligence (AI) and data centres. Multiple authors stress the need for green software practices, noting the high energy cost of popular programming languages like Python and advocating for carbon-aware computing to align workloads with clean energy availability. A significant portion of the text addresses data centre infrastructure, covering the challenges of soaring energy and water consumption, the approval of new sites without Environmental Impact Assessments, and innovative solutions like offsite manufacturing, liquid cooling, and using renewable energy (solar, wind, and even underwater facilities) to achieve net-zero operations. Finally, several experts discuss the importance of transparency in environmental reporting for AI systems, highlighting the ethical debate between prioritizing patient safety over sustainability in healthcare AI and providing actionable strategies for responsible AI innovation.
This podcast was created via Google NotebookLM.
Show transcript
00:00:00: This episode is provided by Thomas Allgaier and Frennus, based on the most relevant LinkedIn posts about sustainability in green ICT in CW-Thirtysix and Thirty-Seven.
00:00:09: Frennus supports ICT enterprises with market and competitive intelligence, decoding green software developments, benchmarking emerging standards, tracking regulatory shifts, and analyzing competitor strategies.
00:00:21: Our mission today is a deep dive into the trends defining green ICT right now.
00:00:25: What we saw across the sources from calendar weeks thirty-six and thirty-seven wasn't just, you know, more talk about sustainability goals.
00:00:32: It felt like a real tactical shift toward measurable execution.
00:00:35: Exactly.
00:00:36: It's that transition from just having a vision to actually gaining velocity, I think.
00:00:40: The industry seems past the if we should be sustainable.
00:00:43: Now it's firmly focused on the how.
00:00:45: How do we actually become efficient?
00:00:46: And that focus is sharpening across every layer, isn't it?
00:00:49: From algorithms and code right up to the physical data centers.
00:00:52: Absolutely.
00:00:52: So we've clustered the core insights for you into three key areas we saw emerging really strongly.
00:00:59: First, optimizing AI efficiency and the whole disclosure piece that comes with it.
00:01:04: Second, the actionable playbooks for green coding, what developers can actually do.
00:01:08: And finally, the, well, the complex reality of data center development and its critical link to the power grid.
00:01:14: Okay, let's unpack that first theme then.
00:01:16: AI efficiency.
00:01:17: It feels like the conversation around AI is definitely maturing.
00:01:21: Moving past just the idea of green AI.
00:01:24: It's really demanding, measurable results and frankly some operational constraints.
00:01:29: Yeah, that mindset shift is, I think, foundational.
00:01:32: Navin Balani really emphasized that sustainable AI is being viewed now as a genuine competitive advantage.
00:01:37: It's not just ticking a regulatory box anymore.
00:01:39: It's about building better, more efficient products.
00:01:41: Right.
00:01:41: And Angela Sorte helped clarify the terms, which is useful.
00:01:44: He noted the difference between eco-AI, that's the full lifecycle thinking, versus green AI.
00:01:50: Green AI being more specifically focused on immediate efficiency, transparency, and using renewables in the actual compute process.
00:01:57: And that urgency makes sense.
00:01:59: When you see the environmental cost figures, there was a Kepp Gemini report, Melissa Jarkin shared this, showing model training can generate up to one point six tons of CO two
00:02:08: per deployment
00:02:09: per deployment.
00:02:11: That's staggering.
00:02:13: But the flip side is that organizations putting green AI practices into action, they're seeing.
00:02:22: That's a huge incentive, both environmentally and economically.
00:02:26: But the integrity of those energy-saving numbers, well, it really hinges on standardized reporting.
00:02:31: And that's maybe the first big hurdle we're seeing.
00:02:33: Ah,
00:02:33: the reporting challenge.
00:02:34: Exactly.
00:02:35: Ryan Sholan did an interesting analysis of Google's recent Gemini prompt data disclosure.
00:02:40: It really highlighted a fundamental issue with the accounting methods used.
00:02:43: See, Google used market-based emissions factors in their main number.
00:02:47: OK, hold on.
00:02:48: For listeners, maybe not.
00:02:49: deep in carbon accounting.
00:02:50: What's the difference between market-based and location-based, and why does that distinction matter so much here?
00:02:55: It matters a lot because it can obscure the real, immediate environmental impact.
00:03:00: Market-based accounting lets companies use things like renewable energy credits, RECs, or offsets, so they can claim low, even zero emissions on paper.
00:03:10: even if the local grid powering their data center at that exact moment is, you know, running heavily on fossil fuels.
00:03:16: Right.
00:03:16: So it's about where the electrons actually come from at that time.
00:03:19: Precisely.
00:03:20: And Shullen found that if you switch to location-based factors reflecting the actual carbon intensity of the local grid right then, then the carbon emissions per prompt suddenly jumped.
00:03:30: How much higher?
00:03:31: Three point six times higher.
00:03:33: Wow.
00:03:33: Yeah, that huge difference shows exactly why we need standardized reporting.
00:03:38: Things like the Green Software Foundation's SCI standard may be adapted for AI to force genuine transparency.
00:03:45: You need to know the real footprint.
00:03:47: Transparency is definitely critical.
00:03:49: But we also saw an important ethical counterpoint raised, especially for sensitive areas like health care.
00:03:54: Ah, yes, Dr.
00:03:55: Muhammad Tafik's point.
00:03:56: Exactly.
00:03:58: He argued.
00:03:59: Quite compellingly, I thought, that for green AI in healthcare, you have to solve the foundational issues first.
00:04:06: Things like patient safety, tackling algorithmic bias, making sure the AI is explainable.
00:04:11: Those have to come before we purely prioritize shaving off grams of carbon from a diagnostic algorithm.
00:04:18: It's about getting the priorities right.
00:04:21: That's a really necessary note of caution, isn't it?
00:04:23: You can't sacrifice patient safety for efficiency gains.
00:04:26: But the pressure for efficiency is already building, which means developers need tools like right now to start addressing this.
00:04:33: That's why the practical research Trudy Barrow shared is so important.
00:04:36: What did that show?
00:04:37: Well, they compared different image generation models and found to get this up to a forty six X difference in energy consumption between them.
00:04:44: Forty six times just between different models doing similar tasks.
00:04:49: Incredible.
00:04:50: Right.
00:04:50: The immediate takeaway is clear.
00:04:51: Developers have to start optimizing their prompts.
00:04:54: Yes.
00:04:54: But also actively compare.
00:04:55: models based on energy efficiency, not just the quality of the output, before they deploy anything.
00:05:00: That kind of performance gap is just impossible to ignore.
00:05:04: And we also saw Tillman Hartwig sharing valuable research on energy efficient text classification inference, so the focus on finding these low impact methods is definitely ramping up.
00:05:14: And moving from the algorithms, sort of the brains, down to the code itself.
00:05:19: This brings us neatly into our second major theme.
00:05:22: green coding.
00:05:23: This is where a lot of those big efficiency gains can actually start.
00:05:26: Yeah, and the sources gave us a great starting point here from Gordon Unidicorn.
00:05:30: He made the simple but powerful point that green IT often stars right there at the code level.
00:05:36: He noted, for instance, that high level interpreted languages like Python can sometimes use ten times, even up to a hundred ten times, more energy than compiled languages like C or assembly.
00:05:46: Mostly because of that interpretation overhead, the constant translation step.
00:05:49: Exactly.
00:05:50: So that initial language choice, that's a massive operational decision with long-term energy consequences.
00:05:56: It really is.
00:05:58: But the insight there isn't necessarily that everyone must suddenly switch back to C or assembly for everything.
00:06:03: That wouldn't make sense.
00:06:05: It's more that when performance and energy efficiency are absolutely paramount, maybe in those foundational AI models we talked about, or large-scale infrastructure software developers need to consciously weigh development speed against that potentially huge, long-term energy cost of interpretation.
00:06:21: It's about making an informed trade-off.
00:06:22: Precisely.
00:06:23: And that idea of conscious choice is And really what Wilco Bergraff's work helps to formalize.
00:06:28: Right.
00:06:29: His contribution felt really key, providing actual guardrails.
00:06:32: Yeah.
00:06:33: He introduced four fundamental laws for sustainable software.
00:06:37: And these are important because they move the practice beyond just vague intentions into like actual codified design principles.
00:06:45: Let's hear them.
00:06:46: Okay.
00:06:46: So the first three are the minimal negative impact law.
00:06:49: basically do the least harm necessary to achieve the required function.
00:06:53: Makes sense.
00:06:54: Then the sustained process law focused on longevity, designing things to last, avoiding that cycle of planned obsolescence.
00:07:02: Yeah.
00:07:02: And the full resource utilization law, this one's powerful.
00:07:05: The most sustainable product is often the one you already own.
00:07:08: So reuse, repurpose, upgrade instead of just replacing.
00:07:11: I like that one.
00:07:12: And the fourth law, the holistic impact awareness law, that feels like the connective tissue, right?
00:07:16: It stops us from just optimizing one part while ignoring the wider impact.
00:07:20: Exactly.
00:07:21: Like you could optimize your code brilliantly, but if the data center needed to run it causes massive water consumption issues locally, you haven't achieved a true net win.
00:07:30: It forces that system wide view.
00:07:32: And we saw some great examples of this holistic thinking in practice.
00:07:36: Mathias Hendricks shared initiatives from Rabobank using automated systems.
00:07:40: Oh yeah.
00:07:40: What were they doing?
00:07:41: Things like automatically downscaling virtual machines during off-peak hours.
00:07:45: Simple but effective.
00:07:47: And using something called green analytics to schedule really heavy compute workloads based specifically on when renewable energy is most abundant on the grid.
00:07:55: Now that's proper carbon aware scheduling in action.
00:07:58: Tying operations directly to green energy availability.
00:08:01: Definitely.
00:08:02: And on the architecture side, Peter Filippovich introduced a new responsible software architecture framework.
00:08:09: And significantly, it includes green software explicitly as one of its four key pillars.
00:08:14: So sustainability is baked in right from the foundational design phase, not just bolted on later.
00:08:19: It feels like external validation and maybe some enforcement mechanisms are starting to catch up too.
00:08:24: Anita Schütler highlighted that the Blue Angel eco label, which is quite established in Germany for products, is now actually available for software.
00:08:32: Ah, interesting.
00:08:33: So providing measurable, verifiable sustainability criteria for software itself.
00:08:38: Exactly.
00:08:39: Moving beyond just marketing claims about being green to something potentially certifiable.
00:08:44: And beyond formal certification, there's also the idea of the subtle nudge.
00:08:49: Middle Carbadia came up with a concept for a green action AI agent.
00:08:53: What would that do?
00:08:54: The idea is it would send personalized feedback, maybe emails or notifications, to users based on their online behavior, gently nudging them toward more eco-friendly choices, like suggesting greener shipping options or more sustainable products.
00:09:09: Huh, interesting.
00:09:10: Using AI to encourage greener habits.
00:09:13: OK, so we've covered the algorithms, the AI, and the code itself.
00:09:16: Let's shift focus now to the physical backbone, the data centers, and all the infrastructure.
00:09:22: The consensus here, really articulated well by Max Loretta, seems to be that the traditional metric we've all used for years, PUE, power usage effectiveness, just isn't cutting it anymore.
00:09:32: Yeah,
00:09:32: PUE feels increasingly narrow, doesn't it?
00:09:35: It's basically just an energy in versus energy used by IT equipment ratio.
00:09:39: But the impact of the modern data center is so much broader now.
00:09:42: It involves power, sure, but also water usage, heat generation, economic effects on the community, societal impacts, regulatory hurdles.
00:09:49: It's a much more complex picture.
00:09:51: It is.
00:09:51: And that's why Loretta proposed this concept of the sustainable data center tetrahedron, trying to integrate all those factors, power, water, heat, along with the economic, societal, and regulatory dimensions into one holistic model.
00:10:05: We need that complex view, especially because of the rising friction we're seeing between data centers and the communities around them.
00:10:12: That friction is becoming very real.
00:10:14: Mark Butcher and Caroline Donnelly were both highlighting the urgent need for mandatory environmental impact assessments, EIAs for any new large facility.
00:10:23: Why
00:10:23: mandatory?
00:10:24: Because without that upfront assessment and community consultation, you risk these huge facilities just consuming vast amounts of local power and water, potentially straining resources and leading to real conflict with residents.
00:10:37: We've seen headlines about that already in some places.
00:10:39: Exactly.
00:10:40: And the political dimension is heating up too.
00:10:42: No one got our pointed out something quite telling.
00:10:45: California lawmakers actually killed a bill recently, a bill that would have required data centers in the state to publicly disclose their energy and water usage.
00:10:53: Wow.
00:10:54: They killed it.
00:10:55: Why?
00:10:55: It exposes that core tension, doesn't it?
00:10:58: Corporate desire for data privacy versus the public's right or need to know about the consumption of shared resources like power and water.
00:11:06: It's a tricky balance.
00:11:07: Yeah,
00:11:08: definitely a space to watch.
00:11:09: But despite that friction, the innovation in building and locating these centers is accelerating like crazy.
00:11:15: It really
00:11:16: is.
00:11:16: We're seeing more push towards offsite manufacturing or OSM.
00:11:20: Rachel Johnson-Scheibaut was advocating for this.
00:11:22: How does that help sustainability?
00:11:24: It allows for faster, more standardized, almost industrialized buildouts.
00:11:29: Often greener because you can control materials and waste much better in a factory setting compared to a traditional construction site.
00:11:36: Makes sense.
00:11:37: And location itself is becoming a core strategy, right?
00:11:39: Absolutely.
00:11:40: Christian Goldsmith highlighted northern Germany's coastal cities as emerging hubs.
00:11:44: Why?
00:11:45: access to natural cooling from the sea and often better access to wind power, smart geography.
00:11:50: But then you get the really radical designs.
00:11:53: Enrique Gomishie has shared news about China's underwater data center.
00:11:56: Underwater, off the coast of Shanghai, yeah.
00:11:58: Apparently, it's powered almost entirely by offshore wind, like, ninety-seven percent.
00:12:02: And the big wind is using seawater directly for cooling, which they say cuts power use for cooling by about thirty percent.
00:12:10: That's tackling the heat and power problems head-on, literally.
00:12:13: It is.
00:12:14: And then down in Tasmania, Paul Ma reported on Firmis and NVIDIA building a dedicated green AI factory campus.
00:12:21: What's interesting there is they're leveraging immersion cooling, literally sinking the hard hardware and cooling fluids combined with Tasmania's abundant renewable power.
00:12:30: So high performance AI is actually driving the need for these ultra efficient localized cooling solutions.
00:12:37: Moving way beyond just traditional air conditioning.
00:12:39: It seems to be.
00:12:40: But the ultimate layer of complexity, maybe the most critical piece is grid integration.
00:12:45: How these massive power consumers interact with the electricity grid.
00:12:48: Ron Vokun had a great update on this.
00:12:50: Google partnering directly with utilities in places like Indiana and Tennessee.
00:12:54: Yeah, setting up demand response programs.
00:12:57: Was that mean in practice?
00:12:58: It means Google agrees to temporarily reduce the power consumption of its data centers during periods of really high grid demand like a heat wave when everyone turns on their AC.
00:13:08: This helps ease the overall strain on the grid for everyone.
00:13:12: So the data center becomes less of just a drain and more of a potentially flexible asset for the grid.
00:13:19: Exactly that.
00:13:20: It's a systems view.
00:13:21: And the potential impact is huge.
00:13:23: As Seem Hussein noted that even a tiny annual load reduction across the board, say just .
00:13:29: could potentially unlock something like ninety-eight gigawatts of spare grid capacity globally.
00:13:34: Ninety-eight gigawatts?
00:13:36: That's enormous.
00:13:37: It shows the scale we're talking about.
00:13:39: Data centers aren't just passive consumers anymore.
00:13:41: They're massive controllable loads that could actually provide stability back to the grid.
00:13:45: That's a fundamental shift in thinking.
00:13:47: It is.
00:13:47: But, and this isn't important, but we have to add the critical context provided by Ann Curry here.
00:13:52: She offered a necessary caution about how we implement this load shifting, this carbon-aware computing.
00:13:57: What was her concern?
00:13:58: The point
00:13:59: was that the primary trigger for shifting workloads shouldn't just be the carbon intensity of the electricity at that moment.
00:14:06: It really needs to involve signals from the grid itself, things like dynamic pricing or terrace that reflect grid stress.
00:14:13: Why the distinction?
00:14:14: Because she argues that just shifting load based purely on carbon intensity data, without considering the real-time state and economics of the grid, could potentially destabilize local energy markets, or ironically, end up doing more harm than good to the overall energy system.
00:14:30: Ah, okay.
00:14:31: So it needs to be intelligently integrated with how the grid actually operates, not just based on a single carbon metric.
00:14:38: Precisely.
00:14:38: It's that kind of nuanced systems level thinking the industry really needs as we connect these huge energy consumers more dynamically to the grid, understanding all the trade-offs.
00:14:48: Absolutely.
00:14:49: Well, this whole deep dive, it really shows a clear, and I'd say necessary, movement.
00:14:54: It's towards measurable execution, towards holistic thinking across all these areas we discussed.
00:14:59: Yeah, from demanding real AI disclosure and transparency, to codifying software efficiency principles, and finally, mandating smarter grid integration for data centers, it's all connected.
00:15:10: If you enjoyed this deep dive, New episodes drop every two weeks.
00:15:15: Also check out our other editions on cloud, digital products and services, artificial intelligence and ICT and tech insights, health tech, defense tech.
00:15:24: And before we wrap up, maybe just one final thought to leave you with, it draws on Wilco Berger's full resource utilization law that we mentioned earlier.
00:15:33: If the most sustainable product truly is the one you already have.
00:15:37: Take a moment and reflect.
00:15:38: What's the most sustainable digital product or maybe piece of infrastructure that you currently own or manage?
00:15:44: And thinking about that, what's the best, most impactful way you could extend its value?
00:15:48: or maybe repurpose it right now instead of replacing
00:15:51: it?
00:15:51: That's a great challenge to take back to your teams.
00:15:53: Thank you for joining us for the deep dive.
00:15:55: Be sure to subscribe so you don't miss our next analysis of the most relevant trends defining your industry.
New comment