Best of LinkedIn: Sustainability & Green ICT CW 04/ 05

Show notes

We curate most relevant posts about Sustainability & Green ICT on LinkedIn and regularly share key takeaways.

This edition explores the strategic convergence of green ICT and digital innovation, emphasizing the urgent need to manage the escalating energy and water demands of artificial intelligence and data centres. Industry leaders advocate for a shift from mere marketing to rigorous engineering, highlighting how efficient workload design, circular hardware procurement, and innovations like wooden data centres can significantly mitigate carbon and water footprints. While AI offers potential for optimizing complex systems like microgrids, experts warn that its rapid expansion risks overwhelming local utilities and perpetuating reliance on fossil fuels, with some regions seeing data centres compete for limited power. The collection outlines a transition toward measurable responsible AI, where frameworks like Software Carbon Intensity (SCI) and transparent reporting tools enable organizations to treat environmental impact as a core architectural constraint alongside performance. Ultimately, the insights call for a "proportional intelligence" approach, urging a resilient digital economy that balances the benefits of technological scaling with the physical limits of planetary resources.

This podcast was created via Google NotebookLM.

Show transcript

00:00:00: This episode is provided by Thomas Allgaier and Frennis, based on the most relevant LinkedIn posts about sustainability and green ICT in CW four and five.

00:00:09: Frennis supports ICT enterprises with market and competitive intelligence, decoding green software developments, benchmarking emerging standards, tracking regulatory shifts, and analyzing competitor strategies.

00:00:22: And, you know, if you've been thinking that sustainability in tech is still just about, say, planting trees or putting a recycling bin in the server room.

00:00:30: The last couple of weeks are going to be a serious reality check.

00:00:33: The whole vibe has changed.

00:00:34: It really has.

00:00:36: Welcome to the deep dive.

00:00:37: We've been sorting through this huge stack of insights from calendar weeks four and five.

00:00:41: And the main takeaway isn't really optimism.

00:00:43: It feels more like... I don't know, engineering gravity.

00:00:46: Engineering gravity is the perfect phrase for it.

00:00:48: For a while now, we've been in this kind of honeymoon phase with green AI.

00:00:51: You know, lots of glossy PDFs, lots of vague promises to be net zero by some far off date like twenty forty.

00:00:58: Right.

00:00:58: But looking at the sources from these two weeks, all that marketing fluff is just evaporated.

00:01:03: We're hitting hard physical walls.

00:01:05: We're talking about the actual nuts and bolts now, the grid, the water pipes, the actual architecture of the chips.

00:01:12: It feels like the industry suddenly woke up and remembered that the cloud actually runs on the ground and that ground has limits.

00:01:18: Exactly.

00:01:19: The conversation is no longer about saving the planet as a slogan.

00:01:23: It's about keeping the lights on.

00:01:25: It's about not crashing the entire electrical grid while we try to scale this massive AI boom.

00:01:31: It's shifted from a moral issue to an operational survival issue.

00:01:35: Okay, so we've got a lot to get through.

00:01:37: Everything from the ethics of casual AI use all the way to wooden data centers.

00:01:42: But I want to start with the mindset.

00:01:44: Because before we fix the machines, you usually have to fix how we think about them.

00:01:48: There was a post from Vinit Derya that really set the tone.

00:01:51: Right.

00:01:52: Dery's point was blunt and honestly pretty necessary.

00:01:55: He's making the case that green computing is no longer a choice or some nice to have feature.

00:02:00: It's a responsibility.

00:02:01: But he frames it as a business issue, right?

00:02:03: It's about efficiency.

00:02:04: Precisely.

00:02:05: If your code is bloated and inefficient, you're not just burning carbon, you're burning money.

00:02:11: It's about doing more with less so you don't hit a growth ceiling.

00:02:14: That makes sense.

00:02:16: But what I found so interesting was seeing how that high-level idea filters down to, well, us, the people actually using these tools.

00:02:24: Did you see that post by Monika B. on casual AI?

00:02:28: I did, and I have to admit, I felt a little called out by that one.

00:02:31: I

00:02:31: think we all did.

00:02:32: It's a very relatable problem.

00:02:33: She's talking about this habit we've all picked up of treating AI like it's free, you know?

00:02:38: I'll write a simple email, paste it into chat GPT and say, make this sound more professional.

00:02:44: And then you do it again?

00:02:45: No.

00:02:45: Make it friendlier.

00:02:46: Exactly.

00:02:47: Three or four times for two sentence email.

00:02:49: And I don't think twice about it.

00:02:51: And that's Monika's whole point.

00:02:52: We're totally blind to the supply chain behind that prompt.

00:02:56: You don't see the water that's cooling the servers.

00:02:58: You don't see the electricity spiking at a data center somewhere.

00:03:00: We

00:03:00: just treat it as casual.

00:03:02: But multiply that casual query by millions of users doing it dozens of times a day.

00:03:09: and suddenly you have a massive unstructured waste of resources.

00:03:13: Her advice

00:03:13: wasn't to stop using AI, though.

00:03:15: It was more about being intentional.

00:03:16: Yes, it's about better prompt crafting.

00:03:19: About asking yourself, do I really need a global supercomputer to rephrase the sentence?

00:03:24: Or can I just, you know, reread it myself?

00:03:27: It's a shift from... AI by default to AI on purpose.

00:03:32: Right.

00:03:33: But it's interesting because that puts the pressure on the user, and let's be realistic, we can change our habits.

00:03:38: But we're not going to solve a global energy crisis just by proofreading our own emails.

00:03:43: The heavy lifting has to happen under the hood.

00:03:45: Oh, absolutely.

00:03:45: We can't just hack our way out of this with user behavior.

00:03:49: And that brings us right to the engineering side.

00:03:51: Michael Eidman had this fantastic post that really dug into the architecture.

00:03:56: He basically called on the industry to stop bragging about trillion parameter models.

00:04:00: Which is just a mine is bigger than yours contest.

00:04:02: Real fleet.

00:04:03: He's saying stop that and start fixing the plumbing.

00:04:05: He brought up something called the von Neumann bottleneck.

00:04:08: Okay, that sounds like a spy movie title.

00:04:10: For those of us who aren't computer scientists, can you unpack that a little?

00:04:13: What's the bottleneck?

00:04:14: Sure.

00:04:15: So in classic computer design, the von Neumann architecture, you have the processor, the brain, And it's separate from the memory.

00:04:23: So to do any calculation, you have to go fetch data from memory, move it over to the processor, do the math, and then send the result back.

00:04:29: And that trip costs energy.

00:04:31: A

00:04:31: huge amount of energy.

00:04:32: In fact, just moving the data often costs more energy than actually doing the calculation.

00:04:38: That's the bottleneck.

00:04:39: Eidman is saying, if we want truly green AI, we have to break that cycle.

00:04:44: It's like living next to your office instead of commuting for two hours.

00:04:48: Exactly that.

00:04:49: And he also talked about fractional model use.

00:04:52: I loved that term.

00:04:53: Me too.

00:04:53: It's the other part of the puzzle.

00:04:55: Right now, if you ask a huge language model a simple question, the whole thing sort of wakes up.

00:04:59: It's like waking up the entire faculty of a university, just to ask them what two plus two is.

00:05:04: Just don't

00:05:04: wake up the whole brain for a simple task.

00:05:06: Yes.

00:05:07: Fractional use just means you only activate the specific little slice of the model you actually need.

00:05:13: It's just efficiency by design.

00:05:15: And speaking of brains, Alex Wilmez highlighted some work from Julie Grolier that takes this idea even further.

00:05:21: This is where we get into almost sci-fi territory with neuromorphic computing.

00:05:26: This is really exciting stuff.

00:05:27: Neuromorphic just means brain-shaped.

00:05:29: They're literally designing hardware that mimics the physical structure of the human brain's synapses.

00:05:36: And the claim is that this kind of architecture could use a hundred to a thousand times less energy.

00:05:42: A

00:05:42: thousand times.

00:05:43: That's a game changer.

00:05:44: It really is.

00:05:46: Wilma is called it intelligence.

00:05:47: That is energy sober by construction.

00:05:50: Instead of forcing our current tech to be efficient, you're building sobriety in from the ground up.

00:05:54: I like energy sober.

00:05:55: It sounds a lot better than power hungry.

00:05:58: And this all connects to what Navin Balani was writing about with proportionality.

00:06:02: Balani just nailed it.

00:06:03: He says efficiency isn't about minimalism.

00:06:05: It's about using the right tool for the job.

00:06:08: You don't use a cannon to kill a fly.

00:06:10: If you have a task that a simple Python script can solve, you shouldn't be spinning up a massive generative AI model to do it.

00:06:17: So we've got the software people calling for smarter code, hardware people calling for smarter chips.

00:06:23: But even if we get a hundred times more efficient, the sheer scale of what's being built is running headfirst into a massive physical wall.

00:06:31: Let's talk about the grid.

00:06:33: This is the part that keeps infrastructure planners up at night.

00:06:37: We had Jacques Grove pointing out that the main bottleneck for AI isn't even chips anymore.

00:06:42: For the past two years, all you heard was, where can I get more GPUs?

00:06:45: Right,

00:06:45: the supply chain crisis.

00:06:47: Yeah, they have the chips.

00:06:48: They just can't find a place to plug them in.

00:06:50: Grove highlighted a huge move by Google.

00:06:53: They spent nearly five billion dollars to buy an entire energy company.

00:06:57: An energy company.

00:06:58: Intersect power.

00:06:59: Yeah.

00:07:00: Yeah.

00:07:00: Think about what that signals.

00:07:01: Google isn't an energy company, but they know the public grid is moving too slowly for their ambitions.

00:07:07: They can't wait five or ten years for the local utility to build a new substation.

00:07:11: So.

00:07:11: They're just doing it themselves.

00:07:13: The tech giants are becoming their own power plants.

00:07:15: And they have good reason to be worried.

00:07:18: Robert Barbieri shared some data from PJM.

00:07:20: That's the biggest grid operator in the US.

00:07:23: The numbers are, well, they're terrifying.

00:07:26: They really are.

00:07:27: PJM is warning that the projected demand from new AI data centers could overwhelm the power supply for sixty seven million people.

00:07:35: Hold on.

00:07:36: Let me just repeat that.

00:07:37: Sixty seven million people.

00:07:38: That's the entire population of the UK.

00:07:41: it's an astronomical amount of draw on the system.

00:07:45: Barbieri's point is that this isn't some far-off risk, it's an immediate reliability risk.

00:07:51: We could be looking at a scenario where you have to choose between keeping the data centers online or keeping the lights on in people's homes.

00:07:57: Rolling blackouts.

00:07:58: It's a real possibility if this isn't managed carefully.

00:08:01: That puts a pretty dark spin on innovation.

00:08:04: But Barbieri did suggest a solution, didn't he?

00:08:06: Something that doesn't involve building more power plants.

00:08:09: He did, and it comes back to efficiency.

00:08:11: He points out that something like seventy to eighty percent of the world's computing capacity is just sitting idle at any given moment.

00:08:17: It's dormant.

00:08:18: So like an Airbnb for computer power.

00:08:21: That's a great way to put it.

00:08:22: Instead of building another huge data center that stresses one part of the grid, why not just network all that idle capacity?

00:08:29: use the computers that are already plugged in and doing nothing.

00:08:32: It's a decentralized approach.

00:08:34: Makes sense on paper, at least.

00:08:36: Now, energy is one physical limit.

00:08:39: But there's another one that gets people even more emotional, and that's water.

00:08:43: And looking at the sources, we have a bit of a conflict here.

00:08:46: We really do.

00:08:46: A genuine data clash.

00:08:48: On one side, you have Paul Canty.

00:08:50: He shared a post trying to... bust the myth of AI water consumption.

00:08:55: What's his argument?

00:08:56: He's pointing to data that suggests AI data centers account for less than point two percent of public water usage in the US.

00:09:03: So his take is basically everyone calm down, agriculture uses way more water.

00:09:08: This is a drop in the bucket.

00:09:09: I mean, zero point two percent does sound tiny.

00:09:11: If I just saw that number, I'd probably move

00:09:13: on.

00:09:13: It does.

00:09:14: But then you get Nola the guard who counters this argument perfectly.

00:09:18: She says that looking at global or national percentages is completely misleading because water is a strictly local resource.

00:09:25: Right.

00:09:26: You can't ship water across the country like you can electricity.

00:09:29: Exactly.

00:09:29: It doesn't matter one bit if the national average is tiny.

00:09:32: if in your specific drought stricken town, the new data center is sucking up forty percent of the local reservoir.

00:09:40: Goddard says being a tiny share globally is irrelevant to the community that might lose its drinking water.

00:09:46: That's a really crucial distinction.

00:09:47: It's about local stress, not global averages.

00:09:50: And the trend line is not good.

00:09:53: Anushka Kalra pointed out that by twenty twenty-eight, AI data center water consumption could increase by eleven times.

00:10:00: Eleven

00:10:01: times.

00:10:01: Yes.

00:10:02: So even if it's a small number now, that growth curve is practically vertical.

00:10:06: Wow, okay, so it's a heavy topic, but we did see some genuinely creative solutions popping up.

00:10:10: I really like the example George West got shared about the deal between AWS and Rio Tinto.

00:10:15: Oh, this is a classic circular economy play and it's brilliant.

00:10:18: It's a trade-off.

00:10:19: AWS builds data centers which need a massive amount of copper wiring.

00:10:22: Rio Tinto mines copper.

00:10:24: So AWS agrees to buy low carbon copper from Rio Tinto.

00:10:29: And in return, Leo Tinto agrees to use AWS's AI tools to make their mining operations more efficient and less carbon-intensive.

00:10:38: It's, I'll give you the tech to mine better, you give me the metal to build the tech.

00:10:43: That's so smart.

00:10:44: It locks in the supply chain for both of them.

00:10:46: Then you have Justin Rice reporting on what's happening up in British Columbia.

00:10:50: BC Hydro, the utility there, has basically stopped giving away power connections.

00:10:55: They are making new data centers audition.

00:10:58: for grid access.

00:10:58: Audition.

00:10:59: What does that even mean?

00:11:00: It means you don't just get to plug in because you have the money.

00:11:03: You have to prove you're efficient.

00:11:05: You have to show them your plans for heat recovery like using the waste heat from your servers to warm up nearby buildings or swimming pools.

00:11:11: So if you can't prove you're a good citizen on the grid,

00:11:14: you don't get access.

00:11:16: It's forcing innovation through regulation.

00:11:18: That's fascinating.

00:11:20: And speaking of innovative building, we have to talk about car raids post on wooden data centers.

00:11:24: This was one of my favorites from the past two weeks.

00:11:28: We always picture data centers as these concrete bunkers, right?

00:11:32: Or steel warehouses.

00:11:34: But Rabe is making the case for using engineered wood.

00:11:36: Wood!

00:11:37: For a high-tech building full of hot electronics?

00:11:40: I mean, isn't that a fire risk?

00:11:42: You'd think so, but engineered timber is surprisingly fire resistant.

00:11:46: But the main point is the carbon.

00:11:48: Concrete and steel have massive embodied carbon.

00:11:52: Just producing them releases tons of CO, too.

00:11:55: Wood, on the other hand, stores carbon.

00:11:57: So instead of the building being a source of carbon, the building itself becomes a carbon sink.

00:12:01: Precisely.

00:12:02: You are sequestering carbon in the very frame of the facility.

00:12:05: It completely changes the equation.

00:12:07: I

00:12:07: love that.

00:12:08: A high-tech wooden fortress storing the world's data.

00:12:11: It feels like this perfect blend of the past and the future.

00:12:14: Okay, so let's pull back for a second.

00:12:16: We've covered mindset, architecture, energy, water, and building materials, but there's a missing piece.

00:12:21: How do we know if any of this is actually working?

00:12:24: That brings us to our third thing.

00:12:27: Measurement.

00:12:28: Right.

00:12:28: You can't manage what you don't measure.

00:12:30: And for the longest time, green IT was the Wild West.

00:12:33: A company would say, we're green.

00:12:35: And you had no idea if that meant they bought renewable energy credits or if they just, you know, recycled their soda cans.

00:12:41: But these past two weeks saw some really big steps towards standardization.

00:12:45: Wilco Bergraf reported on something called the SCI for AI.

00:12:49: And this is huge.

00:12:51: SCI stands for software.

00:12:52: carbon intensity.

00:12:54: And the key thing is, it's a specification that has now been officially ratified.

00:12:59: It's a standard.

00:13:00: It gives you a formula to calculate the carbon emissions of a piece of software.

00:13:04: So it's a ruler.

00:13:05: It's a ruler, exactly.

00:13:06: It means we can finally stop guessing.

00:13:08: We can compare model A to model B and say, objectively, this one is more carbon intensive per task.

00:13:14: And once you have a ruler, you can build tools.

00:13:16: John Rid from DreamPixie released a cloud region scorecard that I found incredibly practical.

00:13:22: It's probably the single easiest win for any company listening to this.

00:13:26: Rid's data shows that the carbon intensity of the cloud varies wildly depending on where the physical servers locate.

00:13:31: You actually rank them, right?

00:13:33: Yeah.

00:13:34: A decreated cloud region is ten to fifteen percent more carbon intensive than an A plus region.

00:13:40: So if I'm a CTO, I can cut my carbon footprint by fifteen percent just by clicking a different option in a drop-down menu when I set up my server.

00:13:48: Zero engineering effort, zero extra cost, just... geography.

00:13:53: It allows companies to make these immediate low-effort decarbonization choices.

00:13:57: The transparency is spreading everywhere.

00:14:00: Even the legal world is getting in on this.

00:14:03: Christian Barch from the law firm Bird & Bird mentioned something really interesting.

00:14:07: The lawyers

00:14:07: are coding now.

00:14:08: They actually built a Green Claims AI scanner.

00:14:11: Okay, what does that do?

00:14:12: It scans companies' public marketing materials and it checks them against regulations to spot potential greenwashing.

00:14:18: It's a tool to help their clients avoid getting sued for making false sustainability claims.

00:14:23: That tells you everything.

00:14:25: When the lawyers are building AI tools to police sustainability claims, you know the regulatory hammer is about to fall.

00:14:31: It's the sign that sustainability is moving from a nice to have PR move to a core legal compliance issue.

00:14:39: And that completely changes the role of the people in charge.

00:14:43: Anna Lerner-Nesbitt had this great insight about the chief sustainability officer, the CSO.

00:14:48: She said they need to stop being the moral compass.

00:14:50: Right.

00:14:51: The moral compass phase is over.

00:14:53: We don't just need a cheerleader for the planet and the boardroom.

00:14:57: She argues the CSO now needs to be... the systems architect.

00:15:01: Systems architect.

00:15:02: Yes,

00:15:03: they need to be the CTO's best friend and the CFO's closest ally.

00:15:07: They need to be able to look at a server layout and explain why it's both a financial risk and a carbon risk.

00:15:12: They have to speak the language of engineering and money.

00:15:15: not just morality.

00:15:16: It's about full integration.

00:15:18: You can't be the person in the corner shouting about trees anymore.

00:15:21: You have to be in the server room explaining why the von Neumann bottleneck is hurting quarterly earnings.

00:15:25: Exactly.

00:15:26: It's the only way this really moves forward.

00:15:28: So pulling all of this together, we've gone from casual AI use to massive grid constraints, from a standardized ruler to wooden data centers.

00:15:36: What's the big picture here?

00:15:38: I think the big picture is maturity.

00:15:40: The industry is finally growing up.

00:15:42: We're moving away from slogans and into hard engineering.

00:15:45: We're seeing standardized measurements like SEI, economic models like the cloud scorecards, and physical adaptations like Tuck companies becoming energy companies.

00:15:54: It's becoming a serious, quantified business discipline.

00:15:57: It feels like we're finally treating digital waste as real waste.

00:16:01: We are, but there's always a but.

00:16:04: Here we go.

00:16:05: This deep dive wouldn't be complete without one last provocative thought for you to chew on.

00:16:10: And this one comes from Will Nordberg.

00:16:12: He brings up the idea of agentic AI.

00:16:15: Agentic AI, so AI agents that can act on their own.

00:16:18: They don't wait for a prompt.

00:16:19: They just go out and do tasks.

00:16:20: Right.

00:16:21: We're heading towards a world where you might say, plan my vacation, and an army of AI agents just goes out and negotiates prices, books, flights, checks calendars, and interacts with other agents to get it done.

00:16:32: OK.

00:16:33: Nordberg asks a really terrifying question.

00:16:36: If these agents increase our production speed by a hundred times, are we also about to multiply our digital waste and energy consumption by a hundred times?

00:16:44: That is a haunting thought.

00:16:46: If I have an army of AI agents working for me while I sleep, are they also burning down the grid while I sleep?

00:16:52: It's the classic paradox of efficiency.

00:16:55: When you make it cheaper and faster to do something, history shows we don't do the same amount for less.

00:17:01: We just end up doing a lot more of it.

00:17:03: Well, on that slightly terrifying but very important note, I think we'll wrap up this deep dive.

00:17:08: It's definitely something to think about before you hit enter on your next

00:17:10: prompt.

00:17:11: If you enjoyed this episode, new episodes drop every two weeks.

00:17:14: Also check out our other editions on cloud, digital products and services, artificial intelligence and ICT and tech insights, health tech, defense tech.

00:17:24: Thanks for listening and keep questioning the cost of your compute.

00:17:27: And don't forget to subscribe.

00:17:28: We'll see you next time.

New comment

Your name or nickname, will be shown publicly
At least 10 characters long
By submitting your comment you agree that the content of the field "Name or nickname" will be stored and shown publicly next to your comment. Using your real name is optional.