Best of LinkedIn: AWS re:Invent 2025

Show notes

We curate most relevant posts about Digital Transformation & Tech on LinkedIn and regularly share key takeaways.

This edition provides a comprehensive review of trends identified following AWS re:Invent 2025, which focused heavily on the practical deployment of applied AI coupled with rigorous enterprise execution. This summary, derived from community posts, positions Bedrock as the standard production path for developing GenAI features, emphasising tight integration with SDLC practices to lift developer productivity. Foundational services like SageMaker govern MLOps, while EKS and Lambda remain preferred compute choices for modern event-driven architectures. A significant portion addresses the critical nature of identity and data guardrails, treating zero-trust security as mandatory for successful GenAI adoption. The document details how partners and core services like Redshift and DynamoDB align to support scalable, secure AI applications across various industries.

This podcast was created via Google NotebookLM.

Show transcript

00:00:00: provided by Thomas Allgaier and Frennis based on the most relevant posts on LinkedIn about the AWS re-invent, twenty twenty five.

00:00:07: Frennis enables enterprises with market technology and competitive intelligence for portfolio and strategy development.

00:00:14: Welcome to the deep dive.

00:00:16: Every year re-invent is a spectacle of massive scale.

00:00:21: But what makes twenty twenty five different is the shift in focus.

00:00:24: It wasn't about if a technology works anymore.

00:00:27: It was really about how you put it to work.

00:00:29: quickly and securely.

00:00:30: Exactly.

00:00:30: And for professionals, you know, the noise of Las Vegas can be deafening.

00:00:34: So our mission here is to distill the core strategy shift.

00:00:37: Right.

00:00:38: We've extracted the most crucial insights from all the high signal content shared on LinkedIn, and the takeaway is, well, it's undeniable.

00:00:45: This year was the reckoning where applied AI met disciplined execution.

00:00:49: That phrase discipline execution, that is key.

00:00:51: I think Alison Conrad noted the industry is officially past experimentation.

00:00:55: Yeah, way past

00:00:55: it.

00:00:56: We're deep into a period of true reinvention.

00:00:58: We are looking at measurable outcomes, what's becoming the new enterprise standard, not just a nice to have.

00:01:04: So we've organized the most important trends for you across three big interconnected themes.

00:01:09: that really defined re-invent.

00:01:11: twenty-twenty-five.

00:01:12: Okay.

00:01:13: First, the massive push toward generative AI in production, specifically agentic AI.

00:01:20: Second, the necessary complement, security and compliance in this new AI era.

00:01:26: Of course.

00:01:27: And finally, the fundamental services, the compute, serverless, and DevOps that enable all this speed and scale.

00:01:34: Let's dive right into that first theme then because it absolutely dominated the conversation.

00:01:38: The term agentic AI was, well, it was inescapable.

00:01:42: Everywhere.

00:01:42: When leaders like Carla Wong and Rian VanVeldhausen posted their takeaways, the focus was all on scaling impact.

00:01:49: And to clarify for you, the listener, we're talking about more than just a chatbot here.

00:01:53: Right.

00:01:53: It's not just answering questions.

00:01:54: No.

00:01:55: Agentic AI is about autonomous systems that execute complex, multi-step tasks.

00:02:00: Think of a personal assistant that doesn't just find an answer, but uses multiple tools to complete a whole business process.

00:02:06: So it's a true workflow executor, not just a static LLM.

00:02:10: And if that's the new standard, the question is, what's the default path to get there?

00:02:14: And the answer from Reinvent was overwhelmingly Bedrock.

00:02:18: No surprise there.

00:02:19: Absolutely.

00:02:20: Bedrock is emerging as that default path to production for these assistant and agent patterns.

00:02:26: And not just because it hosts models.

00:02:28: Why then?

00:02:29: Because of its deep emphasis on integrating the entire software development lifecycle, the STLC tooling, it's about making it easy for developers to govern, deploy, and monitor these complex agents, which, you know, just dramatically boosts productivity.

00:02:43: And we saw a fantastic concrete evidence of this impact, the partnership with Lyft.

00:02:48: That's the perfect example of real transformation.

00:02:51: Yes,

00:02:51: Shriella Prolu detailed that one.

00:02:53: They built an intent agent using Claude through Amazon Bedrock.

00:02:57: And this handles the really complicated support request.

00:03:00: Exactly.

00:03:00: And the numbers are just staggering.

00:03:02: They reported an eighty seven percent reduction in average resolution time.

00:03:05: Eighty seven percent.

00:03:06: Wow.

00:03:06: And on top of that, a seventy percent growth in driver usage in twenty twenty five.

00:03:11: That kind of reduction is phenomenal.

00:03:13: It could completely shift the economics of customer support.

00:03:17: But if the models and platforms are this good, what's the new bottleneck for the average enterprise?

00:03:22: That is the critical question.

00:03:24: And the answer, according to the builders on the ground, is context.

00:03:28: Context.

00:03:29: Ginny Sahi had a great line.

00:03:31: She emphasized that AI fails because the context is weak, not necessarily because the model itself is

00:03:36: weak.

00:03:37: That makes so much sense.

00:03:38: The consensus is that the new frontier isn't really model.

00:03:42: It's context engineering, that tricky process of reliably connecting the model to your own proprietary up-to-date data.

00:03:51: Jugal Ancolio was very clear on this.

00:03:53: Context is the whole game

00:03:54: now.

00:03:54: So the competitive advantage moves from who has the biggest model to who has the cleanest, most accessible data layer for their agents, which naturally creates a huge demand for new talent.

00:04:05: How is AWS addressing that skills gap?

00:04:08: Well,

00:04:08: they're responding by updating how skills get validated.

00:04:11: Mabuya Magagala highlighted new micro-credentials like the AWS Agentech AI demonstrated badge.

00:04:17: Oh, what's different about that?

00:04:18: This

00:04:18: isn't a multiple-choice test.

00:04:20: It validates a builder's skills by focusing on solving real-world timed problems.

00:04:25: It's about demonstrated capability, which is what hiring managers actually want to see.

00:04:29: A

00:04:29: very smart, practical approach.

00:04:32: And speaking of practical tools, Amazon Q is everywhere, positioned as providing practical help for business users and engineers.

00:04:39: Right.

00:04:39: And we saw partners racing to smooth that path for enterprise deployment, too.

00:04:43: Like

00:04:44: who?

00:04:44: Try-ins, for instance.

00:04:45: They showcase their Concierto platform.

00:04:47: I think Prashant Pavaraju mentioned their promise of a revolutionary, get this, zero migration path to enterprise agentic AI.

00:04:56: A zero migration path sounds ambitious, almost too good to be true, given how complex enterprise data is.

00:05:03: It is ambitious, but the fact that vendors are even attempting to eliminate that complexity just underscores how urgent the demand has become, which brings us very neatly to our second massive theme.

00:05:14: Security.

00:05:14: Right.

00:05:15: When you accelerate adoption at that kind of pace, security has to accelerate even faster.

00:05:19: Security first patterns, disciplined execution, they're no longer optional.

00:05:23: They're the prerequisite.

00:05:24: Exactly.

00:05:25: Identity and data guardrails for Gen AI are now without a doubt.

00:05:30: table stakes.

00:05:31: It's a whole new threat vector to think about.

00:05:32: We're not just securing endpoints.

00:05:34: We're securing prompts.

00:05:35: We're securing the output.

00:05:37: And the major vendors are stepping up.

00:05:38: Fernando Cardoso detailed a huge announcement from Trend Micro.

00:05:42: Okay.

00:05:43: The launch of the Trend Vision One AI security package.

00:05:47: This is full stack AI risk management.

00:05:49: And crucially, it includes a continuous AI scanner.

00:05:53: Yeah, what does

00:05:54: that do?

00:05:54: It's designed to detect these new AI native threats, things like prompt injection, data poisoning, hallucination risks, all the way from build to runtime.

00:06:03: That

00:06:04: runtime focus is so important.

00:06:05: That's where the real world attacks happen.

00:06:07: For

00:06:08: sure.

00:06:08: And to get the community up to speed on this, we even heard about some, well, surprisingly engaging...

00:06:13: training methods.

00:06:13: I think I know what you're going

00:06:14: to say.

00:06:15: I love this example.

00:06:16: Alara Barancil ran a bedrock guardrails builders fair session where they turned AI security into an interactive beer pong style game.

00:06:23: That's brilliant.

00:06:24: Isn't it?

00:06:25: Each cup represented a real guardrail, like content filters or sensitive information filters.

00:06:31: It takes this complex governance idea and makes it memorable.

00:06:35: It's genius.

00:06:36: Making compliance engaging is half the battle.

00:06:39: But security isn't just about protection.

00:06:41: It's also about Resilience Recovery when things go wrong.

00:06:46: Absolutely.

00:06:47: We saw a strong link there.

00:06:48: Kynicle announced that Cutover is an inaugural launch partner for the new AWS Resilience software competency.

00:06:55: And their focus is on runbooks.

00:06:56: AI-powered runbooks, yeah.

00:06:58: If you think about it, as systems get more complex with all these agents, you need really sophisticated AI-assisted tools to reduce downtime and confidently recover when there's a disruption.

00:07:07: So AI isn't just the thing we have to secure.

00:07:10: It's also the tool we use to become more resilient.

00:07:12: Exactly.

00:07:13: We also saw a lot of focus on the underlying infrastructure.

00:07:16: Zero trust principles are now being strongly connected to securing the serverless and container estates that actually host these agents.

00:07:23: And the data itself, I assume.

00:07:25: And the data itself, protecting those high-performance AI data sets is critical.

00:07:29: You see specialized solutions like file security for Amazon FSX for NetApp.

00:07:36: That data is the secret sauce.

00:07:38: It has to be locked down.

00:07:40: That moves us perfectly into our third theme.

00:07:42: The foundational tech that makes all this scale possible.

00:07:46: The speed and complexity of a genetic AI demand a whole new level of compute.

00:07:51: This

00:07:51: is where the rubber meets the road.

00:07:52: If theme one is the application and theme two is governance, theme three is the enablement layer.

00:07:57: And platforms like EKS and Lambda are doubling down as the preferred scalable paths for these microservice patterns.

00:08:04: Speaking of Lambda, there was an announcement that really addresses a long-standing trade-off in serverless.

00:08:09: Why not both?

00:08:11: moment as I did in Modi put it.

00:08:13: You're

00:08:13: talking about Lambda managed instances.

00:08:15: Exactly.

00:08:15: This is a huge deal for performance.

00:08:17: Pratamesh Rangdal detailed that it gives you the simplicity and serverless model of Lambda, but with an EC-II level performance profile.

00:08:24: And why now?

00:08:25: Because agentic AI workflows often involve really complex low latency, high throughput event processing, you need that serverless simplicity and robust performance.

00:08:35: This bridges the gap.

00:08:37: So no more choosing between performance and operational simplicity for those high-demand AI apps.

00:08:42: What about developer experience?

00:08:44: We saw some improvements there, too.

00:08:45: Absolutely.

00:08:46: In the container space, new EKS capabilities are offloading the management headache for tools like Argo, CD, ACK, and Crow.

00:08:54: So developers can just focus on code.

00:08:56: That's the idea.

00:08:57: Vijay Kodon pointed out that by managing these tools for them, AWS lets developers genuinely focus on the business logic.

00:09:04: And for any engineer who's ever tried to debug, IAM permissions.

00:09:07: Oh,

00:09:07: don't get me started.

00:09:08: The launch of IAM policy autopilot is a game changer.

00:09:11: It

00:09:12: really is.

00:09:12: The tool just generates the IAM policies automatically by analyzing your code.

00:09:16: It figures out the minimum required permissions.

00:09:19: Which

00:09:19: cuts down on risk and developer friction.

00:09:21: A huge win.

00:09:22: A major win for simplifying that security posture we talked about earlier.

00:09:26: OK, let's pivot to data architecture.

00:09:30: Anglicous Glows highlighted the increasing importance of certain data formats.

00:09:35: Yes, iceberg tables.

00:09:37: They are really cementing their place as a game changer in the lake house architecture.

00:09:41: For our listener, what does that mean in practical terms?

00:09:44: It brings governed transactional flexibility to your data lake.

00:09:49: It allows different analytical engines to access the same data reliably, which is crucial for modern AI apps that need that kind of flexible backbone, often still powered by proven services like Redshift and DynamoDB.

00:10:01: We also saw that modernization efforts are now explicitly tied to getting ready for AI, the announcement of AWS Transform.

00:10:08: which need to be shared.

00:10:10: For

00:10:10: full-stack Windows modernization.

00:10:12: Right.

00:10:12: It's all about helping enterprises decouple their legacy apps and data.

00:10:16: Which, as we establish, is the prerequisite for feeding that good strong context into these new ageing use cases.

00:10:22: And finally, let's look outside the AWS bubble for a second.

00:10:25: There was a key networking announcement highlighted by Zim IaaS.

00:10:28: The preview of AWS Interconnect multi-cloud.

00:10:31: That is a huge strategic move.

00:10:33: It is.

00:10:34: It offers a managed connection between AWS and other major providers like Google Cloud.

00:10:39: It's a first-of-its-kind managed connection for high-speed private networking.

00:10:44: And that's just acknowledging the reality that most enterprises are multi-cloud.

00:10:48: Exactly.

00:10:48: It simplifies that interconnection, which is essential for complex distributed apps, including, you know, federated AI systems.

00:10:55: Which all feeds back into the health of the whole ecosystem.

00:10:58: Jay McBain cited comments from Matt Garmin, noting that partners are more central than ever.

00:11:03: And the multiplier effect is staggering.

00:11:06: Partners are potentially driving an incredible seven dollars in revenue for every one dollar spent on AWS.

00:11:13: The seven to one ratio.

00:11:14: It just tells you the partner opportunity is truly as big as it's ever been.

00:11:19: All these new tools, AI agents, security guardrails, managed compute, are creating massive business opportunities for the whole ecosystem.

00:11:26: It's a really compelling picture of integration and scale.

00:11:29: And that brings us to our final thought for you to consider.

00:11:32: So with all these new tools, Bedrock Agent Core, IAM Policy Autopilot, Kiro all focused on making developers, as Vaneswari Supramani put it, too productive.

00:11:44: too productive.

00:11:45: We have to ask, if the bottleneck for delivery shifts entirely away from writing code and toward defining context, setting guardrails and enforcing governance, how should organizations recalibrate their internal measures of success?

00:11:58: It's a complete shift in what we define as high value work.

00:12:01: A profound question worth wrestling with as these technologies mature.

00:12:05: If you enjoyed this deep dive, new episodes drop every two weeks.

00:12:09: Also check out our other editions on cloud insights, sustainability in green ICT, digital products and services, health tech, defense tech, ICT and tech insights and artificial intelligence.

00:12:20: Thank you for joining us and we'll catch you on the next deep dive.

00:12:23: Be sure to subscribe so you don't miss a thing.

New comment

Your name or nickname, will be shown publicly
At least 10 characters long
By submitting your comment you agree that the content of the field "Name or nickname" will be stored and shown publicly next to your comment. Using your real name is optional.