286: I Can Sum Up 2024 - AI AI AI AI and uhh… ML

Episode 286 January 02, 2025 00:59:39
286: I Can Sum Up 2024 - AI AI AI AI and uhh… ML
tcp.fm
286: I Can Sum Up 2024 - AI AI AI AI and uhh… ML

Jan 02 2025 | 00:59:39

/

Show Notes

Welcome to episode 286 of The Cloud Pod – where the forecast is always cloudy! Welcome to the final show of 2024! We thank you for joining us on our cloud journey over the past year. During this last show of the year, we look back on all the tech that changed our jobs and lives, and make predictions for an AI filled 2025. Join Justin, Jonathan, Ryan, and Matthew as they look forward to even more discussions about undersea cables. Happy New Year! 

Titles we almost went with this week:

A big thanks to this week’s sponsor:

We’re sponsorless! Want to get your brand, company, or service in front of a very enthusiastic group of cloud news seekers? You’ve come to the right place! Send us an email or hit us up on our slack channel for more info. 

General News

00:31   2024 Predictions Look Back

Matt

02:07 Matthew – “How is it simpler and easier? I think that there are more ways to run it. The general public has an easier way to access it. And they are simpler as Justin said that they are becoming easier and more efficient and better to use for the average user. So I know that I talked to many people that I work with now and just in general and people that are not in tech, which I feel like a year ago.”

Jonathan

02:07 Jonathan – “Well, there is a religion called the First Church of Artificial Intelligence, but it’s been around for longer than this year. I think it’s like five, six years old at this point. So that’s kind of cheating.

Ryan

02:07 Ryan – “I mean, agentic AI is something that’s been rolled out in a lot of companies. I know in my day job, it’s been rolled out. I hope to see this get even stronger and more obvious just because I think that, you know, the days of searching through thousands of documents or the one, you know, unmaintained team page that someone built three years ago when they were new are over. And so I’d like to see this continue.

Justin

  1. LLM will hit the trough of disillusionment either on Cost, Environmental impact or people realizing how limited these models are
  2. Another AI model other than Transformer based
  3. We will see another large defector from Public Cloud (not 37 Signals or X/Twitter)

13:26 Justin – “I feel partially vindicated that I was sort of right, just I thought we didn’t be in the trough a little faster, but maybe it’s coming still. I don’t know. they’re innovating pretty quickly. I don’t think they’ll get there, but definitely environmental is going to become a big, big conversation around AI.”

17:02  Favorite Story of 2024

Did you remember that Gemini wasn’t a thing in 2023? It feels like it’s been around forever. 2024 saw some serious jumps forward in tech and innovation, as well as a lot of quality of life improvements overall. But here’s a quick rundown of our favorite articles from the past year: 

Ryan

Introduction of RAG into the AI models

https://aws.amazon.com/blogs/aws/knowledge-bases-for-amazon-bedrock-now-supports-amazon-aurora-postgresql-and-cohere-embedding-models/

Matthew

https://aws.amazon.com/about-aws/whats-new/2024/11/amazon-cloudfront-vpc-origins/ 

Jonathan

Open AI Sam Altman Drama

Justin

Announcing Humboldt, the first cable route between South America and Asia-Pacific

Other 2024 things of note:

32:11   2025 Predictions

Ryan

Matthew

Jonathan

Justin

45:01 Justin – “I just feel like their innovation curve has definitely slowed down where I still see Claude and Gemini and Alibaba you mentioned. They’re all innovating quite a bit and I would not be shocked to see the market shift.”

53:36 Jonathan – “That was kind of, that was going to be one of my predictions, but I couldn’t really quantify it in a way which would be measurable to win the point. I think there’s obviously a need for tons of data. I’m not going to say that we’re running out of data exactly, although the quality is a bit questionable, but I think access to data is going to be super important. And I didn’t know how to turn that into a prediction, but like when I, when I go to Safeway and buy my groceries, I want a way to get my, my like receipt electronically, so that I can plug that into an AI. So then I go to do my groceries, my AI will know what’s in my pantry. And if I say, what can I cook to eat today? And it can be like, well, you’ve got this stuff. Why don’t you make this? I just think there’s so many places where access to data would make life easier for a consumer. And right now, it’s very asymmetric. Safeway or Albertsons have access to all the data. They can market the shit out of me because they know exactly what I buy, when I buy – patterns of all kinds of stuff, but I have none of that. So I want to see some of that asymmetry go away and I want access to the data that other people have about me.”

50:27 And since we suck at predictions, here are other experts who may also suck:

 

Closing

And that is the week in the cloud! Visit our website, the home of the Cloud Pod where you can join our newsletter, slack team, send feedback or ask questions at theCloud Pod.net or tweet at us with hashtag #theCloudPod

View Full Transcript

Episode Transcript

[00:00:00] Speaker A: Foreign. [00:00:06] Speaker B: Welcome to the Cloud pod where the forecast is always cloudy. We talk weekly about all things aws, GCP and Azure. [00:00:14] Speaker C: We are your hosts, Justin, Jonathan, Ryan and Matthew. [00:00:18] Speaker A: Episode 286 recorded for the week of December 23rd, 2024 and actually really just the rest part of December because we're not recording until January. I can set up 2024 AI, AI, AI, AI and ML. Yeah, it's that time of year where we don't look at any cloud news and we just try to predict the future and we do a poor job at it, but people seem to enjoy it. So let's see how we did in 2024. Up first, 2024 predictions were a little bit of hit or miss. We. So let's start with Matt here. Matt said that simpler and easier to access LLM with new services would come out, which, I mean I could, I could say, yeah, that happened. Ish. You know, like it's definitely easier to run models on your local machine with things like LM Studio or ChatGPT for all. There's definitely been lots more open source ones. You know, Claude and ChatGPT and Gemini, they've all gotten easier to use and have memory now because we're trying to go Westworld on, you know. So I, yeah, I, I think you got this one, Matt. [00:01:28] Speaker D: I think this wasn't originally where I was thinking of it. I was thinking like a very simplified like API endpoint on AWS and other clouds. But you know, overall the LLM and the concept and the viability for everyone to use it definitely did increase and you know, the barrier to entry is much lower than it was one year ago than than of today or whenever we actually made the predictions. So I'll take the point. [00:01:54] Speaker B: How, how is it simpler and easier to use than it was last year? [00:01:57] Speaker D: Well, don't ask questions. It's Friday after. [00:01:59] Speaker B: No, no, no, I'm. I'm gonna after lose after letting Ryan win points from me, from me giving him things. [00:02:07] Speaker C: I believe I was just. [00:02:11] Speaker D: How is it simpler and easier? I think that there are more ways to run it. So it's, it's. The general public has an easier way to access it and they are simpler as like Justin said that they are becoming easier and more efficient and better to use for the average user. So I know that I talk to many people that I, you know, I work now and you know, just in general and people that are not in tech, which I feel like a year ago more of these things were targeted at tech people. Now more of the General population is using a lot more for their day to day work. That's where I think it's easier and simpler. [00:02:49] Speaker B: That's why I'm just checking that you didn't mention the thing that I'm going to mention later. Sneaky. [00:02:55] Speaker D: What was that? Thank you, Jonathan. You should really not record these after work on a Friday, guys. [00:03:02] Speaker C: No, it never gets. [00:03:04] Speaker A: I'm trying to give you guys the week off next week for Christmas. [00:03:07] Speaker C: Oh, no, we're not really complaining. [00:03:08] Speaker A: I know. All right, Matt, your second prediction was kubernetes will become simpler for smaller companies to operate that doesn't require highly paid DevOps SL scientists, which I still chuckle at. Scientists. You want you to argue for that one because I'm not going to on that one. [00:03:26] Speaker D: No, no, I don't. [00:03:30] Speaker A: Jonathan. Ryan, do you think anything has improved in this space? [00:03:32] Speaker B: I did not. [00:03:35] Speaker A: The container man has thoughts. I can see it on his face. [00:03:38] Speaker C: Yeah, well, I do have thoughts, but I, you know, nothing's swaying me to where I can say yes. Like, yeah, little bits here and there. [00:03:46] Speaker A: Yeah. The only thing I even kind of came to mind a little bit is there was a lot of development in making node management easier, but I don't really think that makes anything easier to develop on it. I think that's still complicated. [00:03:59] Speaker D: Yeah, that's the only thing I was thinking of, like Carpenter. Is that what it was called on AWS with a K? Maybe? [00:04:07] Speaker A: Yeah, Carpenter. [00:04:10] Speaker D: But those are just quality of life improvements and it's not like the massive difference that, you know, I was, I guess, hoping for, but I don't think we're ever gonna get there. [00:04:21] Speaker A: All right, well. And then your third and final choice was low employee churn rates and increased tenure. Quiet quitting. [00:04:28] Speaker D: See, if I was smart, I would have prepped for the podcast and researched a little bit more of this to get real data points. [00:04:33] Speaker A: But that would be very out of character for us here, I would say. I think there has been, I mean, from my experience as a leader, there definitely has been less churn in employees because I don't think there's been as many opportunities for them to go elsewhere. But I haven't seen the quiet quitting part, at least in my teams. [00:04:55] Speaker D: Yeah, I don't think I have either. I mean, I think just in general, the tech market in the US at least has softened and jobs aren't as prevalent and you know, people are less likely to job jump in that way. But I agree. You know, I was thinking about this more. It was the quiet quitting phenomenon. People were really talking about the end of last year where I think that's. [00:05:18] Speaker A: Yeah, I think that disappeared pretty quickly. It was because I think people got to the point where they were concerned they were gonna get laid off and not be able to find something. And so that kind of kicked the quiet quitting thing to the curb. [00:05:28] Speaker D: Yeah. [00:05:29] Speaker C: Yeah. [00:05:30] Speaker A: All right, Jonathan, your first one was there will be mass layoffs in tech directly attributed to AI in Q1 of 2024. And you said 10,000 or more. So, I mean, there was a lot of layoffs in Q1. You know, if I go to the layoffs, FYI, there was 34,000 in January. There was 15 to 16,000 in February, and in March there was 8,000. But I don't know that I would say any of them were directly saying it was caused by AI at that time. But I also don't remember that far back. So you're going to bring evidence on this one. [00:06:06] Speaker C: There was a couple, just to throw an interesting wrench in that, like, there was a couple announcements where they didn't lay off because of AI took their jobs. They laid off swaths of people. Like, I think Cisco did this and one other company I'm thinking about, they pivoted their business to take advantage of AI, so they needed a, you know, different people, which I thought was really interesting. And it was like, well, is that the same or is it not? Right? Like, it's. [00:06:35] Speaker A: Yeah, but also that was in Q. That was in August that Cisco laid off. And they said they laid off 7% of their workforce as it shifts focused to AI and cyber security, which I think was really more than actually interesting enough. Salesforce just announced they're hiring 3,000 peoples to help sell AI agents for salespeople, which I find it funny that their whole pitch is get these more AI agents to help your sales team get rid of people, but you need more people to sell it. We can. [00:07:02] Speaker B: That was. That was a weird thing. [00:07:03] Speaker A: So I don't know. What do you think, Jonathan? Do you think you got this one? [00:07:07] Speaker B: I don't. I can't prove that I got that one. And I think. I don't think the announcements were clear about the reasons for the layoffs other than the economy in general. I think we know that a lot of those positions are going to be replaced by AIs. But no, it wasn't any announcements. So I think I don't get the point. [00:07:27] Speaker A: Okay. All right. Someone with a start will start a cult that follows an AI. Ellen. God, believing in sentience and a Higher power, which I love this one, it's my favorite. I'm not aware of this, but you maybe. Jonathan, you have to talk about something. Something going on that you need to share. [00:07:44] Speaker B: Well, there is a religion called the First Church of Artificial Intelligence, but it's been around for longer than. So this year I think it's like five, six years old at this point. So that's kind of cheating. That was just the throwaway one. No, I don't get a point for that. [00:08:00] Speaker A: Okay. [00:08:01] Speaker B: Next year though. [00:08:02] Speaker C: Next year for sure. Tempted to make that one again? [00:08:06] Speaker B: Yeah, yeah, right. Right after Slack gets whatever. [00:08:11] Speaker A: And then your third one was AI will find a new home in education. Lessons, plans, personalized learning plans by student, etc. [00:08:18] Speaker C: Yep. [00:08:19] Speaker A: What do you feel with this one? [00:08:20] Speaker B: I will take the point for that. I think I should. I pasted a link in that announced either yesterday or today. I think AI started to be used in schools. There was a charter school that announced they're having two hours per day of customized AI driven tutoring for their students. [00:08:39] Speaker A: Thanks. [00:08:39] Speaker B: So I will take that one. [00:08:41] Speaker A: But I've also. I've also seen lots of tiktoks for teachers or like, here's how you can create a lesson plan or start a lesson plan. I've seen. Here's how you help your student with learning math and have AI explain math. So I, I think you 100% hit this one. If you have the article, you should post it in here. We'll put in the show notes that you saw yesterday. But I think you want it before even. Even that. [00:09:01] Speaker B: I think so. That was just a nail in the. In the coffin kind of thing. [00:09:05] Speaker A: Exactly. All right, Ryan, you're on the hot seat. So you predicted that start seeing the financial impact of AI to better profitability by using AI. Wrong. [00:09:19] Speaker B: Well, depends us, isn't it? You know, I mean, we. [00:09:23] Speaker C: You haven't really heard anyone really saying like, oh no, we've. We've restructured everything and this is finally profitable to do a thing. They've just started charging money for it just kind of recently. Really. And so I don't know. I think it's. I don't think I get a point for this one. [00:09:41] Speaker A: I mean, if that salesforce AI agent thing again comes works, that would count, I would think. But you're just a year too early. [00:09:46] Speaker B: Yeah, I agree. For your point though, I mean, it's not necessarily profitability of existing things, but there are plenty of companies like the market cap of companies that now sell AI products to consumers is in the billions. [00:10:03] Speaker A: The many billions and I wouldn't, I wouldn't give it to him. [00:10:07] Speaker C: Yeah, I don't think anyone would call it profitable. [00:10:11] Speaker A: I mean he's got other ones that he, he's got other stuff that's much better. Arguments for points do I, I don't. [00:10:18] Speaker C: Like, I don't like my other two either. [00:10:19] Speaker B: Well the second one you won. [00:10:21] Speaker A: So I give you that number two AI solutions tied towards new employee onboarding replacing wiki technology. [00:10:28] Speaker C: I mean agentic AI is something that's been rolled out in, in a lot of companies I know in my day job it's been rolled out. I hope to see this get even stronger and more obvious just because I think that you know the days of searching through thousands of documents or the one you know, unmaintained team page that someone built three years ago when they were new are over. And so I'd like to see this. [00:10:54] Speaker A: But also I think we saw, you know, we saw companies like Glean with their AI intranets. We saw you know, confluence at AI directly into it. You know I think AI is popping up all over the place that does help new employee onboarding and being able to ask questions of like of your data set. So I would definitely give you that one. [00:11:13] Speaker B: Yep. [00:11:15] Speaker A: Your third one was removal of stateful firewalls as traffic rule set. Next gen firewall. Next. Next. Next gen. Next gen firewall. I don't really know what you meant by this and I didn't bother to go listen to what you tried to argue this was but this is how we summarized it. So it's not good for you. [00:11:29] Speaker C: No, no. And I don't think like even the intent I was trying to you know, forecast the next thing which is a more sort of on demand traffic based rule set and the ability to sort of adapt to traffic. And I don't think this exists at all. [00:11:43] Speaker A: So I mean we definitely have seen Apollo Alto released some demo where of you know, things to help with writing rules and doing different things. I mean I think there's some stuff happening in this but yeah, I don't think it's been. [00:11:57] Speaker D: It's still staple firewalls where you know every, every traffic has to flow through a centralized point versus some sort of. [00:12:03] Speaker A: There's no smarts to the AI. Like AI is analyzing traffic and realizing this isn't valid like those type of things that isn't happening yet for sure. I mean, I mean, I mean you have to have GPUs and firewalls and that's not going to happen because you can't get them. [00:12:15] Speaker B: Yeah, yeah, well, but the thing is like all the Zero trust products so. [00:12:20] Speaker A: Well, they're not doing AI either in Zero trust products. [00:12:22] Speaker B: Well, they're not, but it eliminates the need for those fire rules because if you trust the provider now you don't. It's kind of moved security to the application layer much, much more than the network layer. [00:12:34] Speaker D: I was thinking of this as like egress firewalls in your VPC or Vita. Like that's where I thought Ryan was going with this, like getting rid of those. [00:12:43] Speaker C: Yeah, I mean, kind of, you know, more of a software defined model, but. [00:12:47] Speaker D: That'S also because it's the bane of my existence at times too, so. [00:12:51] Speaker A: All right, well let's go to my predictions where I, I hit zero. The opposite. So the first one was LLM will hit the trough of disillusionment either on cost, environmental impact or people realizing how limited these models are. So I don't think we hit the trough of disillusionment on any of that. But I do think people are raising environmental concerns about AI and I do think people are reason that the models have been somewhat limited or have limited applicability. But we're definitely not in the trough. So I feel partially vindicated that I was sort of right. Just I thought we didn't be in the trough a little faster, but maybe it's coming still. I don't know. They're innovating pretty quickly. I don't think they'll get there. But definitely environmental is going to become a big, big conversation around AI. [00:13:32] Speaker B: Well, I think we're in a bit of a bubble though because you know, we're using for tech things, we're using it for programming, I use it for some other stuff that's not tech things. But looking more broadly at the response to, you know, do OpenAI models and things or even Claude, which I have a really good experience with a lot of people making a lot of complaints that they just won't answer the questions. They're too limited, they're too censored. And I'm like, well that's, that's not my experience. But I'm not trying to do things like aggregating some kind of science data from whatever and you know, there's a lot of, I think it's going to be very use case specific. So I think there is a lot of disillusionment around some of these things for some people. [00:14:12] Speaker A: Sure. I mean again it's, you know, when you talk about the hype cycle, the trough disillusionment is typically, you know, a lot of people saying very negative things and then a few people figuring out their way out of the trough with amazing solutions. And I don't think we're in either of those positions yet. And also, I mean, you know, if you're talking about the broader market, I mean, the feedback on Apple Intelligence has been pretty lackluster as well. You know, customers, you know, naming people getting these things. I've actually asked my wife what she thinks of it, but I would imagine she's either not even touched those features or has been annoyed by them. And their summarization capabilities are somewhat limited. So. All right, number two, was there be another AI model that would not be built on the transformer model? So as you know, all the LLM models today are built on the transformer paper which came out in the 90s or maybe even late 80s. And I sort of felt like there has to be different ways because LMS are always be what they are today without coming up with different things. I think I'm just maybe 20 years too early. So I'm a visionary, but for 20 years from now. [00:15:11] Speaker C: 20 years, huh? [00:15:12] Speaker A: I don't know. I don't know if it'll be a long. I have no idea. I have, you know, Moore's Law says it'll probably be like tomorrow, but. [00:15:17] Speaker C: Yeah, exactly. [00:15:18] Speaker D: Well, just assume the paper was written and take the photo. There was. [00:15:23] Speaker A: Very. Well, there's probably a lot of papers out there, but just no one's been able to turn them into anything. I actually suspect that the next big leap in LM will probably come with a combination of LLM and Quantum, but that's still a couple years away. And then my last one was that there'll be another large defector. [00:15:37] Speaker B: Hang on a second, hang on. Don't do yourself another point for that just yet. Okay, so when, when you said model, did you mean a commercially available thing that you could log in and use that wasn't based on transformer? Because there are certainly papers that were published this year, I thought something that. [00:15:50] Speaker A: Would be showing up in commercially viable. [00:15:53] Speaker B: Okay, all right, that's fine. All right, no point for you then. [00:15:55] Speaker A: Yeah, yes, I'm sure there's a lot of papers, but the papers aren't valid until they have adoption or proof. [00:16:02] Speaker B: I guess I'm feeling the holiday spirit right now just trying to give these points. [00:16:07] Speaker A: I won re invent predictions. You guys can win on everything else this week. And then my last one is we'll see another large defector for public cloud There was a couple articles I went and researched. Someone thought maybe because we had talked about it and there was a couple articles, but there was no customers mentioned or companies that were actually doing it. It was all theoretical. And so there might be out there. I don't know who you are, but you haven't done a press release, so I can't take ones. So. But also, I still don't believe that it's a massive thing. So. So that's, that's our predictions. So, yeah, don't listen to me about what's going to happen in 2025 later. But, you know, and we'll see. We'll see where we end up, you know, next year around this time, once again, if we have better luck. All right, well, let's move on to favorite stories of 2024. And I went through 317 pages of notes that are from our weekly episodes to pick out things that I thought was interesting or I remembered. And I was really struck by a couple things. One, so much happened in 2024. Like, like, I, like, I wish I was like, wait, Gemini wasn't a thing in 2023? No. Bard got renamed to Gemini in 2024. I was, I was like. I thought that was forever ago. For example, did you start at the. [00:17:21] Speaker D: Top like I did, or did you start at the bottom? [00:17:22] Speaker A: I started at the bottom and worked my way to the future. [00:17:24] Speaker D: That was smarter. I. I had to, like, replay everything in reverse order. [00:17:28] Speaker A: Yeah, Amazon CEO, you know, left like, I feel like, you know, Matt's been CEO now for five years already. You know, that just happened this last year. You know, so, like the crowdstrike outages this year, you know, just Amazon started deprecating things. I mean, all these things. I feel like it's been happening for a while. But, you know, overall the trend was, you know, definitely a lot of AI in the announcements. I did try to get Llama to help me, you know, correlate some stuff, but when you have 317 pages, that's too big of a context window for any AI that I was able to find. And then I was starting into, like, how to build into a vector thing and that, you know, I got. Didn't have enough time. I needed more than a week to get there, the amount of time I have with my travels. So, yeah, you know, anyways, what was anything that struck you guys as you were reading through this? Your just general feelings before we talk about our favorites. [00:18:22] Speaker B: I think throughout CO, things really slumped, but 2024 just saw a massive acceleration of innovation. And technology. [00:18:33] Speaker A: Yeah, it definitely feels like these people saying RTO is a must to get innovation back or maybe wrong based on the results of 2024. [00:18:42] Speaker D: There was a lot of like you guys said innovation, but a lot of quality of life improvements that I feel like occurred in the past year too. [00:18:48] Speaker A: Yeah, a lot of long standing feature requests that we've had for Amazon services. We talked about it many times. I was looking through the notes like oh yeah, I can't believe they announced this. It's commented in our notes many times where like finally or yes, quality of life, that's big. [00:19:05] Speaker C: I mean I had a very similar, you know, experience going through the show notes. So it's like, oh yeah, it was Bard. You know, that was something I wrote down. You know, I think that you know, you know how far like AI AI sort of done a full circle where there was like that sort of. I think it was Gemini, like the WOKE AI, you know, issue where they had to shut down certain features where, you know, wouldn't produce images of people for a little while. [00:19:31] Speaker A: Oh yeah, remember. Yeah, that was fun. [00:19:34] Speaker C: You know, just so, and it's, you know, so many GPU machine types announcements, so many models and new model announcements. It was really crazy to read through and it does really feel like just a fire hose of AI. [00:19:50] Speaker A: Yeah, it's interesting you mentioned the woke AI thing because I was looking through Vogels predicted in 2024 for, you know, for his stuff and he had generative AI becomes culturally aware, which he was referring more to like languages get more support in LLMs, which I think that did happen. But I was thinking about the woke thing. I was like, well maybe he was wrong on that one a little bit. Yeah. So it was just sort of funny. All right, well we rolled before the thing and Ryan scored highest. So you get to tell us your favorite story of 2024 first. So you get the pick of the litter. [00:20:26] Speaker C: Oh, it's my favorite story. Oh no, I didn't, I didn't. [00:20:30] Speaker A: Well, general news, whatever we talked about on the show. [00:20:33] Speaker C: Yeah, yeah, I sort of generalized themes. I mean my favorite thing that happened this year would be sort of the, the, the introduction of, of RAG into the AI models. I really do think that having local and smaller context windows and, and, and also being able to leverage LLMs but also having the data be customized and local so you don't have to build everything in the model. It's going to continue. And I really, that was my favorite thing to see this year. [00:21:09] Speaker A: Leave it to Ryan to not follow the Instructions. [00:21:11] Speaker C: Nope. [00:21:13] Speaker D: I was like, wait, are you this wrong? [00:21:15] Speaker C: I can't follow recipes, I can't follow instructions. I never listen to my teachers. [00:21:20] Speaker A: Yeah, no, it's just I'll find a rag story because there are definitely a lot in the notes and I will post it here and I'll say that was your favorite. [00:21:27] Speaker C: So there we go. [00:21:28] Speaker A: All right, Matt, what was your favorite story of 2024? [00:21:33] Speaker D: Oh, hold on. I, I, I pulled like 10 out. So trying to figure out the exact one. Honestly, mine's a little bit more boring. It's just more just things that have bothered me in the past, you know, and a lot of the ones I pulled out are all those like quality of life improvements. But, you know, I really kind of just liked the cloudflare Origins release. I'll grab the link, but it just is a really good quality of life. It was a really big pain point for years. It still surprises me that they took so long to release what doesn't feel like a major feature, but, you know, or something that at least bothered me. [00:22:18] Speaker C: Yeah, there are a few announcements like that, but it was sort of like it. Yeah, they're not really big, you know, showy sort of things, but they're, you know, things that are bothersome as you're setting up and doing implementation or doing deployments or trying to adapt. You know, there's been several improvements into, you know, auto scaling of both containers and at the node level and you know, a lot of different improvements in terms of like, you know, connectivity, like different sources, being able to read from private IPs instead of all being public internal IPv6. You know, there's been a load of those types of things that just make life easier day to day, like you said. And then also enables, you know, some more sophisticated application architectures where YARN is limited. [00:23:10] Speaker B: There are a lot of cloud cost management tools out there, but only Archera provides cloud commitment insurance. It sounds fancy, but it's really simple. Archera gives you the cost savings of a one or three year AWS savings plan with a commitment as short as 30 days. If you don't use all the cloud resources you've committed to, they will literally put the money back in your bank account to cover the difference. Other cost management tools may say they offer commitment insurance, but remember to ask, will you actually give me my money back? Achero will click the link in the Show Notes to check them out on the AWS marketplace. [00:23:50] Speaker A: Jonathan, your favorite announcement of 2024. [00:23:55] Speaker B: I can have two. [00:23:56] Speaker A: You can have two. You definitely can have two. [00:23:58] Speaker D: Oh, I would have had multiple. [00:24:00] Speaker A: Then you gotta pick the one. I mean, we'll have honorable mentions. You only pick one for your actual official, but then you can talk with other ones here. [00:24:05] Speaker B: My actual official. I. I kind of want one that spans the very end of 2023, but into 2024, which was the ongoing OpenAI drama around Sam Alban being kicked out in November and then reinstated early in 2024. And then all the back and forth about who was going to be on the board and who wasn't going to be on the board and who's which companies gets which deals and which access to which models the chaos around the entire situation. I just found very interesting and very indicative that there is. That this is a huge, hugely impactful, culturally changing technology that we're working with and people are developing right now. And clearly everyone was like on a hair trigger and everyone freaked out. People are still freaking out. I just think it was just really indicative of how impactful AI is going to be for everybody. Yeah, but I have some honorable mentions. [00:25:10] Speaker A: We'll talk about those in a second. That's. Yeah. No, I actually looked at the. Sam, I was looking at some of those articles too, and I was like, yeah, that was kind of a crazy time and definitely a little bit unexpected, you know, for the time. So when I was going through these, I stumbled across a set of stories that just made me smile and reminded me how much fun we had talking about them. And now I have a mission for 2025 that I'm going to get somebody who's an expert in undersea cables to come talk to us, because we like to talk about it so much. And so I picked the. Announcing the Humboldt, the first cable route between South America and Asia Pacific. And just because it made me happy. And so, you know, at this end of this year, after the election and all the things happening in the world, the wars, happiness sounded good to me. So that's what I went with. Could have gone with, you know, Adam Slimsky leaving. Those are all interesting too. But, you know, Humboldt. There you go. [00:26:03] Speaker B: Awesome. That was very unexpected. [00:26:05] Speaker C: That's a good choice. Yeah, good choice. I'm jealous. [00:26:10] Speaker A: That's what happens when you go from the beginning of the document, from the largest part to the beginning, you get the good, you know, like. Oh, yeah, we talked about that several times. It was so much fun. And that's what I remember. [00:26:20] Speaker B: I started in the middle thinking that somebody would start from the top and somebody will start from the bottom and pick their stories before they got to the middle. [00:26:26] Speaker A: Well done. All right, Jonathan, now you give us your honorable mention. [00:26:31] Speaker B: A couple of honorable mentions. I think, you know, OpenAI just had their 12 days of OpenAI Christmas, whatever and a couple of days ago they announced 1-800-chat-GPT where anybody can call for free and chat and have a conversation in real time with ChatGPT over the phone. And that's just amazing. Wow. Reminds me of like movie phone from the 90s or something. I just, I really wanted to have the movie phone voice. [00:27:02] Speaker A: Yeah, I was, when I heard that I was like, I think was Moviefone. That was, that was exactly what came to my mind and I was just sort of like, yeah, okay, Movie phone for G Chat gbt. I'm in. [00:27:11] Speaker B: Yeah, I just thought that that was. That's just simply amazing. Talk about democratizing access to the technology when anyone can call from anywhere for free and have a 15 minute conversation with the smartest person around. Most likely. The other thing is I was, I was really pleased to see the Nobel Prizes for chemistry and physics being awarded for AI related things. Physics was, was for the original work on the Perceptron, I think. And then the, the chemistry one was Demis Hasabis for the deep minds. Often the work on protein folding. [00:27:48] Speaker A: Perfect. If you'll put those links to those articles in the notes. [00:27:53] Speaker B: I have links. Oh, one more, one more. Very honorable mention Open Tofu. [00:27:57] Speaker A: Oh yes. [00:27:58] Speaker D: Oh, that was one of my. [00:28:02] Speaker A: Yeah, it's one of my honorable mentions too. Yep, I'll put that one in the show notes. I have that one right here. Okay, so I had my honorable mentions. I had Broadcom ditches VMware, cloud service providers and the beginning of the Broadcom money grab. That was fun. The other one I had was Azure Elastic SAN because who needs a SAN in the cloud? Which still makes me laugh every time I think about it. I chose a hello GPT4.0 because I just like saying it all the time. I had a quality of life one here as well. And the introducing Amazon GuardDuty malware protection for S3 which was a nice quality of life improvement. I also had Amazon's decision to deprioritize seven cloud services, cut customers and even more salespeople by surprise because that was fun. And then my last honorable mention was AWS acquiring Data Center Campus powered by Nuclear Energy, which we just talked about being prohibited. But nuclear is still going to be super fun and part of our future, I'm pretty sure. [00:29:02] Speaker C: Yeah, we haven't heard the last of that for sure. [00:29:04] Speaker A: That's that's got quite a bit more. So, Matt, I think I saw you pasting in some other honorable mentions. [00:29:10] Speaker D: Yeah, there was, there was just Valkyrie in general. I think after all the drama with, with Reddish change or licensing, it's just like, like Terraform did. And then there was, you know, Azure immediately backed them, but Google and AWS pretty much were like, nope, we're going to fork the project. And just seeing how much Valkyrie has, what is. Has done and where it's going and me not at all being mad that Valkyrie is not available on Azure and my assumptions won't be, but, you know, maybe it's a little bit of a pet peeve. So that was number one at the same. [00:29:47] Speaker A: At the same pet peeve about Open Search not being available on gcp. [00:29:51] Speaker D: Yeah, Elastic Pools for hyperscale. It was just, you know, one of those quality of life and affected me a lot on my day job given that, you know, Database Watcher was a pretty cool preview that was announced for Azure where you kind of let you look across your database at a. Your databases on Azure at a holistic level. So it kind of gives you that global point of view for all your databases. I had conditional rights for S3 because that had been a pain point at one point in my life in the past, flexible consumption. So actually function apps being like lambda, where you pay for what you use and the scales horizontally and everything. Azure Premium Bastion, it's one of those services that it's going to slowly take out their competitors because, you know, now for. For not a large sum of money, it will record everything that you do going through the Bastion hosts and everything else. It was a really good thing. It was really good. I think there's a couple competitors in the space. And then AWS was the cost allocation tags being retroactive. So in the past, whenever you added tags, it was only going forward and you would have no idea going in the past what it was. So the retroactiveness was really nice. And then my sad article, I went with the other way than Justin. Things that make him happy. Things that make me sad was, you know, Microsoft spending so much time this year after all their security issues, and then they lost like three weeks worth of logs for a bunch of their customers. It was just like one of those, like facepalm moments. [00:31:30] Speaker A: See, ironically that makes me happy because I don't, you know, Azure's not my favorite. [00:31:35] Speaker C: Yeah, it wasn't our logs. Yeah. [00:31:38] Speaker A: But yeah, I see why that makes me sad. Did you have any other honorable mentions, Ryan? [00:31:44] Speaker C: I have one that you guys haven't discussed already which was that all of the cloud providers are now exporting their billing data in single focus format. [00:31:54] Speaker A: Yes, I have the article for that. I will find it. [00:32:00] Speaker B: And it's spreading too. It's not just that. Isn't it? [00:32:03] Speaker A: People like other people better. [00:32:05] Speaker B: Yeah. [00:32:06] Speaker A: So I, I gotta push on some of our vendors. But hey, you gotta, you gotta start supporting focus if you're gonna do consumption. [00:32:11] Speaker D: Because ruining my 2025 prediction. Guys. [00:32:16] Speaker A: Hey, you can still make the prediction if Ryan doesn't take it first because you're number two in the list. All right, well, let's try. That's good segue actually. Let's move into 2025 predictions where we get our crystal balls out and try to pick what we think's gonna happen next year. I did say we couldn't put World War 3 on the board or Civil War in America or anything crazy like that. So it has to be cloud related or at least technology related has that. So Ryan, again, you rolled first and got the highest number even though you rolled last technically. So Ryan, what Was your first 25 prediction? [00:32:52] Speaker C: Well, it's truly unfair just because it's really hard to look at 2024 and then think ahead to 2025 and not just think that it's going to be. Be continuation of the massive onslaught of, of AI. And I am so still sort of feel like I'm on the outside looking in on so many these things in the announcements and you know, the differences between the models and, and really getting into that nitty gritty. So I'm, my first announcement is someone is going to come up with the ability to quickly provide a custom LLM model for individuals. [00:33:38] Speaker B: Okay, what, can we have an example? What kind of thing? [00:33:41] Speaker C: No. So the hope would be like it's, you know, we have the giant models that are produced by these large companies and you know, and you know, rag has come a long way in order to sort of personalize those without having to train a giant model. But it'd be more of a, a way to have a commodity that you could, you know, run in your house or, you know, something that's specific to, you know, your tailored needs for consumers, specifically individual consumers. [00:34:17] Speaker B: Okay. [00:34:19] Speaker A: All right, moving on to Matt. Your first prediction. [00:34:23] Speaker D: Mine was around Focus, actually, which was we're going to see the majority, you know, a lot of the cloud, tangential vendors support Focus and really embrace it as the framework for everything going forward. [00:34:42] Speaker B: You want to pick? [00:34:42] Speaker A: I'm going to simplify that down for you just a Little bit to just, we'll, we're going to see focus be adopted by vendors who sell consumption models outside of hyperscalers. [00:34:51] Speaker B: Well, that's already happening. I think we should push them a little bit more on that. [00:34:54] Speaker A: All right. You want to call out somebody? [00:34:56] Speaker B: Yeah, pick, pick somebody who you'd like to see. [00:35:02] Speaker A: I can give you suggestions. [00:35:06] Speaker C: Sounds like this might have been just one of Justin's predictions. [00:35:11] Speaker A: No, but I, I just know who's really big into consumption, who would be really beneficial to them adopting focus. [00:35:19] Speaker C: Snowflake. [00:35:20] Speaker A: Snowflake, yeah. [00:35:21] Speaker C: So that's the one. [00:35:21] Speaker B: I, I didn't want to say I. [00:35:23] Speaker D: Was thinking Snowflake, but I thought yeah, that would, that was my big one. Or like data bricks like those, like the big data AI companies. [00:35:33] Speaker A: I'll give you those two. That's fine. All right, Jonathan, your prediction, first one for 2025. [00:35:38] Speaker B: My first one is that a company will claim that AGI has been achieved. [00:35:46] Speaker A: What's AGI for those who don't know? [00:35:48] Speaker C: Artificial general intelligence, which is the new hotness for sentience. [00:35:55] Speaker B: Yeah, it's, it's pretty much you, you can, it's, it's, it's a lot of things but I think Agent Ki folds really nicely into it. But, but pretty much you, you give it a, you give it the tools to do a job and a challenge and it will, it will keep working at it until it's found a solution. [00:36:11] Speaker C: It's really difficult to actually describe. [00:36:13] Speaker B: It is. [00:36:13] Speaker C: Yeah, because I went this, this came up just in a group chat and I was trying to you know, describe what it was to a group of non technical people and it was like it's really difficult because you know, some of them are, some of the articles and some of the science are really looking at that sort of self awareness sentience of it and others are defining it much more like you did, which is sort of an unrestrained capability. [00:36:41] Speaker B: Yeah, I, I think, I think it falls into a few different categories for me. I think human like abilities which includes complex problem solving, the ability to learn new skills quickly, which means not a release cycle every three months for like a 3.1, 3.2 revision. I think perhaps that it can understand and respond to more to social cues. Now we have you know, video and audio based inputs and outputs for some models. I think actually showing some kind of insight into what the person's thinking rather than what they're just literally just the words that they're saying kind of is important for me. Yeah, I think, I think like we way past the whole Turing Test. Now, that's. That's come and gone. But, yeah, I think the problem is it's not well defined, but. But somebody will claim it. [00:37:34] Speaker A: All right, I'm looking at my list of options here, and I'm trying to decide which one I feel is the most likely. You guys went very AI heavy in your first section here, so I think I'm going to go. I'm going to go with. After Q1 and AT&T and Amazon have completed their RTO, we will see a large quantity, I'd say over 10 major enterprises adopt. Return to office five days a week. [00:37:59] Speaker B: Ooh. [00:38:02] Speaker C: God, I hope you're wrong. [00:38:05] Speaker A: I hope so, too. [00:38:07] Speaker C: Is that why. Oh, that's a good idea. You make it a prediction and it won't come true. [00:38:10] Speaker B: Right. [00:38:10] Speaker A: That's. That's my goal. We never get these. Yeah. [00:38:14] Speaker D: Maybe of a. Do you want to add like, of a certain size? [00:38:17] Speaker A: It's like my young over 10 major comp. I mean, like, to be picked up in the news, you got to be four GM. [00:38:22] Speaker B: Yes. Really? [00:38:23] Speaker A: Like, if your company went back five days a week, no one would care. [00:38:26] Speaker D: Yeah, 100%. [00:38:28] Speaker A: All right, Ryan, your second 2025 prediction. [00:38:31] Speaker C: All right. Still AI, because it was. I couldn't think of anything else. And I think that AI will go to the Edge Compute player. I think that a lot of the edge computing and AI capabilities will happen, you know, in some form or fashion without having to go all the way back to. [00:38:51] Speaker A: So you're, you're saying. So I think we need to clarify this a little bit because I think there are some AI use cases at Edge today, particularly like cloudflare has got some stuff that they're doing. But you're, you're specifically referring to, like, I'm going to be able to get access to an H100 Nvidia chip in an Edge device. [00:39:10] Speaker C: That wasn't really where I was going. I was thinking more about the inference layer. Like, I. It sounds like maybe Cloudfare is. [00:39:18] Speaker A: I mean, I mean, inference. It was on. You know, you could do inference on H100 or. [00:39:22] Speaker C: Yeah, no, I was thinking more of just like a. You know, like. So when. When Serverless came to Edge Compute. [00:39:27] Speaker A: Right. [00:39:28] Speaker C: Like everything used to be process, process, server. Most of the inference you're. You're calling an API and, and using that where. I think this will be sort of more, more native to those sort of the, the last mile of the network stack. [00:39:43] Speaker B: What do you think the use would be for that? Like a pre processing thing on the Way or like a. Yeah, that's sort. [00:39:50] Speaker C: Of what I was going with was, you know, like you. That's what I was thinking anyway. Like you. The lambda on the edge was the. Where I started. And so you co. You put in there code for different rules and doing different things. Maybe translation AI integration at that level would be more free form and you wouldn't have to codify it directly. [00:40:11] Speaker D: Right. [00:40:11] Speaker A: I've written down lambda on the edge esque. But AI to kind of try to summarize down what you meant. [00:40:16] Speaker C: Cool. [00:40:17] Speaker D: Don't already kind of exist with like the Intel NPUs, probably. [00:40:21] Speaker C: So I've already won. That's fine. [00:40:26] Speaker A: I would not consider the intel to be a service that I could use though, because intel is just. Jiminy. [00:40:30] Speaker D: It's not a service. But yeah, I mean, this is why. [00:40:33] Speaker A: This is a good prediction because there's technology that could do it. It just has to be integrated now into solution. So that's not bad. All right, Matt, I'm gonna go. [00:40:42] Speaker D: I'm gonna do one on AI, which is gonna be. I think there's gonna be a lot more security focused and like ethical AI frameworks and security frameworks and features around AI that come out. I know there's like, they're starting now, but I think it's gonna be like a major focus and you know, kind of like there's Sock and ISO. I think there's gonna be like an AI specific framework that. For large enterprises to adopt an AI solution of sorts, you're gonna have to say that you've done these types of things. [00:41:15] Speaker A: I. The first part, I don't know if I liked it, but the soccer ISO specific standard for AI, I definitely think that's. [00:41:22] Speaker D: I know ISO, I think has a draft for one, but I don't know that it's used yet. [00:41:27] Speaker A: And the question really is, because it takes years for most ISO standards. So does it get done next year? You're already gambling on that? [00:41:34] Speaker D: Yeah. [00:41:35] Speaker B: Oh, that's a good one, actually. [00:41:37] Speaker D: I like. [00:41:37] Speaker B: I like that one. But I think it's more likely to focus on consumer privacy than it is on those other things to begin with. [00:41:44] Speaker A: Well, I mean, there is a privacy ISO standard. [00:41:46] Speaker B: Yeah, but. Well, maybe that will be extended then to cover AI use cases. [00:41:50] Speaker A: I mean, if that was, that'd be the only thing that happened. I would take that as a win for Matt on the point, I think. Although I might forget this next year. So save the recording. [00:41:59] Speaker D: Listen to this recording right before the podcast next year. [00:42:03] Speaker A: Well, I'm gonna have to. To Figure out what Ryan meant by his number. Second, Jonathan. [00:42:07] Speaker C: Me too. [00:42:08] Speaker A: You're a second. [00:42:10] Speaker B: My second. I explain. Trying to explain what I mean. If you don't get it, let me know. I'm sure you will. Most people, I think, using AI assistants now, like, they go to the website that the assistant is on and they. They use it for a particular task, they ask a question, they ask it to do coding, they ask for whatever, you know, like a search engine kind of thing. But it's always very much a deliberate. Like the relationship is between you and the assistant. Mostly. I think in 2025, people will start using it sort of more as a personal assistant in that they'll have the AI use other apps for them. So, you know, you can. So you'll be able to delegate tasks. And I think one of the examples would be like, dating apps. Let's say you could have an AI set up your profile on a bunch of dating apps and you tell it what you like, and then it could manage those interactions on the app, or it could be. It could be a different kind of app, but I think there'll be a lot of delegation of work that people will be able to do to their existing apps that they use or services they use through assistance. So, like, personal assistance, empowered to work in the real world, I think will be a big hit in 2025. [00:43:21] Speaker A: So existing solutions, they have today that get AI agent capability that's similar to what Google does for booking reservations for restaurants, essentially. [00:43:31] Speaker B: Yes. But for tons of things. [00:43:36] Speaker A: Yeah, I just kind of paraphrase down. [00:43:38] Speaker B: Yeah, no, that's. That was a good example. I forgot about the. Yeah, and it's. That's an interesting example because really, that's been around for, what, two or three years now? [00:43:45] Speaker A: Yeah, I mean, I don't know if it ever got really popular, but, yes, it's been around for a while. [00:43:49] Speaker B: Oh, I. I was. I was famous when I walked into a restaurant here in town after using the Google assistant to book it. Because. Because the assistant called them and made the booking, and everyone was like, wow, this is fantastic. And I walked in there, I was like, oh, you're the guy. Yeah, I'm that guy. [00:44:06] Speaker A: All right, well, so my second prediction is that OpenAI will lose the number one spot in LLM and be seen as no longer the leader in AI that they are today. [00:44:18] Speaker C: Ooh. [00:44:19] Speaker B: Okay. I think it's a close race right now. I think. I think you may be right. And I'm not sure if it's going to be anthropic. Or if it's going to be Alibaba. Alibaba or Gemini. Yeah, it could be. It could be any of those. [00:44:33] Speaker A: Yeah. I just feel like their innovation curve has definitely slowed down. Where I still see Claude and Gemini and Alipaba, you mentioned they're all innovating quite a bit and I would not be shocked to see the market shift. All right, Ryan, your third and final prediction for 2025. All right. [00:44:54] Speaker C: So Jonathan very heavily stepped on what I was going to do. [00:44:58] Speaker A: Yeah, I had that one too. [00:44:59] Speaker C: Yeah. [00:44:59] Speaker A: Well, something similar to it, but. [00:45:01] Speaker C: So this isn't mine, but yeah, it was very similar. Well, so my prediction is that. [00:45:10] Speaker A: There'Ll. [00:45:11] Speaker C: Be more of a cloud native security mesh for applications that are sort of more complex multi cloud hybrid environments. And so the mesh will be sort of become a way. Just like you would make a call to a load balancer and that would route traffic. The mesh would be more of a. You define application rules and what it can talk to at a very flat level. And. And rather than. And that the mesh would be routing all that all the way through application interaction. [00:45:42] Speaker A: Zero Trust networking at the edge. [00:45:45] Speaker C: Zero Trust networking for app from application to application and service Service network. [00:45:53] Speaker A: Okay, again, you have to figure that one out next year because I will not remember what you meant. [00:45:58] Speaker D: I even try to understand what he means now. [00:46:00] Speaker A: Yeah, I'm not sure I fully get it. I think only Ryan truly understands that, but maybe not. We'll see. [00:46:05] Speaker C: I'm just hoping that I just throw off enough words to be like. Yeah, that's what I meant. [00:46:09] Speaker A: Yeah. [00:46:10] Speaker B: So what would be different between. Between that and what we have now. Exact. For example, so we. [00:46:15] Speaker C: Yeah, I mean, so service mesh now. [00:46:17] Speaker A: Right. [00:46:17] Speaker C: Registering applications happens in these small little islands. The biggest difference would be sort of the. The default service mesh capabilities where it's not sort of registering specifically with the mesh in the in or relying on, you know, your kubernetes cluster, sort of proxying your traffic. It'd be much more like you'd be able to select. This is just how all the networking is within the. Within an environment or between these two environments. [00:46:44] Speaker B: Okay. [00:46:45] Speaker A: All right, Matt, your third and final. [00:46:48] Speaker D: I'm thinking with Amazon's going to keep deprecating services that are outdated. [00:46:54] Speaker A: How many? [00:46:55] Speaker D: I have at least five in my notes. [00:46:58] Speaker C: Okay. [00:46:58] Speaker B: For an extra point, pick one. [00:47:02] Speaker D: I was debating between workmail and Work Doc. That was actually my virtual. [00:47:06] Speaker A: I know they already killed work docs, so that one wouldn't count, I think. [00:47:08] Speaker D: Did they? [00:47:08] Speaker A: I thought they did. [00:47:10] Speaker D: See how much I Used it. [00:47:12] Speaker A: I think workmail is still there. [00:47:14] Speaker D: No work docs still exists. [00:47:16] Speaker A: I think. I think if you try to create a new work docs. [00:47:19] Speaker D: Oh, yeah, they ul did. Effective April 25, 2025. [00:47:24] Speaker A: Yeah. [00:47:25] Speaker D: Work mail. [00:47:26] Speaker A: Work mail. [00:47:26] Speaker D: It is because they moved, I thought, years ago over to O365, Amazon internally. [00:47:33] Speaker A: They did. Forever ago. Yeah, we talked about it on the show. [00:47:36] Speaker D: Yeah. [00:47:39] Speaker A: All right, Jonathan, your third and final prediction for 2025. [00:47:43] Speaker B: I think we'll see models that can learn in real time. AI models that can learn in real time. [00:47:50] Speaker A: Now, I know, like the new Gemini 2.0, they talk about being able to leverage Google search as a rag source. Is that what you're referring to? Are you referring to something different? [00:47:59] Speaker B: No, I, I'm not talking about referencing new. Referencing current information externally. I'm talking about incorporating information that they've learned through the context of conversations or interactions with other systems and build that directly update their models, their internal models, dynamically to incorporate new information. So I'm not talking about x, like Chat GPT's memory, where they save a bit of information and it pretty much just sticks into your context window every time you have a chat. More fundamental than that. That, that you. That you'll actually be able to teach a model new tricks. [00:48:36] Speaker C: I think that's kind of what I meant. [00:48:40] Speaker D: Crap. [00:48:41] Speaker A: Too late now. [00:48:42] Speaker C: I know. Yeah. [00:48:44] Speaker A: All right, well, since you guys took this one, I'm going to take it because I am pretty sure it's not going to happen in 2025. We will have a GPT 5, a Claude 4, and a Gemini 3. [00:48:52] Speaker C: So I wasn't gonna do that. [00:48:55] Speaker B: Oh, all those three. One and one. [00:48:57] Speaker A: All three of them. I make it easy. All right. [00:48:59] Speaker C: All right. Yeah. [00:49:00] Speaker A: If. [00:49:01] Speaker C: If you were just calling one of those things. I was. [00:49:03] Speaker A: I would go for all three. All three of those will exist is my belief. [00:49:06] Speaker C: I still think you're gonna win that point. [00:49:07] Speaker A: But I mean, none of you guys picked any of that. I was, I expected someone to, so I was just like, well, I'm not gonna take that one first, so I'll save it for this because I have integrity. [00:49:16] Speaker C: I didn't pick that. I wrote integrity. [00:49:18] Speaker A: Come on. Bragging rights. [00:49:21] Speaker D: I think you have braggy rights from. [00:49:23] Speaker A: The years of lightning round, winning and, and the reinvent. Yeah, yeah, I definitely have reinvent for sure. All right, well, since we suck at predictions, I did go out and find some other sources of predictions to see if they're any better. So. Werner Vogels, of course, is my typical stop and he's got a couple here that we'll talk about real quick. First up is a workforce of tomorrow is going to be mission driven. He basically says the world is facing urgent challenges around sustainability, social equity, food and economic security and responsible AI usage. And a quiet revolution is unfolding in the job market, a move towards work that benefits humanity across industries and generations. So I guess I would say that's green facing jobs for workers to improve the world. I think that's a pretty reasonable. [00:50:14] Speaker C: Yeah, green and then humanitarian, like, you know, food aid, you know, kind of makes sense. [00:50:20] Speaker A: Yeah, I think that's very, very reasonable and I hope that one works out. His next one is a new era of energy efficiency driving innovations. So basically this is around. Renewables such as wind and solar have become impressively scalable and reliable marketing a significant milestone in their transition towards cleaner energy production. But nuclear has reemerged as a promising solution. And I think this is all about nuclear and nuclear power for AI and addressing those type of challenges either through innovative new technology or through expansion of small reaction reactors. That one, I think he's probably right on maybe two. [00:50:55] Speaker D: I had that one on my notes, but I thought, I felt like a lot of things that were announcing it like they'd already kind of announced in the last couple months and they all said like 2027, 2028. [00:51:06] Speaker C: Yeah, I was having a hard time defining something specific. Same thing. [00:51:10] Speaker A: I mean, he rambles for several paragraphs, so I think he struggled too. His third one was technology tips the scale and the discovery of truth. And he says basically we're on the cusp of a movement that will put tools in the hands of consumers, fundamentally changing the current power dynamics. Innovations such as the trustnet browser extension offer the problems of real time crowdsource, fact checking for web content. GeoSpy extracts data from photos and quickly matches the features against geographic regions, countries and cities. And similarly generative AI systems that augment daily news with relevant academic knowledge are emerging, such as proem, providing necessary aid to validate claims and stem the spread of inaccuracies, both accidental and intentional. [00:51:49] Speaker C: I hope that one comes true. [00:51:50] Speaker A: Right? [00:51:51] Speaker C: Especially with the amount of nonsense that I guess at. [00:51:54] Speaker A: Yeah, well, especially in the next four years. How much nonsense we're going to get. [00:51:57] Speaker C: Indeed. [00:51:59] Speaker A: And then open data drives decentralized disaster preparedness. Basically he talks about all the massive amounts of natural disasters that are happening and he says that these grassroots efforts to help in these type of situations is evolving into a decentralized resilience system and advances in edge computing and satellite activity during disasters enables real time data capture and processing even in the harshest of climates. And basically saying that that's going to be, continue to be a thing. No, I'm not sure but I think that one's tied, related to money. To do a, do I need money? And I don't know if that's going to be decentralized, but I like the idea that people are trying to get more data more quickly to help the a where it needs to go. [00:52:35] Speaker B: That was kind of, that was going to be one of my predictions, but I couldn't really quantify it in, in a way which would, would be measurable to win the point. I think like there's obviously a need for tons of data. I'm not going to say that we're running out of data exactly, although the, the quality is a bit questionable. But I think access to data is going to be super important. And I didn't know how to turn that into a prediction. But like when I, when I go to Safeway and buy my groceries, I want a way to get my, my like receipt electronically so that I can plug that into an AI. So then I'm going to do my groceries, my AI will know what's in my pantry. And if I say what can I eat, what can I cook to eat today? It can be like, oh well, you've got this stuff, why don't you make this? I, I just think there's, there's so many places where access to data would, would, would make life easier for a consumer. And right now it's very asymmetric. You know, Safeway or Albertsons have got access to all the data. They can market the shit out of me because they know exactly what I buy when I buy patterns of all kinds of stuff. But I have none of that. So I want to see some of that asymmetry go away. And I want access to the data that other people have about me. [00:53:39] Speaker C: Yeah, I'm starting to see that come in the form of applications for very technical use cases. It's a way of consolidating all the data that exists in an environment from various sources. And rather than just dumping it in a data lake where you query it, you know, really providing that data all the way to the end user. And now instead of, you know, hiring a team for bi or you know, you know, to produce your reports, somehow you're seeing applications that sort of do it automatically. So I agree. I think that's, yeah, I think that's coming. [00:54:15] Speaker B: It would be awesome to have Some kind of open framework for consumer data access that people that companies would adopt. [00:54:21] Speaker C: It'd be hard though, right? Because how do you, how do you identify without breaking the trust? [00:54:26] Speaker B: So it's, well it's, I mean it's tough but like you know, to get the discounts at Safeway, you've got to register a, register a card and have an account. Same for, same for pharmacies, same for everybody anymore. It's like everyone wants you to have an app, everyone wants you to have an account. [00:54:39] Speaker A: And well, loyalty has been a great way to track consumer habits and, and incentivize people and do these things. So makes sense. All right, the last one from Warner was an intentional driven, intention driven consumer technology is going to take hold. And basically if you read through this, he talks about the amount of screen fatigue that's happening and there's been growing movements around being like the offline club in Amsterdam or where basically Grant High School in Portland has a lunch, lunchtime is allowed again as students engage in face to face conversations, not their phones. And so he said innovators are taking notice. A new wave of purpose driven devices is emerging designed to foster intentional use and encourage us to enter a state of flow flow or the zone rather than divert our attention. And he mentions the E reader Kindles have a long heard immersive reading without interruptions. But now we are witnessing this trend on much larger scale with emergence of minimalist phones that offer little more than call and text functions, cameras that emphasize the craft of taking a photograph, not sharing and a Salem music player that lets destroy music without the constant barrage of messages and notifications. So yeah, purpose driven thin devices I think has been a bit of a trend as well. I think he's probably right on that. One success, commercial success of them I think is a question, but I think they'll exist. [00:55:51] Speaker D: No, I definitely think that you're seeing some of them, you know, and especially with, you know, I see it with young kids, you know, toys and things like that where you don't want to give, you know, a two year old a phone or screen or anything, you know, so you're seeing technology out there that's like very specific. You put in a cartridge and plays an audiobook, you know, 10, you know, a one minute audiobook. So you, they're starting to see very much things along those lines already out there. So it would not surprise me if that continues to trend forward. [00:56:21] Speaker A: All right, before we wrap it up here real quickly, six enterprise technology trends for 2025 from CIO Dive and so some of these are just make me cranky. Software vendors will double down on consumption based pricing which we definitely see a trend which will make me mad when renewals come up. CIOs will be more selective about generative AI use cases next year which I think. Yeah, because I think there's been a lot of hype and not a lot of delivery quite yet. This may be the beginning of that trough of disillusionment but again I'm not going to make that prediction again this year. [00:56:48] Speaker D: Almost did I thought about it. [00:56:50] Speaker A: Yeah. Enterprises will consolidate IT service vendor contracts with their largest providers. I mean that's been kind of a movement that's been happening with AWS and GCP and Azure anyways with marketplace and large commit spends and, and just moving all your spend into one place so you can consolidate that. I think that'll definitely continue. Hyperscalers will continue infrastructure buildouts which no, duh, agentic AI will double the workforce in 2025. This is an interesting one because I don't really see how agentic AI doesn't replace people but apparently they saying it's going to double the workforce and then number six, the majority of enterprise generative AI strategies it will accelerate which I definitely think is probably realistic as well. So yeah, that's, that's their prediction. So again CIO dive. I don't know if I believe everything they say but worth seeing someone else's perspective other than just ours. [00:57:38] Speaker B: Now I think a disillusionment thing is, is is quite likely to come because I think as these bigger, more expensive models come, it's going to be like well actually you know we get, we told you that you might be able to use these models to replace people but actually if you want to replace somebody who does these particular tasks, you're going to need to use this other expensive product which is going to cost $20 a go and it'll take half an hour to return an answer. And it may not be right but still it's going to be very expensive. I think cost is going to drive the disillusionment when people, when people actually want to replace humans. [00:58:08] Speaker C: Yeah. And it'll take you know, armies of people to get that set up for individual use cases and individual companies. Right. So while it's you know, eliminating one set of jobs, it's probably creating a bunch as well. [00:58:20] Speaker B: Yep. [00:58:21] Speaker A: All right gentlemen, well you both have, all of you have a happy holiday and a happy New Year and I look forward to talking to you guys all in January fresh and New Year glow on you. [00:58:32] Speaker C: All hungover, I think is what you mean. [00:58:35] Speaker A: Yeah, that's the same thing. We'll see what 25 has to bring, but we got Google Next coming up. We've got lots of Cloud nonsense I'm sure throughout the year. We'll have another reinvent next year, I'm sure too. So it'll be another fantastic AI year, I'm pretty sure in general, but look forward to seeing how our predictions turn out next year and where things go. [00:58:57] Speaker B: Awesome. See you later. [00:59:00] Speaker C: Bye everybody. [00:59:01] Speaker D: Happy New Year everyone. [00:59:05] Speaker B: And that's all for this week in Cloud. We'd like to thank our sponsor Archera. Be sure to click the link in our show notes to learn more about their services. While you're at it, head over to our [email protected] where you can subscribe to our newsletter, join our Slack community, send us your feedback and ask any questions you might have. Thanks for listening and we'll catch you on the next episode.

Other Episodes

Episode 99

January 04, 2021 01:02:17
Episode Cover

Episode 99: 2020 overstays its welcome

On The Cloud Pod this week, the team looks back on the incredibly weird year that was 2020 and how all we want is...

Listen

Episode 262

June 06, 2024 00:52:59
Episode Cover

262: I Only Aspire Not to Use and Support .NET

Welcome to episode 262 of the Cloud Pod podcast – where the forecast is always cloudy! Justin, and Ryan are your hosts this week,...

Listen

Episode 174

July 27, 2022 01:24:45
Episode Cover

174: The Cloud Pod Goes the Distance With Rocky Linux

On The Cloud Pod this week, the team discusses facial recognition avoidance tactics. Plus: Waving farewell to CentOS 7 with the rise of Rocky...

Listen