[00:00:00] Speaker A: Foreign.
[00:00:06] Speaker B: Welcome to the Cloud Pod where the forecast is always cloudy. We talk weekly about all things aws, GCP and Azure.
[00:00:14] Speaker C: We are your hosts, Justin, Jonathan, Ryan and Matthew.
[00:00:18] Speaker A: Episode 287 recorded for the week of January 8, 2025. The Cloud Pod rebrands the Cloud AI so we can also get an a billion dollar valuation. Good evening, Jonathan, Ryan and Matt. How are you doing? Happy New Year.
[00:00:31] Speaker B: Happy New Year.
[00:00:32] Speaker C: Happy New Year.
[00:00:33] Speaker D: Happy New Year.
[00:00:33] Speaker A: We all have great Christmas and presents and New Year's celebrations and everyone has all their fingers and toes that didn't blow them off of fireworks.
[00:00:42] Speaker C: I feel like I lost time. So I might have been abducted by aliens for two weeks because I don't know what happened.
I feel like it was just yesterday and I'm still waiting for our vacation to start. Hasn't happened.
[00:00:54] Speaker A: It, it went. I took the week of Christmas off and then the week after that I sort of worked and I made the mistake of taking the working the Thursday and Friday after New Year's and I had zero motivation. That was a, that was a mistake.
[00:01:07] Speaker B: Yeah.
[00:01:08] Speaker A: So if this, this weird late year thing for breaks and school breaks is, I might just take two weeks off.
[00:01:15] Speaker B: Yeah. With, with a, with young kids, though.
[00:01:18] Speaker A: Young.
[00:01:18] Speaker B: You, you need a vacation after the, the school vacation.
[00:01:21] Speaker A: Right, Exactly.
[00:01:22] Speaker B: That's, that's, that's where I'm at.
[00:01:24] Speaker D: I'm not gonna lie. My wife and I were very happy when Sunday happened and we're like, oh my God, we have five days of daycare. Let's do this.
[00:01:34] Speaker C: I think I just, I, I, I took the model. If you can't beat them, join them. So I just played video games for two weeks. My kids y removed all the screen time restrictions. Like, ah, it was raining like a. My only home.
[00:01:48] Speaker B: Yep, I do the same thing. I'm, I'm diamond level in Fortnite Rocket Racing now.
[00:01:54] Speaker C: Nice.
[00:01:58] Speaker A: I can't get into Fortnite. I, I can beat them at, at Mario Kart still. They now beat me at Halo all the time, so. But yeah, Fortnite, I just can't get behind the hold. Like, I had to build and I have to shoot. Like I just want to point at things and shoot them in the video game. That's all I want to do. I don't want to think that hard.
Well, luckily the cloud providers also took most of December off, so we don't have a lot of news to catch up on after I gave you guys two weeks off to enjoy apparently playing Fortnite without pay. Without pay. None of us get paid, but do sponsor us. Hey, if you're looking to sponsor a Cloud Pod podcast, reach out. We love to get you on who makes Fortnite.
Hey, Fortnite can sponsor us. I'll take their money. You don't have to be in the cloud world if you want to, you know, you want to sell barbecues or you want to, you know, sell widgets, I don't care. You can come. We'll take your money.
It's all good.
All right, well, let's jump into the general news.
Oracle had their earnings back in December 9th and we were in the middle of reinvent, show, catch up and trying to catch up for the end of the year, so we missed it. But now we're back to fix that sin. So basically, Wall street didn't like it. I can summarize it down to that level.
Rapid cloud growth occurred, but was not sufficient to appease those Wall street gods. They reported earnings of $1.47 a share, just shy of the $48 expected by those godlike analysts. Revenue is up 9% from a year before to 14.606 billion from the Street's target of 14.1 billion. Sorry.
Couple million dollars and change income was up 26% from the year before. So all their optimization efficiency is doing great for them with 3.15 billion in income. Sorry. And yeah, and revenue from cloud services and license, which we care about the Most, is up 12% to 10.8 billion. Which if your revenue is 14 billion and cloud and license is 10 billion, that means that's 77% of it. So Oracle Cloud is working out for them. Oracle CEO Safra Kat said growth in the AI segment was nothing short of extraordinary with 336% growth in GPU unit consumption from the prior year. And despite all this positive news, oral guidance was soft for the year. And that angered the Wall street gods as well, punishing their stock in December.
[00:04:13] Speaker B: That's crazy. I think the Wall street gods were just getting grumpy because it wasn't New Year yet.
[00:04:20] Speaker A: Yeah, I mean, now in January, their Stock is up do 11 today, but looking at the month, they haven't really recovered from earnings quite yet. So we'll see how they do as they continue through year.
[00:04:33] Speaker C: But tech overall is slow, I think.
[00:04:35] Speaker A: Yeah, I mean, tech in general is down. I mean, everything's down. Everyone's waiting for the election to. Well, not the election, but the swearing in and the new administration to come in. As we're past that all now.
Well, Hashicorp had their recap of their busy 2024 and they wrote up a blog post. They thought I'd call out some of their highlights. First of all, they're very excited about IBM and Hashicorp signing an agreement to be acquired with IBM. They believe they can bring modern infrastructure and security practices to an even greater number of organizations around the world, and they're excited for the possibilities that IBM will unlock for Hashicorp. Some of those things they announced this year or they released this year that should help innovate your environment. Terraform Got Terraform Stacks module lifecycle management to simplify your day two operations Terraform migrate for HTTP Terraform so you can move all your money to SaaS, test integrate modules, published ephemeral values and Secrets config driven state updates for refactoring and importing resources and pre written Sentinel policy Libraries co developed with Amazon Packer 1.11 or sorry 1.11 got released with a new plugin loading process Packer and plugin version tracking and CICD pipeline metadata outputs now to help your SDLC processes. Nomad got significant upgrades this year which, you know, I know people who use Nomad, including myself. I didn't actually think it got that much investment, but apparently they did with Nomad Bench Nvidia device driver support enhancements for GPU scheduling and resource quotas, a new task driver for Exec 2 and Livevert Taskdriver beta for improved virtual machine support. All in Nomad for those of you involved. So we talked about Secret Sync, the new auto rotation and dynamic Secrets with integration to your Amazon or Google or Azure console access and the Vault Secrets operator for Kubernetes all out as well as the new HTTP Vault Radar product for the SAS solution which scans your digital state for Unmanaged Secrets and PI data and Console got Transparent proxy for ECS console, DNS views for Kubernetes and registration CRUD operations for Console for Kubernetes. So yeah, that's a pretty busy year for a lot of different tools all from HashiCorp.
[00:06:37] Speaker C: I was as surprised as you are for the with the Nomad news like but then I was thinking about it and it's just like there isn't the greatest of options for managing infrastructure if you're not on a cloud hyperscaler. So it's like you can use OpenStack which gets a little bit of support, but I don't think it's really, you.
[00:06:57] Speaker A: Know, it's Habitat right?
[00:07:02] Speaker C: I still don't know what that is and I don't know if that still exists.
[00:07:07] Speaker A: So to be honest, I don't. I think that might have died but I was always a joke back in the day.
Well, I'm glad they're excited about that IBM acquisitions. Someone is not so excited about it and that is the eu. Not so fast. They say Britain's Competition and Markets Authority is apparently going to investigate IBM's acquisition of HashiCorp and has launched a merge inquiry. So official Jonathan, you British people, merge inquiry. With a deadline of February 25th to decide, they're going to go into phase two which would further delay any possibilities of a merger. They're asking for parties to comment on the merger before January 16th. Of course for IBM the big prize is Terraform, which, but sorry to the British government, I think they're seeing some conflicts potentially with IBM owned Red Hat, Ansible and Terraform competing or taking competition out of the market. So we'll see if that ends up being anywhere. The last one of these that British Competitions Market Authority protested was broadcom's acquisition of VMware which they did a tier one and tier two investigation and only delayed it six months. So instead of closing in December like they wanted to for the Hatch Corp. Deal, it probably will take sometime into the summer, around August if previous timelines have stood up.
[00:08:24] Speaker D: I don't really think of Ansible as a competitive Terraform.
[00:08:28] Speaker C: It's because it's not.
[00:08:29] Speaker A: It's not. But you're not the government and you're not a British Competitions and Markets Authority person who would know the differences and the nuance of what you're saying.
[00:08:37] Speaker C: I. So that's the part that annoys me the most is that all they did was basically Google a notice of investigation with no details on why they think it's, you know, what the pos, what they're possibly investigating, what there's. So it's all left up to speculation on what they're actually concerned with, which is, you know, like, I think it's probably a miss. I think that they're gonna dive into this and be like, oh these, IBM is just getting too big. Which.
[00:09:05] Speaker A: Yeah, I mean they've been too big since the 60s or 50s when they started. I mean like they've always been a massive conglomerate.
[00:09:11] Speaker C: Yeah. I just don't. Yeah, I just, I'm not a big fan of this, this move. And I also, I'm trying to figure out too how much of it is just kind of like a perfunctory thing that they needed paperwork wise to start an investigation to make sure like due diligence wise. And then it got into the press, so I can't tell. But I'm also like super cynical with anything online these days. So it's like everything's a lie.
[00:09:33] Speaker B: Well, I know just like any government agency, they are somewhat self.
What's the word?
Ukraine agencies that does something, they have to do it. They have to justify their own existence.
[00:09:46] Speaker A: Well, if you get the Department of Government Efficiency on the job, you can get rid of these apartments.
[00:09:51] Speaker D: There you go. That's. That's the solution.
[00:09:53] Speaker A: Yeah. Then they don't have to be efficient because they can just get removed. At least that's what I'm told by the Internet. I don't believe it, but we'll see.
Well, OpenAI announced that they are looking at the structure of the organization. The board of directors is evaluating their corporate structure in order to best support the mission of ensuring AGI benefits all of humanity with three objectives. Choosing a nonprofit for profit structure that is best for the long term success of the mission or how do we make money? Make the nonprofit sustainable means the profit entities probably feed the nonprofit versus the way it is right now, which is the nonprofit feeds the profit entity and equip each arm to do its part of the mission that they have set out for AGI. Basically this is going to be the evolution from the fact they were a research lab back when they started and then they were a startup and now they're saying they want to become an enduring company with revenue and profit and loss and Wall street being angry at them for missing their estimates. And the board is consulting with outside legal and financial advisors to make the best possible structure for OpenAI in the future. So we'll keep an eye on this one. I suspect it'll be a few months before we hear what their new structure is. And I'm sure people will sue because that's what people seem to do these days. And they'll say that OpenAI has lost their way and their mission has failed.
[00:11:06] Speaker C: And blah, blah, blah, blah.
[00:11:07] Speaker A: It'll be all a bunch of stuff to look forward to later this year.
[00:11:11] Speaker C: I applaud them. They waited over a year to do exactly what everyone knew they were going to do after the whole like Sam Altman kerfuffle where they fired him and.
[00:11:18] Speaker A: Rehired him, but now they're bringing external counsel. So, you know, it's more on the up and up this time, apparently.
[00:11:24] Speaker C: And they got rid of all the, all the people who didn't like more.
[00:11:27] Speaker A: Yeah, exactly.
[00:11:29] Speaker C: So like this is. This has been coming for a While yeah, for sure.
[00:11:34] Speaker A: They also announced new tools for the OpenAI01 model. I hate the one model. It's such a dumb name. It's really dumb. The OpenAI Reasoning Model 01 is interesting more capable model for you new tools to customize those models and upgrade and improve. The importance of the model flexibility and cost efficiency will also all improve. So first UP is the OpenAI01 in the API with support her function calling developer messages, structured outputs and vision capabilities now available in the API you get real time API updates including simple web RTC integration, a 60% price reduction for GPT4 0 audio and support for GPT4 mini at 110 of previous audio rates. They give you preference fine tuning, a new model customization technique that makes it easier to tailor models based on user and developer preferences as well as a new Go and Java SDK available in beta which I'm kind of excited. There's a Go SDK. It's kind of nice.
Nice for my gentic AI. Wishes that I'll never do but I dream of.
[00:12:36] Speaker B: Yeah, I think, yeah I think the branding is kind of messed up like because they were really the first to launch a decent consumer facing service. Chat GPT is, is like the brand name, you know, just like Google. And so the fact they're not using like not calling it chat GPT01 is, it just boggles me. I don't know. I know I understand why they want to separate the web service from the underlying models but at the same time who really cares? I don't know.
[00:13:04] Speaker C: Glad there's the whole like naming of 01, right? The 4.0 instead of 0, you know, like, just like it's just made me very confused about what they're trying to do and which character I'm supposed to use.
[00:13:17] Speaker B: Yeah, it's kind of weird because they, they make a big deal about every, every little launch they make and look, now we're on 03 and now it's, now this other thing is coming and meanwhile Claude sits there in the background and I'm, I'm using it every couple of days and all of a sudden like oh, this behavior is different all of a sudden. Now instead of writing out, if I ask you to fix some code, it's not writing out just a little section that you've got to copy and paste in. They have some kind of like built in. Like it's almost like it's creating a diff and then you can see the artifact being updated just individually the lines that need to be changed every Time so it's reducing the number of tokens that get used. You can use it for longer now and it's just much easier to like integrate changes with existing code that you have. And it's kind of like chat GPTs. Can't remember what they call it, like the, the workspace thing where we do a similar thing where it creates a draft and then it updates the draft. But like Anthropic don't announce any of these changes, they're just like built. Okay, here's a new feature.
Very, very strange, very different companies, which is surprising since they kind of had basically the same origin, you know.
[00:14:20] Speaker A: Yeah, yeah.
[00:14:21] Speaker C: Well, I mean, it's interesting because what, you know, like my fear is that Anthropic will go through a similar sort of identity crisis, you know, if, if funding changes or their funding model changes a lot like OpenAI did, you know, last year ago.
[00:14:36] Speaker A: So it was, I mean, didn't Anthropic start as a per profit company? Like, I mean the whole open AI chat GPT, like their whole model started out from a different angle. So you know, then they found success and we're like, oh, we could make a lot of money. And now they ran into a problem like our structure doesn't actually support this model and we're making probably too much money for the tax and all the other problems that are going to cause them grief. Where Anthropic, I don't think it started that way. And they definitely, Anthropic has a very similar, they want to have, you know, they're, they're founded on the idea of having safe and, and you know, community or AI for the humanity basically. I don't know what they call that but you know, they basically have very similar, you know, guiding principles, but with the fact that we're going to make money from day one.
[00:15:23] Speaker B: Yeah, yeah. Ensuring AGI benefits all of humanity. I, I absolutely, absolutely hate that phrase. What do you mean by benefits humanity exactly? Which, which humans in particular you're going to benefit?
[00:15:34] Speaker A: Are we talking about.
[00:15:35] Speaker C: Yeah, the shareholders, of course.
[00:15:38] Speaker B: Well, there'll be no more disease. You know why? Because we know more humans.
[00:15:43] Speaker A: Transferred your, your consciousness into the machine and then you've given it, you know, you're being corralled by this, a, this AI AGI that's now growing your life. Yeah, I've seen this, I've seen the sci fi movie.
[00:15:56] Speaker C: Right, yeah.
[00:15:58] Speaker A: Well, speaking of Anthropic, they are in talks to raise a new round of funding that could value the company at $60 billion up from 16 billion less than a year ago. This is led by Lightspeed Venture Partners. The new round could pump an additional 2 billion into the company. Since the company was found in 2021, it has raised more than 11.3 billion from venture firms so far. This all comes on top of all of the founding rounds happening in the AI world, with Xai raising 6 billion recently and 6.6 billion raised by OpenAI in Q4. So, you know, AI is great. It's also very expensive. Takes a lot of capital to make it work. And we are seeing lots of capital being flowing into these AI companies.
[00:16:39] Speaker C: Yeah.
[00:16:40] Speaker B: And what's f. What's weird is of course, the Chinese models like Deep Seq and Quinn, which are trained on cheaper hardware that they have access to from. For much less money. I think Deep Seq was trained for like a tenth the price of any of the competing models. We've kind of forced the Chinese AI engineers to be really innovative because of the trade restrictions, and they're going to eat the lunch of these companies here.
[00:17:08] Speaker A: So we basically created NASA all over again. We're like, oh, if you want to make software for the shuttle, you had to fit in these very tight constraints. And so they got really good at writing really small programs that did exactly what was needed. And then, you know, now you talk about like, oh, you know, the space shut only ran on like, you know, 30k of memory. And everyone's like, what? That's impossible because, you know, Java takes at least, you know, three gigs to start up nowadays.
[00:17:32] Speaker D: Just turn on.
[00:17:35] Speaker B: Yeah.
[00:17:36] Speaker A: Efficiency in software is something that we've sort of lost in interpreted languages. That does make me a little sad. But maybe, maybe AI can help us go back to some. Something more efficient. We'll see.
[00:17:49] Speaker D: I don't think so. I think it's going to load off the same way that we all program right now. And you have the same problems. Yeah, it's going to load off Ryan's code. It's going to be great.
[00:17:58] Speaker C: Yeah.
[00:18:01] Speaker A: Well, Amazon didn't announce a lot during December, but they did poke Microsoft in the eye.
[00:18:07] Speaker D: Polite way to say what happened.
[00:18:09] Speaker C: Yeah.
[00:18:09] Speaker A: Amazon CISO CJ Moses publicly shamed Microsoft security, halting his employer's deployment of Microsoft 365 for a full year as the vendor tries to fix a long list of security problems Amazon identified. Industry security executives are of two minds. Some applaud Amazon, saying that the online retail giant has a revenue and employees to push Microsoft to fix the issue like that. Others don't have. Others were cynical, saying that the move is less altruistic to improve cybersecurity and more a thinly disguised sales pitch for AWS services, which I sort of lean towards that one.
Moses says that they conducted their own analysis of the software and asked for changes to guard against unauthorized access. Created more detailed accounting of user activity in the apps. He said they Deep Dive Office 65 and all of the controls around it. And we held it just as we had any other services team within Amazon to the same bar of service. Amazon requests included modifying tools to verify the users accessing the apps are properly authorized and once in that their actions are tracked in a manner that Amazon's automated systems can monitor for changes that might indicate a security risk.
[00:19:07] Speaker D: I don't think I agree with you Justin there where it's dumb trying to poke people and move people to Amazon services because I guess while aws, but they don't show work docs, work whatever, you know, they have work mail, they have them, but they're not real services.
[00:19:24] Speaker A: Oh, see this, if you read through the article a little bit more, they talk about, well, you know, Microsoft 365 is just a conglomeration of tools that all different authorization models and different things. And so they don't have a standardized auditing format and they don't have. They have these things. And so, you know, it looks secure on the Surface, but isn't. And so basically Amazon can say like, look, we built security in from day one, it's part of our design, it's day zero. And anything that you're buying from Microsoft and anything from their cloud is just a bunch of stuff they merge together just like Microsoft 365. And if that's bad, why would you assume Azure is not going to be any better?
[00:19:58] Speaker C: You think so that's a stretch. But I mean that's a stretch because.
[00:20:01] Speaker D: Also looking at S3 and it doesn't really follow the same IAM model and you know, EC2 and VPC falls into the EC2 world, which doesn't really follow the same model. So like any legacy services that you try to fit into the box don't really work great. So I almost feel like, yes, I'm not disagreeing with them where it is a hodgepodge of technologies, they've merged together over the years into what it is now. And there definitely are things that they need to clean up. But I also think that AWS still has some things there too that need to be improved.
[00:20:40] Speaker A: Yeah, again, I think this is.
You didn't have to talk about this. The Amazon CISO didn't have to say it on a podcast, didn't have to talk about, you know, this issue. This could be, you know, hey, Charlie over Charlie Bell. You know, we did this analysis. We think we have these issues. We can't move forward until you guys fix this. Will you do that? And then Charlie goes, yeah, yeah, we'll take care of it. It'll take us about six months to a year to do it and we'll get back to you. And then Amazon says, cool, cool. And then no one talks about it for a year and then they fix the issue and they go live and then there's a big party. Like it went live. No, it was, we, we threw you under the bus publicly in a way that wasn't necessary. And so that's why I say I, I think it is not, is not a situation where they're trying to make Microsoft do better. It is 100% of like our security is better than yours.
[00:21:29] Speaker D: I feel, I'm not disagreeing with that. But I would like to see Microsoft security not be a paid premium model to get the security. And you know, they did have a number of security incidents in the last year, like losing logs for customers, you know, and things like that that just shouldn't happen. So I almost feel like this is Amazon jumping on the bandwagon, you know, of Microsoft Aria being down.
[00:21:58] Speaker B: There are a lot of cloud cost management tools out there, but only Archera provides cloud commitment insurance. It sounds fancy, but it's really simple. Archera gives you the cost savings of a one or three year AWS savings plan with a commitment as short as 30 days. If you don't use all the cloud resources you've committed to, they will literally put the money back in your bank account to cover the difference. Other cost management tools may say they offer commitment insurance, but remember to ask will you actually give me my money back? Our chair will click the link in the show notes to check them out on the AWS marketplace.
[00:22:36] Speaker A: Well, if you want to get into other legal problems with Amazon, you can now use stable diffusion 3.5 on bedrock.
This allows you to generate high quality images from text descriptions and a wide range of styles to accelerate the creation of concept art, visual effects and and detailed product imagery for customers and media, gaming, advertising and retail. I did not have a chance to go ask it if the founding fathers to draw them for me and see if they turn out in the right color of skin. I did not have to do that. But I assume if there's issues we'll hear about it here at the Cloud Pod very quickly. But if you're looking for to stable diffusion 3.5, it's available to you in Bedrock.
[00:23:13] Speaker D: Yeah.
[00:23:16] Speaker A: I can sense your excitement about building images from AI.
I am starting to see more and more AI and advertising and in different forms of social media. And like it sticks out like a sore thumb.
[00:23:28] Speaker C: It really does.
[00:23:29] Speaker A: I'm like, oh, that's AI. Like I can tell you just, just look at it. There's no doubt about it.
[00:23:33] Speaker D: It might be because there's six fingers but don't worry about that.
[00:23:35] Speaker A: I mean there's those issues. But I mean even, even stuff that's even correct, you can tell it's AI.
[00:23:39] Speaker C: Like yeah, yeah, that's become my kid's favorite game as they're watching like commercials or whatever is to point out the ones that are like that's AI, that's AI. You know, nice. Whoever's the first to find the AI.
[00:23:53] Speaker B: One interesting use cases I think is becoming more popular is people making AI generated adult content and publishing it on things like OnlyFans. And they're not even real people behind these accounts. They're literally just machines cranking out images.
[00:24:09] Speaker A: What else do they even do that with? Like most of them have like built into the training model protections and things.
[00:24:16] Speaker B: There are plenty models which are nf not safe work.
[00:24:20] Speaker A: I did not know that enabled.
[00:24:22] Speaker B: Yeah.
[00:24:22] Speaker A: I apparently have not gone to the dark side of LM world.
[00:24:25] Speaker C: Yeah, you probably have to host it yourself. Yeah, probably it's not gonna be.
[00:24:28] Speaker A: Yeah.
[00:24:28] Speaker C: And you know, there's some caveats, I'm sure, but I thought Only Fans was early into that where they were talking about like, I don't know about video content, but at least the interaction of fans being very heavily AI based and that was a service. That's something.
[00:24:44] Speaker B: Yeah. I'm not aware of video content, but I, I know, I know. Like pages, the Redditor accounts with just pictures on there.
[00:24:50] Speaker A: I mean meta just had the whole thing last week where they said we're going to create all these AI people on our social platform.
Everyone like freaked out and they're like, we're not going to do that now. Like you should never done that.
[00:25:01] Speaker C: It's not that everyone freaked out. They did release one and it was a little sketch.
It was bad. Like I'm not one to be easily ruffled by these things. And I was like, oh, I, I, I wouldn't have done that for Persona choices. Like he could have been in a different prompt in a lot more safer.
[00:25:20] Speaker A: Way than you want I. So you went further down that route. I saw that article and I was like, I'm just gonna ignore that because I don't use Facebook.
Oh, yeah.
[00:25:27] Speaker C: I just read the.
[00:25:28] Speaker A: And I assume it was. It was terrible. And then it got pulled a week later, and I was like, it must have been real bad. So if you went down. The fact that you actually saw what the example was, I. I'm.
[00:25:35] Speaker C: Oh, yeah, no, they definitely went down the path of virtual signaling. So they tried to create. Not only did they try to create these Personas that you could interact with, but it was like they tried to create Personas that were in underrepresented, you know, aspects, but then they would do things like combine multiples. Right. And so the. The one that I read, the one that was high. Example was, you know, is. I, you know, the. The statement was, is I'm a. A queer mom who's highly, like, devoted to her children. And there's. And there was some other aspect of, like, artist or. Or something like that.
[00:26:14] Speaker A: It was just the profile of the AI, like, and then.
[00:26:16] Speaker C: Correct.
[00:26:17] Speaker A: Oh, my God.
[00:26:17] Speaker C: Yeah. And it was edit, you know, in spectacular fashion through interaction.
Completely blew up. Right. Because it was, you know, you can interact with this. This model and have conversations. And so you can ask questions about whether or not having these Persona attributes into your model is appropriation or, you know, or if it's, you know, if it's actually, you know, are you making. At what point is it, you know, a parody, you know, kind of thing? And it was just like, oh. And it was. Some of the answers were.
That came out as AI. Like, it was just not. Not a great thing at all. Not a great representation.
[00:26:57] Speaker A: So sort of like Microsoft's racist chatbot.
[00:27:00] Speaker C: It's very similar. Yeah.
[00:27:02] Speaker B: And I got pissed when I realized that the people I was playing with on Words with Friends were bots and not real people. And I didn't try and have social interactions with them. But to. To run a social media site and then completely throw away the social aspect, which is people talking to other people, and replace it with machines is just really, really weird. Not to mention the fact that you surely like diluting your advertising audience. I mean, you're not advertising. You're not getting revenue for adverts being shown to. To bots.
[00:27:28] Speaker D: Or are they?
[00:27:30] Speaker C: Oh, wow. I mean, I think it's one of those things. I. I think they're probably trying to get people to look at the site, so they're trying to cause more interaction, which is probably the.
And so, yeah, the Bots aren't going to consume advertisement, but they're probably losing eyeballs because only old people use Facebook anymore, as my kids are very prompt in telling me. And you know, so I imagine that the viewership is declining.
[00:27:56] Speaker A: I mean it works for X. That's how they keep their numbers up. You Russian bots hit the sidebo time. Works out perfectly.
[00:28:02] Speaker B: I, I quit X. Yeah. End of December, my, my premium subscription ran out. Last access to Gro. I'm like, okay, see? Yep.
[00:28:10] Speaker A: Yeah, I've pretty much moved over to Blue sky and Macedonia. Those are the two that I'm pretty much on now. We still have automation from the cloud pod, like in my account and stuff. You know, post things, you know, about our new episodes, but that's all automated and it's not me.
[00:28:24] Speaker B: Yeah, we should, we should set up a cloud pod Blue sky account.
[00:28:27] Speaker A: I have Mastodon. I do not have a Blue sky cloud pod. But I will do that before we publish this episode so someone doesn't yank it.
Well, can't.
[00:28:37] Speaker C: You do.
[00:28:38] Speaker B: Yeah, yeah, you can just use, you can just use our domain. Yeah, you use our domain as the, the account for sure.
[00:28:44] Speaker A: As we have the cloud pod one on X and we've had it forever because we can't reach the person who actually owns the cloud pod who doesn't use it and hasn't used it for 15 years. You know, it's so annoying.
[00:28:54] Speaker C: That's frustrating. Yeah.
[00:28:56] Speaker A: But you know, I think I. So it's sort of interesting now in context that you explained to me this AI chat thing because then, you know, yesterday or Zuckerberg said they're moving the content moderation team to Austin and then they're killing, they're actually killing the most of that team and moving into community notes, which is very similar to what X does as well, where the community can comment on something and you know, vote up or down with the actual content of the messages.
So I wonder, I wonder how if that's all related in some way or not.
[00:29:27] Speaker C: I don't know. I am like, I will applaud Zuckerberg one thing which is that he was at least transparent saying that, you know, there's the re. The only reason they're doing this is because the political headwinds are changing and, and it's, you know, like, and it's, it's obvious to me that it'd be a cost savings. You don't have to have teams of content moderators and, and, and support sort of the back end moderation systems as well as any automation you have that's doing automated reviewing and stuff like that. You can turn all that stuff off and do savings as well. But then. Yeah, it's just to. So there's not as much noise. Right. Because the political headwinds have shifted the other direction. And you know, the new overlords would gladly welcome Russian interference with our election system, so. Great.
[00:30:10] Speaker A: Can't wait.
Well, AWS is saying it's going to invest $11 billion to expand data center in Georgia. This is after they said they're going to invest $11 billion eight months ago in Indiana. The new Georgia region will expand its infrastructure to support various cloud and AI technologies. And AWS estimates it will create roughly 550 jobs in the state. So we'll have soon a new Georgia Peaches region. I guess that would be pretty sweet.
[00:30:36] Speaker D: Is it that or is it US East 1? That's that far south now.
[00:30:40] Speaker A: Right. It's just expanded so far that it's just out of Georgia.
[00:30:44] Speaker D: It's like New York down to Georgia at this point with the local zone in Miami, I think.
[00:30:49] Speaker C: Yeah.
[00:30:50] Speaker A: I wonder if Georgia will be a local zone or if it'll actually be a full region. Same thing with Indiana, to be honest. Like, are these just going to be local zones or are they going to be full regions? Because $11 billion seems like a lot for a local zone, I think US.
[00:31:04] Speaker D: Central and Azure, I thought was in.
[00:31:07] Speaker A: Ohio and so was.
So is US in Ohio? Is in Ohio and then Google's in Iowa. I think Azure is also in Iowa.
[00:31:19] Speaker D: Yeah.
[00:31:23] Speaker B: Pretty sure they're going where the electricity is the cheapest at this point.
[00:31:27] Speaker A: I assume that's a driver. Does Georgia. Is Georgia known for its power? I mean, I assume it's all nuclear like Florida, right?
[00:31:36] Speaker D: Yeah.
[00:31:37] Speaker C: I don't. I can't imagine there's a whole bunch.
[00:31:38] Speaker B: Of hydro.15 cents a kilowatt hour.
[00:31:42] Speaker D: There is a new Georgia power plan in Georgia.
[00:31:45] Speaker A: I don't know how they're generating power in Georgia because I don't think they have the river system for damming.
They're definitely. I mean, maybe solar.
[00:31:52] Speaker D: I mean, maybe natural gas.
[00:31:55] Speaker A: I mean, that's not going to help you with your green initiatives.
[00:31:57] Speaker D: That's a different problem.
[00:31:59] Speaker A: Climate pledge arena says otherwise, Matt.
[00:32:02] Speaker C: Yeah.
[00:32:04] Speaker A: They drive by that building every day to remind themselves of their pledge to the climate that they're ignoring. So don't you forget it.
[00:32:13] Speaker D: Yeah. It looks like there's two nuclear facilities in Georgia.
[00:32:16] Speaker A: Yeah, I'm not surprised.
Well, other. Speaking of other regions, they have opened the Thailand region It's now generally available with three availability zones. This is the first region in Thailand and the 14th region in the Asia Pacific region. The adoption of cloud computing has gained significant momentum in Thailand, driven by evolving business needs and government initiatives such as Thailand 4.0.
I don't know what 1.0, 2.0 or 3.0 for Thailand were, but I. Apparently this is the fourth.
[00:32:41] Speaker C: Yeah, well, this is better.
[00:32:43] Speaker A: Three times better.
[00:32:44] Speaker D: Four.
[00:32:45] Speaker A: Yeah, yeah.
[00:32:47] Speaker D: It's amazing how many regions there are now versus, like, I remember starting on aws, it was like, okay, there's one in Europe, maybe two in Europe and the two in the US and go. And.
[00:32:58] Speaker A: And Australia or Japan.
[00:33:00] Speaker D: Like that was it at the time.
[00:33:02] Speaker A: Yeah, yeah, I remember the days. And there was not a lot of regions. And, and, and the argument for Amazon was like, well, we don't need a lot of regions. You get more benefit of economies of scale if they're. There's less of them in areas where we can grow. And then all governments are like, no data sovereignty. And then like, oh, screw that. We have to kind of build 14 regions.
[00:33:20] Speaker D: How many more are on the roadmap that they've publicly announced? And then how many more behind the scenes? Like, they gotta be getting up to running out of places to put regions too.
[00:33:29] Speaker C: I feel like, I don't know, Oracle seems to spin up a shoebox of a closet of whatever they're doing.
[00:33:35] Speaker A: I mean, Google does too. So based on the capacity that we have available to us.
All right, let's talk about Google. So Google CEO Sundar Pichai told its employees that stakes are high in 2025 as the company faces increased competition and regulatory hurdles, and contends with rapid advances in AI. He addressed the need to move faster as a company, as this is a disruptive moment. I then expected to hear the we're going back to the office five days a week. But they didn't say that. So that's not happening yet. Uh, it's not lost or. Sorry. Patricia says it's not lost on me that we're facing scrutiny across the world. It comes with our size and success. It's part of a broader trend where tech is now impacting society at scale. So more than ever through this moment, we have to make sure we don't get distracted. You know, AI regulation around them being a search monopoly indeed can be a busy year for Google. You know, the other thing about, you know, it's is you get to become a bigger company and your size and scale, and you're doing bad things because you Got rid of don't be evil maybe is also why you're under scrutiny. Just put it out there.
[00:34:36] Speaker C: I thought he was CEO of Alphabet. Did that get rolled back?
[00:34:39] Speaker A: He is Alphabet. I mean, Google. Yeah.
How people at media.
[00:34:44] Speaker C: Oh, no, he's CEO of Google and Alphabet. Yeah, absolutely. I was wondering if. And so like that's according to his LinkedIn profile.
Yeah. I mean, I think that just goes to show you that you know why some of the stakes are high and why the antitrust is there. Right. Like it's. It's sort of a fallacy that there's multiple businesses within the Google ecosystem.
You know, they did all the separation, but that was mostly for financial. And I think maybe it had some sort of driving force behind antitrust, but it's going to be huge.
[00:35:16] Speaker B: Yeah. I struggle to sympathize with the people who think breaking up the company is the best idea.
[00:35:23] Speaker A: I mean, it might be the best thing that ever could happen to Google because Google is going to have this problem where if search is going to be monopoly, AI is going to eat search, then cool, we'll just take this search thing and we'll spin that off and then we'll have AI, which is going to replace Search anyways.
Might be the best divestment strategy they've ever had.
[00:35:42] Speaker B: I don't think even Google has the money for AI to replace search meaningfully.
[00:35:46] Speaker A: Yeah, probably not, but doesn't mean they're not going to try.
[00:35:51] Speaker C: No, they'll definitely get a try.
[00:35:54] Speaker A: They did announce a Google Cloud feature, though. Database center is now expanding to support BigTable, Memory Store and Firestore, which. Yeah, if you're going to call your product Database center, you should have the opinion that you're going to manage all the databases that you have on your platform. Not just Cloud SQL, but all the databases. Because you didn't call it Cloud SQL center, you call it Database Center. So glad to see that get resolved. You'll now be able to gain a comprehensive view of your entire database fleet, practically de risk your database fleet through that tool and optimize your database fleet with AI powered assistance, which, you know, if it starts writing indexes for me, I'm going to be super happy.
[00:36:30] Speaker D: I thought Memory Server was Redis.
[00:36:33] Speaker C: Well, it's. It can be Redis or memcache.
[00:36:37] Speaker A: Right.
This is where we talk about Redis. It can be used as a database. You shouldn't use it as a database, but it can be used as a database.
[00:36:45] Speaker D: You can also use it as a queue.
[00:36:46] Speaker A: Yeah. Or a key store. Or all Kinds of things that you shouldn't do.
[00:36:50] Speaker D: Yeah.
[00:36:50] Speaker A: So, yeah.
[00:36:51] Speaker D: I just don't think of cache always as a database. But I also know that Azure's billing rolls it up into database category too, for. For Redis.
[00:37:03] Speaker A: So, yeah. I've seen many people screw this up too, because they're thinking they're using a cache, so they actually set a. Set a non expiration date on it. So the data just lives in the.
[00:37:13] Speaker C: Cache forever or the alternate. I've also seen.
[00:37:17] Speaker A: Yeah. Where the data they put.
[00:37:18] Speaker C: Why did I go away?
[00:37:18] Speaker A: Yeah. Why did the data disappear?
[00:37:22] Speaker D: I remember, I think it was the I instance types on aws. One of my friends at one point.
[00:37:29] Speaker B: With the local disk.
[00:37:30] Speaker D: Yeah. With the local drives, they were like, oh. They had one of their interns, like, looking at it and they were like, oh, let's set up our Mongo TP on I install types because it's cheaper because we get the storage. Not realizing that it's ephemeral storage. So luckily they have their clustered, so it wasn't that big of a deal. But one of those fun. Yeah, let's put everything in ephemeral. And what could possibly go wrong.
[00:37:52] Speaker B: Yeah. We ran an elasticsearch on those instances.
[00:37:57] Speaker A: Yes, we did.
Yep.
[00:38:00] Speaker B: In a previous and glad to be forgotten live.
[00:38:02] Speaker A: Yeah. And I'd like to never do that again.
[00:38:05] Speaker B: No.
[00:38:06] Speaker A: Every time someone says elastic to me, I'm just like, nope, I'm sorry, pass.
Even Ryan was like, I'm gonna use elasticsearch for its actual use case. And I'm like, I don't know if I believe you.
[00:38:18] Speaker B: I never want to see that again.
[00:38:20] Speaker C: Yeah. I am the writer and the consumer.
[00:38:21] Speaker A: It'll be okay. Is it, though? I don't know.
[00:38:24] Speaker C: Is it?
[00:38:24] Speaker A: Yeah.
[00:38:25] Speaker C: Yeah.
[00:38:28] Speaker A: I knew I was in trouble when Ryan was saying I want to use the last assertion. I'm like, son of a. Fine.
[00:38:34] Speaker C: To be fair, some of that is because of your reaction.
[00:38:37] Speaker A: Oh, I know, I know. That's the thing. I'm like, is this really the right architectural choice or is this you trolling me? Like, I think it's you trolling me.
[00:38:44] Speaker C: Half of it and l.
[00:38:49] Speaker A: Could he have used Spanner? Yes, he could have, but yes, I could have. It would have been much better to, you know, make it just.
[00:38:55] Speaker C: I would have had to write a lot more ETL infrastructure code. Right. Like, I'm doing amazing things with logs, Dash. It's really easy.
[00:39:06] Speaker D: While you're gonna start using it. Right. We know by the time you're done with it, you're gonna be like, yeah, I've abused it in a bad ways.
[00:39:14] Speaker A: No, he won't abuse it. Someone else will abuse it. And then he's like, no, they used it in a way I didn't expect to then be like, oh no, I'm. It's now it's logging.
[00:39:22] Speaker C: It's my cluster. It's like a database. It's not using it.
[00:39:26] Speaker A: Like no one, no one can use this thing.
Microsoft has a weird announcement. They wrote a full blog post, full feature announcement, blog post for availability of the O1 models on Azure OpenAI service.
And I was reading through it and seeing what I want to talk about here on the show and there are all the features, vision input, developer messages, reasoning efforts, blah blah blah. And then I noticed, wait, they're pre announcing this? It's not actually available. You can't get it. Today they were pleased to announce the O model is coming soon to Microsoft Azure OpenAI service. So if you're excited about reasoning and you want to use it on your Azure OpenAI service, suck it. You have to wait. This blog post is making you think otherwise. So be wary of this reality until probably next week when they'll announce it's actually available. But I find it funny that in December they decided to announce this Pre announcement on December 17th and it's coming soon.
[00:40:19] Speaker C: I mean, do you think they'll announce that they actually have it? Like maybe they're.
[00:40:22] Speaker A: That's true. It's Azure. It'll just live in non announced stage forever.
[00:40:29] Speaker D: Don't hurt me.
[00:40:35] Speaker A: Well, Microsoft sees your $11 billion. Amazon says screw you. We're spending $80 billion in fiscal 2025 to build data centers designed to handle AI workloads. These AI enabled data centers will be designed to train AI models and deploy AI and cloud based applications around the world. Brad Smith, Microsoft Vice Chair and President, wrote, as we look into the future, it's clear that artificial intelligence is poised to become a world changing GPT. AI promises to drive innovation and boost productivity in every sector of the economy. The United States is poised to stand at the forefront of this new technology wave, especially if it doubles down on its strengths and effectively partners internationally.
So great. I'm sure Amazon's spending more than 80 billion. They just don't talk about it because people would go like how you're spending how much?
Because Amazon should definitely not be spending $80 billion on AI at this point with their current offerings.
[00:41:26] Speaker C: I'm terrified because either like either they're just throwing AI on here to get funding because it's the Only way anything gets funded these days is it has to have AI somewhere on it, written.
[00:41:34] Speaker A: On it, you know, or like they're building Skynet.
It's not going to be secure. It's going to get taken over by somebody. We already talked about Microsoft and they're like.
[00:41:46] Speaker D: Did you listen to the beginning part of the podcast?
[00:41:48] Speaker B: Yeah.
[00:41:49] Speaker A: Were you here earlier?
[00:41:50] Speaker C: Yeah. It's just crazy to me. We're no longer talking about applications and new versions of models. We're talking about old data centers.
[00:42:00] Speaker A: OpenAI needs to accelerate their training because 01 reasoning. O2 is going to be coming at some point, you assume, and they want to make sure they have enough capacity to train all these crazy models that OpenAI wants for AGI. You know, the AGI data center where Skynet's going to be.
[00:42:16] Speaker B: So, yeah, I'm just thinking about the government use cases, surveillance, correlations, and this.
[00:42:23] Speaker A: Thing happening in administration. We don't need to talk about that. Let's move on.
Good.
[00:42:27] Speaker D: Thank you for trying to get about this.
[00:42:31] Speaker A: This isn't the time, Jonathan.
[00:42:33] Speaker D: Yeah. Read the room.
[00:42:35] Speaker B: It's exciting.
[00:42:36] Speaker A: January 6th was just two days ago. It's too early.
[00:42:45] Speaker B: Okay, well, I guess. Well, I guess, you know, once we, once we capture Greenland, we'll have a whole bunch of extra space to. To build data centers.
[00:42:53] Speaker A: I mean, Canada has a lot of natural cooling, that's what I hear.
[00:42:56] Speaker B: And Hydroelectric, too. Yeah, that's a good reason. Yeah. Let's invite.
[00:43:03] Speaker A: Okay, for all of you who got those new shiny budgets in 2025, I have the perfect way for you to spend it. By buying Oracle Xadata X11IMS. This is the new and improved Oracle Exadata, the latest generation of the XDATA platform. Apparently, Compared to the X10M, it's 55% faster with vector searches, 25% faster on OLTP transactions and concurrent transactions, and 25% faster in analytical query processing. It still costs an ARM and a leg. The initial exadata infrastructure that includes two database servers and three storage services with eight ECPU hours will run you $12,799 per month. But it gets crazy real, real fast. So I just upped this to 64 ECPU because I was curious and my price went to $26,800. And I learned that 64 ECPU does not guarantee me memory.
So that's just whatever carved out of those two database servers. So those two database servers don't have enough memory. Then I need more database servers. And when you start adding more database servers on top of this to the 64 ECPU. This price multiplies real, real fast.
[00:44:05] Speaker D: I feel like this is more of like how many seconds can you burn your budget in?
[00:44:09] Speaker A: Yeah, I mean you could do it pretty darn fast.
[00:44:12] Speaker D: Couple 10, 20.
[00:44:14] Speaker C: You know, I'm still just very confused about these offerings. Like it's, it's, it's, it's structured offering, but it's not like hardware. It's still a service.
It is hardware.
[00:44:25] Speaker A: It's dedicated to you.
[00:44:27] Speaker C: Okay.
[00:44:28] Speaker A: And it is. Yeah.
[00:44:29] Speaker C: But it lives in.
[00:44:31] Speaker A: It lives in or on prem. You can buy it either way and so yeah.
[00:44:38] Speaker C: Okay, that helps. I didn't realize that it was hardware when they said it's hosted on the cloud. It was just like it a thing.
[00:44:48] Speaker D: Found your first problem trying to understand Oracle.
[00:44:51] Speaker C: Yeah.
[00:44:52] Speaker A: Yeah. So the, the largest configuration of an X11 you can get is 32 database servers and 64 storage servers. That'll cost you $207,000 a month. Then the largest exadata database you can add on top of that is 24,320 ECPUs and that'll cost you $6 million a month. So combined it's $6.286 million a month on the maximum configuration of the dedicated XDATA cloud infrastructure.
So like I said, you can burn your money real fast.
[00:45:25] Speaker B: You're probably on the list for just looking that up now.
[00:45:27] Speaker D: Probably, yeah.
[00:45:28] Speaker A: I hope my Oracle rep is already calling me. You're like, I saw you on the pricing calculator.
Are you interested in some X data? No, I am not. Thank you.
[00:45:36] Speaker C: It pages them out of a dead sleep. Yeah, yeah.
[00:45:39] Speaker A: There's a disturbance in the force.
[00:45:41] Speaker D: It more like also notifies their lawyers to go audit you to see.
[00:45:46] Speaker A: Yeah, I mean that's another problem.
Well, that's it. Welcome back guys. Looks forward to a fantastic year here at the cloud pod. We'll see what we're going to pull out of our sleeves this year as fun content. And we've got Google Next we'll have reinvent again. I'm sure we have some Azure events in here. Matt will remind me about two weeks after they happen because I can't keep track of Microsoft's bajillion events. But yeah, you know, always glad to have you guys here at the show. Everyone came back this year, unlike Peter.
[00:46:20] Speaker D: So we'll get him to join us. One week.
[00:46:23] Speaker A: He should, he should. He's. I think he's enjoying retirement though, for the most part from the collab pod and from whatever else he was doing. I don't think he's doing much other than spending time with his family. It was well deserved.
All right, gentlemen, see you next week here in the Cloud.
[00:46:37] Speaker B: See you later.
[00:46:38] Speaker C: Bye, everybody.
[00:46:39] Speaker D: Bye, everyone.
[00:46:42] Speaker B: And that's all for this week in Cloud. We'd like to thank our sponsor, Archera. Be sure to click the link in our show notes to learn more about their services.
While you're at it, head over to our
[email protected] where you can subscribe to our newsletter, join our Slack community, send us your feedback, and ask any questions you might have. Thanks for listening, and we'll catch you on the next episode.