246: The CloudPod Will Never Type localllm Correctly

Episode 246 February 17, 2024 01:03:25
246: The CloudPod Will Never Type localllm Correctly
tcp.fm
246: The CloudPod Will Never Type localllm Correctly

Feb 17 2024 | 01:03:25

/

Show Notes

Welcome to episode 246 of The CloudPod podcast, where the forecast is always cloudy! This week we’re discussion localllm and just why they’ve saddled us all with that name, saying goodbye to Bard and hello to Gemini Pro, and discussing the pros and cons of helping skynet to eradicate us all. All that and more cloud and AI news, now available for your listening nightmares. 

Titles we almost went with this week:

Oracle says hold my beer on Africa

The Cloud Pod Thinks the LLM Maturity Model has More Maturing To Do

There is a Finch Windows Canary in Fargate

New LLM Nightmares

The Cloud Pod Will Never Type localllm Correctly

A big thanks to this week’s sponsor:

We’re sponsorless this week! Interested in sponsoring us and having access to a very specialized and targeted market? We’d love to talk to you. Send us an email or hit us up on our Slack Channel. 

General News

It’s Earnings Time! 

01:42 Microsoft issues light guidance even as Azure growth drives earnings beat 

02:46 Justin- “I don’t think the count the Open AI customers, do you? Because there’s way more people that have Open AI usage than 53,000. So I think this is legitimately Azure AI – which is Open AI under the hood – but specifically paying for that subscription.”

04:19 Alphabet shares slide on disappointing Google ad revenue  

04:51 Justin- “…which is interesting, because you would expect that they’d have similar growth being tied to Bard and Gemini to be close to what Microsoft is doing.”

12:02 Amazon reports better-than-expected results as revenue jumps 14% 

14:19 Jonathan – “I think AI is great for tinkering right now, but I think the cloud that’s going to win – and I suspect it’s going to be Amazon despite Google’s early lead – will be the cloud that provides the best tooling around SDLC.”

AI is Going Great (or how ML Makes all Its Money)

17:22 Building an early warning system for LLM-aided biological threat creation 

22:15 Justin- “We assumed Skynet takes us out with nuclear weapons; but we’re teaching it how to make biological weapons. That’ll work even better!”

AWS

22:44 Finch Container Development Tool: Now for Windows 

24:50 AWS Free Tier now includes 750 hours of free Public IPv4 addresses, as charges for Public IPv4 begin

24:58   Justin – “So, thank you for the free ones, but also, I just got a really big increase in my bill for all the IPV4 addresses you have that I can’t turn off because you don’t support IPV 6 on those services yet…I really don’t appreciate it. And those 750 free hours? Amazon – you can shove them somewhere.”

27:40 Amazon FSx for OpenZFS now supports up to 400,000 IOPS  

29:00 Announcing CDK Migrate: A single command to migrate to the AWS CDK 

29:51   Ryan – “I like features like this, just because anything where you’re taking your resources where you’ve deployed and being able to configure them into a stateful representation I think is a neat tool. It’s super powerful for development.”

40:14 AWS Fargate announces a price reduction for Windows containers on Amazon ECS 

40:44   Justin – “If you HAVE to run Windows containers, this is the *only* way I’d recommend…which, I guess having a price cut is pretty nice. But if this is your model of deployment – try something else. Please.” 

GCP

42:33 Firestore Multiple Databases is now generally available

44:14   Ryan – “We were laughing before the show because we all learned that this was a limitation, and it’s crazy… don’t get me started on how Google provides their managed services; I’m sure that’s what this is. The way they implemented it required these backend connections into your project through your network.”

45:31 Heita South Africa! The new Google Cloud region is now open in Johannesburg   

46:12 Bard’s latest updates: Access Gemini Pro globally and generate images 

48:17 Jonathan – “I think this just confirms our suspicions that Bard was rushed out the door in response to Chat GPT.” 

48:51 No GPU? No problem. localllm lets you develop gen AI apps on local CPUs

50:12   Jonathan – “I’m pretty sure they’ve chosen that name just for SEO. This is named purely for SEO, because everyone is searching for Local Llama right now, and that’s Meta’s tool, and you can already run those models locally with the same technology and techniques to quantize the model…this is totally a hack on people who are already running Local Llama.”

Azure

56:36 Achieve generative AI operational excellence with the LLMOps maturity model

58:05   Justin – “This is so junior at this moment in time. It’s just covering LLM usage; it’s not covering LLM development or any other LLM use cases. And I expect that in a year it’s just laughed at.”

Oracle

1:01:24 OCI announces plans to expand in Africa 

Closing

And that is the week in the cloud! Just a reminder – if you’re interested in joining us as a sponsor, let us know! Check out our website, the home of the Cloud Pod where you can join our newsletter, slack team, send feedback or ask questions at theCloud Pod.net or tweet at us with hashtag #theCloud Pod

Other Episodes

Episode 124

July 08, 2021 00:44:29
Episode Cover

124: The Cloud Pod now with millions of bugs

On The Cloud Pod this week, with the first half of the year full of less-than-ideal events, the team is looking forward to another...

Listen

Episode

January 08, 2019 36:35
Episode Cover

Episode 4 – The podcast is now available in multiple regions

Show Notes Follow Up Jedi Contract Jedi Contract/AWS Bid Riddled with Conflicts of Interest  Community Licensing Issue Adam Jacobs – Sustainable Free and Open...

Listen

Episode 173

July 21, 2022 01:01:43
Episode Cover

173: Oracle Begins Its Invasion of Sovereign Nations

On The Cloud Pod this week, the team discusses shorting Jim Chanos amid the great cloud giant vs. colo standoff. Plus: Google prepares for...

Listen