this post was submitted on 30 Jan 2026
328 points (99.1% liked)

Technology

79674 readers
4004 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

The immediate catalyst, it seems, is an intensifying focus on capex, or capital expenditures. Microsoft revealed that its spending surged 66% to $37.5 billion in the latest quarter, even as growth in its Azure cloud business cooled slightly. Even more concerning to analysts, however, was a new disclosure that approximately 45% of the company’s $625 billion in remaining performance obligations (RPO)—a key measure of future cloud contracts—is tied directly to OpenAI, the company revealed after reporting earnings Wednesday afternoon. (Microsoft is both a major investor in and a provider of cloud-computing services to OpenAI.)

you are viewing a single comment's thread
view the rest of the comments
[–] postscarce@lemmy.dbzer0.com 1 points 2 hours ago

LLMs could theoretically give a game a lot more flexibility, by responding dynamically to player actions and creating custom dialogue, etc. but, as you say, it would work best as a module within an existing framework.

I bet some of the big game dev companies are already experimenting with this, and in a few years (maybe a decade considering how long it takes to develop a AAA title these days) we will see RPGs with NPCs you can actually chat with, which remain in-character, and respond to what you do. Of course that would probably mean API calls to the publisher’s server where the custom models are run, with all of the downsides that entails.