this post was submitted on 11 Apr 2025
102 points (97.2% liked)

Technology

69421 readers
2602 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] besselj@lemmy.ca 44 points 2 weeks ago* (last edited 2 weeks ago) (10 children)

Call it what you will, but all signs seem to indicate that generative AI is simply not as profitable as the evangelists want it to be.

[–] Voroxpete@sh.itjust.works 19 points 2 weeks ago (6 children)

Not even remotely. LLMs have failed to find any viable market fit.

The problem continues to be hallucinations and limited utility. This is compounded by the fact that LLMs are very expensive to run. The latter problem wouldn't really be a problem if LLMs were truly capable of replacing a human employee, but they're not. They're just too unreliable for any serious enterprise grade application, and they're too expensive for any low severity application.

For example, as a coding assistant, a lot of people quite like them. But as a replacement for a human coder, they're a disaster. That means you still have to employ the expensive human, and you also have to pay an exorbitant monthly fee for what amounts to a very cool search engine.

There are tonnes of frivolous applications where they work really well. The AI girlfriend stuff, for example. A chatbot that sexts you is a very sellable product, regardless of how icky it might seem to some people. But no one is going to pay over $200 / month for it (as an example, ChatGPT still doesn't make a profit at their $200/month tier).

LLMs are too unreliable to make anything better than toys, but too expensive to sell as toys.

[–] hubobes@sh.itjust.works 5 points 2 weeks ago

That is not true, we use Vision/LL Models to extract data and structure said data from documents. It is amazing how easy it is compared to OCR and needing to know where which information is located.

load more comments (5 replies)
load more comments (8 replies)