this post was submitted on 02 Feb 2026
60 points (79.4% liked)

Technology

79983 readers
4056 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] panda_abyss@lemmy.ca 15 points 1 day ago (1 children)

Yea, but those are all using heaps of proprietary heuristics. 

The beauty of LLMs and one of their most useful tasks is taking unstructured natural language content and converting it into structured machine readable content. 

The core transformer architecture was original designed for translation, and this is basically just a subset of translation. 

 This is basically an optimal use case for LLMs. 

[–] MolochHorridus@lemmy.ml 6 points 1 day ago (2 children)

Quite obviously not the optimal use case. “The tensor outputs on the 16 show numerical values an order of magnitude wrong.”

[–] JPAKx4@piefed.blahaj.zone 8 points 23 hours ago

That's the hardware issue he was talking about, it has no relation to the effectiveness of the usage of the LLM. It sounded to be mostly a project he was doing for fun rather then out of necessity

[–] db2@lemmy.world 4 points 1 day ago

Grok says it's right so it must be 🤤