this post was submitted on 24 Jun 2025
634 points (98.9% liked)

Technology

71922 readers
3903 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] LifeInMultipleChoice@lemmy.world 2 points 1 day ago* (last edited 1 day ago) (1 children)

Oh I agree it should be, but following the judges ruling, I don't see how it could be. You trained an LLM on textbooks that were purchased, not pirated. And the LLM distributed the responses.

(Unless you mean the human reworded them, then yeah, we aren't special apparently)

[–] WraithGear@lemmy.world 3 points 1 day ago* (last edited 1 day ago) (1 children)

Yes, on the second part. Just rearranging or replacing words in a text is not transformative, which is a requirement. There is an argument that the ‘AI’ are capable of doing transformative work, but the tokenizing and weight process is not magic and in my use of multiple LLM’s they do not have an understanding of the material any more then a dictionary understands the material printed on its pages.

An example was the wine glass problem. Art ‘AI’s were unable to display a wine glass filled to the top. No matter how it was prompted, or what style it aped, it would fail to do so and report back that the glass was full. But it could render a full glass of water. It didn’t understand what a full glass was, not even for the water. How was this possible? Well there was very little art of a full wine glass, because society has an unspoken rule that a full wine glass is the epitome of gluttony, and it is to be savored not drunk. Where as the reference of full glasses of water were abundant. It doesn’t know what full means, just that pictures of full glass of water are tied to phrases full, glass, and water.

[–] LifeInMultipleChoice@lemmy.world 3 points 1 day ago* (last edited 1 day ago)

Yeah, we had a fun example a while ago, let me see if I can still find it.

We would ask to create a photo of a cat with no tail.

And then tell it there was indeed a tail, and ask it to draw an arrow to point to it.

It just points to where the tail most commonly is, or was said to be in a picture it was not referencing.

Edit: granted now, it shows a picture of a cat where you just can't see the tail in the picture.