Traister101

joined 2 years ago
[–] Traister101@lemmy.today 13 points 9 months ago (1 children)

Yes but a lot of us who do block ads block them largely because they are intolerable. I largly only started blocking ads at all because of how utterly miserable YouTube ads became.

[–] Traister101@lemmy.today 8 points 9 months ago

While this is pretty hilarious LLMs don't actually "know" anything in the usual sense of the word. An LLM, or a Large Language Model is a basically a system that maps "words" to other "words" to allow a computer to understand language. IE all an LLM knows is that when it sees "I love" what probably comes next is "my mom|my dad|ect". Because of this behavior, and the fact we can train them on the massive swath of people asking questions and getting awnsers on the internet LLMs essentially by chance are mostly okay at "answering" a question but really they are just picking the next most likely word over and over from their training which usually ends up reasonably accurate.

[–] Traister101@lemmy.today 0 points 9 months ago (1 children)

Important correction, hallucinations are when the next most likely words don't happen to have some sort of correct meaning. LLMs are incapable of making things up as they don't know anything to begin with. They are just fancy autocorrect

[–] Traister101@lemmy.today 1 points 11 months ago

The way to get around it is respecting robots.txt lol

[–] Traister101@lemmy.today -1 points 1 year ago (4 children)

95% of the people in a dictatorship like the dictator! That's crazy

view more: ‹ prev next ›