this post was submitted on 09 Aug 2025
940 points (99.0% liked)
Technology
73967 readers
3597 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
This is a wild take. You can get chatbots to vomit out entire paragraphs of published works verbatim. There is functionally no mechanism to a chatbot other than looking at a bunch existing texts, picking one randomly, and copying the next word from it. There's no internal processing or logic that you could call creative, it's just sticking one Lego at a time onto a tower, and every Lego is someone's unpaid intellectual property.
There is no definition of plagiarism or copyright that LLMs don't bite extremely hard. They're just getting away with it because of the billions of dollars of capital pushing the tech. I am hypothetically very much for the complete abolition of copyright and free usage of information, but a) that means everyone can copy stuff freely, instead of just AI companies, and b) it first requires an actually functional society that provides for the needs of its citizens so they can have the time to do stuff like create art without needing to make a livable profit at it. And even if that were the case, I would still think the current implementation of AI is pretty shitty if it's burning the same ludicrous amounts of energy to do its parlor tricks.
The energy costs are overblown. An response costs about 3Wh, which is about 1 minute of runtime for a 200W Pc, or 10 Seconds of a 1000W microwave. See the calculations made here and below for the energy costs. if you want to save energy, go vegan and ditch your car; completely disbanding ChatGPT amounts for 0,0017% of the CO2 Reduction during Covid 2020 (this guy gave the numbers, but had an error in magnitude, which i fixed in my reply, calculator output is attached. It would help climate activists if they concentrated on something that is worthwhile to criticize.
If i read a book, and use phrases out of that book in my communication, it is covered under fair use - the same should be applicable for scraping the web, or else we can close the internet archive next. Since LLM output isn't copyrightable, i see no issues with that - and copyright law in the US is an abomination which is only useful for big companies to use as a weapon, small artists don't really profit from that.
The costs for responses are overblown, but the costs for training are not.
Adding the cost for training, which is a one time cost, to ChatGPT raises the power consumption from 3W to 4W. That's the high-end calculation btw.