this post was submitted on 18 Feb 2026
904 points (99.3% liked)

Technology

81451 readers
4531 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] SCmSTR@lemmy.blahaj.zone 1 points 11 hours ago (1 children)

I mean, yes, but also that's a bit nuclear. Machine learning has real, fully good ethical and responsible uses... The problem is that society has yet to agree on the philosophy of what that is, and most business-first minded people have SUPER shitty, or even completely missing moral compasses.

So, effectively what you say, yes. But technically with much nuance and many clauses, not entirely.

We are clearly not ready as a species to handle it. Though, maybe we'll burn the shit out of our hands in the next coming century enough to learn. But either way, it's DEFINITELY not an "ignore all risk and run blindly at this shiny new flame" thing like a lot of people seem to think and treat it.

[โ€“] RalfWausE@feddit.org 2 points 3 hours ago

The thing is: "AI" can be a useful tool in the hands of an competent programmer, media creator and so forth... BUT it is literally the dark side of the force. Just to bring in the Yoda quote:

Luke: ... Is the dark side stronger?

Yoda: No, no, no. Quicker, easier, more seductive.

The problem it allows a horde of fools to create software that is - at best - dysfunctional and at worst really dangerous. While, yes, it always was possible to fake photographs and create false video evidence of events it either required to have money, knowledge or both. Now any person can - with nearly no training - create realistic looking pictures and videos leading to god knows what.

And don't let me get into the environmental aspects of this technology...

Perhaps, some day in the future when the hype is gone (and hopefully most of the shitty people pushing it) it might be possible to use this technology in the right way... but this hype and push for the usage of that technology will not go away until we push at least as hard against it as the proponents push towards it.