this post was submitted on 22 Sep 2025
97 points (86.5% liked)

Technology

77925 readers
3849 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 31 comments
sorted by: hot top controversial new old
[–] BzzBiotch@lemmy.world 45 points 3 months ago (2 children)

Let’s throw a clanker at a problem that obviously needs a human solution. What could go wrong?

Also: would it maybe be cheaper to just hire a human for this? Seems like the development of such a machine would be much more expensive that just hiring a guy.

[–] coolmojo@lemmy.world 26 points 3 months ago (1 children)

I suspect a seven year old little girl wouldn’t ask for too much salary either. /s

I'm guessing cute bandaids and an ice cream cone would be sufficient.

[–] Chronographs@lemmy.zip 5 points 3 months ago (1 children)

I mean it’s definitely more expensive to pay someone for their time assuming you’d presumably have an army of these to replace with people. But it’d also be am actual solution instead of whatever fresh hell this is

[–] BzzBiotch@lemmy.world 1 points 3 months ago (1 children)

I don’t think it’s more expensive to hire people for this work than to develop and produce a bunch of robots that are obviously not up to the task.

How could they ever be? Loneliness can only be adressed through sincere human contact. Any other (robotic) solution is financially wasteful at best and disastrously counterproductive at worst.

[–] Chronographs@lemmy.zip 3 points 3 months ago

Developing the robots is mostly up front costs (between paying someone to “code” the thing and then buying the units) vs paying however many employees forever which is why I say it’s more expensive.

If they could make an actual genai I think it could help with loneliness but we won’t be there anytime soon, if ever, the way they keep following a strategy of scaling and optimizing a speech center.

[–] ICastFist@programming.dev 31 points 3 months ago (1 children)

The idea isn't new or evil per se, since anyone that had to spend 2+ days in a hospital knows how dreadful it can feel. The problem here is relying on current AI tech, which may end up suggesting suicide during the chats, and the companies' pinky promises that your privacy and data won't be leaked or sold to the lowest bidder

[–] freeman@feddit.org 1 points 3 months ago

I agree, but usually the vendor picks the highest bidder ;)

[–] probable_possum@leminal.space 20 points 3 months ago (1 children)

" Can I come with you?"
-- little boy in Screamers (1995)

[–] IcyToes@sh.itjust.works 3 points 3 months ago (1 children)

Oh shit. That's memories. Creeped me the hell out back in the day...

[–] probable_possum@leminal.space 1 points 3 months ago

Good times.:)

[–] ShieldsUp@startrek.website 19 points 3 months ago* (last edited 3 months ago) (2 children)

I think I would be offended if some robot came up to try and provide emotional support. Its fake, and reminds me that our society values profit over human life. This should not be normalized as a necessity due to missing money that is going to the pockets of administrators, owners, insurance, and whoever else...this is pathetic.

[–] Kolanaki@pawb.social 7 points 3 months ago* (last edited 3 months ago)

Fr, fr. At least when a human fakes it there is the possibility that they aren't faking it. A machine that is incapable of feeling loses all ambiguity. It's even emptier than just pretending to be sympathetic.

[–] sqgl@sh.itjust.works 2 points 3 months ago (2 children)

The example in the article is of a kid patient.

[–] Landless2029@lemmy.world 4 points 3 months ago

I have to admit if I was a small child of 8-10 and Eve from Wall-E rolled up I'd be giddty to play with it, but even then I wouldn't expect it to replace actual human contact.

[–] ShieldsUp@startrek.website 1 points 3 months ago (1 children)

Sure, I just can't help but imagine how I personally would react if this became common for all patients.

[–] sqgl@sh.itjust.works 2 points 3 months ago* (last edited 3 months ago)

I certainly would be outraged to have a bot assigned to me, as an adult.

Worth trying with kids. They play with dolls after all and imaginary friends.

[–] Telorand@reddthat.com 14 points 3 months ago

Oh good, now they're taking the jobs of certified comfort/support dogs? The solution nobody asked for, JFC.

[–] morto@piefed.social 12 points 3 months ago (1 children)

That would just make me feel fear and loneliness

[–] Agent641@lemmy.world 9 points 3 months ago (2 children)

To help lessen your fear and loneliness, robochild will stand silently by your bedside all night while you sleep, keeping her eyes open without blinking the whole time to make sure you're safe.

[–] Chronographs@lemmy.zip 8 points 3 months ago

Every time you finally start to fall asleep and snore a little too loudly: “Sorry, I didn’t catch that.”

[–] morto@piefed.social 1 points 3 months ago

Doctors, euthanize me, please!

[–] tidderuuf@lemmy.world 9 points 3 months ago

I take it the people who designed this never had an evil genius 7 yo daughter who constantly terrorized their parents?

[–] DeathByBigSad@sh.itjust.works 7 points 3 months ago (1 children)

Bruh, just give them a cat to pet lol

[–] onslaught545@lemmy.zip 9 points 3 months ago (1 children)

It's not the best idea to have a source of allergens, dangerous pathogens, and parasites around sick children.

[–] T156@lemmy.world 1 points 3 months ago

Stuffed toy you can autoclave?

[–] markovs_gun@lemmy.world 2 points 3 months ago (1 children)

Is it bad this is the first thing I thought of? I can't imagine people won't do absolutely horrifying things to a robot programmed to act like a 7 year old

https://www.youtube.com/watch?v=XQcNYb3DydA

[–] StopSpazzing@lemmy.world 1 points 3 months ago

Found a former long time reddit user.

[–] jjlinux@lemmy.zip 1 points 3 months ago (2 children)

That's and AI pedophile magnet 🤣

[–] kami@lemmy.dbzer0.com 2 points 3 months ago

Or a treatment

[–] helpImTrappedOnline@lemmy.world 2 points 3 months ago

Better to catch them with a robot kid than with real ones.