this post was submitted on 20 Oct 2025
29 points (91.4% liked)

Technology

4460 readers
620 users here now

Which posts fit here?

Anything that is at least tangentially connected to the technology, social media platforms, informational technologies and tech policy.


Post guidelines

[Opinion] prefixOpinion (op-ed) articles must use [Opinion] prefix before the title.


Rules

1. English onlyTitle and associated content has to be in English.
2. Use original linkPost URL should be the original link to the article (even if paywalled) and archived copies left in the body. It allows avoiding duplicate posts when cross-posting.
3. Respectful communicationAll communication has to be respectful of differing opinions, viewpoints, and experiences.
4. InclusivityEveryone is welcome here regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, education, socio-economic status, nationality, personal appearance, race, caste, color, religion, or sexual identity and orientation.
5. Ad hominem attacksAny kind of personal attacks are expressly forbidden. If you can't argue your position without attacking a person's character, you already lost the argument.
6. Off-topic tangentsStay on topic. Keep it relevant.
7. Instance rules may applyIf something is not covered by community rules, but are against lemmy.zip instance rules, they will be enforced.


Companion communities

!globalnews@lemmy.zip
!interestingshare@lemmy.zip


Icon attribution | Banner attribution


If someone is interested in moderating this community, message @brikox@lemmy.zip.

founded 2 years ago
MODERATORS
all 9 comments
sorted by: hot top controversial new old
[–] fubarx@lemmy.world 2 points 2 days ago

Advantages of running things locally:

  • Saving on electricity, bandwidth, and processing
  • Able to customize for individuals or families
  • Enhanced privacy
  • Option for future federated/mesh applications
  • Keeps running when network/cloud goes down (Hello, AWS!)
[–] artyom@piefed.social 3 points 2 days ago (1 children)

This isn’t speculative, it’s real and running, and it doesn’t pose a lot of the ethical dilemmas other AI applications face. Here’s why I think this matters: The consumer doesn’t have to do anything beyond pressing a button to use it.

  1. Whose data is it trained on? Seems like an ethical dilemma to me.

  2. Even worse than a web-based LLM, people are going to be even more unlikely to fact-check the often-incorrect information it's going to feed you.

  3. Using it will not be the complicated part. Setting it up will be.

[–] manualoverride@lemmy.world 1 points 2 days ago* (last edited 2 days ago) (1 children)
  1. Whose data is it trained on? Seems like an ethical dilemma to me.

Using a standalone LLM for personal use doesn’t seem like an ethical dilemma to me, it’s already been trained on the data and if the data was accessible on the web or via a library then I don’t see the harm.

Getting small amounts of medium-trust information on a subject, is a good way to get someone interested enough to read a book, watcha a YouTube video or find a website for more information and validate the AI response.

[–] artyom@piefed.social 1 points 2 days ago (1 children)

Using a standalone LLM for personal use doesn’t seem like an ethical dilemma to me

What is the ethical dilemma, exactly, and why/how is this different?

Getting small amounts of medium-trust information on a subject, is a good way to get someone interested enough to read a book, watcha a YouTube video or find a website for more information and validate the AI response.

Again, how is this different? At least the web-based ones actually link to where the info came from...

[–] manualoverride@lemmy.world 1 points 2 days ago (1 children)

We’re talking about home use AI searches… you said it was unethical so maybe you should define exactly why you think this?

Today I wanted to know what the tyre pressures should be for my 2002 Corolla and AI gave me the answer, I would not have bought a book or gone anywhere past the first page of google for that information.

The possible ethical dilemma is depriving someone of compensation because I used their research and deprived them of potential revenue, in reality I would never have bought a book on tyre pressures or car maintenance, and it’s unlikely I would ever have visited a site where adverts would have paid the contributors.

Another dilemma is of power consumption, the model is already made then it’s already used the power, and my tiny LLM query is going to use far less power locally than a web based search.

As a company who might make money, or achieve cost savings from using AI trained on data some only intended for use by a human, I can see how this is not always ethical.

[–] artyom@piefed.social 1 points 2 days ago (1 children)

maybe you should define exactly why you think this?

It's very simple, copyright. You're benefitting from someone else's work without providing them with any compensation for said work. That doesn't suddenly change because the compute happens on your personal computer.

Today I wanted to know what the tyre pressures should be for my 2002 Corolla and AI gave me the answer

If you had actually looked it up, you might have actually gotten the correct answer, as well as learned that it's printed on the driver's door jamb of every car.

my tiny LLM query is going to use far less power locally than a web based search

Why would you think your local LLM would be any more efficient than a web-based one?

[–] manualoverride@lemmy.world 1 points 2 days ago

This was exactly my point, when it’s for home use the chance of my depriving anyone of revenue is negligible.

If I’m running a home assistant anyway not having that assistant constantly connected to the web relaying my audio, processing and sending it back will use less power.

Finally thanks to the solar panels on my roof I can guarantee my searches are powered on 100% sunshine.

[–] swelter_spark@reddthat.com 1 points 2 days ago

My bf wants to set one of these up. Didn't know you could use a pebble, or whatever they call those things.