this post was submitted on 10 Mar 2026
51 points (90.5% liked)

Technology

82549 readers
3597 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Debian is the latest in an ever-growing list of projects to wrestle (again) with the question of LLM-generated contributions; the latest debate stared in mid-February, after Lucas Nussbaum opened a discussion with a draft general resolution (GR) on whether Debian should accept AI-assisted contributions. It seems to have, mostly, subsided without a GR being put forward or any decisions being made, but the conversation was illuminating nonetheless.

Nussbaum said that Debian probably needed to have a discussion "to understand where we stand regarding AI-assisted contributions to Debian" based on some recent discussions, though it was not clear what discussions he was referring to. Whatever the spark was, Nussbaum put forward the draft GR to clarify Debian's stance on allowing AI-assisted contributions. He said that he would wait a couple of days to collect feedback before formally submitting the GR.

His proposal would allow "AI-assisted contributions (partially or fully generated by an LLM)" if a number of conditions were met. For example, it would require explicit disclosure if "a significant portion of the contribution is taken from a tool without manual modification", and labeling of such contributions with "a clear disclaimer or a machine-readable tag like '[AI-Generated]'." It also spells out that contributors should "fully understand" their submissions and would be accountable for the contributions, "including vouching for the technical merit, security, license compliance, and utility of their submissions". The GR would also prohibit using generative-AI tools with non-public or sensitive project information, including private mailing lists or embargoed security reports.

you are viewing a single comment's thread
view the rest of the comments
[–] ProdigalFrog@slrpnk.net 2 points 3 hours ago* (last edited 3 hours ago) (1 children)

Unless that AI is hosted locally and only trained exclusively on public domain or GPL code (AFAIK, no AI model like that exists), it's both unethical to use, and potentially corrupting the code-base with proprietary code it copied from somewhere else.

If a developer uses an AI hosted in a datacenter, then every use of it encourages the waste of water and fossil fuels to run the datacenter, it encourages the more to built in vulnerable neighborhoods where they can't do anything about the pollution they generate, and it enriches the pocketbooks of the techno-fascists that run those datacenters.

[–] Venator@lemmy.nz 1 points 3 hours ago* (last edited 3 hours ago) (2 children)

it enriches the pocketbooks of the techno-fascists that run those datacenters.

Depends, in a lot of cases it costs them more money to service the query than they charge 😅.

Although it's all borrowed money so it doesn't matter to them...

Still causing all that havoc on the environment and poisoning with potentially proprietary stolen code though...

They're gonna be running those data centers regardless though, as most of the compute time is spent on training new models...

[–] ProdigalFrog@slrpnk.net 2 points 3 hours ago

Although it’s all borrowed money so it doesn’t matter to them…

Right, and they tend to convince investors to give them more by the amount of users adopting and using the product, which they can then extrapolate out with wild projections to convince venture capital to give them even more investments.

If more people abandon it entirely, the less venture capital they may be able to entice into their trap.

[–] Venator@lemmy.nz 1 points 3 hours ago

The copyright concerns can be mitigated somewhat by prompting to follow existing patterns in the codebase(and double checking that it has done that when reviewing the generated code)