
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
No backups, no pity.
This is a crazy use of AI!
What I have been considering, but haven't found a readily available setup yet: Make a user with lots of read permissions (most of /etc, API keys & passwords in separate excluded files). That could be done with very restrictive sudo patterns. Let the AI run commands under that user directly (it can do sudo -l to get an idea of what it can do). Then, use it like in Star Trek "Computer - run a level 2 diagnostic".
Not as the centre of attention when fixing a problem, but as additional input / modern rubber ducking.
haha, whoopsie lol :)
Nice ☺
No backup, no mercy.
Has anyone tried a deltree *.* /y when talking to claude? Revenge is a dish best served code.
That's it Son of Anton is banned.
The real reason I hate using LLMs is because I have to "think" like a social human non software engineer.
For whatever fucking reason, I just can't get these things to be useful. And then I see idiots connecting an LLM to production like this.
Is that the problem? I literally can't turn my brain off. The only other nearly universal group of people that seems opposed to LLMs are psychologists and social workers who seem to be universally concerned about its negative effects on mental health and it's encouragement of abandoning critical thinking.
Like I can't NOT think through a problem. I already know more about my software than the AI could actually figure out. Anytime I go into GitHub Copilot and say "I want this feature" I get some code and the option to apply it. But the generated code is usually duplicate of something and doesn't usually pick up or update existing models. The security flaws are rampant, and the generated tests don't do much of any real testing.
: You had a backup, right?
lol, lmao even
They had a backup and restored everything. This is clickbait.
No, they had only snapshots. Which is not a backup. They were lucky support could restore the data which by rights should have been wiped.
The only job AI is gonna take is the intern who fucks everything up.
It legitimately is squeezing out the entry level already and that is its own problem. Maybe it's good for some of us, in that people with experience will be needed for a long time as they prevent all these younger people from getting that experience, but it absolutely sucks for a whole bunch of people trying to make a career, and it will eventually suck for the economy as a whole. AI, whether it's ready or not, or will even ever be fully what the marketing people claim it is, is leading to a whole lot of shortsighted decisions that are hurting people.