this post was submitted on 26 Oct 2025
377 points (90.9% liked)

Technology

76415 readers
3990 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

I came across this article in another Lemmy community that dislikes AI. I'm reposting instead of cross posting so that we could have a conversation about how "work" might be changing with advancements in technology.

The headline is clickbaity because Altman was referring to how farmers who lived decades ago might perceive that the work "you and I do today" (including Altman himself), doesn't look like work.

The fact is that most of us work far abstracted from human survival by many levels. Very few of us are farming, building shelters, protecting our families from wildlife, or doing the back breaking labor jobs that humans were forced to do generations ago.

In my first job, which was IT support, the concept was not lost on me that all day long I pushed buttons to make computers beep in more friendly ways. There was no physical result to see, no produce to harvest, no pile of wood being transitioned from a natural to a chopped state, nothing tangible to step back and enjoy at the end of the day.

Bankers, fashion designers, artists, video game testers, software developers and countless other professions experience something quite similar. Yet, all of these jobs do in some way add value to the human experience.

As humanity's core needs have been met with technology requiring fewer human inputs, our focus has been able to shift to creating value in less tangible, but perhaps not less meaningful ways. This has created a more dynamic and rich life experience than any of those previous farming generations could have imagined. So while it doesn't seem like the work those farmers were accustomed to, humanity has been able to shift its attention to other types of work for the benefit of many.

I postulate that AI - as we know it now - is merely another technological tool that will allow new layers of abstraction. At one time bookkeepers had to write in books, now software automatically encodes accounting transactions as they're made. At one time software developers might spend days setting up the framework of a new project, and now an LLM can do the bulk of the work in minutes.

These days we have fewer bookkeepers - most companies don't need armies of clerks anymore. But now we have more data analysts who work to understand the information and make important decisions. In the future we may need fewer software coders, and in turn, there will be many more software projects that seek to solve new problems in new ways.

How do I know this? I think history shows us that innovations in technology always bring new problems to be solved. There is an endless reservoir of challenges to be worked on that previous generations didn't have time to think about. We are going to free minds from tasks that can be automated, and many of those minds will move on to the next level of abstraction.

At the end of the day, I suspect we humans are biologically wired with a deep desire to output rewarding and meaningful work, and much of the results of our abstracted work is hard to see and touch. Perhaps this is why I enjoy mowing my lawn so much, no matter how advanced robotic lawn mowing machines become.

top 50 comments
sorted by: hot top controversial new old
[–] SapphironZA@sh.itjust.works 28 points 2 days ago (2 children)

Executive positions are probably the easiest to replace with AI.

  1. AI will listen to the employees
  2. They will try to be helpful by providing context and perspective based on information the employee might not have.
  3. They will accept being told they are wrong and update their advice.
  4. They will leave the employee to get the job done, trusting that the employee will get back to them if they need more help.
[–] Tire@lemmy.ml 13 points 2 days ago
  1. The AI won’t have a twitter account to go on racist rants.

  2. The AI won’t end up on the Epstein list.

  3. The AI won’t drunkenly send nudes to an intern.

[–] IAmNorRealTakeYourMeds@lemmy.world 3 points 2 days ago (1 children)

Don't executives spend their day talking to AI and doing whatever they say?

[–] SapphironZA@sh.itjust.works 2 points 2 days ago* (last edited 2 days ago)

Exactly. No need to add the executive toxicity filter

[–] Snowclone@lemmy.world 10 points 2 days ago (2 children)

I've worked for big corporations that employ a lot of people. Every job has a metric showing how much money every single task they do creates. Believe me. They would never pay you if your tasks didn't generate more money than they need to pay you to do the task.

[–] Knock_Knock_Lemmy_In@lemmy.world 3 points 1 day ago (1 children)

Every job has a metric showing how much money every single task they do creates.

Management accountants would love to do this. In practise you can only do this for low level, commoditised roles.

[–] Snowclone@lemmy.world 1 points 4 hours ago* (last edited 3 hours ago) (1 children)

Mopping a floor has a determined metric. I'm not kidding. It's a metric. Clean bathrooms are worth a determined dollar amount. It's not simply sales or production, every task has a dollar amount. The amount of time it takes to do the task has a dollar value determined and on paper. Corporations know what every task is worth in dollar amounts. Processing Hazmats? Prevents the fine. Removing trash or pallets? Prevents lawsuits and workplace injury. Level of light reflected from the floor? Has a multiplier effect on sales. Determined. Defined. Training sales people on language choices, massive sales effect. They know how much money every single tasks generates, fines or lawsuits prevented, multiplier effects on average ticket sales, training to say ' highest consumer rated repair services ' instead of 'extended warentee' these are on paper defined dollar amounts. There is NO JOB in which you are paid to do something of no financial value. There are no unprofitable positions or tasks.

[–] Knock_Knock_Lemmy_In@lemmy.world 1 points 3 hours ago* (last edited 3 hours ago)

Your examples are all commoditized and measurable. Many roles are not this quantifiable.

There is NO JOB in which you are paid to do something of no financial value.

Compliance, marketing, social outreach, branding.

Putting a $ amount on these and other similar roles is very difficult.

But I agree, if the value added is known to be zero or negative then usually no-one is paid to do it.

There are no unprofitable positions or tasks.

Not when they are set up, but they can become unprofitable over time, and get overlooked.

load more comments (1 replies)

If OpenAI gets wiped out, maybe it wasn’t even a “real company” to start with

[–] biofaust@lemmy.world 9 points 2 days ago

That would actually be true if companies were run by the people doing the work.

[–] remon@ani.social 10 points 2 days ago (1 children)

This guy needs to find Luigi.

[–] 1985MustangCobra@lemmy.ca 2 points 1 day ago

that's a smart comment to make.

[–] sobchak@programming.dev 17 points 2 days ago (4 children)

The problem is the capitalist investor class, by and large, determines what work will be done, what kinds of jobs there will be, and who will work those jobs. They are becoming increasingly out of touch with reality as their wealth and power grows and seem to be trying to mold the world into something, somewhere along the lines of what Curtis Yarven advocates for, that most people would consider very dystopian.

This discussion is also ignoring the fact that currently, 95% of AI projects fail, and studies show that LLM use hurts the productivity of programmers. But yeah, there will almost surely be breakthroughs in the future that will produce more useful AI tech; nobody knows what the timeline for that is though.

[–] Tollana1234567@lemmy.today 9 points 2 days ago

its also hurting students currently HS and college too, they are learning less than before.

load more comments (3 replies)
[–] mechoman444@lemmy.world 3 points 1 day ago

It’s funny, years ago, a single developer “killing it” on Steam was almost unheard of. It happened, but it was few and far between.

Now, with the advent of powerful engines like Unreal 5 and the latest iterations of Unity, practically anyone outside the Arctic Circle can pick one up and make a game.

Is tech like that taking jobs away from the game industry? Yes. Very much so. But since those programs aren’t technically “AI,” they get a pass. Never mind that they use LLMs to streamline the process, they’re fine because they make games we enjoy playing.

But that’s missing the point. For every job the deployment of some “schedule 1” or “megabonk” tech replaced, it enabled ten more people to play and benefit from the final product. Those games absolutely used AI in development, work that once would’ve gone to human hands.

Technology always reduces jobs in some markets and creates new ones in others.

It’s the natural way of things.

[–] LittleBorat3@lemmy.world 6 points 2 days ago

Productivity will rise again and we will not get compensated even if we all get better cooler jobs and do the same but 10x more efficiently. Which we won't get to do, some of us will have no jobs.

Earnings from AI and automation need to be redistributed to the people. If it works and AI does not blow up in their face because it's a bubble, they will be so filthy rich that they either don't know what to do with it or lose grip of reality and try to shape politics, countries, the world etc.

See the walking k-hole that tried to make things "more efficient".

[–] SocialMediaRefugee@lemmy.world 25 points 2 days ago (1 children)

What do we need the mega rich for anyway? They aren't creative and easily replaced with AI at this point.

[–] lechekaflan@lemmy.world 6 points 2 days ago

What do we need the mega rich for anyway?

Supposedly the creation and investment of industries, then managing those businesses which also supposedly provide employment for thousands who make the things for them. Except they'll find ways to cut costs and maximize profit. Like looking for cheaper labor while at the same time thinking of building the next megayacht for which to flex off at Monte Carlo next summer.

[–] mp3@lemmy.ca 46 points 3 days ago* (last edited 3 days ago) (1 children)

CEO isn't an actual job either, it's just the 21st century's titre de noblesse.

load more comments (1 replies)
[–] Curious_Canid@lemmy.ca 28 points 3 days ago (1 children)

Sam Altman is a huckster, not a technologist. As such, I don't really care what he says about technology. His purpose has always been to transfer as much money as possible from investors into his own pocket before the bubble bursts. Anything else is incidental.

I am not entirely writing off LLMs, but very little of the discussion about them has been rational. They do some things fairly well and a lot of things quite poorly. It would be nice if we could just focus on the former.

load more comments (1 replies)
[–] billwashere@lemmy.world 21 points 2 days ago

Sam, I say this will all my heart…

Fuck you very kindly. I’m pretty sure what you do is not “a real job” and should be replaced by AI.

[–] supersquirrel@sopuli.xyz 122 points 3 days ago (3 children)

Starting this conversation with Sam Altman is like showing up at a funeral in a clowncar

load more comments (3 replies)
[–] Dojan@pawb.social 40 points 3 days ago (8 children)

At one time software developers might spend days setting up the framework of a new project, and now an LLM can do the bulk of the work in minutes.

I'd not put an LLM in charge of developing a framework that is meant to be used in any sort of production environment. If we're talking about them setting up the skeleton of a project, then templates have already been around for decades at this point. You also don't really set up new projects all that often.

load more comments (8 replies)
[–] 6nk06@sh.itjust.works 102 points 3 days ago (16 children)

At one time software developers might spend days setting up the framework of a new project, and now an LLM can do the bulk of the work in minutes.

No and no. Have you ever coded anything?

load more comments (16 replies)
[–] SocialMediaRefugee@lemmy.world 16 points 2 days ago

Can't AI replace Sam Altman?

[–] DupaCycki@lemmy.world 14 points 2 days ago* (last edited 2 days ago) (5 children)

To be fair, a lot of jobs in capitalist societies are indeed pointless. Some of them even actively do nothing but subtract value from society.

That said, people still need to make a living and his piece of shit artificial insanity is only making it more difficult. How about stop starving people to death and propose solutions to the problem?

[–] SanicHegehog@lemmy.world 6 points 2 days ago (2 children)

There's a book Bullshit Jobs that explores this phenomenon. Freakonomics also did an episode referring to the book, which I found interesting.

Bullshit Jobs: A Theory is a 2018 book by anthropologist David Graeber that postulates the existence of meaningless jobs and analyzes their societal harm. He contends that over half of societal work is pointless and becomes psychologically destructive when paired with a work ethic that associates work with self-worth

[–] LittleBorat3@lemmy.world 2 points 2 days ago

The jobs did not start out that way, I guess these people have been tossed to the side and are not where the action currently is.

Yet they are still employed because the boss does not understand what they are doing and they might embellish their contributions etc.

There are so many people who do little, drink free coffee talk to everyone and are seen as very social, liked by everyone etc. They do fucking nothing, I know a handful of them.

load more comments (1 replies)
load more comments (4 replies)
[–] lechekaflan@lemmy.world 8 points 2 days ago (1 children)

Thou shalt not make a machine in the likeness of a human mind.

-- The Orange Catholic Bible

Also, that pompous chucklefuck can go fuck himself. There are people who could barely feed themselves at less than a couple dollars per day.

load more comments (1 replies)
[–] Halcyon@discuss.tchncs.de 4 points 2 days ago (1 children)

Jobs like air traffic controllers for example?

load more comments (1 replies)
[–] Telorand@reddthat.com 61 points 3 days ago (4 children)

Cool, know what job could easily be wiped out? Management. Sam Altman is a manager.

Therefore, Sam Altman doesn't do real work. Fuck you, asshole.

load more comments (4 replies)
[–] MonkderVierte@lemmy.zip 27 points 3 days ago* (last edited 3 days ago) (17 children)

Talking psychology, please stop calling it AI. This raises unrealistic expectations. They are Large Language Models.

[–] FireWire400@lemmy.world 20 points 3 days ago

Raising unrealistic expectations is what companies like OpenAI are all about

load more comments (16 replies)
[–] Octavio@lemmy.world 6 points 2 days ago (3 children)
load more comments (3 replies)
[–] maleable@lemmy.world 1 points 1 day ago

This was a great comment to the article. You have true expression in your words, my friend. It was a joy reading.

[–] finitebanjo@lemmy.world 1 points 1 day ago

Why do people still listen to this grifter piece of shit? I really don't get it.

[–] LodeMike@lemmy.today 40 points 3 days ago (3 children)

Says the guy who hadn't worked a day in their life

load more comments (3 replies)
[–] drmoose@lemmy.world 5 points 2 days ago (5 children)

I've been thinking a lot about this since chatgpt dropped and I agree with Sam here despite the article trying to rage bait people. We simply shouldn't protect the job market from the point of view of identity or status. We should keep an open mind of jobs and work culture could look like in the future.

Unfortunately this issue is impossible to discuss without conflating it with general economics and wealth imbalance so we'll never have an adult discussion here. We can actually have both - review/kill/create new jobs and work cultures and address wealth imbalance but not in some single silver bullet solution.

load more comments (5 replies)
[–] Sam_Bass@lemmy.world 2 points 2 days ago

If Sam got wiped out he would even be a real man anyway

load more comments
view more: next ›