this post was submitted on 27 Apr 2026
129 points (97.8% liked)

Technology

84166 readers
2414 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

I love how corpos can just change the rules at will.

Edit: New prices:

https://docs.github.com/en/copilot/reference/copilot-billing/models-and-pricing

And if you look at the old pricing structure, some of the models are increasing by 27x

all 42 comments
sorted by: hot top controversial new old
[–] halfdane@piefed.social 24 points 1 hour ago

I mean, if AI gets too expensive, companies can always hire juniors to replace them 🀣

[–] civ@lemmy.civl.cc 4 points 1 hour ago (1 children)
[–] chronicledmonocle@lemmy.world 1 points 12 minutes ago

The tears are delicious

[–] UnspecificGravity@piefed.social 33 points 3 hours ago (1 children)

Gonna be hilarious when the people who haven't been paying attention realize that they just replaced workers with shit that doesn't work AND actually costs more.

[–] disorderly@lemmy.world 15 points 1 hour ago

Yep, I've been telling anyone who'll listen that if you really want to drop juniors and give tools to seniors, then you have to pay the monthly cost (whatever it will be) and you have to be ready to foot the big bill in 5 years when your seniors (with no candidate replacements) say they'll take a 50% raise or walk.

[–] Fedditor385@lemmy.world 13 points 2 hours ago

This is the right way to go. This will incentivize many companies to rethink their strategy, and slow down or scale down AI adoption. After that and the revenue drops for many AI companies, they will back off purchasing all possible RAM and storage in existence which will drive down pricing. And when the prices get to normal again, we will simply buy more RAM for our local machines and run free models.

This news kinda makes me happy. Shit's starting to fall apart. Finally.

[–] eager_eagle@lemmy.world 59 points 4 hours ago* (last edited 4 hours ago) (3 children)

Users on annual Pro or Pro+ plans will remain on their existing plan with premium request-based pricing until their plan expires, however, model multipliers will increase on June 1 (see table).

holy shit, 9x the previous cost. which was already not great. I was on the fence about cancelling it, but thanks for making up my mind, MS

[–] panda_abyss@lemmy.ca 1 points 2 minutes ago* (last edited 2 minutes ago)

They should really describe this as you’re on the same plan, but your plan gives you 80-88% less use.

[–] Rhaedas@fedia.io 14 points 3 hours ago

That's been their business model for a while now. "Here's something you also didn't ask for"

[–] unitedwithme@lemmy.today 1 points 4 hours ago* (last edited 4 hours ago) (1 children)

You're looking at Claude, I don't See Copilot

Edit: ignore me, I finally reviewed the article and is through GH, not actual MS 365 page.

[–] terabyterex@lemmy.world 8 points 4 hours ago

copilot isnt amodel. its a front end that lets ypu pick a model. that table shows usage costs.

[–] henfredemars@infosec.pub 50 points 4 hours ago (2 children)

Tale as old as time. Corpos try to get you dependent and then give your business an atomic wedgie.

[–] eager_eagle@lemmy.world 4 points 57 minutes ago (1 children)

watch me go back to debugging like a real engineer: copying and pasting from stack overflow

[–] AceBonobo@lemmy.world 1 points 38 minutes ago (1 children)

Stack overflow is not what it used to be

[–] eager_eagle@lemmy.world 1 points 32 minutes ago

at this rate, it will be

[–] XLE@piefed.social 28 points 4 hours ago (1 children)

The good news is that none of the companies pushing these products have created the dependency yet, and they are running out of venture capital almost too fast to have the option.

[–] henfredemars@infosec.pub 10 points 3 hours ago (1 children)

I really hope you're right. My employer is using it as a crutch. I don't think they can stop using AI because they just don't have enough skilled employees to deliver on their commitments. They would pay nearly any price, and I'm sure they're not alone.

[–] empireOfLove2@lemmy.dbzer0.com 21 points 3 hours ago (1 children)

They would pay nearly any price

Literally any price except paying skilled employees.

[–] boonhet@sopuli.xyz 4 points 2 hours ago* (last edited 2 hours ago) (1 children)

That costs actual money though

For the cost of one employee you can give 5 employees AI and tell them to work 10x faster while they have to wrestle the stupid AI.

[–] XLE@piefed.social 2 points 45 minutes ago

You aren't wrong, but even if LLMs didn't exist, employers would invent a scapegoat to make the same demands of their employees.

[–] Ilixtze@lemmy.ml 9 points 3 hours ago* (last edited 3 hours ago)

Getting de-skilled is starting to look like a very expensive gamble and this is just the first price hike, expect more to come. And expect them to criminalize open source models as well with some national security concerns or something.

[–] panda_abyss@lemmy.ca 15 points 3 hours ago (2 children)

Inline completions are genuinely useful, I’m mostly replacing them with local models though. They are slower but free as in beer (once you pay the hardware cost).

[–] eager_eagle@lemmy.world 1 points 53 minutes ago (1 children)

I'm interested in setting it up, are you using vs code? Which extension or editor?

[–] panda_abyss@lemmy.ca 2 points 45 minutes ago* (last edited 45 minutes ago)

I’m using vim with minuet-ai, and it plugs the AI suggestion into my completion module. I found the Copilot style virtual text interfaces all janky.

[–] frongt@lemmy.zip 2 points 2 hours ago (1 children)

So, not free. Just capital expense instead of operational.

[–] panda_abyss@lemmy.ca 3 points 2 hours ago

Sure, but hopefully small code completion (2-4b range) models can run locally on a lot of things. They’re just less good.

[–] BannedVoice@lemmy.zip 11 points 3 hours ago (2 children)

Claude is cutting back on usage policy for pro users…

GitHub CoPilot is now doing it too…

It’s not hard to see the future, AI companies bought up all the RAM creating a shortage which raises prices for all of us across the board.

Now they’re going to throttle and limit access to it behind a paywall per transaction. The future is so stupid. Can we just go back to dial up internet and IRC? It was a much simpler time.

[–] DeckPacker@piefed.social 8 points 2 hours ago

You could also just not use these services. You don't need them. They were a stupid idea to begin with.

[–] Casterial@lemmy.world 3 points 3 hours ago

It's why I only use the free versions of these lol

[–] vala@lemmy.dbzer0.com 8 points 3 hours ago (1 children)

Well, I'm officially canceling my GH copilot sub. Wtf is this?

Been feeling like we were going to see a cheap ai compute rug pull soon.

Imagine paying for gpt-4o in 2026.

[–] calcopiritus@lemmy.world 20 points 3 hours ago

Wtf is this?

The most foreseeable event of the last 20 years.

Massive out of this world investment + no demand = prices so cheap they were operating at a huge loss

Operating at a huge loss + time = huge enshittification

Raising prices is the easiest form of enshittification. Ads are coming too. Lastly it will be degrading features. Incorporating more features that no one wants, and bundling with other services that no one wants.

[–] unitedwithme@lemmy.today 16 points 4 hours ago (2 children)

Of course it is... Funny though, bc nobody really uses it at work, so our pricing should theoretically go down... But I doubt it, MS will find a way to make it go up

[–] trem@lemmy.blahaj.zone 2 points 2 hours ago

Well, base prices stay the same. They seem to just be billing more per usage on top of that...

[–] vane@lemmy.world 8 points 3 hours ago (1 children)

All 0 multiplier gone from pricing.

[–] Nighed@feddit.uk 2 points 3 hours ago

People were using free agent swarms for stuff.

[–] GreenBeanMachine@lemmy.world 2 points 3 hours ago

Time to try those Chinese models. They just released a new Deepseek V4 Pro and I'm hearing great things and it's super cheap

[–] Cornpop@lemmy.world 1 points 2 hours ago (1 children)

So should I just switch to Claude max plan now?

[–] Nighed@feddit.uk 2 points 3 hours ago (1 children)

The almost endless opus usage couldn't last to be fair. I assume those costs are closer to the real cost to provide the service, ouch!

[–] stsquad@lemmy.ml -2 points 3 hours ago (1 children)

On the potentially bright side maybe this will make people think harder about which model to use for which task. You don't need to feed your entire code base into Opus when a Gemini Flash sub-agent can do a perfectly fine job running grep and compiling a summary for the main agent.

[–] boonhet@sopuli.xyz 4 points 2 hours ago* (last edited 2 hours ago)

If I have to think about that I'd honestly rather just do the work myself. Less effort and I trust the author more