ErmahgherdDavid

joined 1 year ago

Well Anthropic chose to settle their piracy lawsuit out of court which probably indicates that they thought there was a reasonable chance they could have lost the case. No legally binding precedents set yet though afaik.

Yeah I agree. Small models is the way. You can also use LoRa/QLoRa adapters to "fine tune" the same big model for specific tasks and swap the use case in realtime. This is what apple do with apple intelligence. You can outperform a big general LLM with an SLM if you have a nice specific use case and some data (which you can synthesise in come cases)

[–] ErmahgherdDavid@lemmy.dbzer0.com 5 points 1 month ago (2 children)

Unlike the dotcom bubble, Another big aspect of it is the unit cost to run the models.

Traditional web applications scale really well. The incremental cost of adding a new user to your app is basically nothing. Fractions of a cent. With LLMs, scaling is linear. Each machine can only handle a few hundred users and they're expensive to run:

Big beefy GPUs are required for inference as well as training and they require a large amount of VRAM. Your typical home gaming GPU might have 16gb vram, 32 if you go high end and spend $2500 on it (just the GPU, not the whole pc). Frontier models need like 128gb VRAM to run and GPUs manufactured for data centre use cost a lot more. A state of the art Nvidia h200 costs $32k. The servers that can host one of these big frontier models cost, at best, $20 an hour to run and can only handle a handful of user requests so you need to scale linearly as your subscriber count increases. If you're charging $20 a month for access to your model, you are burning a user's monthly subscription every hour for each of these monster servers you have turned on. That's generous and assumes they're not paying the "on-demand" price of $60/hr.

Sam Altman famously said OpenAI are losing money on their $200/mo subscriptions.

If/when there is a market correction, a huge factor of the amount of continued interest (like with the internet after dotcom) is whether the quality of output from these models reflects the true, unsubsidized price of running them. I do think local models powered by things like llamacpp and ollama and which can run on high end gaming rigs and macbooks might be a possible direction for these models. Currently though you can't get the same quality as state-of-the-art models from these small, local LLMs.

This is nice to see. I wish they'd actually do something about our water companies too.

[–] ErmahgherdDavid@lemmy.dbzer0.com 6 points 1 month ago (1 children)

We all have to feel sorry for the drones?

[–] ErmahgherdDavid@lemmy.dbzer0.com 9 points 1 month ago* (last edited 1 month ago) (2 children)

Relative privation is when someone dismisses or minimizes a problem simply because worse problems exist: "You can't complain about X when Y exists."

I'm talking about the practical reality that you must prioritize among legitimate problems. If you're marooned at sea in a sinking ship you need to repair the hull before you try to fix the engines in order to get home.

It's perfectly valid to say "I can't focus on everything so I will focus on the things that provide the biggest and most tangible improvement to my situation first". It's fallacious to say "Because worse things exist, AGI concerns doesn't matter."

[–] ErmahgherdDavid@lemmy.dbzer0.com 8 points 1 month ago* (last edited 1 month ago) (4 children)

Here's how I see it: we live in an attention economy where every initiative with a slew of celebrities attached to it is competing for eyeballs and buy in. It adds to information fatigue and analysis paralysis . In a very real sense if we are debating AGI we are not debating the other stuff. There are only so many hours in a day.

If you take the position that AGI is basically not possible or at least many decades away (I have a background in NLP/AI/LLMs and I take this view - not that it's relevant in the broader context of my comment) then it makes sense to tell people to focus on solving more pressing issues e.g. nascent fascism, climate collapse, late stage capitalism etc.

[–] ErmahgherdDavid@lemmy.dbzer0.com 17 points 2 months ago* (last edited 2 months ago)

The telegraph is a low-quality right-wing rag in the UK that frames everything with a hint of xenophobia. I'd say it's a pretty poor source for "world news". Definitely take it with a large dose of salt.

Edit: MBFC rating is "mixed" with right-wing bias

Presumably it's selling snake oil and convincing people to trust them?

In the United Kingdom yes because of our authoritarian Online Safety Act that came into power earlier this year. If I join a discord channel marked as nsfw I get a prompt for id which I bypass with a VPN in another country.

[–] ErmahgherdDavid@lemmy.dbzer0.com 9 points 2 months ago* (last edited 2 months ago)

The issue with this is that like 95% of android devices run android with gapps and re-imaging your phone is becoming increasingly more difficult as manufacturers lock down their bootloaders. Normies who aren't technical are not gonna want to mess with that shit. I'm not saying this to instill hopelessness but highlighting that it's a challenge.

For censorship circumvention, mainstream tech is gonna continue to be increasingly useless. We need to educate people about these matters and also try to encourage people to lean heavily into decentralised comms like lora/meshtastic

They're banning non google-drmed app installs

view more: next ›