Harlehatschi

joined 10 months ago
[–] Harlehatschi@lemmy.ml 21 points 6 months ago

But it's 2⁵² addresses for each star in the observable universe. Or in other words, if every star in the observable universe has a planet in the habitable zone, each of them got 2²⁰ more IPs than there are IPv4 addresses.

[–] Harlehatschi@lemmy.ml 25 points 7 months ago

Just when you thought Nvidia couldn't get worse, they praise Trump.

[–] Harlehatschi@lemmy.ml 1 points 7 months ago

But spending a lot of processing power to gain smaller sizes matters mostly in cases you want to store things long term. You probably wouldn't want to keep the exact same LLM with the same weightings and stuff around in that case.

[–] Harlehatschi@lemmy.ml 2 points 7 months ago

Ye but that would limit the use cases to very few. Most of the time you compress data to either transfer it to a different system or to store it for some time, in both cases you wouldn't want to be limited to the exact same LLM. Which leaves us with almost no use case.

I mean... cool research... kinda.... but pretty useless.

[–] Harlehatschi@lemmy.ml 6 points 7 months ago* (last edited 7 months ago) (3 children)

Ok so the article is very vague about what's actually done. But as I understand it the "understood content" is transmitted and the original data reconstructed from that.

If that's the case I'm highly skeptical about the "losslessness" or that the output is exactly the input.

But there are more things to consider like de-/compression speed and compatibility. I would guess it's pretty hard to reconstruct data with a different LLM or even a newer version of the same one, so you have to make sure you decompress your data some years later with a compatible LLM.

And when it comes to speed I doubt it's nearly as fast as using zlib (which is neither the fastest nor the best compressing...).

And all that for a high risk of bricked data.

[–] Harlehatschi@lemmy.ml 56 points 7 months ago

Shoot this fucker in the face already

[–] Harlehatschi@lemmy.ml 1 points 8 months ago

Why would I need AI for that? We should really stop trying to slap AI on everything. Also no, I'm not that big of a fan of wasting energy on web crawlers.

[–] Harlehatschi@lemmy.ml 1 points 9 months ago (1 children)

Living under a rock eh?