this post was submitted on 28 Nov 2025
337 points (98.3% liked)

Selfhosted

53304 readers
1206 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] roofuskit@lemmy.world 75 points 5 days ago (4 children)

Guess what, GPU prices are about to go back up because they rely on RAM.

load more comments (4 replies)
[–] Kolanaki@pawb.social 83 points 5 days ago (2 children)

Just buy an extra GPU and figure out how to turn it's VRAM into regular RAM. 😌

[–] colourlesspony@pawb.social 37 points 5 days ago (1 children)

I know you can create a gpu ram disk so you could use it as swap lol.

[–] litchralee@sh.itjust.works 10 points 5 days ago (1 children)

Ok, I have to know: how is this done, and what do people use it for?

[–] Trincapinones@lemmy.dbzer0.com 14 points 5 days ago (1 children)

It uses a github repo that hasn't been updated in 3 years and it's more of a gimmick but it's a fun weekend project.

https://www.youtube.com/watch?v=qyzBRsfQ_UM

[–] litchralee@sh.itjust.works 9 points 5 days ago (1 children)

https://github.com/Overv/vramfs

Oh, it's a user space (FUSE) driver. I was rather hoping it was an out-of-tree Linux kernel driver, since using FUSE will: 1) always pass back to userspace, which costs performance, and 2) destroys any possibility of DMA-enabled memory operations (DPDK is a possible exception). I suppose if the only objective was to store files in VRAM, this does technically meet that, but it's leaving quite a lot on the table, IMO.

If this were a kernel module, the filesystem performance would presumably improve, limited by how the VRAM is exposed by OpenCL (ie very fast if it's just all mapped into PCIe). And if it was basically offering VRAM as PCIe memory, then this potentially means the VRAM can be used for certain RAM niche cases, like hugepages: some applications need large quantities of memory, plus a guarantee that it won't be evicted from RAM, and whose physical addresses can be resolved from userspace (eg DPDK, high-performance compute). If such a driver could offer special hugepages which are backed by VRAM, then those application could benefit.

And at that point, on systems where the PCIe address space is unified with the system address space (eg x86), then it's entirely plausible to use VRAM as if it were hot-insertable memory, because both RAM and VRAM would occupy known regions within the system memory address space, and the existing MMU would control which processes can access what parts of PCIe-mapped-VRAM.

Is it worth re-engineering the Linux kernel memory subsystem to support RAM over PCIe? Uh, who knows. Though I've always like the thought of DDR on PCIe cards. All technologies are doomed to reinvent PCIe, I think, said someone from Level1Techs.

[–] Illecors@lemmy.cafe 3 points 4 days ago

This is such an incredible write up of something I've never even considered to exist. Thank you!

I'd love to have things like that in a form of a post at !graybeard@lemmy.cafe

[–] afk_strats@lemmy.world 27 points 5 days ago (1 children)
[–] klangcola@reddthat.com 28 points 5 days ago

The article introduction is gold:

In the unlikely case that you have very little RAM and a surplus of video RAM, you can use the latter as swap.

[–] tal@lemmy.today 42 points 5 days ago (1 children)

GPU prices are coming to earth

https://lemmy.today/post/42588975

Nvidia reportedly no longer supplying VRAM to its GPU board partners in response to memory crunch — rumor claims vendors will only get the die, forced to source memory on their own

If that's true, I doubt that they're going to be coming to earth for long.

[–] fleton@lemmy.world 11 points 5 days ago

Have to refill before being sent to titan.

[–] myfunnyaccountname@lemmy.zip 24 points 5 days ago (1 children)

Where are gpu prices coming down? 5070ti is what 800? 3090 24gb is 800-900 still. They not coming down anytime soon.

[–] cyberpunk007@lemmy.ca 7 points 4 days ago

Yup. Bs headline.

[–] jeena@piefed.jeena.net 33 points 5 days ago (7 children)

This is very unfortunate, about a year ago I built my PC and only put in 32 GB of Ram, It was double I had on my laptop so I thought it should be enough for the beginning and I could buy more later.

Already after 2 months I realizes I can do so much more because of the fast CPU in parallel but suddenly the amount of RAM became the bottleneck. When I looked at the RAM prices it didn't seem quite worth it and I waited. But that backfired because since then the prices never went down, only up.

[–] NotSteve_@piefed.ca 38 points 5 days ago (3 children)

What are you running that needs more than 32Gb? I'm only just barely being bottlenecked by my 24Gb when running games at 4k

[–] comrade_twisty@feddit.org 33 points 5 days ago

Chrome probably

[–] jeena@piefed.jeena.net 7 points 5 days ago (1 children)

Two browsers full of tabs but that is not a problem, but once I start compiling AOSP (which I sometimes want to do for work at home instead in the cloud because it's easier and faster to debugg) then it eats up all the RAM imediatelly and I have to give it 40 more GB or swap and then this swapping is the bottleneck. Once that is running the computer can't really do anything else, even the browser struggles.

[–] usernamesAreTricky@lemmy.ml 10 points 5 days ago (1 children)

Have you tried just compiling it with fewer threads? Would almost certainly reduce the RAM usage, and might even make the compile go faster if it you're needing to swap that heavily

load more comments (1 replies)
[–] hoshikarakitaridia@lemmy.world 6 points 5 days ago (5 children)

AI or servers probably. I have 40gb and that's what I would need more ram for.

I'm still salty because I had the idea of going cpu & ram sticks for AI inference literally days before the big AI companies. And my stupid ass didn't buy them in time before the prices skyrocketed. Fuck me I guess.

[–] NotMyOldRedditName@lemmy.world 6 points 5 days ago* (last edited 5 days ago) (3 children)

It does work, but it's not really fast. I upgraded to 96gb ddr4 from 32gb a year or so ago, and being able to play with the bigger models was fun, but it's not something I could do anything productive with it was so slow.

[–] possiblylinux127@lemmy.zip 5 points 5 days ago (1 children)

Your bottle necked by memory bandwidth

You need ddr5 with lots of memory channels for it to he useful

load more comments (1 replies)
load more comments (2 replies)
load more comments (4 replies)
[–] jeena@piefed.jeena.net 10 points 5 days ago

I just had a look, 2nd of April I payed 67,000 KRW for one 16 GB stick, now the same one (XPG DDR5 PC5-48000 CL30 LANCER BLADE White), they only sell them in pairs, a pair costs 470,000 KRW in the same shop, so 235,000 KRW per 16 GB stick. That is a price increase of 250%, god damn.

load more comments (5 replies)
[–] Burghler@sh.itjust.works 26 points 5 days ago (1 children)

Not for long, GPU partners will have to source their own VRAM for manufacturing Nvidia now.

[–] panda_abyss@lemmy.ca 13 points 5 days ago

I hope this is the beginning of the end for the cuda monopoly. I just want good gpgpu support for numerical code.

[–] fhein@lemmy.world 18 points 5 days ago (1 children)

... I was thinking about buying a 96GB DDR5 kit from the local computer store a few weeks ago, but wasn't sure it was actually worth €700. Checked again now and the exact same product costs €1500. I guess that settles it, 32GB will have to be enough for the next couple of years then.

[–] Holytimes@sh.itjust.works 3 points 5 days ago (2 children)

Iv come to learn over the years. If you want to buy computer parts just do it.

Your actively stupid if you don't cause some bigger idiot with more money then brains will make a new grift that causes everything to be unaffordable.

Fuck waiting for deals, fuck thinking twice. Just fucking buy it and ignore reality around you cause you will be pissed either way.

Either a deal comes and you fucked yourself, or everything goes to the moon and now you have nothing AND your fucked.

[–] lemming741@lemmy.world 9 points 5 days ago (1 children)

Part of it the thrill of the hunt. I've caught some great deals over the years stalking marketplace.

Got .iso storage after chia crashed
Got a 3090 after Bitcoin asics took over
Got a 5900x when the X3D parts came out

But I've never seen decent RAM for sale, only single sticks or slow kits.

load more comments (1 replies)
[–] fhein@lemmy.world 1 points 3 days ago

I just wanted to test if it was viable to run larger MoE LLMs on CPU, e.g. Qwen3-next-80B-A3B.. Even if I got acceptable generation speeds I'd probably get bored with it after a few hours, as with other local models. Had I got it for €700 it was pretty low value for money anyway, since my current RAM is enough for everything else I use the computer for. On the positive side, I can put that money towards a Steam Frame instead.

[–] krooklochurm@lemmy.ca 11 points 5 days ago

So.... the models are all trained up and now they need to run them is what I'm reading.

You need lots of vram to train a model.

An llm once trained can be run in much less vram and a lot of ram

[–] W3dd1e@lemmy.zip 10 points 5 days ago (1 children)

Buying used RAM on marketplace and hoping it isn’t broken. Hoping it was just stolen from a Best Buy. Fingers crossed y’all!

[–] cooligula@sh.itjust.works 14 points 5 days ago (4 children)

I just recently bought a Samsung 16GB 5600MT/s stick for 45€ and received a 32GB stick instead! Sorry, but I wanted to brag x)

load more comments (4 replies)
[–] orochi02@feddit.org 10 points 5 days ago (7 children)

Serious question: when will ram prices go down?

When AI dies.

[–] AliasVortex@lemmy.world 18 points 5 days ago (1 children)

Gamers Nexus did a piece on this, but short of a crash or bubble pop, it's not expected to recover any time soon.

[–] COASTER1921@lemmy.ml 5 points 4 days ago

I think a bubble pop may be closer than people think. Several F500 company CEOs are at least calling it a bubble now, but admittedly still 100% on board the AI hype train.

[–] cyberpunk007@lemmy.ca 7 points 4 days ago

Never. Same with GPUs. AI destroyed jobs, the gaming market, and customer service

[–] lightnsfw@reddthat.com 8 points 4 days ago

prices go down?

That's not very capitalism.

load more comments (3 replies)
[–] Two2Tango@lemmy.ca 3 points 4 days ago* (last edited 4 days ago)

I just bought an Arc A310 for <$200. I couldn't find a set of 16x2 DDR4 in stock for less than that on Black Friday... Wild times for the RAM market.

[–] givesomefucks@lemmy.world 9 points 5 days ago (1 children)

Prebuilts have become about the only way to get a deal for a while now.

load more comments (1 replies)
[–] MantisToboggon@lemmy.world 8 points 5 days ago (1 children)
[–] justdaveisfine@piefed.social 27 points 5 days ago (1 children)
[–] MantisToboggon@lemmy.world 9 points 5 days ago

Oh I like you you can come over and fuck my wife.

[–] utjebe@reddthat.com 4 points 5 days ago

Well that settles my idea of entertaining AM5.

load more comments
view more: next ›