this post was submitted on 03 Feb 2026
134 points (95.3% liked)

Technology

80478 readers
3792 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] RedWeasel@lemmy.world 53 points 1 day ago* (last edited 1 day ago) (13 children)

So, around 1947. Took about 14 years to get to being able to put into chips. So another decade and a half?

Edit: and another 15 to 25 years after that for it to be in consumer households?

[–] kutt@lemmy.world 8 points 17 hours ago (2 children)

I don’t think it will ever reach consumer households, since it requires extremely complex and expensive materials, tools and physical conditions. Unless a major breakthrough occurs but highly unlikely.

Also we don’t really have a use for them, at least to regular users. They won’t replace classical computers.

But you can already access some QCs online. IBM has a paid remote API for instance.

[–] baggachipz@sh.itjust.works 6 points 17 hours ago (1 children)

requires extremely complex and expensive materials, tools and physical conditions.

Counterpoint: they said the same thing when a computer was made of vacuum tubes and took up an entire room to add two digits.

[–] kutt@lemmy.world 4 points 17 hours ago (1 children)

Yeah but you have to consider one other thing. Before creating classical computers, we already had theorized them, we had algorithms etc. We knew why we were creating them.

For QC, the pace of hardware development is faster than our ability to create algorithms. It's very similar to what's happening with the AI bubble currently, we're investing heavily in a new technology because it looks cool to investors, but we don't even have enough algorithms to run on it. It's just a shit ton of marketing...

[–] baggachipz@sh.itjust.works 1 points 17 hours ago (1 children)

Yeah, understood. I was just saying that because it doesn’t seem technically possible now, don’t discount that it could be in the future. Whether it would be useful, that’s another debate. But I have a hard time believing it has practical uses. If it does though, the innovation will be rapid like the shift to silicon transistors (assuming it is even possible).

[–] kutt@lemmy.world 4 points 16 hours ago

Oh I'm not saying it is technically impossible, it's the opposite actually, it's developing extremely fast. And usefulness and having QCs in our homes aren't that far apart to be honest. Why would John Doe have a QC at home if he's not trying to create a new medication, or simulate a molecule? Probably for the same reasons he doesn't have an MRI machine in his living room :)

[–] RedWeasel@lemmy.world 1 points 16 hours ago (2 children)

I can currently only see them used as accelerators of some type right now. Could see them used potentially for GPUs, but generally I suspect some form of compute first. GenAI anyone? SkyNET? But that is only if they can be made portable for laptops or phones which is still a major issue still needing to be addressed.

I don't expect them to replace traditional chips in my lifetime if ever.

Could see them used potentially for GPU

Like used as GPUs or like GPUs. The latter, certainly. The former not as much. They aren't a replacement for current tech they accelerate completely different things (and they really do nothing currently that your average consumer would be interested in anyway).

[–] kutt@lemmy.world 2 points 15 hours ago

Yes they will probably never replace them because they’re actually slower than classical computers in doing simple calculations.

Quantum ML is actively being researched. However I am not informed at all about the advancement in this field specifically.

But the good news is that it doesn’t need to be portable, we can use them just as we do right now with remote access!

load more comments (10 replies)