this post was submitted on 29 Dec 2025
502 points (99.0% liked)

Technology

78029 readers
3285 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] JeeBaiChow@lemmy.world 80 points 1 day ago (3 children)

Good read. Funny how I always thought the sensor read rgb, instead of simple light levels in a filter pattern.

[–] _NetNomad@fedia.io 32 points 1 day ago (1 children)

wild how far technology has marched on and yet we're still essentially using the same basic idea behind technicolor. but hey, if it works!

[–] GamingChairModel@lemmy.world 5 points 18 hours ago

Even the human eye basically follows the same principle. We have 3 types of cones, each sensitive to different portions of wavelength, and our visual cortex combines each cone cell's single-dimensional inputs representing the intensity of light hitting that cell in its sensitivity range, from both eyes, plus the information from the color-blind rods, into a seamless single image.

[–] TheBlackLounge@lemmy.zip 13 points 1 day ago

You could see the little 2x2 blocks as a pixel and call it RGGB. It's done like this because our eyes are so much more sensitive to the middle wavelengths, our red and blue cones can detect some green too. So those details are much more important.

A similar thing is done in jpeg, the green channel always has the most information.

[–] Davel23@fedia.io 20 points 1 day ago (3 children)

For a while the best/fanciest digital cameras had three CCDs, one for each RGB color channel. I'm not sure if that's still the case or if the color filter process is now good enough to replace it.

[–] CookieOfFortune@lemmy.world 8 points 1 day ago (2 children)

There are some sensors that have each color stacked vertically instead of using a Bayer filter. Don’t think they’re popular because the low light performance is worse.

[–] GreyEyedGhost@piefed.ca 4 points 21 hours ago

This was sold by Foveon, which had some interesting differences. The sensors were layered which, among other things, meant that the optical effect of moire patterns didn't occur on them.

[–] Natanael@infosec.pub 1 points 20 hours ago

Some Sony phones have that type of sensor

[–] lefty7283@lemmy.world 7 points 1 day ago (1 children)

At least for astronomy, you just have one sensor (they’re all CMOS nowadays) and rotate out the RGB filters in front of it.

[–] trolololol@lemmy.world 4 points 23 hours ago (1 children)

Is that the case for big ground and space telescopes too? I can imagine this could cause wobbling.

Btw is that also how infrared and x-ray telescopes work as well?

[–] lefty7283@lemmy.world 6 points 21 hours ago

It sure is! The monochrome sensors are also great for narrowband imaging, where the filters let through one specific wavelength of light (like hydrogen alpha) which lets you do false color imaging.

IR is basically the same. Here’s the page on JWST’s filters. No clue about xray scopes, but IIRC they don’t use any kind of traditional CMOS or CCD sensor.

[–] worhui@lemmy.world 2 points 23 hours ago

3chip cmos sensors are about 20-25 years out of date technology. Mosaic pattern sensors have eclipsed them on most imaging metrics.