This is exactly how CMOS/CCD image sensors (used in digital cameras) work [0]. Each "sub-pixel" is just a photocell which detects light intensity. A bayers filter on top of the sensor array filters light colors so different "sub-pixels" detect colors (light intensity of different parts of the visual spectrum) [1]
Enthusiasts sometimes remove the filter to get higher optical resolution or better contrast B&W images. You can even buy consumer cameras with this feature off the shelf [2]
NASA is doing the equivalent of instagram filters, making photos appear more appealing to the general public. NASA says (via. Space.com [3]):
"Creating color images out of the original black-and-white exposures is equal parts art and science," NASA said.
For example, Hubble photographed the Cat's Eye Nebula through three narrow wavelengths of red light that correspond to radiation from hydrogen atoms, oxygen atoms, and nitrogen ions (nitrogen atoms with one electron removed). In that case, they assigned red, blue and green colors to the filters and combined them to highlight the subtle differences. In real life, those wavelengths of light would be hard to distinguish for humans."
I'm not sure exactly what you mean by "ignore". The colors, though somewhat arbitrarily chosen, do convey physical information. It may not be a "what your eye might see" image, but it does tell the viewer about the relative emission from hydrogen, oxygen, and singly-ionized nitrogen (for the Cat's Eye Nebula). So, the coloring isn't something that could be ignored.
One way to describe those types of images is "representative color".
"Cat's Eye Nebula ... In real life, those wavelengths of light would be hard to distinguish for humans."
The context of the my comment was how the photo compare to the naked eye.
I used to take IR photos for fun (consumer camera with IR filter replaced with visible light filter), and used thermal and night vision technology at my job. They all relay useful information, but are not photo-realistic in anyway.
An IR photo of Earth for example, useful but not photorealistic.
> The context of the my comment was how the photo compare to the naked eye.
I agree that is true for your example of the Cat's Eye nebulae. I guess my response was to my (and presumably other's possible) mis-interpretation that "ignore the color" was aimed at color astronomy images in general.
Enthusiasts sometimes remove the filter to get higher optical resolution or better contrast B&W images. You can even buy consumer cameras with this feature off the shelf [2]
NASA is doing the equivalent of instagram filters, making photos appear more appealing to the general public. NASA says (via. Space.com [3]):
"Creating color images out of the original black-and-white exposures is equal parts art and science," NASA said.
For example, Hubble photographed the Cat's Eye Nebula through three narrow wavelengths of red light that correspond to radiation from hydrogen atoms, oxygen atoms, and nitrogen ions (nitrogen atoms with one electron removed). In that case, they assigned red, blue and green colors to the filters and combined them to highlight the subtle differences. In real life, those wavelengths of light would be hard to distinguish for humans."
In other words, ignore the colors.
[0] https://en.wikipedia.org/wiki/Image_sensor
[1] https://en.wikipedia.org/wiki/Bayer_filter
[2] https://en.wikipedia.org/wiki/Leica_M_Monochrom
[3] http://www.space.com/8059-truth-photos-hubble-space-telescop...