Glad Juno has made it and appears to be working OK. Not the most amazing image, better ones are routinely taken by amateurs from earth, but the crescent can only be seen by going there as Jupiter lies outside our orbit. We'll see much closer views from Juno when it's moved to a closer orbit in October.
This isn't a photo. It a photoshop fiction of 400 different photos blended together. There's a big difference between that and a single frame from a spacecraft. Comparing the two seems silly.
Hardly fictional, it's a pretty accurate representation of what Jupiter actually looks like. The main reason for using this technique is because our turbulent atmosphere blurs any image. Taking a video allows only a small percentage of the clearest frames to be combined. A spacecraft doesn't have this problem. The technique is called lucky imaging: https://en.wikipedia.org/wiki/Lucky_imaging
Stacking improves signal-to-noise ratio but does not introduce details that aren't present. A stacked image can be more accurate than a single frame as camera artifacts are reduced.
Really well said! Also, before reading your explanation, I was a little hazy on how the image stacking thing worked. Now I understand a lot better. It's like averaging to get all of the details to the forefront without the oddities!
Glad you found it useful. If you haven't come across him before, this guy takes some of the best planetary images: http://www.damianpeach.com/index.htm
Of course it's a photo. All photography involves varying degrees of processing and subjectivity with respect to the information that is presented or discarded. Stacking may be somewhat less intuitive than a single frame, but conceptually it's no different from a long exposure - it's just selecting the moments of best seeing (minimal distortion from the atmosphere) to include in that long exposure. 400 frames of video at 60 fps is roughly analogous to a 6.7 second exposure in terms of signal vs noise.
Your typical single-frame exposure involves a great number of processing operations and filtering that may be unknown to most people, but still define the "look" of the end product (which is nowhere close to a 1:1 representation of the light entering the lens).
To complement what others have said, stacking photos is mathematically the same thing as using a long exposure time and blocking the iris whenever the image would have been fuzzy, so it doesn't seem any less of a "photo" than anything else. You are still integrating across the stream of photons you want while discarding ones you don't.
It's not a fiction, nor is it "photoshop", it's actual data.
The basic problem of all astronomy from Earth's surfaces is that the atmosphere impedes seeing. So much so that even small backyard telescopes rapidly reach the limit of atmospheric seeing before hitting the limits of diffraction. There are several ways around this. One is to leave the atmosphere entirely, resulting in the stunning capabilities of space based observatories like Hubble. Another is adaptive optics, which relies on various techniques to read the atmospheric disturbance causing degraded seeing and precisely counteracting it by interposing a reactive optical element.
The technique in question is basically a sort of poor man's adaptive optics related to speckle imaging (another technique employed for the same purpose). For bright objects (like planets) an image exposure using modern CCDs need only be a fraction of a second. Which provides the opportunity to collect many of them over a period of time, which is something that many cameras already do quite well by recording video. The alternative of counteracting poor atmospheric seeing is to hope you get lucky and capture a moment where atmospheric distortion is at a minimum. Such moments are rare and fleeting, but over the course of hundreds or thousands of images from a video stream, there will be a few. The trick is to find the best moments of seeing for a given portion of the image from within the frames and combine all of them together into a single image. You then have essentially a "dream team" of atmospheric seeing conditions for every part of the image. Every part is real, and the overall image is a true representation, not a fantasy.
However, your point about the futility of comparing the two is accurate. Comparing a single image from Juno to the best thing produced from a backyard astronomer ever is not a good comparison. Especially since Juno will collect a great many more pictures and is not at all optimized for taking pictures at its current distance from Jupiter. The best pictures of Jupiter from Juno will outclass anything we've taken with any instrument from any observatory or spacecraft so far, but it'll be a while yet before we have those.
If I'm not mistaken, they use a similar technique to makes satellite images of earth that don't have cloud cover, or maps without cars and stuff, yet we know it is highly likely these were present when pictures were taken. So it's not fiction.
Stacking allows errors caused by our atmosphere to be cancelled out. It's an attempt to get it closer to what a spacecraft with no atmosphere to contend with would see.
I came across this on Reddit the other day, taken with a smartphone and 8" telescope: https://www.reddit.com/r/astrophotography/comments/4kegyv/sm...