Unstung
Active member
I currently don't have a job other than studying, doing homework, and procrastinating, but I've done this over the summer.
Posting this here feels appropriate:
The [Mars] image went through processing like all Mars rover images using custom software, but not Photoshop. I am familiar with the process used to calibrate the MER (Spirit and Opportunity) Pancams, which shares some similarities with what is done to Curiosity data.
Good images that don't contain data dropouts (the entire image has not yet been fully downlinked), are not overexposed, and so on, are selected in custom made software. After the good images are selected, they are calibrated. Since the MERs use a color filter wheel, multiple images of a target are taken in different wavelengths. The calibrator ensures the images are aligned and selects the colors on the calibration target (sundial on Spirit, Opportunity, Curiosity). The software produces a color or false color (depends on what filters are used, usually IR/UV ones to show the geological composition of the rocks) image based on the calibration data. Then the results are briefly studied, confirming their quality.
Curiosity has color cameras, so using three filters to make an RGB image isn't necessary, but the images may also be processed in a similar way. Unlike what is seen from Curiosity, official MER panoramas include entire images and do not merge any parts together so no data is removed. The seams between the images can be seen, but in a Curiosity panorama they are blended together.
Posting this here feels appropriate:
The [Mars] image went through processing like all Mars rover images using custom software, but not Photoshop. I am familiar with the process used to calibrate the MER (Spirit and Opportunity) Pancams, which shares some similarities with what is done to Curiosity data.
Good images that don't contain data dropouts (the entire image has not yet been fully downlinked), are not overexposed, and so on, are selected in custom made software. After the good images are selected, they are calibrated. Since the MERs use a color filter wheel, multiple images of a target are taken in different wavelengths. The calibrator ensures the images are aligned and selects the colors on the calibration target (sundial on Spirit, Opportunity, Curiosity). The software produces a color or false color (depends on what filters are used, usually IR/UV ones to show the geological composition of the rocks) image based on the calibration data. Then the results are briefly studied, confirming their quality.
Curiosity has color cameras, so using three filters to make an RGB image isn't necessary, but the images may also be processed in a similar way. Unlike what is seen from Curiosity, official MER panoramas include entire images and do not merge any parts together so no data is removed. The seams between the images can be seen, but in a Curiosity panorama they are blended together.
Last edited: