Earlier this year, we learned that Mars Opportunity Rover had officially stopped its service. It traveled more than a marathon distance in its time on our neighboring plan, which surpassed its original mission goal. During the opportunity, it sent back some amazing images of Mars, which provided useful scientific data and the ability to marvel at the majesty of the planet's terrain.
The final image taken is a massive panorama that took 29 days to shoot. It gives us a picture of the Perseverance Valley, where the rover is now sitting. The panorama contains in itself image data from 354 individual images that "sticks" with software.
If you look carefully at the last photo you will notice that a small piece at the bottom left is still in black and white, while the rest is in color. This is not an artistic choice, but rather a technical detail with a surprisingly sad explanation. Opportunity shot the last necessary pictures to fill that section in color, but never got the chance to upload them.
The charged, combined device camera sensors in the rover's panoramic camera (called Pancam only takes black and white images. Pair of cameras nearly one foot away from each other helped calculate the distance for the rover's travel and just found objects in their field of view and so it could position its robot arm exactly
According to Jim Bell, Pancam Payload Element Lead and Arizona State University Professor, the color comes from a wheel of filters rotating in front of the camera lenses .When a filter is in place, the sensor takes a picture limited to specific wavelengths. total color filter on each wheel, but one is specifically intended to take pictures of the sun, so it severely cuts the amount of light coming in.
So why is it part of Opportunity's last panorama in black and white while the rover shot the necessary pictures to provide color information, it never had the necessary bandwidth to send them back to earth inside n the desolate storm arrived, which eventually ended the rover's mission. The color version of the image combines images shot through three of the filters centered around the following wavelengths: 753 nanometers (near infrared), 535 nanometers (green) and 432 nanometers (blue). It is similar to what you would find in a standard digital camera.
Unfortunately, the last frames needed to find out the last pieces of color in the panorama never made it back to earth.
This method of capturing images sounds complicated, but this process is in fact extremely similar to the way in which almost all modern digital cameras work. For example, every pixel on the sensor inside your smartphone camera sits behind a filter that is either red, green or blue. These filters are arranged in a pattern commonly referred to as the Bayer pattern. When you snap a photo, the camera knows how much light each image gets and what color filter it passed through and it uses that information to "debug" the image and give it its colors. Instead of using pixels on a sensor, Pancam takes several images representing different wavelengths of light and later combines them in a similar way.
Unlike your digital camera, Pancam takes wavelengths beyond what you want to record. According to Bell, the cameras go further into both the red and blue ends of the spectrum to gain access to ultraviolet and infrared light that is beyond human vision.
While the picture is amazing and a bit sad ̵