We started developing a stereo camera set-up so that we could capture two images simultaeously -- one visible, one infrared (for more on infrared imaging see our near-infrared camera tool). The visible imagery can be stitched normally, or overlaid with the infrared to create NDVI - Normalized Difference Vegetation Index.
NDVI can be used to identify not only the presence of but also the health of vegetation. Since the 1980's NDVI has become one of the important tools for mapping and monitoring vegetation.
NDVI = (NIR-RED) / (NIR+RED)
where, RED and NIR stand for the spectral reflectance measurements acquired in the red and near-infrared regions, respectively.
Only two bands are required: Red channel 0.62 – 0.67 μm, and Near IR channel 0.84–0.87 μm. On the MODIS satellite these are Bands 1 and 2 respectively.
Another use of a stereo camera rig would be for stereoscopy -- a technique which uses two slightly offset images (from two cameras as far apart as human eyes, for instance) to recreate the illusion of depth in a digital image.
How we're doing it:
- the cameras are controlled by the USB Remote Trigger from the Canon Hack Development Kit
- an arduino is used to simultaeneously send a 5 volt pulse to each camera.
The other way to achieve this is to simply attach 2 cameras to the same rig -- the above photo shows this with a pair of stabilizing chopsticks (pencils will work) -- and set them to trigger every second or so in continuous mode with rubber bands over their triggers. You're not guaranteed simultaneous images but there is usually plenty of overlap between your IR and VIS photos.
Pat Coyle's double Ocean Spray juice bottle rig is another solution for a better-protected double camera setup; just bolt two flat-sided bottles together (click to watch video):
Live video version
Above, a version using USB webcams and a Processing application. This allows the use of the stereo camera for NDVI and NRG as described on the near-infrared camera page -- in real time. This is neat, and reduces the feedback loop - you can just put thing in front of the camera and look at them, instead of taking 2 pictures, aligning them, and doing the compositing later, which can be arduous.
What about a version running on an Android phone or tablet, for portability? We need a second camera, or a way to split the image from one camera and account for exposure differences.
It's apparently possible to attach external webcams to some android phones, like the Nexus One. Read more here: http://sven.killig.de/android/N1/2.2/usb_host/ The following are required:
- A USB hub with upstream power, $16 TM-UH710, or a Y-cable to "inject" power. Read about cable options here
This is just speculation but it would be very interesting to design an android peripheral which could do this. Another option might be something using a fast board like the Maple, which can do some video.