This is an extension of the research started by Mathew Lippincot to create a low cost gas sensitive camera.
The high sensitivity semiconductor sensors are all very expensive, and scanning in two axes would require a significant scanning apparatus.
My investigation has been looking at what we might be able to accomplish with significantly less sensitivity. Can we slow down the time resolution or sacrifice the ability to detect/document all but the largest leaks in order to make something affordable?
Specifically, I've been looking at the use of thermopiles, all the way down near the bottom of the graph that Mathew found. A company called Heimann makes a 32x32 pixel thermopile sensor for $70, including the lens, amplifier, digital readout, etc. I am currently in the process of ordering one of these sensors through their Boston-area distributor, Boselec.
Another Boston-area company, Precision Micro-Optics, distributes a 3.4um narrow bandpass filter at the affordable price of $62. Combining these optical components with a Raspberry Pi, a display, and a visible camera to increase the perceived resolution of the IR sensor could put the camera in the $250 range.
But can we even detect anything with that low a sensitivity?
32x32 pixel sensor datasheet: HTPA_32x32d_L2.1_0.8_-Hi-S_Rev2_Datasheet.pdf
Thinking about noise
I'm not an expert or even particularly knowledgeable in this field, so the few methods of analyzing feasibility have given me several different answers.
Noise floor of the sensor
The noise equivalent temperature difference of the sensor is 150 mK / Hz0.5. Order of magnitude, a change in temperature of 150 mK, at human temperature scales, results in a change of power output of 1 W/m2. With a lens with a FOV of 33 degrees (L5 Ge), this means that the noise floor for each pixel, at 1 Hz, is 1 nW of incident power.
How much does methane absorb?
Let's assume that this is a reasonable amount of methane to try to measure (unverified assumption! it seems high, but maybe we'll be looking through more methane (10m instead of 5cm) at a lower concentration in a real world environment. I have scrawled in some notes that 200ppm methane = .245g/m3, but I may have screwed up somewhere.) This means that, order of magnitude, we need 10nW of power incident on each pixel in the array in order to see variation caused by methane (either through variation over time or through subtraction of two sensors or by chopping the filter.)
Where does that power come from?
In the iPython notebook, I tried to evaluate what the baseline power is from the scattered illumination of the blue sky. (Assuming we're looking at outdoor leaks against a clear sky.) I found a received power of 0.0056 W/m2 from the entire blue sky. This doesn't look good, from this I calculated a received power of .015 nW per pixel.
But maybe it would work against a cloudy sky? Maybe it would work against a buildings or other thermal radiators? More to investigate.
Comparison with existing gas cameras
FLIR's gas camera has a NETD of 15mK/Hz0.5, about an order of magnitude better than the sensor I'm evaluating. A factor of 10 doesn't sound so bad! But, I think it's worse than that -- the NETD of the thermopile array is assuming it's receiving a whole bunch of radiation in the 5-12um range, where as the sensor in the FLIR camera can only receive in the 2-5um range. (And maybe this figure is even reported with the gas-sensitive IR filter taken into account.) If this is the case, then we lose another 1.5 orders of magnitude, and the cheap sensor is maybe .3% as sensitive. Guessing from FLIR's reported sensitivity, maybe this means that leaks in the 200g/hr range could be detectable? I believe this is around what a typical stove burns, so this seems like a reasonable range to try testing and measuring.
I'm a bit overwhelmed by assumptions and uncertainty over figures in datasheets, so my plan is to order one of the sensors, start measuring things with it, and try evaluating empirically whether any of this looks feasible. Worst case scenario, I made an expensive DIY thermal camera.