This answer was developed for Boson, however the content is general to any thermal detector. Latency of the Boson signal pipeline is defined as the time difference between when the signal level of a given pixel is read from the sensor and when that signal is available as output from the camera. It is the amount of time for “raw data in” to be fully processed to “data out” at the selected video channel.
The value varies depending upon where in the signal chain the output is tapped, as follows: Pre-AGC: ~108% of the frame time Post-AGC: ~113% of the frame time Post-zoom: ~222% of the frame time
For all three tap points, the output channel utilizes a multi-frame buffer. This buffer introduces a frame of latency. For the post-zoom tap-point, the zoom operation itself also utilizes a multi-frame buffer, introducing a second frame of latency. The remaining fractions of a frame-time in the latency values provided above are processing time required by the various blocks in the signal pipeline.
Like all thermal detectors, Boson’s sensor assembly has a characteristic thermal time constant. It is not traditional to include time constant in the latency definition.
For a photon detector, such as a typical visible CMOS sensor, incoming photons are converted to electrons with near-instantaneous response time; however, the sensor only collects information from the scene during an integration period which is often a small fraction of the frame time. In other words, high-speed phenomena (such as a strobed signal) can be missed entirely if the resulting photons are incident at a point in time when the detector is not integrating. On the other hand, a thermal detector such as Boson’s is always changing temperature in response to incident radiation. That is to say, it is always“on” regardless of whether or not it is being actively sensed. The ability to detect high-speed phenomena is instead a function of the detector`s characteristic thermal time constant, which governs the rate of temperature change. For Boson, the detector time constant is on the order of 8 msec, meaning that an instantaneous change in irradiance will result in a temperature change of the detector as shown in the figure below.
Image of Thermal Time Constant
From a latency standpoint, if the irradiance upon a given pixel changes just before that detector is read out, the detector temperature will not have changed appreciably, meaning the effect of the irradiance change will not register on that frame period. By the next readout event, 1/60th of a second later, the detector temperature will have changed to ~88% of its new value, meaning the effect of the irradiance change will be almost completely sensed. For the more general case in which an irradiance change is not perfectly synchronized with readout timing, the signal at the next readout event will be somewhere between 0% and 88% of its steady-state value. This uncertainty makes it difficult to define precisely the “latency” introduced by the detector time. The delay between irradiance change and detectable signal at output of the detector is a function of numerous factors including signal strength, detector noise, and relative timing. It is feasible to derive a statistical model of “average latency”, but generally speaking, values will range between approximately 0 and 1 frame period.