Real-time visual-to-infrared image generation

ABSTRACT
IR sensor simulation has become an increasingly important component of simulation and training for defense, security, transportation, and law enforcement agencies. The steady improvement in real-time graphics has led to the development of simulators capable of displaying imagery over very large geospecific training areas in both visual and IR sensor channels. Unlike out-thewindow simulation, which can use visual imagery directly, high-quality IR sensor simulation must be physics-based, and must rely on knowledge of the material properties of the simulation area.

High-resolution material classification maps of large global areas are generally not available. To create material maps of geospecific imagery, the state-of-the-art process typically inputs visualspectrum imagery to a material classifier, resulting in a raster of material codes. These material codes can then be used to drive a physics-based IR model. The three major problems with this offline material classification process are:
• A raster of material codes cannot be mip-mapped at run-time, leading to undesirable texture sampling artifacts.
• The classifier is forced into making a per-pixel choice from a discrete set of materials, creating hard edges and pixelation in the resulting sensor image.
• The image generator has to store both a visual and sensor representation of the same area. If the physics processing is performed off-line, several versions may be required for different times of day. For very large simulation scenes, this is a major burden.

The real-time classification and visual-to-IR transformation technology developed by Technology Service Corporation (TSC) and MetaVR addresses the classification problem. Our process uses pixel-shader technology to convert the per-pixel filtered visual spectrum RGB color into its component materials, which are then used by the physics-based IR model to compute IR radiance and IR sensor display intensity. The physics-based model takes full account of the local environment, time of day, and sensor characteristics. For a given set of environmental conditions, a 3D lookup table is pre-computed, allowing any possible RGB color to be remapped to material properties, and this is performed per-pixel on the GPU. The material properties are then used in conjunction with pre-computed real-time parameters and time-of-day considerations to produce a physics-based radiance result. The result is a physically accurate sensor scene derived from a visual-spectrum database. The key advantages of the new approach include:
• The filtered RGB value is used to produce physical material properties, avoiding the problem of texture aliasing.
• As RGB colors represent a blend between discrete materials in the classification palette, the resulting physical properties will also be a blend. This avoids the problem of hard edges and pixelation.
• No additional sensor database needs be stored by the IG. The process is driven entirely by the existing visual-spectrum "out-the-window" database.

The only offline processing requires the user to create a classification palette for supervised material classification by associating RGB colors with physical materials from a material database.

Using the per-pixel material properties and the real-time parameters derived from environmental conditions, a per-pixel radiance raster image is computed for each frame, as would be seen by an actual thermal sensor. Simulated sensor processing is then performed on this radiance image in real time using the GPU to implement blurring, noise and Automatic Gain Control (AGC). The AGC process maps the radiance range into a display dynamic range, and can be disabled by the user if manual level and gain controls are to be used. The AGC process uses a histogram analysis of the radiance image to determine an appropriate display dynamic range for the scene. Sensor noise is automatically computed as a function of the display’s dynamic range.

Geotypical content, such as 3D culture, can be directly mapped to material codes using an external file which associates texture names with material codes. The geospecific and geotypical content use the same underlying physics model, resulting in a radiometrically consistent scene.

A sequence of simulated IR sensor images, generated using the described new technology, will be included in the paper. The images show a simulation scene at various times during a 24-hour IR sensor simulation. At midnight, the IR image shows the warmer vegetation and the cooler asphalt road and concrete structures. The cooler 3D structures are difficult to discern as they have cooled to an even temperature and blend into the background. In the afternoon, the sun has heated up the asphalt road and the stucco sides of the buildings. The vegetation has cooled relative to the concrete and asphalt structures heated by the sun, which illustrates the thermal crossover effect.

The presented IR images are generated using the IR sensor model in MetaVR’s Virtual Reality Scene Generator™ (VRSG™) version 5.8 software. VRSG is a Microsoft DirectX-based image generator (IG) that provides geospecific sensor simulation with game quality graphics. Its IR model is built on top of TSC’s RealIR™ application programming interface (API) library. The RealIR API provides complete functionality for a physics-based IR model and includes sensor specification, time-varying environment and atmosphere, physics-based computation of scene temperatures and radiances, and atmospheric propagation using LOWTRAN or MODTRAN. These tools meet the needs of users who wish to incorporate real-time imaging IR into their simulation systems.

VITA
Dr. Uri Bernstein is a physicist with 35 years of experience in the simulation of IR and radar sensors, simulation database development, and software development. He is the product manager for TSC’s RealIR simulation software, and has managed several programs for the development of hardwarein- the-loop IR simulators. His simulation experience ranges from very accurate non-real-time simulations, such as TSC’s hyperspectral IR simulation and pulse-by-pulse radar simulations, to real-time IR simulations for HWIL and man-in-the-loop applications.Dr. Bernstein has also managed or participated in several TSC programs for feature extraction from optical, multispectral, LIDAR and IFSAR data. These programs were intended to automate the process of constructing simulation databases for industry-standard formats such as CDB. He has presented several papers on IR simulation at national conferences.