GISdevelopment.net ---> AARS ---> ACRS 2004 ---> New Generation Sensors and Applications: Airborne Sensing

Research for verification and calibration of Multi-spectral aerial photographing system(PKNU 3)

Nam-Chun, Cho
Master 1, Pukyong National University, Dept. of Satellite Information Sciences, KOREA,
Telephone : 051)620-6578,
Email: pkphoto78@hanmail.co.kr

Chul-Uong, Choi
Professor, Pukyong National University, Dept. of Satellite Information Sciences, KOREA,
Email: cuchoi@pknu.ac.kr

Seong-Woo, Jeon
Researcher, Korean Environment Institute, KOREA,
Telephone: 02) 380-7661,
Email: swjeon@kei.re.kr

Hui-Chul, Jung
Researcher, Korean Environment Institute, KOREA,
Telephone: 02) 380-7780,
Email: hchjung@kei.re.kr


Abstract
In this study, we quantify the spectral characteristics of the multi-spectral camera (REDLAKE MS 4000) and the thermal infrared sensor (Raytheon IRPro) that would be used in the small-format aerial photographing system named PKNU 3 developed by our laboratory. We compute the radial lens distortion of the multi-spectral camera by surveying a series of GCPs (Ground Control Point), and analyzing the accuracy after correcting detected distortion. By analyzing the correlation between the radiant energy in each band and the brightness value on the image, we also computed the sensitivity of each band on each CCD that should be corrected for maintaining the proper balance among R, G, B, and IR bands on an image. We estimated the sensitivity of thermal infrared sensor by analyzing the correlation between the brightness value on the image and radiation temperature associated with the Emissivity of each object. In this study, we allow for the generation of normalized data from PKNU 3 imagery by determining the spectral characteristics of each sensor. Thus, these equipment calibration techniques can give equipment such as the PKNU 3 added value in roles such asenvironmental monitoring.

1. Introduction
The multi-spectral sensor (MS 4000) that is implemented in the multi-spectral aerial photographing system (PKNU 3) uses a triple CCD (R, B&G, and IR) sensor, as opposed to a camera with a single CCD, so that more precise data on each band channel in any particular image may be obtained. Quantifying the various characteristics of camera lens distortion, the individual band sensitivity of each CCD, and the sensitivity of the thermal infrared sensor allows for the normalization of data to result in a more accurate image. Presently, few studies have been undertaken to analyze lens distortion and mitigate its effects on digital camera photography. A domestic program has studied the analysis and techniques to correct the lens distortion of the CCD camera for mobile mapping system (Dong-hun Jeong, 2002). In a foreign study, studying the possibility of generating a digital map in a narrow area using Kodak DCS460 cameras, the issue of lens distortion was discussed (S.Manson et al., 1997; Clive S.Fraser et al., 1997).

2. PKNU 3 sensor
Multi-spectral aerial photographing system designated the PKNU 3 consists of a multi-spectral camera (REDLAKE MS 4000) that can take R, G, B, IR images simultaneously and also a thermal infrared camera (Raytheon IRPro).

2.1. REDLAKE MS 4000
The REDLAKE MS 4000 sensor is a triple CCD camera that can take R,G,B,IR images simultaneously so it can produce RGB and CIR 1600 ×1200 pixels(7.4 . per pixel) images of the target area.


Figure 1 B,G,R,IR Band and FWHM of MS 4000 sensor

The green and blue bands are detected on one CCD with a Bayer pattern consisting of rows of red-green-red-green and blue-green-blue-green pixels. A monochrome CCD sensor acquires the red plane at full resolution. Images captured with the MS 4000 RGB/CIR triple CCD camera such as the one used in this study has a comparably higher resolution than images acquired with a single-chip color camera because in the RGB/CIR configuration, the red image is acquired at full resolution.

2.2 Raytheon IRPro
The thermal infrared camera, a Raytheon IRPro used in this study senses the energy of 7~14 . wavelength as still and moving pictures and displays through an LCD viewer. Thermal energy patterns on images are displayed via 5 colors (red, orange, yellow, green, and blue) and temperatures are expressed in terms of brightness values.

3. Calibration and correction of Sensors

3.1. Multi-spectral camera (RGB/CIR MS 4000 sensor)
Prior to obtaining data from aerial photography, the RGB/CIR MS 4000 sensor was first calibrated so that geometric and radiometric accuracy of images taken by PKNU 3 would be possible.

3.1.1 Geometric correction of multi-spectral camera
Camera lens distortion due to the geometric structure of lens is due to varying degrees of radial and tangential distortion. Tangential distortion has typically a much smaller effect than radial distortion and can thus be safely disregarded during calibration. Thus, we focus on the correction of radial distortion in this study.

A panel of 121 evenly arrayed GCPs was utilized for determining radial lens distortion using Total station. Then, that panel was taken up as an image by the MS 4000 sensor. Ground coordinates(x, y, z) were calculated using the horizontal angle (HA), vertical angle (VA), and the slope distance (SA) values surveyed by Total station as well as the computed radial distance of the ground coordinates, the radial distance of the image(r), and the radial distance r).

Lens distortions were then computed by comparing between ground coordinates (set as a reference) surveyed by Total station and the converted coordinates from pixel numbers on the image of the panel in this paper. The lens distortion characteristic was ascertained by setting the coefficients of radial distortion as K 0 =6.4817200e-3, K 1 =-4.4270500e-4, K 2 =3.9596100e-06 using eq. (3-1).



r: radial distortion, r: radial distance, k 0 ~k n : coefficients of radial distortion, í : variation) Image coordinates(x, y) corrected for the radial lens distortions through eq. (3-2) were calculated. Precision of x, y and RMSE (Root Mean Square Error) on the image before and after correction of the lens distortion are shown in table 1. (3-2)


Table 1 Precision

Consider that the reason for the small lens distortion as the 0.57~0.92 range pixel distortion determined above is due to the utilization of the central 12% of the available photographic area of the lens; where lens distortion is typically lowest. Thus, it is possible to obtain geometrically precise images of less than 1 pixel in lens distortion through calibration and lens distortion compensation procedures of the REDLAKE MS 4000 sensor.

3.2. Radiometric correction
Radiometric correction is necessary so that emitted and reflected energy of an object measured by multi-spectral sensor squares with that detected by spectrometer. Thus, radiometric correction should be conducted for finding the actual intensity of solar radiant energy and reflectance. It was determined that MS 4000 would photograph multi-spectrally by verification of the radiometric distortion.

3.2.1. Comparison and analysis between ground truth data and pixel values on the image
To compare and analyze between actual ground data and pixel values of the image, Color Checker Color Rendition Chart of Corp. Grelagmacbeth was used and the results are as follows in table 2.

Table 2 Radiometric correction in each band (RGB)

Calibration between the measured RGB reflectance of the color chart and RGB values on the image (Radiometric correction) to the find CF (Correlation Factor) values are done as follows. Correction factors (CF) = Ground target spectral reflectance / (RGB values/255) RGB sensitivity of the sensor is calculated by CF values. According to the CF values in each band of MS4000, the CCD sensor of G/B bands is more sensitive than that of the Red band. Therefore, the gain value that adjusts for the incoming light to the CCD through the camera lens, should rise for the CCD measuring the G/B band relative to that of the CCD reserved for the Red band to maintain proper balance among R, G, and B bands of an image.

3.2.1. Analysis the correlation between brightness values and reflected radiant energy
The correlation between brightness values on the image and the measured solar radiant energy were analyzed using different f-stop (2.8, 8, 11, 16) and exposure times to determine the maximum and minimum quantized pixel values(effective sensing range of MS 4000 sensor) as shown in figure 2. Then the effective QCAL were converted to spectral radiance at the sensor’s aperture from the following eq. to compare with actual ground values of radiant energy.

(L = spectral radiance at the sensor’s aperture (w/m 2 .sterì m), QCAL= the quantized calibrated pixel value in DN, LMIN= the spectral radiance that is scaled to QCALMIN. LMAX= the spectral radiance that is scaled to QCALMAX, QCALMIN=the minimum quantized calibrated pixel value (corresponding to LMIN ) in DN, QCALMAX=the maximum quantized calibrated pixel value (corresponding to LMAX ) in DN)


Figure 2 Effective sensed range of MS 4000 sensor

From that analysis, the effective sensing ranges of MS 4000 sensor were found to be 0~125 in the blue band, 24~130 in the green band, and 0~125 in the red band by using the 31~47% of center section of the image sensing area of the CCD sensor.

The relationship between the spectral radiance at the sensor’s aperture in effective sensing range and surveyed radiant energy was proportional by coefficients of 0.83, 0.81, and 0.86 in blue, green, and red band respectively as shown in figure 3. The sensitivity in CCD of red band was superior to others because a monochrome CCD sensor acquires the red plane at full resolution while Green and Blue bands are detected on the same CCD with a Bayer pattern.


Figure 3 Correlation between spectral radiance in effective sensed range and radiant energy

3.3. Calibration and correction of thermal infrared sensor
If the Emissivity () of an object is known, total spectral radiant flux of an actual object could be calculated by modifying Stefan-Bolzmann's law (M b =T kin 4 ) as applied to black body radiation analysis. The following equation factors in the temperature and Emissivity of an object to allow comparison between the calculated radiant flux and values reported on the thermal IR sensor.


The thermal infrared sensor reports not temperature (T kin ) but outward radiant temperature (K rad ) of an object. The relationship between radiant temperature of an object detected by the sensor and temperature is like K rad = e 1/4 T kin. Therefore, radiant temperatures for each object with regard to Emissivity were calculated to analyze the its correlation with brightness values on an image as shown in table 3.

Table 3 Surveyed Temperatures, Radiant Temperatures, and Brightness Values

The correlation coefficients between outward radiant temperature with regard to the object’s emissivity and the sensitivity of thermal IR sensor in each object were high as 0.834 on the concrete, 0.889 at the wall, and 0.725 on the grass.

4. Conclusion
Prior to the actual aerial photography test of the PKNU 3, we conducted an investigation to determine the operational characteristics of each sensor. We analyzed the lens distortion, sensitivity to each band of each CCD sensor, as well as the thermal response of the thermal infrared sensor and have made the following conclusions:

First, the the relatively small 0.57~ 0.92 pixel distortion is due to the use of the central 12% of the available photographic area. Typically, lens distortion increases in a radial pattern on a lens.

Second, the light sensitivity gain value must be kept at a higher level for the CCD sensing the G/B band in relation to that CCD sensing in the red band in order to maintain the proper balance among R, G, and B band for any particular image.

Third, the sensitivity in CCD of red band was superior to others because a monochrome CCD sensor acquires the red plane at full resolution while Green and Blue bands are sensed on the same CCD with a Bayer pattern.

Fourth, The correlation coefficients between outward radiant temperature with regard to an object’s emissivity and the sensitivity of thermal IR sensor in each object were as high as 0.834 on the concrete, 0.889 at the wall, and 0.725 on the grass.

5. Acknowledgements
We express our gratitude towards Pukyong National University, the Ministry of Maritime Affairs & Fisheries, and the Korea Environmental Institute for their support of this study.

6. References
  • Livingstone, D., Raper, J. and McCarthy, T., 1999. Integrating areal videography and digital photography with terrain modelling, Geomorphology 29, 77-92.
  • John R. Jensen, , 1996. Introductory Digital Image Processing A remote sensing perspective , Prentice-Hall, pp. 17~62, 107~135.
  • Jeong Dong-hoon, Kim bYung-Guk, 2002, Lens Distortion Correction for CCD camera using projective Transformation Method, Korean society of civil engineers, 22(5): 995-1001.
  • D.D Lichti and M.A. Chapman, CCD Camera Calibration Using the finite Element Method, SPIE videometrics IV, 1995.