GISdevelopment.net ---> AARS ---> ACRS 1990 ---> Digital Image Processing

A method for cloud of classification of AVHRR image data with fractal dimension

Ryoichi Kawads and Mikio Takagi
Institute of Industrial Science
University of Tokyo, Tokyo, Japan


Abstract
The need for automated cloud detection and classification arises from the massive quantities of data by satellites. Only automated processing can handle this many data.

In this paper, we present a method for multi spectral cloud classification of Advanced Very high Resolution Radiometer (AVHRR) data, which uses fractal dimension for texture analysis a well as other features like channel-1 visible reflectivity, channel-4 infra-red brightness temperature, and so on.

Features to measure texture are important, especially for night-time analysis on infra-red data 9visibel feature are not available for night time analysis). These features provide some information for distinguishing cumuliform clouds from stratiform clouds or clear regions.

In order to represent different textures we calculated fractal dimension of each pixel in images, which was found to be more effective than local difference. Use of fractal dimension leads to correct interpretation of coat lines or tidal fronts, which are often mis-classified into clouds by use of difference.

The classification is based on maximum likelihood method, which uses features extracted for AVHRR data of NOAA (National Oceanic and Atmospheric Administration) satellites.

Introduction
In multispectral classification of satellite images, local texture such as variance, standard deviation, and difference is a very important feature especially at night. But using of these often lead to false classification of coast line and tidal fronts into clouds.

In this study we used fractal dimensions to represent different textures. Unlike other features described above, values of fractal dimension of border liens of different domains are not so large as that of clouds. Although the calculation time of fractal dimension is generally larger than that of others, we did pixel by the pixel considered -------------- that is much faster than calculation over a block whose center point is the pixel. Thus, we got good results almost free of the problems above.

The images used are map images made from Advanced Very high Resolution Radiometer's data of NOAA meteorological satellites. Channel-1 (0.58~0.68m m; visible) and 2 (0.725~1.1m m; middle IR), 4 (10.3~11.3mm ; far IR and 5(11.5~12.5m m : far IR) contains brightness temperature.

Fractal dimension
  1. Calculation method of fractal dimension in this study

    There are several ways of calculation of fractal dimension (hereafter FD). Procedure we used is described below:

    Calculation is performed along two lines whose middle point is the pixel considered (FD is estimated pixel by pixel)-----------i.e. local latitude and longitude lien. (See fig.1 (a).) values of pixels along those two lines are calculated into the estimated fractal dimension. Mean of two calculated fractal dimensions along each line is considered to be the fractal dimension belonging to the pixel.

    Fractal dimension along a lone is calculated in this way. See Fig. (b), where I is values of pixels and I is the length of domain C. First, we calculate n (r) below for all a in (0,l - b + a).

    n(r) = ( | I(a) - I (b) | / r ) + 1 .................(1)

    Next, using mean n(r) for all a, N (r) is derived.

    N(r) = n(r) I / r ..........................(2)

    N(r) is calculated for all r in (0,I), and the gradient of the regression line of log N(r) and log r is the estimated fractal dimension.


    Figure 1: (a)The lines along which fractal dimension is calculated.
    (b) Calculation of fractal dimension along a line.

  2. Examples of FD calculation

    As written in chapter 1, fractal dimension of coast liens and tidal fronts is not so large as that of cloud. That is a merit of using fractal dimension for texture representation of variance or difference.

    Fig. 3 (a) and (b) shows values of difference and fractal dimension of each pixel in the image shown in fig. 2. (Local differences assigned to each pixel in this image is defined as follows[5]:

    S = |a0 - a1| + |a0 - a2|+ |a0 - a3|+ |a0 - a4| / 4 ..........................(3)

    Where a0 is the values of ch4 temperature of the pixel considered, and a1- a4 are those of the neighborhood pixels show in Fig. 4) in Fig. 3(a), coast and tidal fronts have large value of difference like clouds, while in fig 3 (b), they are not clearly seen.

    Figure 4 : Neighborhood pixels considered when calculating difference These results suggests that if we use fractal dimension to represent different textures, we can distinguish low clouds and coast liens or tidal fronts more effectively than when using difference.

Figure 2: Original image (NOAA-10. 1987.5.9.1sh. ch1).


Figure 3: (a) The brightness of each pixel in this image represents the
value of local difference around the pixel in fig.2 image.
The difference is calculated by eq. 3.
(b)The brightness represents fractal dimension of fig. 2.


Figure 4: Neighborhood pixels considered when calculating difference.

Maximum Likelihood Method
Point-wise likelihood classifier is used. The features are : NOAA-10 (night) .......... ch4, ch3-ch4, FD
NOAA-10 (day) .......... ch1, ch4, ch1-ch2, ch3-ch4,FD
NOAA-9, 11 (night) .......... Ch4, ch3-ch4m, ch4-ch5, FD
NOAA-9,11 (day) .......... ch1, ch4, ch1-ch2, ch3-ch4, ch4-ch5 FD
FD of Ch4 brightness temperature is calculated for each pixel by the methods described in chapter 2 and I in equation (2) is set to 8. the features other than FD are from reference (6).

There are 6 classification classes, that are :

Sea, land, high cloud, middle cloud, and sun glint.

The following is the definition of the membership function for class c(7):


Eq. 4

Where
Pi (x) = 1 / (2p)N/2 | åi |1/2 . exp {-1/2 (x-mi)T åi-1(x-mi)} .................(5) x : a pixel vector
N : the dimensions of the pixel vectors
m : the number f predefined classes 1* i] m),
mi : mean of class i,
åi : covariance matrix of class i.
In this study, of m=6 (described above). Six membership grades are assigned toe ach pixel to indicate the extents to which the pixels belongs to the six predefined classes. Then, for hard classification, the class that has the largest value of membership is assigned to each pixel.[7][8]

Results and discussions
We performed calculation on sequent S81 computer. Original map images have 512 x 512 pixels, already being calibrated and transformed onto Melcator - Projection planes. The distance between a pixel and its neighboring pixel is about 4 km. Each pixel has five data that are reflectivity of ch1 and ch2 and brightness temperature of ch3, ch4 and ch5.

Fig 6 (a), 7-9 shows original images and classified images. For comparison, classification result with difference instead of fractal dimension is shown in fig 6(b). (difference is defined as eq.(3) ).

In Fig. 6 (1), classification is performed with only two features, the accuracy is rather low than the other results. However because of the use of fractal dimension for texture representation, coast liens and tidal fronts are found to be classified into their proper classes, although some pixels in the sea are misinterpreted as land. Fig 6 (b) is derived from the same original image with fig. 6 (a), using difference instead of fractal dimension. In this result image, tidal fronts and coast liens are misinterpreted as low clouds.

In fig 7,the season of which is winter, snow-covered land is mis-classified into low cloud. If there is the class of snow covered land, it may be improved.

In fig, 8 and 9, which are day images. The classification results are generally good. In Fig 9 sun glint is well detected. However, compared with classification of nigh images, the calculation times is rather large as described in the figure captions, because more features are used in maximum ;likelihood classifier.

In this classification, supervising data is made fro each images, so it is not completely automatic at this time. However, by making data base of supervising data for several different conditions such as four seasons and times in a day, automatic classification will be realized.


Figure 5: (a) Gray levels and classes for. 6-8.
(b) Gray levels and classes for fig.9.


Figure 6: Classified image of fig.2. (a) The features are ch4, ch3-ch4, and FD.
Elapsed time on Sequent S81 computer is about 180 seconds.
(b) The features are ch4, ch3-ch4, and difference by eq.3.
Elapsed time is about 90 seconds.


Figure 7: (a) Original image (NOAA-11, 1988.12.14.1h, ch4).
(b) Classified image of (a). The features are ch4, ch3-ch4,
ch4-ch5, and FD. Elapsed time is about 220 seconds.


Figure 8: (a) Original image (NOAA-10, 1987.5.9.6h, ch2).
(b) Classified image of (a). The features are ch1, ch4, ch1-ch2,
ch3-ch4, and FD. Elapsed time is about 270 seconds.


Figure 9: (a) Original image (NOAA-9, 1987.5.9.13h, ch2).
(b) Classified image of (a). The features are ch1, ch1-ch2, ch3-ch4,
ch4-ch5 and FD. Elapsed time is about 310 seconds.


Conclusion
It is shown in this study that fractal dimension is generally more useful than difference for texture representation, especially for classification of coast lines and tidal fronts. There is the problem of calculation time of fractal dimension that is larger than that of difference, but by calculating FD along local lines, not over locks, we reduce the problem to such extent that we can perform FD calculation pixel by pixel, which leads to more accurate classification.

References
  • J.T. Bunting and K. R. Hardy, "cloud identification and characterization from satellites," in Satellite Sensing of a cloudy Atmosphere: Observing the Third Planet, A. H. Sellers, Ed., Taylor & Francis, 1984.
  • H. Takayasu and M. Takayasu, "What is fractal?", Diamond-sha, Tokyo, Japan 1988.
  • H. Nakayama, M. Sone, and M. Takagi, "Meteorological Satellite Analysis Using Fractal Dimension and Lower-Order Statistics" Proc. 5th Scandinavian Conf. on image Analysis, 1987, pp. 261-268.
  • A. Detwiler, "Analysis of cloud imagery using box counting", Int. J. Remote sensing, 1990, Vol. 11, No. 5, pp,887-898.
  • H.L. Gleau, M. Derrien, L. Harang, L. Lavanant, and A. Noyalet, "Operational loud mask using the AVHRR of NOA 11, in proc. 4th AVHRR data users' Meeting (Rothenburg, F.R, Germany),1989, pp. 69-72.
  • K.G Karlsson, "Development of an operational cloud classification model," Int. J. Remtoe Sensing, 1989, Vol. 10, Nos. 4 and 5, pp.687-693
  • F. Wang, "Fuzzy supervised classification of remote sensing images, "IEEE Trans. Geosci. Remote Sensing, Vol. Ge-28, pp. 194-201, Mar. 1990.
  • J.R. Key, J. A. Maslanik, and R.G. Barry, "cloud classification from satellite data using a fuzzy sets algorithms: A polar example", Int. J. Remote Sensing 1989, Vol 10, No. 12 pp.1823-1842.