Multisensor image fusion

Date

1998-12

Journal Title

Journal ISSN

Volume Title

Publisher

Texas Tech University

Abstract

In recent years, a new discipline called multisensor data fusion has been developed to solve a diverse set of problems having common characteristics. Multisensor fusion seeks to combine data from multiple sensors to perform inferences that may not be possible from a single sensor alone. Several image fusion techniques are available, viz. statistical methods, feature selection methods, neural networks, and pyramidal methods. In this thesis fusion is performed using wavelets. The wavelet-based fusion techniques have the advantage of processing different frequency ranges differently, while providing a fast way to integrate local spatial information, a good control over noise, and an easy way to reconstruct the final product.

This thesis reviews common image fusion techniques and describes the implementation of the wavelet based image fusion algorithm. The results obtained by applying different fusion schemes in the wavelet domain are presented and compared with those obtained by application of fusion schemes in the spatial domain. Edge detection is performed to show the additional features obtained in the fused images.

Description

Citation