Please use this identifier to cite or link to this item:
標題: 基於多尺度雙向濾波器和IHS轉換的一種新的全色態與多光譜影像融合技術
A New Fusion Approach for Panchromatic and Multi-spectral Images based on Multi-scale Bilateral Filtering and IHS Transform
作者: 王世翼
Wang, Shih-Yi
關鍵字: Phen-Lan Lin;影像融合;Chu-Hui Lee;遙測影像;全色態影像;多光譜影像;雙向濾波器
出版社: 資訊科學與工程學系所
引用: [1] 林韋華,2010,「遙測影像融合之IHS飽和度有效修正法」,國立中興大學資訊網路多媒體研究所碩士論文。 [2] Te-Ming Tu, Shun-Chi Su, Hsuen-Chyun Shyu, Ping S. Huang, “A new look at IHS-like image fusion methods,” Information Fusion, Vol.11, pp.177-186, September 2001. [3] Maria Gonzalez-Audicana, Jose Luis Saleta, Raquel Garcia Catalan, and Rafael Garcia, “Fusion of Multispectral and Panchromatic Images Using Improved IHS and PCA Mergers Based on Wavelet Decomposition,” IEEE Transactions on geoscience and remote sensing, Vol.43, pp.1291-1299, January 2004. [4] Chavez, P. S. Jr. and A. Y. Kwarteng, “Extracting spectral contrast in Landsat thematic mapper image data using selective principal component analysis,” Photogrammetric Engineering & Remote Sensing, Vol.55, No.3, pp. 339-348, 1989. [5] L. Wald, “Quality of high resolution synthesized images: Is there a simple criterion?,” in Proc. Int. Conf. Fusion Earth Data, pp. 99–105, Jan. 2000. [6] P. S. Pradhan, R. L. King, N. H. Younan, and D.W. Holcomb, “Estimation of the number of decomposition levels for a wavelet-based multiresolution multisensor image fusion,” IEEE Trans. Geosci. Remote Sens., vol. 44, no. 12, pp. 674-3686, Dec. 2006. [7] Youcef Chibani , Amrane Houacine, “Redundant versus orthogonal wavelet decomposition for multisensor image fusion ,” Pattern Recognition 36, pp. 879 – 887, 2003. [8] Tuan D. Pham, “An image restoration by fusion”, Pattern Recognition 34, PP. 2403 – 2411, 2010. [9] Myungjin Choi, “An Improved Intensity-Hue-Saturation Method for IKONOS Image Fusion”, IEEE Transactions on Geoscience and Remote Sensing, Vol. 44, No. 6, PP. 1672-1682, June, 2006. [10] F. Durand and J. Dorsey, “Fast bilteral filtering for the display of high dynamic range image,” in John Hughes, editor, SIGGRAPH2002 conference Proceedings, Annual Conference Series, ACM Press/ACM SIGGRAPH, PP. 257-265, 2002. [11] FLEISHMAN, S., DRORI, I. AND COHEN-OR, D. “Bilateral mesh denoising.”, ACM Transaction on Graphics, Vol. 22, PP. 950-953, 2003. [12] R. T. Chin and C. L. Yeh. “Quantitative evaluation of some edge-preserving noise-smoothing techniques.”, CVGIP, Vol. 23, PP. 67–91, 1983. [13] B. Aiazzi, L. Alparone, S. Baronti, and A. Garzelli, “Context-driven fusion of high spatial and spectral resolution image based on oversampled multiresolution analysis,” IEEE Trans. Geosci. Remote Sensing, Vol. 40, PP. 2300–2312, Oct, 2002. [14] Y. Chibani and A. Houacine, “The joint use of IHS transform and redundant wavelet decomposition for fusing multispectral and panchromatic images,” Int. J. Remote Sensing, Vol. 23, PP. 3821–3833, Sep, 2002. [15] Gonzalo Pajares* , Jes*us Manuel de la Cruz, “A wavelet-based image fusion tutorial”, Pattern Recognition 37, PP. 1855–1872, 2004. [16] Z. Wang and A.C. Bovik. “A universal image quality index.” IEEE Signal Processing Letters, 9, Vol. 3, PP. 81-84, 2002. [17] R. H. Yuhas, A. F. H. Goetz, and J. W. Boardman, “Discrimination among semi-arid landscape endmembers using the Spectral AngleMapper (SAM) algorithm,” in Proc. Summaries 4th JPL Airborne Earth Sci. Workshop, PP. 147- 149, 1992. [18] J. G. LIU “Smoothing Filter-based Intensity Modulation: a spectral preserve image fusion technique for improving spatial details” int. J. remote sensing, Vol. 21, No. 18, PP. 3461-3472, 2000. [19] R. Fattal, M. Agrawala, S. Rusinkiewicz, “Multiscale shape and detail enhancement from multi-light image collections”, ACM Transactions on Graphics 26 (3), PP. 51-1–51-9, 2007. [20] Jianwen Hu, Shutao Li, “The multiscale directional bilateral filter and its application to multisensory image fusion”, Information Fusion, Vol. 13, PP. 196–206, 2012. [21] Florence Laporterie-Dejean, Helene de Boissezon, Guy Flouzat, Marie-Jose Lefevre-Fonollosa, “Thematic and statistical evaluations of five panchromatic/multispectral fusion methods on simulated PLEIADS-HR images”, Information fusion, Vol. 6, PP. 193-212, 2005. [22] L. Zhenhua, L. Henry, “Fusion of Multispectral and Panchromatic Images Using a Restoration-Based Method”, IEEE Transactions on Geoscience and Remote Sensing, Vol. 47, No. 5, PP. 1482-1491, MAY, 2009. [23] Chen Shao-hui, Su Hongbo, Zhang Renhua, Tian Jin, “Fusing remote sensing images using a trous wavelet transform and empirical mode decomposition”, Pattern Recognition Letters, Vol. 29, PP. 330-342, 2008. [24] Chen Shao-hui, Su Hongbo, Zhang Renhua, Tian Jin and Yang Lihu, “The tradeoff analysis for remote sensing image fusion using expanded spectral angle mapper”, Sensors, Vol. 8, PP. 520-528, 2008. [25] C. Tomasi, R. Manduchi, “Bilateral Filtering for Gray and Color Images”, 1998 IEEE International Conference on Computer Vision, Bombay, PP. 839-846.
此篇論文中,我們提出一種新的利用多重解析度以適當的分解尺度提取影像,在衛星影像中,為了得到具有高解析度的彩色衛星影像,全色態影像(PAN)與多光譜影像(MS)的融合是非常重要的議題。此篇論文中,結合兩類方法,選擇首重保留空間解析度方法與選擇首重保留多光譜影像色彩資訊的多尺度分解方法,我們透過IHS轉換法的特性,並結合Multi-Scale Bilateral Filter來設計出一個新的且有效的融合方法,用來解決傳統融合方法所造成的光譜失真問題。本篇論文將融合步驟分為兩個階段。 第一階段將多光譜影像進行IHS轉換,並預先將多光譜影像的I值與全色態影像進行Multi-Scale Bilateral Filter分解,將全色態影像與多光譜影像分解成相對應的光譜資訊影像與空間資訊影像,並透過有效的評估方法使我們決定恰到好處地分解層數,解決分解不足造成空間解析度萃取也不足,也避免過度分解則融合的光譜影像分量過度稀釋,我們所提方法非常有效的降低其間的光譜誤差。實驗的結果顯示,我們提出的方法最能夠保留完整的解析度、並且與其他學者所提方法相比將光譜誤差降到最低。

In this paper, a new fusion approach for panchromatic and multispectral images is proposed. To deal with MS image, we firstly apply IHS transform to MS image in order to separate the intensity component from hue and saturation ones. Then, the salient and complementary information of PAN and MS images are extracted by using multi-scale bilateral filtering strategy so that high-frequency information in PAN images, such as edge and texture, are extracted as much as possible, and most of spectral information in the intensity component of MS image is also held in the last approximation image. The fused intensity image is generated by reversing the multi-scale bilateral filtering with the last approximation image of PAN image replaced by that of MS image. Finally our fusion result is obtained by performing reverse IHS transform with new fused intensity image together with the original saturation and hue of MS image. With our fusion scheme, high-frequency information from PAN image is preserved as much as possible and spectral distortion problem can be reduced. In our experiments, SAM, ERGAS and PSNR measurements are adapted to fairly evaluate the performance of each method. In three typical image sets, our fusion method performs better than others in the quantitative and visual analysis.
Appears in Collections:資訊科學與工程學系所

Show full item record

Google ScholarTM


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.