Citation: | Zhang Yaling, Ji Linna, Yang Fengbao, et al. Characterization of dual-mode infrared images fusion based on cosine similarity[J]. Opto-Electronic Engineering, 2019, 46(10): 190059. doi: 10.12086/oee.2019.190059 |
[1] | 杨风暴, 李伟伟, 蔺素珍, 等.红外偏振与红外光强图像的融合研究[J].红外技术, 2011, 33(5): 262–266. doi: 10.3969/j.issn.1001-8891.2011.05.004 Yang F B, Li W W, Lin S Z, et al. Study on fusion of infrared polarization and intensity images[J]. Infrared Technology, 2011, 33(5): 262–266. doi: 10.3969/j.issn.1001-8891.2011.05.004 |
[2] | 杨艳春, 李娇, 王阳萍.图像融合质量评价方法研究综述[J].计算机科学与探索, 2018, 12(7): 1021–1035. doi: 10.3778/j.issn.1673-9418.1710001 Yang Y C, Li J, Wang Y P. Review of image fusion quality evaluation methods[J]. Journal of Frontiers of Computer Science & Technology, 2018, 12(7): 1021–1035. doi: 10.3778/j.issn.1673-9418.1710001 |
[3] | 郭喆.双模态红外图像融合有效度分布的合成研究[D].太原: 中北大学, 2018: 19–20. Guo Z. Study on synthesis of fusion validity distribution of bimodal infrared images[D]. Taiyuan: North University of China, 2018: 19–20. |
[4] | 牛涛, 杨风暴, 王志社, 等.一种双模态红外图像的集值映射融合方法[J].光电工程, 2015, 42(4): 75–80. doi: 10.3969/j.issn.1003-501X.2015.04.013 Niu T, Yang F B, Wang Z S, et al. A set-valued mapping fusion method of dual-mode infrared image[J]. Opto-Electronic Engineering, 2015, 42(4): 75–80. doi: 10.3969/j.issn.1003-501X.2015.04.013 |
[5] | 张雷.面向拟态变换的异类红外图像融合算法协同嵌接方法研究[D].太原: 中北大学, 2018: 38–42. Zhang L. Research on collaborative and embedded method of heterogeneous infrared image fusion algorithm for mimic transformation[D]. Taiyuan: North University of China, 2018: 38–42. |
[6] | 安富, 杨风暴, 牛涛.模糊逻辑与特征差异驱动的红外偏振图像融合模型[J].红外技术, 2014, 36(4): 304–310. doi: 10.11846/j.issn.1001_8891.201404010 An F, Yang F B, Niu T. A fusion model of infrared polarization images based on fuzzy logic and feature difference driving[J]. Infrared Technology, 2014, 36(4): 304–310. doi: 10.11846/j.issn.1001_8891.201404010 |
[7] | 孙君顶.图像特征提取与检索技术[M].北京:电子工业出版社, 2015: 15–16. Sun J D. Image Feature Extraction and Retrieval Technology[M]. Beijing: Electronic Industry Press, 2015: 15–16. |
[8] | 陈文安.子空间方法及其核扩展的研究[D].北京: 北方工业大学, 2006: 23–26. Chen W A. Subspace methods and their kernelization[D]. Beijing: North China University of Technology, 2006: 23–26. |
[9] | Tamura H, Mori S, Yamawaki T. Textural features corresponding to visual perception[J]. IEEE Transactions on Systems, Man, and Cybernetics, 1978, 8(6): 460–473. doi: 10.1109/TSMC.1978.4309999 |
[10] | Bach F R, Jordan M I. Kernel independent component analysis[J]. Journal of Machine Learning Research, 2002, 3(1): 1–48. doi: 10.1162/153244303768966085 |
[11] | Melzer T. Generalized canonical correlation analysis for object recognition[D]. Vienna: Vienna University of Technology, 2002. |
[12] | 刘帅奇, 郑伟, 赵杰, 等.数字图像融合算法分析与应用[M].北京:机械工业出版社, 2018: 3–6. Liu S Q, Zheng W, Zhao J, et al. Analysis and Application of Algorithm for Digital Image Fusion[M]. Beijing: Mechanical Industry Press, 2018: 3–6. |
[13] | Yi Q J, Wang H T, Guo R P, et al. Laser ultrasonic quantitative recognition based on wavelet packet fusion algorithm and SVM[J]. Optik, 2017, 149: 206–219. doi: 10.1016/j.ijleo.2017.08.105 |
[14] | Li S T, Kang X D, Hu J W. Image fusion with guided filtering[J]. IEEE Transactions on Image Processing, 2013, 22(7): 2864–2875. doi: 10.1109/TIP.2013.2244222 |
[15] | Malini S, Moni R S. Image denoising using multiresolution singular value decomposition transform[J]. Procedia Computer Science, 2015, 46: 1708–1715. doi: 10.1016/j.procs.2015.02.114 |
[16] | Fletcher P, Sangwine S J. The development of the quaternion wavelet transform[J]. Signal Processing, 2017, 136: 2–15. doi: 10.1016/j.sigpro.2016.12.025 |
[17] | Ye J. Cosine similarity measures for intuitionistic fuzzy sets and their applications[J]. Mathematical and Computer Modelling, 2011, 53(1–2): 91–97. doi: 10.1016/j.mcm.2010.07.022 |
[18] | 段汕, 王小凡, 张洪.图像相似性度量方法的研究[J].中南民族大学学报(自然科学版), 2016, 35(4): 121–125. doi: 10.3969/j.issn.1672-4321.2016.04.026 Duan S, Wang X F, Zhang H. Research on method of similarity measure for images[J]. Journal of South-Central University for Nationalities (Natural Science Edition), 2016, 35(4): 121–125. doi: 10.3969/j.issn.1672-4321.2016.04.026 |
Overview: In the existing fusion of infrared intensity and polarization images, the optimal fusion efficiency measurement method is not sought, and leads to the inability to accurately reflect the real fusion situation in different imaging scenes. Therefore, to solve the above problems, this paper firstly constructs the class sets of difference features and the class sets of fusion algorithms aiming at the image features and fusion features of the dual-mode images. Then, the difference features were defined and the meaning of fusion validity was defined. The fusion validity evaluation functions were constructed by using the distance measurement formulas. Among them, the three common functional representations of distance measurement were Euclidean distance, cosine similarity and Lance and Williams distance. Based on the difference features amplitudes of the maximum and the minimum in the source image, all the difference features amplitudes will be interval equal (here are divided into 20 groups), and the interval of each amplitude will be measured, and gets each amplitude range in the difference features of the approximate fusion validity, and finally gets the source images in 20 amplitude ranges of approximate fusion validity distribution curves, it is concluded that the different variety of fusion algorithms for different features of fusion validity distribution curves. According to the thought that difference features drive selecting the optimal fusion algorithm, the dual-mode images for difference features classes focus on different features of amplitude, through the use of three kinds of measurement for fusion validity based on the concentration of 12 kinds of fusion algorithm, and get fusion validity of discrete points distribution, then the amplitude of difference features intervals was classified. The amplitudes of difference features intervals discrete points are averaged which contributes the curves distribution of fusion validity under different fusion algorithms for each differential feature amplitude. Again in each amplitude range, the algorithm with the maximum fusion validity value is selected. The optimal fusion algorithm in each difference feature amplitude interval and the overall fusion efficiency of the interval represented by the optimal fusion algorithm are also obtained. The frequency of the optimal fusion algorithm in the difference feature amplitude interval of the ten groups of source images was counted, thus the optimal fusion algorithm of each difference feature is obtained. The experimental results show that the cosine similarity has high stability and good matching with human vision analysis in the fusion measurement validity of various fusion algorithms.
Source infrared intensity and infrared polarization images. (a) Infrared intensity image; (b) Infrared polarization image
Scatter distribution of fusion validity. (a) Based on Euclidean distance; (b) Based on cosine similarity; (c) Baesd on Lance and Williams distance
Curve of fusion validity. (a) Based on Euclidean distance; (b) Based on cosine similarity; (c) Baesd on Lance and Williams distance
The distribution of the maximum value of the fusion validity in the amplitude interval under multiple algorithms. (a), (b) Based on Euclidean distance; (c), (d) Based on cosine similarity; (e), (f) Baesd on Lance and Williams distance
The frequency of occurrence of the optimal fusion algorithm in the amplitude interval of different difference features.(a), (b) Based on Euclidean distance; (c), (d) Based on cosine similarity; (e), (f) Baesd on Lance and Williams distance
Source dual-mode infrared images. (a) Infrared intensity image; (b) Infrared polarization image
Fusion result diagram. (a) PCA; (b) DWT; (c) NSCT; (d) NSST; (e) DTCWT; (f) TH; (g) LP; (h) WPT; (i) GFF; (j) CVT; (k) MSVD; (l) QWT