It is difficult to select an appropriate evaluation index for current image fusion. In order to solve the problem, a synthesis evaluation index is proposed based on the correlation between subjective and objective evaluations. First, a variety of fusion results are evaluated subjectively from the edge of clarity, natural sense, information and comprehensive evaluation, respectively. Secondly, 14 commonly objective indexes are used to evaluate the fusion results. Then, the subjective and objective results are normalized, and the Spearman correlation coefficient is used to analyze the correlation between the four subjective evaluations and each objective evaluation. Finally, according to the correlation, a comprehensive index is constructed through 14 objective indexes in 4 aspects. The experimental results show that the synthesis index is more relevant to subjective evaluation than the individual evaluation index and any other comprehensive indexes.
Multiband fusion image evaluation method based on correlation between subject and object evaluation
First published at:Sep 15, 2017
Opto-Electronic Engineering Vol. 44, Issue 09, pp. 895 - 902 (2017) DOI:10.3969/j.issn.1003-501X.2017.09.006
1 Li Shutao, Kang Xudong, Fang Leyuan, et al. Pixel-level image fusion: a survey of the state of the art [J]. Information Fusion, 2017, 33: 100-112.
2 Ding Li, Huang Hua, Zang Yu. Image quality assessment using directional anisotropy structure measurement [J]. IEEE Transactions on Image Processing, 2017, 26(4): 1799-1809.
3 Krasula L, Le Callet P, Fliegel K, et al. Quality assessment of sharpened images: challenges, methodology, and objective metrics [J]. IEEE Transactions on Image Processing, 2017, 26(3): 1496-1508.
4 Vega M T, Mocanu D C, Stavrou S, et al. Predictive no-reference assessment of video quality [J]. Signal Processing: Image Communication, 2017, 52: 20-32.
5 Alaql O Ghazinour K, Lu Cheng Chang. Classification of image distortions for image quality assessment [C]// Proceedings of International Conference on Computational Science and Computational Intelligence. 2016: 653-658.
6 Yan Wen, Gong Fei, Zhou Ying, et al. Satellite cloud image fusion based on adaptive PCNN and NSST [J]. Opto-Electronic Engineering, 2016, 43(10): 70-76,83.
颜文, 龚飞, 周颖, 等. 基于NSST与自适应PCNN相结合的卫星云图融合 [J]. 光电工程, 2016, 43(10): 70-76,83.
7 Yin Ming, Duan Puhong, Chu Biao, et al. CT and MRI medical image fusion based on shift-invariant shearlet transform and compressed sensing [J]. Opto-Electronic Engineering, 2016, 43(8): 47-52.
殷明, 段普宏, 褚标, 等. 结合SIST和压缩感知的CT与MRI图像融合 [J]. 光电工程, 2016, 43(8): 47-52.
8 Zhang Xuedian, Wang Hong, Jiang Minshan, et al. Applications of saliency analysis in focus image fusion [J]. Opto-Electronic Engineering, 2017, 44(4): 435-441.
张学典, 汪泓, 江旻珊, 等. 显著性分析在对焦图像融合方面的应用 [J]. 光电工程, 2017, 44(4): 435-441.
9 Liu Yu, Chen Xun, Peng Hu, et al. Multi-focus image fusion with a deep convolutional neural network [J]. Information Fusion, 2017, 36: 191-207.
10 Zhang Kai, Wang Min, Yang Shuyuan. Multispectral and hyperspectral image fusion based on group spectral embedding and low-rank factorization [J]. IEEE Transactions on Geoscience and Remote Sensing, 2017, 55(3): 1363-1371.
11 Wang Zhou, Bovik A C. A universal image quality index [J]. IEEE Signal Processing Letters, 2002, 9(3): 81-84
12 He Guiqing, LiangFan, Xing Siyuan, et al. Study on algorithm evaluation of image fusion based on multi-hierarchical synthetic analysis[C]// Proceedings of 2016 IEEE International Conference on Signal Processing, Communications and Computing, 2016: 1-6.
13 Zhu Yahui. Research on quality evaluation methods of infrared and visible image fusion[D]. Xi’an: Northwestern Polytechnical University, 2015.
朱亚辉. 红外与可见光图像融合质量评价方法研究[D]. 西安: 西北工业大学, 2015.
14 Xydeas C S, Petrovic V. Objective image fusion performance measure[J]. Electronics Letters, 2000, 36(4): 308-309.
15 Piella G, Heijmans H. A new quality metric for image fusion [C]// Proceedings of 2003 International Conference on Image Processing, 2003, 2: III-173-176.
16 Nizami I F, Majid M, Khurshid K. Efficient feature selection for blind image quality assessment based on natural scene statistics [C]// Proceedings of 2017 14th International Bhurban Conference on Applied Sciences and Technology, 2017: 318-322.
17 Ding Yong, Zhao Yang, Zhao Xinyu. Image quality assessment based on multi-feature extraction and synthesis with support vector regression [J]. Signal Processing: Image Communication, 2017, 54: 81-92.
18 Mukherjee R, Debattista K, Bashford-Rogers T, et al. Objective and subjective evaluation of high dynamic range video com-pression [J]. Signal Processing: Image Communication, 2016, 47: 426-437.
19 Liu Yu, Liu Shuping, Wang Zengfu. A general framework for image fusion based on multi-scale transform and sparse representation [J]. Information Fusion, 2015, 24: 147-164.
20 Jagalingam P, Hegde A V. A Review of quality metrics for fused image [J]. Aquatic Procedia, 2015, 4: 133-142.
21 Zhang Xiaoli, LI Xiongfei, LI Jun. Validation and correlation analysis of metrics for evaluating performance of image fusion[J]. Acta Automatica Sinica, 2014, 40(2): 306-315.
张小利, 李雄飞, 李军. 融合图像质量评价指标的相关性分析及性能评估[J]. 自动化学报, 2014, 40(2): 306-315.
22 Han Yu, Cai Yunze, Cao Yin, et al. A new image fusion performance metric based on visual information fidelity[J]. Information Fusion, 2013, 14(2): 127-135.
23 Warne R T. Testing Spearman's hypothesis with advanced placement examination data[J]. Intelligence, 2016, 57: 87-95.
Get Citation: Han Ze, Lin Suzhen. Multiband fusion image evaluation method based on correlation between subject and object evaluation[J]. Opto-Electronic Engineering, 2017, 44(9): 895–902.