DOI: 10.17586/1023-5086-2019-86-07-19-26
УДК: 535.8
Infrared and visible face fusion recognition based on extend sparse representation classification and local binary patterns for single sample problem
Full text «Opticheskii Zhurnal»
Full text on elibrary.ru
Publication in Journal of Optical Technology
Z. Xie, S. Zhang, X. Yu, G. Liu Infrared and visible face fusion recognition based on extend sparse representation classification and local binary patterns for single sample problem (Распознавание лиц по совместным изображениям инфракрасного и видимого диапазонов на основе расширенных разреженных представлений и локальных бинарных паттернов) [на англ. яз.] // Оптический журнал. 2019. Т. 86. № 7. С. 19–26. http://doi.org/10.17586/1023-5086-2019-86-07-19-26
Z. Xie, S. Zhang, X. Yu, G. Liu Infrared and visible face fusion recognition based on extend sparse representation classification and local binary patterns for single sample problem (Распознавание лиц по совместным изображениям инфракрасного и видимого диапазонов на основе расширенных разреженных представлений и локальных бинарных паттернов) [in English] // Opticheskii Zhurnal. 2019. V. 86. № 7. P. 19–26. http://doi.org/10.17586/1023-5086-2019-86-07-19-26
Z. Xie, S. Zhang, X. Yu, and G. Liu, "Infrared and visible face fusion recognition based on extended sparse representation classification and local binary patterns for the single sample problem," Journal of Optical Technology. 86(7), 408-413 (2019). https://doi.org/10.1364/JOT.86.000408
While near infrared and visible fusion recognition has been actively researched in recent years, most theoretical results and algorithms concentrate on the sufficient training samples setting. This paper focuseson general fusion method when there are insufficient training samples with one pair of near infrared and visible face image. Compared with existing methods, the proposed method requires neither sufficient samples nor the training step. To get a robust and time-efficient fusion model for unconstrained face recognition with single sample situation, two models are proposed to fuse the local binary patterns based descriptors and sparse representation based classification: the first fusion model fuses directly the representation error, while the second fusion model is an accelerated version with learning a cross-spectral dictionary. Experiments are performed on HITSZ LAB2 database and the experiments results showed that the proposed fusion model extracted the complementary features of near-infrared and visible-light images, the fusion face recognition method had superior performance to state of the art fusion methods.
local binary patterns, sparse representation based classification, error-level fusion, cross-spectral Joint Dictionary Representation, face fusion recognition
Acknowledgements:This paper is supported by the National Nature Science Foundation of China (No.61861020), Natural Science Foundation of Jiangxi Province of China (No. 20171BAB202006), Science &Technology Project of Education Bureau of Jiangxi Province (No. GJJ160767) and Nature Science Project of Jiangxi Science and Technology University (2013QNBJRC005).
OCIS codes: 100.0010
References:1. Lu G., Wang Y. Feature extraction using a fast null space based linear discriminant analysis algorithm // Information Sci. 2012. V.193. № 1.P. 72–80
2. Tian G., Zhang C., Sun Q. FFT Consolidated sparse and collaborative representation for image classification // Arabian J. Sci. and Eng. 2018. V. 43. № 2. P. 741–758.
3. Bebis G., Pavlidis I. Infrared and visible image fusion for face recognition // Proc. SPIE. 2004. V. 5404. P. 585–596.
4. Desa S.M., Hati S. IR and visible face recognition using fusion of kernel based features // Proc. 19th Internat. Conf. Pattern Recognition. 2008. V. 1. P. 1–4.
5. Raghavendra R., Dorizzi B., Rao A., Kumar H. Particle swarm optimization based fusion of near infrared and visible images for improved face verification // Pattern Recognition. 2011. V. 44. № 2. P. 401–411.
6. Ma Z., Wen J., Liu Q., Tuo G. Near-infrared and visible light image fusion algorithm for face recognition // J. Modern Opt. 2015. V. 62. № 9. P. 745–753.
7. Guo K., Wu S., Xu Y. Face recognition using both visible light image and near-infrared image and a deep network // CAAI Trans. Intelligence Technol. 2017. V. 2. № 1. P. 39–47.
8. Wright J., Yang A.Y., Ganesh A. Robust face recognition via sparse representation // IEEE Trans. Pattern Analysis and Machine Intelligence. 2009. V. 31. № 2. P. 210–227.
9. He R., Zheng W., Hu B. Maximum correntropy criterion for robust face recognition // IEEE Trans. Pattern Analysis and Machine Intelligence. 2011. V. 33. № 8. P. 1561–1576.
10. Lai Z., Dai D., Ren C. Discriminative and compact coding for robust face recognition // IEEE Trans. Cybernetics. 2015. V. 45. № 9. P. 1900–1912.
11. Yang M., Zhang L., Yang J. Robust sparse coding for face recognition // Proc. 24th IEEE Conf. Computer Vision and Pattern Recognition. 2011. V. 1. P. 625–632.
12. Yang M., Zhang L., Yang J. Regularized robust coding for face recognition // IEEE Trans. Image Proc. 2013. V. 22. № 5. P. 1753–1766.
13. Li X., Dai D., Zhang X. Structured sparse error coding for face recognition with occlusion // IEEE Trans. Image Proc. 2013. V. 22. № 5. P. 1889–1900.
14. Deng W., Hu J., Guo J. Extended SRC: Under-sampled face recognition via intra-class variant dictionary // IEEE Trans. Pattern Analysis and Machine Intelligence. 2012. V. 34. № 9. P. 1864–1870.
15. Yu Y., Dai D., Ren C. Discriminative multi-scale sparse coding for single-sample face recognition with occlusion // Pattern Recognition. 2017. V. 66. № 2. P. 302–312.
16. Ahonen T., Hadid A., Pietikäinen M. Face recognition with local binary patterns // Proc. 8th European Conf. Computer Vision. 2004. V. 1. P. 476–481.
17. Ojala T., Pietikinen M., Menp T. Multi-resolution gray-scale and rotation invariant texture classification with local binary patterns // IEEE Trans. Pattern Analysis and Machine Intelligence. 2002. V. 24. № 6. P. 971–987.
18. Xu Y., Zhong A., Yang J., Zhang D. Bimodal biometrics based on a representation and recognition approach // Opt. Eng. 2011. V. 50. № 3. P. 0372021–0372027.