Citation: | Xiang Z L, Zhang Q C, Wu Z J. 3D shape measurement and texture mapping method based on structured light projection[J]. Opto-Electron Eng, 2022, 49(12): 220169. doi: 10.12086/oee.2022.220169 |
[1] | Li Y X, Qian J M, Feng S J, et al. Deep-learning-enabled dual-frequency composite fringe projection profilometry for single-shot absolute 3D shape measurement[J]. Opto-Electron Adv, 2022, 5(5): 210021. doi: 10.29026/oea.2022.210021 |
[2] | 应晓霖, 姚建云, 张晓松, 等. 采用LD的光源步进条纹投影三维测量系统[J]. 光电工程, 2021, 48(11): 210298. doi: 10.12086/oee.2021.210298 Ying X L, Yao J Y, Zhang X S, et al. Fringe projection based three-dimensional measurement system by the light-source-stepping method using LD[J]. Opto-Electron Eng, 2021, 48(11): 210298. doi: 10.12086/oee.2021.210298 |
[3] | Wu Z J, Guo W B, Zhang Q C. Two-frequency phase-shifting method vs. Gray-coded-based method in dynamic fringe projection profilometry: a comparative review[J]. Opt Lasers Eng, 2022, 153: 106995. doi: 10.1016/J.OPTLASENG.2022.106995 |
[4] | 左超, 张晓磊, 胡岩, 等. 3D真的来了吗?−三维结构光传感器漫谈[J]. 红外与激光工程, 2020, 49(3): 0303001. doi: 10.3788/IRLA202049.0303001 Zuo C, Zhang X L, Hu Y, et al. Has 3D finally come of age?——An introduction to 3D structured-light sensor[J]. Infrared Laser Eng, 2020, 49(3): 0303001. doi: 10.3788/IRLA202049.0303001 |
[5] | 张启灿, 吴周杰. 基于格雷码图案投影的结构光三维成像技术[J]. 红外与激光工程, 2020, 49(3): 0303004. doi: 10.3788/IRLA202049.0303004 Zhang Q C, Wu Z J. Three-dimensional imaging technique based on gray-coded structured illumination[J]. Infrared Laser Eng, 2020, 49(3): 0303004. doi: 10.3788/IRLA202049.0303004 |
[6] | 张宗华, 于瑾, 高楠, 等. 高反光表面三维形貌测量技术[J]. 红外与激光工程, 2020, 49(3): 0303006. doi: 10.3788/IRLA202049.0303006 Zhang Z H, Yu J, Gao N, et al. Three-dimensional shape measurement techniques of shiny surfaces[J]. Infrared Laser Eng, 2020, 49(3): 0303006. doi: 10.3788/IRLA202049.0303006 |
[7] | 侯冠宇, 吴斌, 何荣芳, 等. 基于双目光栅重建和纹理映射的缺陷三维测量方法[J]. 光学学报, 2022, 42(7): 0712003. doi: 10.3788/AOS202242.0712003 Hou G Y, Wu B, He R F, et al. Three-dimensional measurement method of defects based on binocular grating reconstruction and texture mapping[J]. Acta Opt Sin, 2022, 42(7): 0712003. doi: 10.3788/AOS202242.0712003 |
[8] | 左超, 陈钱. 计算光学成像: 何来, 何处, 何去, 何从?[J]. 红外与激光工程, 2022, 51(2): 20220110. doi: 10.3788/IRLA20220110 Zuo C, Chen Q. Computational optical imaging: an overview[J]. Infrared Laser Eng, 2022, 51(2): 20220110. doi: 10.3788/IRLA20220110 |
[9] | 张广军. 视觉测量[M]. 北京: 科学出版社, 2008. Zhang G J. Vision Measurement[M]. Beijing: Science Press, 2008. |
[10] | 何进英, 刘晓利, 彭翔, 等. 基于灰度约束的三维数字散斑整像素相关搜索[J]. 中国激光, 2017, 44(4): 0404003. doi: 10.3788/CJL201744.0404003 He J Y, Liu X L, Peng X, et al. Integer pixel correlation searching for three-dimensional digital speckle based on gray constraint[J]. Chin J Lasers, 2017, 44(4): 0404003. doi: 10.3788/CJL201744.0404003 |
[11] | 赵涵卓, 高楠, 孟召宗, 等. 双视角三维测量系统同时标定方法[J]. 光电工程, 2021, 48(3): 200127. doi: 10.12086/oee.2021.200127 Zhao H Z, Gao N, Meng Z Z, et al. Method of simultaneous calibration of dual view 3D measurement system[J]. Opto-Electron Eng, 2021, 48(3): 200127. doi: 10.12086/oee.2021.200127 |
[12] | Catmull E E. A subdivision algorithm for computer display of curved surfaces[D]. Salt Lake City: The University of Utah, 1974. |
[13] | Blinn J F. A scan line algorithm for displaying parametrically defined surfaces[J]. ACM SIGGRAPH Comput Graph, 1978, 12(SI): 1−7. doi: 10.1145/988437.988439 |
[14] | Bier E A, Sloan K R. Two-part texture mappings[J]. IEEE Comput Graph Appl, 1986, 6(9): 40−53. doi: 10.1109/MCG.1986.276545 |
[15] | 张宗华, 彭翔, 胡小唐. 一种新型彩色三维光学成像系统[J]. 光学学报, 2002, 22(8): 994−998. doi: 10.3321/j.issn:0253-2239.2002.08.022 Zhang Z H, Peng X, Hu X T. A new color 3-D optical imaging system[J]. Acta Opt Sin, 2002, 22(8): 994−998. doi: 10.3321/j.issn:0253-2239.2002.08.022 |
[16] | 孙士杰, 翟爱平, 曹益平. 一种快速获取物体三维形貌和纹理信息的算法[J]. 光学学报, 2016, 36(3): 0312001. doi: 10.3788/AOS201636.0312001 Sun S J, Zhai A P, Cao Y P. A fast algorithm for obtaining 3D shape and texture information of objects[J]. Acta Opt Sin, 2016, 36(3): 0312001. doi: 10.3788/AOS201636.0312001 |
[17] | Liu Y Z, Fu Y J, Zhou P X, et al. A real-time 3D shape measurement with color texture using a monochromatic camera[J]. Opt Commun, 2020, 474: 126088. doi: 10.1016/j.optcom.2020.126088 |
[18] | Zhang S, Huang P S. High-resolution, real-time three-dimensional shape measurement[J]. Opt Eng, 2006, 45(12): 123601. doi: 10.1117/1.2402128 |
[19] | Ou P, Li B W, Wang Y J, et al. Flexible real-time natural 2D color and 3D shape measurement[J]. Opt Express, 2013, 21(14): 16736−16741. doi: 10.1364/OE.21.016736 |
[20] | 刘星明, 刘晓利, 殷永凯, 等. 真实感三维模型的纹理融合[J]. 计算机辅助设计与图形学学报, 2012, 24(11): 1440−1446. doi: 10.3969/j.issn.1003-9775.2012.11.008 Liu X M, Liu X L, Yin Y K, et al. Texture blending of 3D photo-realistic model[J]. J Comput-Aided Des Comput Graph, 2012, 24(11): 1440−1446. doi: 10.3969/j.issn.1003-9775.2012.11.008 |
[21] | 杜瑞建, 葛宝臻, 陈雷. 多视高分辨率纹理图像与双目三维点云的映射方法[J]. 中国光学, 2020, 13(5): 1055−1064. doi: 10.37188/CO.2020-0034 Du R J, Ge B Z, Chen L. Texture mapping of multi-view high-resolution images and binocular 3D point clouds[J]. Chin Opt, 2020, 13(5): 1055−1064. doi: 10.37188/CO.2020-0034 |
[22] | 赵琳敬. 三维点云智能进化拼接与表面纹理映射方法研究[D]. 天津: 天津大学, 2018. doi: 10.27356/d.cnki.gtjdu.2018.000668. Zhao L J. Research on 3D point cloud intelligent evolutionary registration and surface texture mapping method[D]. Tianjin: Tianjin University, 2018. doi: 10.27356/d.cnki.gtjdu.2018.000668. |
[23] | 向卓龙, 张启灿, 陈超文. 利用自由拍摄二维图像实现三维点云的纹理贴图[J]. 激光与光电子学进展, 2021, 58(18): 1811018. doi: 10.3788/LOP202158.1811018 Xiang Z L, Zhang Q C, Chen C W. Texture mapping of 3D point clouds with freely recorded 2D images[J]. Laser Optoelectron Prog, 2021, 58(18): 1811018. doi: 10.3788/LOP202158.1811018 |
[24] | Lepetit V, Moreno-Noguer F, Fua P. EPnP: an accurate O(n) solution to the PnP problem[J]. Int J Comput Vis, 2009, 81(2): 155−166. doi: 10.1007/s11263-008-0152-6 |
In the traditional optical 3D measurement method, the ultimate goal is to obtain the 3D shape information of the measured object, but the 3D data that ignores the texture information often makes the measurement scene lack realism. To make obtained point cloud information more realistic and create an immersive feeling, the texture mapping technique is introduced to attach color information to the reconstructed 3D point cloud. Texture mapping also faces some technical problems. The first thing is how to complete color texture mapping without a color camera. Second, on the premise of using a color camera for texture recording, how to freely move the texture camera to capture the texture images from different perspectives without frequent calibration, so as to complete the accurate mapping from 3D point cloud to texture images.
This paper discusses the above two problems, and proposes a fixed-view grayscale texture mapping method and a color texture method for the case of no additional texture camera; for the case of using a color texture camera, this paper also proposes a free texture mapping method and an unconstrained mapping method. The specific content of the paper is as follows:
1) Under the condition that there is no color camera for texture capturing, a color projector is used to project sinusoidal fringes with three frequencies from three channels of R, G, and B, respectively. The deformed fringe images are collected, and the periodic intensity distribution of three-frequency fringes is eliminated by averaging phase-shifting patterns respectively. And the corresponding texture mapping can be completed after combining textures in three channels and color correction.
2) On the premise of taking an additional texture camera, marker points are added in the measured field, and the mapping relationship between the 3D point cloud and the 2D texture image can be obtained from the camera imaging model. To further get rid of the constraints of adding markers, an unconstrained free texture mapping method is proposed for objects with rich textures. The idea is to perform feature matching between the object images captured by the left and right cameras. According to the corresponding relationship between feature matching points and the 3D point cloud, the PnP problem is solved to obtain the pose relationship for the establishment of the mapping relationship between the 3D point cloud and the 2D texture image and finally realizes texture mapping. Experiments have proved the feasibility of these two methods. The research fruits of this paper could provide a simple and easy means of color 3D information acquirement for the fields of cultural relics digitization and reverse engineering.
Flow chart of the acquiring color texture information from fixed viewing angle
Schematic diagram of the free texture mapping
Schematic diagram of the unconstrained free texture mapping method
Experimental setup of (a) fixed-view texture mapping and (b) free-view texture mapping
Sheep face mask to be measured
Sheep face mask texture. (a) Low frequency (R channel) grayscale texture; (b) Intermediate frequency (G channel) grayscale texture; (c) High frequency (B channel) grayscale texture; (d) Uncorrected color texture; (e) Corrected color texture
3D reconstruction and texture mapping results of sheep face mask. (a) 3D point cloud of sheep face; (b) Corrected color texture mapping result
Reconstructed 3D point clouds of a ceramic cat face from different perspectives
2D textures of a ceramic cat face obtained from different positions by the texture camera and their mapping results. (a)~(b) Texture images freely captured by the texture camera; (c)~(f) Texture mapping results in different perspective views
Object to be measured. (a) Human face mask; (b) Fox mask
Texture images of human face mask captured by texture camera at three different angles
3D reconstruction and texture mapping results of human face mask. (a) Reconstructed point cloud results; (b)~(d) Mapping results of texture 1~3; (e)~(h) Left view corresponding to (a)~(d) results; (i)~(l) Right view corresponding to (a)~(d) results
Texture images of the fox mask captured by texture camera at three different angles
3D reconstruction and texture mapping results of the fox mask. (a) Reconstructed point cloud results; (b)~(d) Mapping result of texture 1~3; (e)~(h) Left view corresponding to (a)~(d) results; (i)~(l) Right view corresponding to (a)~(d) results