Space Coordinate Calculation and Attitude Estimation Based on Binocular Stereo Vision
-
摘要: 基于双目立体视觉的目标空间坐标计算及姿态估计方法、双目立体视觉系统标定和双目视觉系统校准技术,构建了三维场景中目标的空间坐标与图像中点像素坐标的对应方法.通过双目立体视觉系统采集目标图像,采用基于半全局立体匹配(SGBM)方法,实现了目标中心点三维坐标的计算,并通过背景差分的手段获得目标轮廓,获取目标在水平面的旋转角度,配合焦点所处的坐标位置标示,即可提取目标姿态信息.采用以上方法对电力仪表的姿态进行估计和验证,结果表明:设计的方法可以完成电力仪表的空间坐标计算以及姿态估计,可实现对电力仪表的准确抓取,为机器人在电力系统人工智能领域的应用提供参考.Abstract: The target space coordinate calculation and attitude estimation method based on binocular stereo vision, binocular stereo vision system calibration and binocular vision system calibration technology is used to construct the correspondence between the spatial coordinates of the target in the 3D scene and the pixel coordinates of the midpoint of the image. The target image is acquired through the binocular stereo vision system, and then the three-dimensional coordinates of the target midpoint are calculated with the semi-global stereo matching (SGBM) method. The target contour is obtained with the background difference method, and then the rotation angle of the target in the horizontal plane is obtained. The coordinate position indicated by the focus indicates the target posture information. The attitude estimation and the test of power meter are verified with the method proposed above. The experimental results show that the designed method can complete the space coordinate calculation and attitude estimation of power meter. It can realize the accurate grasp motion of the power instrument and provide reference for the application of robot in the field of artificial intelligence of power system.
-
表 1 匹配结果及误差
Table 1. The matching results and errors
mm 序号 三维坐标 匹配坐标 误差 1 [27.16,2.57,511.72] [28.23,2.89,510.88] [-1.07,-0.32,0.84] 2 [4.15,-1.32,509.78] [4.21,0.11,508.16] [-0.06,-1.43,-1.24] 3 [-43.14,-14.93,510.22] [-42.07,-10.28,508.76] [-1.07,-4.65,-1.46] 4 [-38.76,-12.47,509.84] [-36.47,-7.19,509.13] [-2.29,-5.28,0.71] 5 [-40.76,-17.23,509.87] [-38.45,-16.01,509.52] [-2.31,-1.22,-0.35] 6 [96.21,-16.32,509.75] [92.17,-14.91,509.58] [4.04,-1.41,0.17] 7 [35.87,8.49,510.72] [34.68,9.51,509.89] [1.19,-1.02,-0.83] 8 [-47.09,-18.93,509.94] [-44.25,-16.80,509.32] [-2.84,-2.13,-0.62] 9 [-37.91,-13.67,511.84] [-36.34,-10.13,511.68] [-1.43,-2.57,-1.72] 10 [31.45,-0.27,509.67] [28.69,2.15,509.92] [2.76,1.88,-0.25] 表 2 本文方法与SIFT特征匹配方法对比
Table 2. the comparison of the proposed method and the SIFT method
匹配方法 平均坐标绝对误差/mm 平均计算时间/s 本文方法 6.156 8.101 SIFT特征匹配方法 8.412 8.056 -
[1] SOVOBODA T, KYBIC J, HLAVAC V. Image processing, analysis & and machine vision-a MATLAB companion[M]. New York:Thomson Learning, 2008. [2] PENG J, XU W, HAN Y. An efficient pose measurement method of a space non-cooperative target based on stereo vision[J]. IEEE Access, 2017, 5:22344-22362. doi: 10.1109/ACCESS.2017.2759798 [3] 贾丙西, 刘山, 张凯祥, 等.机器人视觉伺服研究进展:视觉系统与控制策略[J].自动化学报, 2015, 41(5):861-873. http://d.old.wanfangdata.com.cn/Periodical/jqr200403019JIA B X, LIU S, ZHANG K X, et al. Survey on robot visual servo control:vision system and control strategies[J]. Acta Automatica Sinica, 2015, 41(5):861-873. http://d.old.wanfangdata.com.cn/Periodical/jqr200403019 [4] 周芳.双目视觉中立体匹配算法的研究与实现[D].大连: 大连理工大学, 2013.ZHOU F. Research on stereo matching in binocular vision and its implementation[D]. Dalian: Dalian University of Technology, 2013. [5] PENG J, XU W, LIANG B, et al. Pose measurement and motion estimation of space non-cooperative targets based on laser radar and stereo-vision fusion[J]. IEEE Sensors Journal, 2019, 19(8):3008-3019. doi: 10.1109/JSEN.2018.2889469 [6] ZHANG Q, WANG Y, WANG L. Registration of images with affine geometric distortion based on maximally stable extremal regions and phase congruency[J]. Image & Vision Computing, 2015, 36:23-39. http://www.wanfangdata.com.cn/details/detail.do?_type=perio&id=0876fd97b03e1e3f1a3523afe8b5400d [7] 蔡晓妍, 戴冠中, 杨黎斌.一种快速的单模式匹配算法[J].计算机应用研究, 2008, 25(1):45-46;81. doi: 10.3969/j.issn.1001-3695.2008.01.011CAI X Y, DAI G Z, YANG L B. Faster algorithm for single pattern matching[J]. Application Research of Computers, 2008, 25(1):45-46;81. doi: 10.3969/j.issn.1001-3695.2008.01.011 [8] GOLDBERG S B, MAIMONE M W, MATTHIES L. Stereo vision and rover navigation software for planetary exploration[C]//2002 IEEE Aerospace Conference Proceedings. Montana: IEEE, 2002. [9] TSAI R Y. A versatile camera calibration technique for high-accuracy 3d machine vision metrology using off-the-shelf TV cameras and lenses[J]. IEEE Journal of Robotics and Automation, 1987, 3(4):323-344. doi: 10.1109/JRA.1987.1087109 [10] ZHANG Z. A flexible new technique for camera calibration[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2000, 22(11):1330-1334. doi: 10.1109/34.888718 [11] 李小文.利用拉普拉斯-高斯模板进行边缘检测[J].华南师范大学学报(自然科学版), 1997(2):53-55. http://journal-n.scnu.edu.cn/article/id/1686LI X W. Edge detection using Laplacian-of-Gaussian masks[J]. Journal of South China Normal University (Natural Science Edition), 1997(2):53-55. http://journal-n.scnu.edu.cn/article/id/1686 [12] 倪爱伟.基于双目立体视觉的三维定位技术研究[D].南京: 南京航空航天大学, 2009.NI A W. Study on three dimension reconstruction technique based on stereo vision[D]. Nanjing: Nanjing University of Aeronautics and Astronautics, 2009. [13] 李胜利.基于双目立体视觉的工件识别与定位技术研究[D].哈尔滨: 哈尔滨工业大学, 2016.LI S L. Research on workpiece recognition and location technology based on binocular stereo vision[D]. Harbin: Harbin Institute of Technology, 2016. [14] OKADA K, INABA M, INOUE H. Integration of real-time binocular stereo vision and whole body information for dynamic walking navigation of humanoid robot[C]//Proceedings of IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, MFI2003. Piscataway: IEEE, 2003. [15] 何佳唯.基于双目视觉的机器人定位技术研究[D].无锡: 江南大学, 2016.HE J W. Research of robot positioning technology[D]. Wuxi: Jiangnan University, 2016. [16] ZENG Z, YAN H. Region matching and optimal matching pair theorem[C]//International Conference on Computer Graphics. Piscataway: IEEE, 2001. -