DOI: 10.17586/1023-5086-2024-91-07-71-79
УДК: 004.94
Computer simulation of the influence of optical system parameters on the error in determining the orientation and position of a fiducial marker
Full text on elibrary.ru
Шматко Е.В., Сивов Н.Ю., Еремин Д.В., Поройков А.Ю. Компьютерное моделирование влияния параметров оптической системы на погрешность определения ориентации и положения кодового маркера // Оптический журнал. 2024. Т. 91. № 7. С. 71–79. http://doi.org/10.17586/1023-5086-2024-91-07-71-79
Shmatko E.V., Sivov N.Yu., Eremin D.V., Poroykov A.Yu. Computer simulation of the influence of optical system parameters on the error in determining the orientation and position of a fiducial marker [in Russian] // Opticheskii Zhurnal. 2024. V. 91. № 7. P. 71–79. http://doi.org/10.17586/1023-5086-2024-91-07-71-79
Subject of study. Influence of optical system parameters on the error of determining the orientation and position of fiducial markers. Aim of study. Obtaining the dependences of the absolute error of position and orientation on various factors. Method. An approach to assessing the error of a machine vision system based on fiducial markers using computer image modeling in the Unity 3D graphics system. Main results. During the simulation, more than 100,000 images of AprilTag markers in different positions and orientations were synthesized and processed. After processing the simulation results, the dependences of absolute position and orientation error on the distance between the camera and the marker, on the marker rotation angle and on the camera focal lengths were obtained. Practical significance. The obtained results will be used to optimize the location of markers on the platform, to select the position of video cameras and focal lengths of their lenses, as well as to make changes in the image processing algorithm to improve the accuracy of measurements on the system for the development of microsatellite orientation algorithms.
computer simulation, fiducial marker, orientation and position of the object in space
OCIS codes: 120.0120, 100.2000, 100.4145
References:1. Kok M., Hol J.D., Schön Th.B. Using inertial sensors for position and orientation estimation // Foundations and Trends in Signal Proc. 2017. V. 11. № 1–2. 153 p. http://dx.doi.org/10.1561/2000000094
2. Wu R., Chen Y., Pan Y., et al. Determination of three-dimensional movement for rotary blades using digital image correlation // Opt. and Lasers in Eng. 2015. V. 65. P. 38–45. https://doi.org/10.1016/j.optlaseng.2014.04.020
3. Liu T., Burner A., Jones T., et al. Photogrammetric techniques for aerospace applications // Progress in Aerospace Sci. 2012. V. 54. P. 1–58. https://doi.org/10.1016/j.paerosci.2012.03.002
4. Jurado-Rodriguez D., Muñoz-Salinas R., Garrido-Jurado S., et al. 3D model-based tracking combining edges, keypoints and fiducial markers // Virtual Reality. 2023. V. 27. P. 3051–3065. https://doi.org/10.1007/s10055-023-00853-5
5. Kansal S., Mukherjee S. Vision-based kinematic analysis of the Delta robot for object catching // Robotica. 2022. V. 40. № 6. P. 2010–2030. https://doi.org/10.1017/S0263574721001491
6. Vela C., Fasano G., Opromolla R. Pose determination of passively cooperative spacecraft in close proximity using a monocular camera and AruCo markers // Acta Astronautica. 2022. V. 201. P. 22–38. https://doi.org/10.1016/j.actaastro.2022.08.024
7. Vörös V., Page A.S., Deprest J., et al. Motion and viewing analysis during minimally invasive surgery for autostereoscopic visualization // Int. J. Computer Assisted Radiology and Surgery. 2023. V. 18. № 3. P. 527–535. https://doi.org/10.1007/s11548-022-02753-6
8. Olson E. AprilTag: A robust and flexible visual fiducial system // 2011 IEEE Int. Conf. Robotics and Automation. 2011. P. 3400–3407. https://doi.org/10.1109/ICRA.2011.5979561
9. Olson E., Wang J. AprilTag 2: Efficient and robust fiducial detection // 2016 IEEE/RSJ Int. Conf. Intelligent Robots and Systems (IROS). 2016. P. 4193–4198. https://doi.org/10.1109/IROS.2016.7759617
10. Garrido-Jurado S., Muñoz-Salinas R., Madrid-Cuevas F., et al. Automatic generation and detection of highly reliable fiducial markers under occlusion // Pattern Recognition. 2014. V. 47. № 6. P. 2280–2292. https://doi.org/10.1016/j.patcog.2014.01.005
11. Calvet L., Gurdjos P., Griwodz C., et al. Detection and accurate localization of circular fiducials under highly challenging conditions // 2016 IEEE Conf. Computer Vision and Pattern Recognition (CVPR). Las Vegas, NV, USA. 2016. P. 562–570. https://doi.org/10.1109/CVPR.2016.67
12. Fischler M.A., Bolles R.C. Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography // Commun. ACM. 1981. V. 24. № 6. P. 381–395. https://doi.org/10.1145/358669.358692
13. Шапиро Л., Стокман Дж. Компьютерное зрение: учеб. пособ. / 3-е изд. (электронное) / Пер. с англ. Богуславского А.А. под ред. Соколова С.М. М.: БИНОМ. Лаборатория знаний, 2015. 763 с.
Shapiro L.G., Stockman G.C. Computer vision. Prentice Hall, 2001. 580 p.
14. Болотских А.А., Иванов Д.С., Ткачёв С.С. Лабораторное исследование алгоритмов определения углового движения наноспутника на стенде с аэродинамическим подвесом // XLV академические чтения по космонавтике, посвященные памяти академика С.П. Королёва и других выдающихся отечественных ученых – пионеров освоения космического пространства. Сб. тез. Т. 1. М.: изд. МГТУ им. Н.Э. Баумана, 2021. C. 387–388.
Bolotskikh A., Ivanov D., Tkachev S. Laboratory study of angular motion determination algorithms for nanosatellite on laboratory facility with aerodynamic suspension // AIP Conf. Proc. 2023. P. 060004. https://doi.org/10.1063/5.0107896
15. Shmatko E.V., Poroykov A.Yu. The estimation of inertial measurement units accuracy using digital image processing algorithms // 2022 Wave Electronics and its Application in Information and Telecommunication Systems (WECONF), IEEE. 2022. P. 1–4. https://doi.org/10.1109/WECONF55058.2022.9803505
16. Kalaitzakis M., Carroll S., Ambrosi A., et al. Experimental comparison of fiducial markers for pose estimation // Int. Conf. Unmanned Aircraft Systems (ICUAS). 2020. P. 781–789. https://doi.org/10.1109/ICUAS48674.2020.9213977
17. Ullah S., Javed M., Rabbi I., et al. Analysing the attributes of fiducial markers for robust tracking in augmented reality applications // Int. J. Computational Vision and Robotics. 2017. V. 7. № 1/2. P. 68–82. https://doi.org/10.1504/IJCVR.2017.10001817
18. Abbas S.M., Aslam S., Berns K., et al. Analysis and improvements in apriltag based state estimation // Sensors (Basel). 2019. V. 19. P. 1–32. https://doi.org/10.3390/s19245480
19. Liu Y., Schofield H., Shan J. Intensity image-based LiDAR fiducial marker system // IEEE Robotics and Automation Lett. 2022. V. 7. № 3. P. 6542–6549. https://doi.org/10.1109/LRA.2022.3174971
20. Górski F., Wichniarek R., Kuczko W., et al. Influence of marker arrangement on positioning accuracy of objects in a virtual environment // Advances in Sci. and Technol. Research J. 2015. V. 9. № 28. P. 112–119. https://doi.org/10.12913/22998624/60797
21. Shmatko E.V., Sivov N.Yu., Poroykov A.Y. Estimation of rotation measurement error of objects using computer simulation // 2023 5th Int. Youth Conf. Radio Electronics, Electrical and Power Engineering (REEPE), IEEE. 2023. P. 1–6. https://doi.org/ 10.1109/REEPE57272.2023.10086903
22. Poroykov A., Pechinskaya O., Shmatko E., et al. An error estimation system for close-range photogrammetric systems and algorithms // Sensors. 2023. V. 23. № 24. P. 9715. https://doi.org/10.3390/s23249715