Journal of Shanghai Jiaotong University(Science) >
Passive Binocular Optical Motion Capture Technology Under Complex Illumination
Accepted date: 2022-01-14
Online published: 2025-03-21
Fu Yujia, Zhang Jian, Zhou Liping, Liu Yuanzhi, Qin Minghui, Zhao Hui, Tao Wei . Passive Binocular Optical Motion Capture Technology Under Complex Illumination[J]. Journal of Shanghai Jiaotong University(Science), 2025 , 30(2) : 352 -362 . DOI: 10.1007/s12204-023-2578-y
[1] QIN T, LI P L, SHEN S J. VINS-mono: A robust and versatile
monocular visual-inertial state estimator [J]. IEEE Transactions on Robotics,
2018, 34(4): 1004- 1020.
[2] SUN S Z, LI G Y, FENG Q Q, et al. Indoor positioning based on visible light
communication and binocular vision [J]. Optics and Precision Engineering, 2020,
28(4): 834-843 (in Chinese).
[3] MENOLOTTO M, KOMARIS D S, TEDESCO S, et al. Motion capture technology in
industrial applications: A systematic review [J]. Sensors, 2020, 20(19): 5687.
[4] EICHELBERGER P, FERRARO M, MINDER U, et al. Analysis of accuracy in optical
motion capture - A protocol for laboratory setup evaluation [J]. Journal of
Biomechanics, 2016, 49(10): 2085-2088.
[5] OptiTrack. Prepare setup area [EB/OL]. [2021-12- 16].
https://v30.wiki.optitrack.com/ index.php?title = Prepare Setup Area.
[6] SHAFIQ M S, T¨UMER S T, G¨ULER H C. Marker detection and trajectory
generation algorithms for a multicamera based gait analysis system [J].
Mechatronics, 2001, 11(4): 409-437.
[7] KURIHARA K, HOSHINO S, YAMANE K, et al. Optical motion capture system with
pan-tilt camera tracking and real time data processing [C]//2002 IEEE
International Conference on Robotics and Automation. Washington: IEEE, 2002:
1241-1248.
[8] LI B, MENG Q, HOLSTEIN H. Articulated motion reconstruction from feature
points [J]. Pattern Recognition, 2008, 41(1): 418-431.
[9] HUANG B S, CHEN FM, ZHANG J J. Optical motion capture system with improved
algorithms [J]. Journal of Tongji University (Natural Science), 2005, 33(10):
1372-1376 (in Chinese).
[10] XIAO Z. Research and implementation of camera calibration and 3D
reconstruction in optical motion capture system [D]. Changsha: Hunan
University, 2018 (in Chinese).
[11] QIN Z B. Research on camera calibration technology based on
one-dimensional calibration object [D]. Guangzhou: South China University of
Technology, 2019 (in Chinese).
[12] VON GIOI R G, RANDALL G. A sub-pixel edge detector: an implementation of
the canny/devernay algorithm [J]. Image Processing On Line, 2017, 7: 347-372.
[13] RAAJAN N R, RAMKUMAR M, MONISHA B, et al. Disparity estimation from stereo
images [J]. Procedia Engineering, 2012, 38: 462-472.
[14] GUO J, ZHU C. Dynamic displacement measurement of large-scale structures
based on the Lucas-Kanade template tracking algorithm [J]. Mechanical Systems
and Signal Processing, 2016, 66: 425-436.
[15] DONG X L, YUAN J, HUANG S Z, et al. RGB-D visual odometry based on
features of planes and line segments in indoor environments [J]. Robot, 2018,
40(6): 921-932 (in Chinese).
[16] GUO Q D, QUAN Y M, YU G P, et al. Improved binocular calibration based on
ICP algorithm [J]. Acta Optica Sinica, 2016, 36(12): 191-198 (in Chinese).
[17] LEVENBERG K. A method for the solution of certain non-linear problems in
least squares [J]. Quarterly of Applied Mathematics, 1944, 2(2): 164-168.
[18] STURM J, ENGELHARD N, ENDRES F, et al. A benchmark for the evaluation of
RGB-D SLAM systems [C]//2012 IEEE/RSJ International Conference on Intelligent
Robots and Systems. Vilamoura-Algarve: IEEE, 2012: 573-580.
[19] LI J Y, YANG B B, CHEN D P, et al. Survey and evaluation of monocular
visual-inertial SLAM algorithms for augmented reality [J]. Virtual Reality
& Intelligent Hardware, 2019, 1(4): 386-410.
[20] LIU Y, FU Y, CHEN F, et al. Simultaneous localization and mapping related
datasets: a comprehensive survey [DB/OL]. (2021-10-16). https://arxiv.
org/abs/2102.04036.
[21] GEIGER A, LENZ P, URTASUN R. Are we ready for autonomous driving? The
KITTI vision benchmark suite [C]//2012 IEEE Conference on Computer Vision and
Pattern Recognition. Providence: IEEE, 2012: 3354-3361.
[22] KUMMERLE R, GRISETTI G, STRASDAT H, et al. G2o: A general framework for
graph optimization [C]//2011 IEEE International Conference on Robotics and
Automation. Shanghai: IEEE, 2011: 3607-3613.
[23] UMEYAMA S. Least-squares estimation of transformation parameters between
two point patterns [J]. IEEE Transactions on Pattern Analysis and Machine
Intelligence, 1991, 13(4): 376-380.
[24] GRUPP M. EVO: Python package for the evaluation of odometry and SLAM
[EB/OL]. [2021-12-16]. https://github.com/MichaelGrupp/evo.
/
〈 |
|
〉 |