For safety reasons, in the automated dispensing medicines process, robots and humans cooperate to
accomplish the task of drug sorting and distribution. In this dynamic unstructured environment, such as a humanrobot
collaboration scenario, the safety of human, robot, and equipment in the environment is paramount. In this
work, a practical and effective robot motion planning method is proposed for dynamic unstructured environments.
To figure out the problems of blind zones of single depth sensor and dynamic obstacle avoidance, we first propose
a method for establishing offline mapping and online fusion of multi-sensor depth images and 3D grids of the robot
workspace, which is used to determine the occupation states of the 3D grids occluded by robots and obstacles
and to conduct real-time estimation of the minimum distance between the robot and obstacles. Then, based on
the reactive control method, the attractive and repulsive forces are calculated and transformed into robot joint
velocities to avoid obstacles in real time. Finally, the robot’s dynamic obstacle avoidance ability is evaluated on an
experimental platform with a UR5 robot and two KinectV2 RGB-D sensors, and the effectiveness of the proposed
method is verified.
WANG Zheng (王正), XU Hui (许辉), L v Na (吕娜), TAO Wei∗ (陶卫),
CHEN Guodong (陈国栋), CHI Wenzheng (迟文正), SUN Lining (孙立宁)
. Dynamic Obstacle Avoidance for Application of Human-Robot Cooperative Dispensing Medicines[J]. Journal of Shanghai Jiaotong University(Science), 2022
, 27(1)
: 24
-35
.
DOI: 10.1007/s12204-021-2366-5
[1] BALAN L, BONE G M. Real-time 3D collision avoidancemethod for safe human and robot coexistence[C]//2006 IEEE/RSJ International Conference on IntelligentRobots and Systems. Beijing: IEEE, 2006:276-282.
[2] REDON S, LIN M C, MANOCHA D, et al. Fast continuouscollision detection for articulated models [J].Journal of Computing and Information Science in Engineering,2005, 5(2): 126-137.
[3] PAN J, CHITTA S, MANOCHA D. FCL: A generalpurpose library for collision and proximity queries[C]//2012 IEEE International Conference on Roboticsand Automation. Saint Paul, MN: IEEE, 2012: 3859-3866.
[4] PAN J, S窾CAN I A, CHITTA S, et al. Realtimecollision detection and distance computation onpoint cloud sensor data [C]//2013 IEEE InternationalConference on Robotics and Automation. Karlsruhe:IEEE, 2013: 3593-3599.
[5] NICOLAI P, RACZKOWSKY J, W¨ORN H. A novel3D camera based supervision system for safe humanrobotinteraction in the operating room [J]. Journalof Automation and Control Engineering, 2015: 3(5):410-417.
[6] FISCHER M, HENRICH D. Surveillance of robots usingmultiple colour or depth cameras with distributedprocessing [C]//2009 Third ACM/IEEE InternationalConference on Distributed Smart Cameras (ICDSC).Como: IEEE, 2009: 1-8.
[7] FLACCO F, KROEGER T, DE LUCA A, et al. Adepth space approach for evaluating distance to objects[J]. Journal of Intelligent & Robotic Systems,2015, 80(1): 7-22.
[8] FLACCO F, KR¨OGER T, DE LUCA A, et al. A depthspace approach to human-robot collision avoidance[C]//2012 IEEE International Conference on Roboticsand Automation. Saint Paul, MN: IEEE, 2012: 338-345.
[9] FABRIZIO F, DE LUCA A. Real-time computation ofdistance to dynamic obstacles with multiple depth sensors[J]. IEEE Robotics and Automation Letters, 2017,2(1): 56-63.
[10] PAN J, MANOCHA D. GPU-based parallel collisiondetection for fast motion planning [J]. The InternationalJournal of Robotics Research, 2012, 31(2): 187-200.
[11] KARAMAN S, WALTER M R, PEREZ A, et al. Anytimemotion planning using the RRT [C]//2011 IEEEInternational Conference on Robotics and Automation.Shanghai: IEEE, 2011: 1478-1483.
[12] LEVEN P, HUTCHINSON S. A framework for realtimepath planning in changing environments [J].The International Journal of Robotics Research, 2002,21(12): 999-1030.
[13] SCHUMANN-OLSEN H, BAKKEN M, HOLHJEM? H, et al. Parallel dynamic roadmaps for real-timemotion planning in complex dynamic scenes [C]//3rdWorkshop on Robots in Clutter-Perception and Interactionin Clutter. Chicago: IEEE, 2014.
[14] YANG Y M, MERKT W, IVAN V, et al. HDRM:A resolution complete dynamic roadmap for real-timemotion planning in complex scenes [J]. IEEE Roboticsand Automation Letters, 2018, 3(1): 551-558.
[15] YANG Y M. Motion synthesis for high degree-offreedomrobots in complex and changing environments[D]. Edinburgh: The University of Edinburgh, 2018.
[16] KHATIB O. Real-time obstacle avoidance for manipulatorsand mobile robots [M]//Autonomous robot vehicles.New York, NY: Springer New York, 1986: 396-404.
[17] ZHU J, YANG M Y. Path planning of manipulator toavoid obstacle based on improved artificial potentialfield method [J]. Computer Measurement & Control,2018, 26(10): 205-210 (in Chinese).
[18] LI Y Q. 3D obstacle avoidance path planning formanipulator based on A* mixed with potential fieldmethod [J]. Agricultural Equipment & Vehicle Engineering,2018, 56(12): 62-66 (in Chinese).
[19] FRESE C, FETZNER A, FREY C. Multi-sensorobstacle tracking for safe human-robot interaction[C]//ISR/Robotik 2014 ; 41st International Symposiumon Robotics. Munich: VDE, 2014: 1-8.
[20] ZUBE A. Combined workspace monitoring and collisionavoidance for mobile manipulators [C]//2015IEEE 20th Conference on Emerging Technologies &Factory Automation (ETFA). Luxembourg: IEEE,2015: 1-8.
[21] FETZNER A, FRESE C, FREY C. A 3D representationof obstacles in the robots reachable area consideringocclusions [C]//ISR/Robotik 2014 ; 41st InternationalSymposium on Robotics. Munich: VDE, 2014:1-8.
[22] RIVADENEYRA C, MILLER I, SCHOENBERG JR, et al. Probabilistic estimation of multi-level terrainmaps [C]//2009 IEEE International Conferenceon Robotics and Automation. Kobe: IEEE, 2009: 1643-1648.
[23] HORNUNG A, WURM K M, BENNEWITZ M, etal. OctoMap: An efficient probabilistic 3D mappingframework based on octrees [J]. Autonomous Robots,2013, 34(3): 189-206.
[24] FLACCO F, DE LUCA A. Multiple depth/presencesensors: Integration and optimal placement for human/robot coexistence [C]//2010 IEEE InternationalConference on Robotics and Automation. Anchorage,AK: IEEE, 2010: 3916-3923.
[25] XU H. Research on robot visual perception and motionplanning for human-machine collaboration [D].Suzhou: Soochow University, 2020 (in Chinese).
[26] ZHANG Z. A flexible new technique for camera calibration[J]. IEEE Transactions on Pattern Analysisand Machine Intelligence, 2000, 22(11): 1330-1334.