基于手势理解的UR机器人视觉引导与控制方法研究
本文关键词:基于手势理解的UR机器人视觉引导与控制方法研究 出处:《中国科学院长春光学精密机械与物理研究所》2016年硕士论文 论文类型:学位论文
更多相关文章: 视觉引导控制 UR机器人 手势识别 手势跟踪 Shi-Tomasi算法 KLT算法
【摘要】:随着机器人技术在工业制造、军事作战和医疗领域中的广泛应用,人与机器人的交互方式更多地向着“以人为主体”的方向发展。依赖鼠标、键盘、操作面板等硬件设备的传统交互方式已经越来越无法满足人们的需要,因此,基于手势理解的控制方式已逐步发展为实现自然人机交互的新趋势。本文的主要研究内容是基于手势理解的UR机器人视觉引导与控制方法,对手势识别方法、手势跟踪方法、UR机器人远程控制方法进行了详细的阐述和验证,并建立了基于手势识别与跟踪的人机交互系统,实现了对机械臂末端运动的视觉引导控制。在手势识别方面,提出了结合肤色分割和Viola-Jones算法的识别方法,先对输入图像应用肤色分割模块去除背景中大部分非肤色区域,再由Viola-Jones算法离线训练的手势目标检测器完成识别。在肤色分割模块,在YCb Cr色彩空间中建立肤色模型,实现待检测图像中肤色区域与背景的分割,然后通过形态学滤波操作去除噪声的干扰;应用Viola-Jones算法时,采用Haar特征、积分图策略以及级联结构来训练识别三种目标手势的检测器。通过本文的识别方法与传统检测器对于三种手势识别效果的对比试验,结果表明本文提出的方法对手势识别较为理想,满足人机交互系统的要求。在手势跟踪方面,结合改进的Shi-Tomasi特征点提取算法与融合KLT跟踪算法和Kalman滤波器所构造的双模块跟踪器完成手势跟踪。通过改进的ShiTomasi算法,剔除未分布在手势目标上以及对噪声敏感的特征点,将可靠稳定的特征点送入跟踪器,由跟踪器中的KLT模块匹配特征点进而实现对手势目标的定位。特征点丢失时,启动跟踪器中的Kalman滤波模块预测手势位置,缩小检测器的检测范围,从而实现高效检测与连续跟踪,解决了由于发生遮挡或重叠而导致的跟踪失败、跟踪引导信号不连续问题。在UR机器人远程控制方面,深入剖析了UR机器人的运动控制机制,设计了一种UR机器人的远程运动控制方法。通过MATLAB平台验证该方法,使UR机器人末端实现了正弦轨迹跟踪,并且完成了对机器人的状态监控,验证了本方法的正确性和有效性。最后在上述研究基础上,将手势检测、手势跟踪与机器人远程控制方法相结合,应用到机器人视觉控制引导系统中,建立了一个以用户手势为输入的人机交互系统,通过对手势的识别与跟踪来控制机械臂运动,从而实现了操作者与UR机械臂平台的实时、友好交互,实现了基于手势理解的UR机器人视觉引导控制。
[Abstract]:Along with the development of robot technology in manufacturing industry, widely used in the field of medical and military operations, the way people interact with the robot more toward the "human-oriented" direction. Relying on the mouse, the keyboard, the traditional interactive mode of operation panel and other hardware equipment has become increasingly unable to meet the needs of the people, therefore, understand the gesture control based on the development of a new trend has been gradually realize natural human-computer interaction. The main content of this paper is to study the UR robot vision gesture understanding guidance and control method based on the method of gesture recognition, gesture tracking method, UR robot remote control method are described and validated in detail, and the establishment of gesture recognition and human-computer interaction system based on the tracking of the manipulator motion vision guided control. In the aspect of recognition, is presented based on skin color segmentation and Viola-Jones algorithm The recognition method, first divide the most non skin regions in the input image to remove the background module application of color, then by gesture target detector Viola-Jones algorithm off-line training. In recognition of the color segmentation module, a skin color model in YCb Cr color space, to achieve detection of skin color and background in the image segmentation, and then through the morphology the filtering operation to remove noise interference; the application of Viola-Jones algorithm, using Haar feature, integral image and cascade detector strategy training structure to identify three kinds of target gestures. Comparison test for three kinds of gesture recognition effect by identifying this method with the conventional detector, the results show that this method is ideal for gesture recognition, human-computer meet interactive system. In gesture tracking, combined with improved Shi-Tomasi feature extraction algorithm and fusion KLT tracking algorithm The structure of double module tracker and Kalman filter to complete the gesture tracking. By the improved ShiTomasi algorithm, the distribution in the target gesture not excluding the feature points and noise sensitive, reliable and stable feature points will be sent to the tracker by the KLT module in the tracker to match feature points so as to realize the positioning of the target. The gesture features are lost when start, Kalman filter module tracker forecast gesture position, narrowing the range of detector, in order to achieve continuous tracking and efficient detection, solved the problems caused by the occurrence of occlusion or overlapping tracking failure, tracking guidance and continuous signal. In the remote control of UR robot, an in-depth analysis of the UR robot motion control mechanism. The design of the remote motion control method for UR robot. The method is verified by the MATLAB platform, the UR robot realize sinusoidal track tracking, And the completion of the monitoring of the robot, to verify the correctness and validity of this method. Finally, on the basis of the above study, the combination of hand gesture detection, gesture tracking and robot remote control method applied to robot vision control and guide system, establish a human-computer interaction system with user input gesture. Through the recognition and tracking of gestures to control the mechanical arm movement, so as to realize the operator and the UR manipulator platform in real-time, friendly interaction, realize the guidance control of UR robot vision based gesture based on understanding.
【学位授予单位】:中国科学院长春光学精密机械与物理研究所
【学位级别】:硕士
【学位授予年份】:2016
【分类号】:TP391.41;TP242
【参考文献】
相关期刊论文 前10条
1 张毅;姚圆圆;罗元;张天;;一种改进的TLD动态手势跟踪算法[J];机器人;2015年06期
2 扈立超;史再峰;庞科;刘江明;曹清洁;;用于图像匹配的改进Harris特征点检测算法[J];计算机工程;2015年10期
3 艾斯本·奥斯特加;;工业机器人的未来之路[J];办公自动化;2015年11期
4 刘真;白韬韬;卢鹏;;一种解密图像无背景噪声的加密全息数字水印技术[J];光学学报;2015年02期
5 毕国玲;赵建;续志军;孙强;;基于角点和局部特征描述子的快速匹配算法[J];光电工程;2014年09期
6 吴江梅;张瑜慧;孙莹;刘海朦;;一种基于单目视觉的人手检测与识别方法[J];计算机与数字工程;2014年07期
7 刘彦妤;;UR机器人助力精密工程公司降低成本优化生产[J];工程机械文摘;2014年03期
8 丁雄飞;张春燕;;基于Moravec算子和改进的SIFT算法的图像匹配[J];合肥学院学报(自然科学版);2013年03期
9 谭民;王硕;;机器人技术研究进展[J];自动化学报;2013年07期
10 王晓华;李才顺;胡敏;朱弘;;服务机器人手势识别系统研究[J];电子测量与仪器学报;2013年04期
相关硕士学位论文 前3条
1 韦慧怡;基于形状特征的手势识别方法研究[D];兰州理工大学;2014年
2 孟祥媛;基于FPGA的KLT算法设计与实现[D];长春理工大学;2013年
3 苏功宴;内容与形式相统一的智能交互空间[D];上海交通大学;2007年
,本文编号:1408331
本文链接:https://www.wllwen.com/kejilunwen/zidonghuakongzhilunwen/1408331.html