基于ROS的导盲机器人实时定位与地图创建研究
发布时间:2018-05-12 09:04
本文选题:导盲机器人 + 同时定位与地图创建 ; 参考:《江苏科技大学》2016年硕士论文
【摘要】:近年来我国的视力障碍者数目不断增长,因此,研发出一款能够辅助视力障碍者行走的工具——导盲机器人是非常必要且迫切的。导盲机器人要实现在环境中自主运行,一个不容忽视的前提就是建立环境地图,并且能够随时在环境中完成自定位,也就是实时定位与地图创建,本文的研究重点就是围绕这个内容展开的。本论文中以实验室的Turtlebot2为基础,研发了一种在ROS操作系统下,以无线手柄控制运行,以激光测距仪进行环境探测的导盲机器人实验样机。研究内容包括导盲机器人实验样机的搭建,导盲机器人系统模型的构建,导盲机器人SLAM算法研究与MATALB实验仿真,导盲机器人真实环境中实验及分析。首先,本文完成了导盲机器人样机的搭建,将实验室turtlebot2机器人上的Kinect视觉传感器更换为激光测距仪,作为机器人的外部观测传感器,同时将双模无线震动力反馈手柄与机器人连接,控制机器人的运行。与此同时,在ROS下,建立激光测距仪和手柄的通讯节点,使得导盲机器人的各个节点间能够相互通信。其次,对导盲机器人样机和SLAM问题的基本思想进行研究分析,建立导盲机器人运动模型、观测模型、地图模型等,其中,为简化计算,导盲机器人在行走过程中采用的是直线型运动方式。然后,对导盲机器人实时定位与地图创建的算法进行研究,在卡尔曼滤波和粒子滤波算法的理论基础上,依据文中建立的SLAM过程中的系统模型,采用基于扩展卡尔曼滤波的EKF-SLAM算法和基于粒子滤波的FastSLAM算法,并利用MATLAB仿真软件,创建导盲机器人运行的仿真环境,分别进行EKF-SLAM实验仿真和FastSLAM实验仿真,分析导盲机器人X方向,Y方向和航向角的位置误差,实验仿真及对比结果显示在各项参数基本相同的情况下,FastSLAM算法比EKF-SLAM算法的累积误差要小。最后,本文利用自主搭建的导盲机器人实验平台,在三个真实环境中,构建真实环境的二维地图,分析实验中出现的问题,可以得出在创建地图的过程中,地图的质量与机器人车体的稳定性、机器人行走过程中打滑和环境的复杂程度有关。
[Abstract]:In recent years, the number of people with visual impairment has been increasing in our country, so it is necessary and urgent to develop a tool to assist the visually impaired people to walk-guided robot. In order to realize autonomous operation in the environment, a prerequisite that can not be ignored is the establishment of environmental map and the ability to complete self-localization at any time in the environment, that is, real-time location and map creation. This paper focuses on this content. In this paper, based on the laboratory Turtlebot2, an experimental prototype of a blind guided robot is developed, which operates under the ROS operating system, operates with a wireless handle and detects the environment with a laser rangefinder. The research contents include the construction of the experimental prototype of the guide robot, the construction of the system model of the guide robot, the research of the SLAM algorithm and the simulation of the MATALB experiment of the guide robot, and the experiment and analysis of the guide robot in the real environment. Firstly, the prototype of the blind robot is built, and the Kinect vision sensor on the lab turtlebot2 robot is replaced by the laser rangefinder, which is used as the external observation sensor of the robot. At the same time, the dual-mode wireless vibration force feedback handle is connected with the robot to control the operation of the robot. At the same time, the communication nodes of the laser rangefinder and the handle are established under ROS, so that the nodes of the robot can communicate with each other. Secondly, the basic ideas of the prototype and SLAM problem of the guided robot are studied and analyzed, and the motion model, observation model, map model and so on of the guided robot are established, among which, in order to simplify the calculation, In the course of walking, the blind robot adopts linear motion mode. Based on the theory of Kalman filter and particle filter, the system model of SLAM process is established. The EKF-SLAM algorithm based on extended Kalman filter and the FastSLAM algorithm based on particle filter are adopted, and the simulation environment of blind robot running is created by using MATLAB simulation software. The simulation of EKF-SLAM experiment and FastSLAM experiment are carried out, respectively. The position error of X direction and heading angle of blind robot is analyzed. Experimental simulation and comparison show that the accumulated error of fast slam algorithm is smaller than that of EKF-SLAM algorithm under the condition that the parameters are basically the same. Finally, using the self-built Blind Robot experiment platform, this paper constructs 2D map of real environment in three real environments, and analyzes the problems in the experiment, we can draw the conclusion that in the process of creating map, The quality of the map is related to the stability of the robot body, the slippage of the robot and the complexity of the environment.
【学位授予单位】:江苏科技大学
【学位级别】:硕士
【学位授予年份】:2016
【分类号】:TP242
,
本文编号:1877991
本文链接:https://www.wllwen.com/kejilunwen/zidonghuakongzhilunwen/1877991.html