-
Notifications
You must be signed in to change notification settings - Fork 32
RobotCraft
Bearnav is a simple teach-and-repeat visual navigation system robust to appearance changes induced by varying illumination and naturally-occurring environment changes. It's core method is computationally efficient, it does not require camera calibration and it can learn and autonomously traverse arbitrarily-shaped paths. During the teaching phase, where the robot is driven by a human operator, the robot stores its velocities and image features visible from its on-board camera. During autonomous navigation, the method does not perform explicit robot localisation in the 2d/3d space but it simply replays the velocities that it learned during a teaching phase, while correcting its heading relatively to the path based on its camera data. The experiments performed indicate that the proposed navigation system corrects position errors of the robot as it moves along the path. Therefore, the robot can repeatedly drive along the desired path, which was previously taught by the human operator. Early versions of the system proved their ability to reliably traverse polygonal trajectories indoors and outdoors during adverse illumination conditions [1,2], in environments undergoing drastic appearance changes [2,3] and on flying robots[4]. The version presented here is described in [5,6] and it allows to learn arbitrary, smooth paths, is fully integrated in the ROS operating system and is available on-line in this repository.
- You should install Ubuntu 16 with ROS kinetic or Ubuntu 18 with ROS melodic.
- Make sure your system is up to date:
sudo apt update
- Also, you should install other prerequisities:
sudo apt install git
.
- Run the following commands in separate:
cd
,mkdir -p ~/bearnav_ws/src
,cd ~/bearnav_ws/src
,catkin_init_workspace
- Clone the usb_cam ROS driver:
git clone https://github.com/gestom/usb_cam.git
- Compile it:
cd ..
,catkin_make
- Source your environment:
source devel/setup.bash
- Make your camera easy to access:
sudo chmod 777 /dev/video0
- Run the camera node:
roslaunch usb_cam usb_cam-test.launch
- You should see a live feed from your camera. If yes, you can terminate the program (e.g. by CTRL-C in the terminal you launched it from).
cd ~/bearnav_ws/src
- Clone the stroll_bearnav package:
git clone --branch coimbra_2019 --single-branch https://github.com/gestom/stroll_bearnav.git
- Compile it:
cd ..
,catkin_make
- Source your environment:
source ~/bearnav_ws/devel/setup.bash
- Run the mapping:
roslaunch stroll_bearnav stroll-map.launch
- You should see an image with the detected features and a graph with the ROS nodes.
- Examine the graph of the ROS nodes and the topics used for communication.
- Find the
mapper
client gui and create a map by entering its name, e.g.A
inside of the parentheses behind thefileName
. Then clickSend goal
, wait for feedback (you should see something like500 features saved at distance 0.000
and then clickCancel goal
. - When creating the map, make sure that you have enough features on stationary objects. You can retry the map building simply by clicking again on the
Send goal
. - Once you have a good map, terminate the mapping (e.g. by CTRL-C in the terminal you launched it from).
- Source your environment:
source ~/bearnav_ws/devel/setup.bash
- Run the mapping:
roslaunch stroll_bearnav stroll-nav.launch
- Examine the graph of the ROS nodes and the topics used for communication.
- Now find the
loadMap
gui, enter the map name (e.g.A
) in the prefix and clickSend goal
- Start the navigation by clicking
Send goal
in thenavigator
gui. - Test how the image features' matches between the map and the current view reflect the pan of the camera. In the navigation client window, you should see information about the map and current image displacements.
A detailed system description is provided in [5], theoretical foundations are in [6], experiments are described in [2].
The software is free to use for non-commercial applications. If you use it for your research, please cite [2] and [6]. If you would like to use it to make money, drop me [email protected] an email and we will strike a deal.
- T.Krajnik, L.Preucil: A simple visual navigation system with convergence property.In European Robotics Symposium, 2008. [bibtex]
- T.Krajnik, J.Faigl et al.: Simple yet stable bearing-only navigation. Journal of Field Robotics, 2010. [bibtex]
- T.Krajnik, S.Pedre, L.Preucil: Monocular navigation for long-term autonomy.In 16th International Conference on Advanced Robotics (ICAR), 2013. [bibtex]
- T.Krajnik, M.Nitsche et al: A simple visual navigation system for an UAV.In 9th International Multi-Conference on Systems, Signals and Devices (SSD), 2012. [bibtex]
- F.Majer, L.Halodova, T.Krajnik: A precise teach and repeat visual navigation system based on the convergence theorem. In Student Conference on Planning in Artificial Intelligence and Robotics (PAIR), 2017 (in review). [bibtex]
- T.Krajnik, F.Majer, L.Halodova, T.Vintr: Navigation without localisation: reliable teach and repeat based on the convergence theorem. 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)[bibtex]
This research is supported by the Czech Science Foundation project 17-27006Y STRoLL - Spatio-Temporal Representations for Mobile Robot Navigation.