-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Simulation for SLAM and faking sensor data #6
Comments
Data Set from the Intel Cameras - For simulation of SLAM without the hardware setup - Data Set Information is stored in a Rosbag file (The information on the exact content of the Rosbag file is given here)
Record and Playback Example
|
The most suitable and well documented simulator with a rich library is preferred. Due to the vast majority of online resources and excellent support for ROS, Gazebo is chosen for the simulation part.
Faking and Simulating sensor data:
Gazebo model for Intel Realsense cameras. Which combination of ROS/Gazebo versions to use?
An alternative for learning Gazebo and SLAM is the Udemy course - ROS for beginners II: Localization, Navigation and SLAM which provides an excellent understanding into the workings of Robot Navigation using ROS and Gazebo. |
To model the dynamics of the drone in gazebo, it is suggested to use the existing models prepared by the px4 team. Refer to their official site at the following link: Gazebo Simulation |
I did a bit of background check regarding gazebo model for
But I was able to find an alternative third party plugin in the same issue which we can use. |
The final Gazebo simulation model for the Quadcopter has been completed. For the IMU and depth camera we are using models from the default PX4-SITL_gazebo package. The specific model number for the IMU is unknown, while for the depth camera a Kinect RGBD camera model is used. For the LiDAR a Velodyne VLP 16 model is used. The set of sensors which will be deployed on the actual Quadcopter will be a different one though. The following table lists the components selected.
The latest version of the PX4 Firmware included support for realsense camera through the official librealsense plugin by intel. The mesh files are included by PX4. It is hereby suggested to replace the Kinect by the Realsense depth camera model for further testing. Although, it doesn't really make much of a difference but the librealsense plugin has a detailed and articulate documentation. The official team inventory has a set of Realsense tracking and depth camera therefore the algorithms developed can be tested directly in the real world model before transferring to Zed-2. |
The world file used for initial simulations was the willow-garage office model listed in the official osrf/gazebo_models repository. The walls of the model are very small in scale compared to the quadcopter model, as a result the world file was replaced by @NidheeshJain with a custom designed world model included in this repo (check building1.world). |
It is important to work on a simulated environment rather than going for on-field testing directly to avoid any unwanted damage to property especially when working with aerial robots. The challenges are:
To decide the appropriate simulation software based on the type of environment.
To fake the sensor data which is comparable to the real data to a great extent and hence the need to determine the algorithm of Intel Realsense t265 and d435 cams.
To prepare the appropriate 3D environment or importing existing ones from online databases.
The text was updated successfully, but these errors were encountered: