-
Notifications
You must be signed in to change notification settings - Fork 0
Home
- Introduction
- Build component
-
Ports and Services
3.1. init_grabber
3.2. enable_display
3.3. init_camera
3.4. init_detector
3.5. init_servo
-
Testing the component
4.1. Using Matlab
UAV Visual Servoing is a genom3 component that is launched on the Embedded PC of MikroKopter
to perform visual servoing.
It uses ViSP library to :
- Grab the image
- Display the image
- Detect AprilTag
- Extract visual features from the image related to the position and size of the tag
- Calculate desired velocity that the drone should have in order to keep the tag in the middle of the image
This component supports the following cameras:
- FLIR cameras through FlyCapture SDK
- Any camera that is compatible with OpenCV SDK
Return to table of contents
In order to use this component, you should proceed as follows:
- Download the latest sources using git:
$ mkdir devel
$ cd devel
$ mkdir components
$ cd components
$ git clone https://gitlab.inria.fr/lagadic/uavvs-genom3.git
- Change to the project's directory and launch
bootstrap.sh
file:
$ cd uavvs-genom3
$ ./bootstrap.sh
- Create
build
directory and build the project :
$ mkdir build
$ cd build
$ ../configure --prefix=$HOME/devel --with-templates=pocolibs/server,pocolibs/client/c
$ make
$ make install
- Once this process is done, create a link to the created libraries (to be able to interface the component with MATLAB):
$ cd /opt/openrobots/lib/genom/pocolibs/plugins
$ sudo ln -s $HOME/devel/lib/genom/pocolibs/plugins/uavvs* .
Return to table of contents
The component presents an output port containing linear and angular velocities. The port's type is rigid_body
thus similar to the input port reference
of NHFC
.
When first launched, the component won't automatically grab images or detect AprilTag, so it publishes 0 linear and angular velocity. The interface with the copmonent is done through a set of functions described as follow:
Initializes the image grabber. We can specify which grabber we want to use according to camera used. The parameters of this function are:
-
device
: specifies the camera used -
cam_index
: usually 0 and specifies the index of the camera (in case of multiple cameras) -
frame_rate
: for the cameras which we can set the frame rate (through FlyCapture SDK) -
shutter
: when this is true, it means auto shutter (through FlyCapture SDK) -
gain
: when this is true, it means auto gain (through FlyCapture SDK)
Return to table of contents
After initializing the grabber, it is good to display the images.
-
frequency
: display frequency Example if the grabber is initialized at25Hz
and enable_display is called with25
as parameter, we display every image.
Return to table of contents
This function initializes camera parameters. In order to detect the AprilTag, we should specify the parameters of the camera. Note that, this function should be called after calibrating the camera using ViSP.
It uses the perspective projection with distorsion. So when calibrating the camera, we get the parameters px
, py
, u0
, v0
, kud
and kdu
.
Return to table of contents
This function initializes the AprilTag detector with tagSize
being the width of the AprilTag, quad_decimate
and nThreads
are two parameters that improve speed at a cost of pose accuracy.
Return to table of contents
This function will initialize the tracking of the visual features. Note that we should specify the gains lambda_0
, lambda_inf
and lamda_dot_0
which are respectively the gains at 0, infinity and the slope at 0. As well as the z_desired
which is the distance to keep from the tag.
camera_pan
and camera_pos
depend on how the camera is attached relative to the drone's frame.
Return to table of contents
- You first have to launch the component on the drone (example
mkQuadro1
) along with all the other genom3 components such asNHFC
,maneuver
,pom
, ... This is done by modifiying themk_launch_exp_poco.sh
file and adding the lines corresponding to uavvs components:
#!/bin/bash
robot_id=$1
h2 init
sleep 1
mikrokopter-pocolibs -f --name mikrokopter_mk$1 &
sleep 1
nhfc-pocolibs -f --name nhfc_mk$1 &
sleep 1
maneuver-pocolibs -f --name maneuver_mk$1 &
sleep 1
vicon-pocolibs -f --name vicon_mk$1 &
sleep 1
pom-pocolibs -f --name pom_mk$1 &
sleep 1
uavvs-pocolibs -f --name uavvs_mk$1 &
sleep 1
genomixd -p 1000$1
Now that the components are launched, along with genomix server, in Matlab, we proceed as follows:
- Create connections to genomix server of the drone by specifiying
host:port
:
>> client = genomix.client('192.168.30.151:10001');
- Load the
uavvs
component on the server:
>> uavvs = client.load('uavvs')
uavvs =
component with properties:
meta: [1×1 struct]
desired: @()handle.port(args)
genom_state: @()handle.port(args)
log_stop: @(varargin)handle.rqst(args,varargin{:})
init_grabber: @(varargin)handle.rqst(args,varargin{:})
connect_port: @(varargin)handle.rqst(args,varargin{:})
enable_display: @(varargin)handle.rqst(args,varargin{:})
init_servo: @(varargin)handle.rqst(args,varargin{:})
connect_service: @(varargin)handle.rqst(args,varargin{:})
abort_activity: @(varargin)handle.rqst(args,varargin{:})
close_display: @(varargin)handle.rqst(args,varargin{:})
init_detector: @(varargin)handle.rqst(args,varargin{:})
kill: @(varargin)handle.rqst(args,varargin{:})
init_camera: @(varargin)handle.rqst(args,varargin{:})
log: @(varargin)handle.rqst(args,varargin{:})
As you can see, to call the functions all you need to do is the following:
>> uavvs.init_grabber(0,0,25,1,1);
>> uavvs.enable_display(1);
>> uavvs.init_camera(1200,1200,516.97,391,-0.219,0.231);
>> uavvs.init_detector(0.23,3,4);
>> uavvs.init_servo(0.5,0.3,30,1.25,deg2rad(-45),{0,-0.11,0});
We initialized the image grabber associated with PointGrey Flea3 camera, enabled 1 image/sec display, initialized the camera with calibration parameters, initialized AprilTag detector for AprilTag of 23cm
width, and initiliazed the tracking task with gains and desired distance.
Return to table of contents