Skip to content
Stefanie Tellex edited this page Nov 10, 2015 · 26 revisions

Congratulations on making it through the readme.

You can now begin using ein to interact with objects using the robot. This tutorial is organized as follows:

  • Calibration
  • Scanning objects
  • Process raw data of each collected class.
  • Cache detector data for each class.
  • Load detectors for a specified subset of objects online.
  • Map objects in workspace yielding detections.
  • Get workspace detections to your external program.

The program is has a main executable, ein, with a gui for monitoring behavior, and a client for sending it commands, ein_client.py, which sends commands to ein over a ROS topic. Any program can send commands on the same topic and control ein. It is thus possible to use ein as a standalone apparatus for experiments or to hand control of the robot back and forth between ein and another program.

When ein starts, the arm it controls will move to a ready position in front of the robot in a crane pose with the end effector pointed in the negative z direction.

When you issue a command through the repl you must end with a " ;", space followed by a semi-colon. If you do not leave a space, the semicolon will be parsed as part of the final word. All ein words begin with lower case and proceed with camel case as a general rule. Tab complete can reveal related words in the repl, and all words are listed with the repl starts. When you add a word, you must recompile and run ein once to generate the word list before starting the repl or tab complete will not recognize your new word.

Stock Calibration

Especially if you never have, now would be a good time to run the arm calibration that Rethink provides. First make sure that there are no objects or obstructions, including tables and people, within Baxter's reach; calibration involves a wide range of arm movements and you don't want to damage anything.

Remove the parallel electric (or any other) gripper from the arm.

Next, run

rosrun baxter_tools calibrate_arm.py -l left

and reboot the robot. Then run

rosrun baxter_tools calibrate_arm.py -l right

and reboot the robot. Then run

rosrun baxter_tools tare.py -l left

and reboot the robot. Finally, run

rosrun baxter_tools tare.py -l right

and reboot the robot. If calibration was especially bad and movement jittery, perhaps after wrenching the arm when the cuff wasn't activated, this procedure may appear to have no effect. If so, put about 1 hour of fairly active, varied movement on the arm and repeat. A situation has been reported which improved only after 4 iterations and 4 hours with some space between.

Basic Movement

You can move the end effector in Cartesian coordinates using "wasdqe" when focused on the "Commands" text box or commands issues via the repl:

xUp 
zDown 

These commands move the end effector by an increment which you can set. The default is 1cm or 0.01. You can reset to the default by issuing

setMovementSpeedMoveFast 

Use tab complete to explore related setMovementSpeedMove* commands, and you can find these words in ein_movement.cpp. Next, you can change the orientation of the end effector by issuing

oXUp 

and related commands, which rotates the end effector about the X axis in the positive direction. Shift + "wasdqe" in a gui window will do the same.

Moving one increment at a time is tiresome, so try issuing

( xUp ) 5 replicateWord 

This pushes 5 copies of xUp onto the stack and executes, resulting in a movement of 5 cm by default.

Calibration

There are some quantities pertaining to the workspace and end effector that need to be collected so that ein can make simplifying assumptions about your workspace and calibrate the sensors to your Baxter and gripper of choice.

  • Setting the fence.
  • Table height.
  • Gripper mask.
  • IR sensor offset.
  • Camera intrinsics.

These quantities need to be set for each arm separately.

Setting the fence.

Ein assumes you will be working on flat table parallel to Baxter's xy plane and that you will restrict the working area so that distracting objects are not visible from the crane grasp pose within the workspace, or "fenced region". This can be observed in the Object Map View which can be viewed in its own window and is replicated in the ein main window. It is the red box. The large oriented circle in the center of the map is Baxter and the small oriented circle is the end effector projected into the plane of the workspace.

Use the getters and setters

mapSearchFenceXMin
-0.75 setMapSearchFenceXMin
1.00 setMapSearchFenceXMax
-1.25 setMapSearchFenceYMin
1.25 setMapSearchFenceYMax

to change the fence. Note that these are set twice, once for each hand, so make sure you are changing the values you think you are.

If the fence is not set sufficiently within the working area, ein will become distracted during mapping, causing the workflow to slow and introducing spurious detections. Although spurious detections can be ignored by creating a background image class, it is best to set the fence carefully. If ein gets distracted by the table edge, floor, or another out of workspace object, you should adjust your fence to prevent it.

Table height.

Ein assumes that there is a table inside the workspace, parallel to Baxter's xy plane and thus at a constant z-coordinate. You must set this height manually, and it must be set carefully if you require flush grasps. Maintaining an accurate table height setting is important to stay on top of because Baxter's sensor value can drift over a short period, so if flush grasps work one day they may miss a couple of days later.

To set the table height, drive the end effector so that the gripper is 1-3 cm above the table and issue

setTable 

which should yield something like

Looks like the table reading hasn't steadied for the wait of 2 .
  current, last, delta: 0.192946 0.1931 0.000154744

repeated in ein's stdout. If you see zeros or spurious values, something is wrong. The values of 0.19 are the OPPOSITE of the z-coordinate of the table in Baxter's base frame. One is the current reading and the other is an average. The third number is the difference. If the difference is large, the average did not "catch up", so run it again, but if the end effector is stationary when you issue the command, it should run for enough iterations to stabilize during one call. Note that the end effector position as reported by ein in the main window is NOT the coordinate of the tip of the gripper, but is about 10 cm behind the stock parallel gripper tip.

Issue

saveCalibration 

and you should see something like

Writing calibration information to /home/oberlin/dev/catkin_ws/src/ein/streamTest/config/leftCalibration.yml ...done.

The table height is now set for this arm and will be reloaded to this value if you restart.

Gripper mask.

It is important to mask out the gripper from the wrist camera image. Ein ships with masks for the small parallel electric gripper finger that comes with Baxter, but odds are you will have to make your own. Use

assumeCalibrationPose 1 changeToHeight 

to move to a good IK spot. It would be good to include this point in your workspace. Now, put a colorfully textured background on the table so that it covers the entire wrist cam image. The camera should not change saturation level significantly while looking at different portions of the background. Issue

setGripperMaskWithMotion 

and watch as

(data_dir)/config/leftGripperMask.bmp

changes. Experiment with different backgrounds. You can manually modify this mask and save it as a binary bitmap if the auto mask fails. You can check gripper mask coverage in the Object Map Viewer. Issue

density 

to refresh the frame in Object Map Viewer. The gripper mask should appear in transparent blue.

IK Map.

Ein caches the IK success for the arm in various locations in the crane pose. This forms a matrix indexed by x,y,z (where the orientation is always fixed with the arm pointing downward.) Each cell contains whether the IK was successful, failed, or found the arm in collision for that pose.

Ein still uses the Baxter IK service at runtime to compute joint angles, since the service seems to do a good job finding joint angles that require relatively little motion between the desired position and the arm's current position (exploiting the free parameter to do this.) Additionally the service performs self-collision checking.

We have integrated IKFast into ein, and verified that its end effector position error is very small (indistinguishable from the IK Service that comes with Baxter). However IKFast does not perform collision checking, and we have not yet added a collision checking capability to Ein, so we still use the service within Ein. (Note: you might wonder if you need new IK fast plugins for each Baxter, since each robot has its own URDF. In fact, since the only joint that is different between Baxters is where the arm is welded to the body, the IKFast model does not need to be different for different robots. However, there is a bug in the Baxter SDK URDF in which the left_endpoint and right_endpoint links are offset by 4.5mm compared to the URDF that ships with the Baxter SDK and the URDF obtained from /robot_description. This difference results in 4.5mm of position error when using IKFast naively. However it is possible to correct for this at runtime by transforming the desired end effector position by 4.5mm and transforming it back. Another fix would be to rerun IK Fast to generate new plugins, but I couldn't get it to run on the /robot_description URDF.)

Before doing this it's useful to run

toggleDrawClearanceMap 

so that you can see the IKmap unobscured by the clearance map.

fillIkMapAtHeights

It will take a while to run, and will successfully fill the IK map starting at the table height and going up to the mapping height.

After it runs you can run

0 fillIkMapFromCachedHeightIdx

for different indices to see the IK Map at that height. Finally you can run the following command to compute a merged ikMap that is only valid if the IK at all heights is valid:

fillIkMapFromCachedHeights

Lastly to save the ik information run:

saveIkMap saveIkMapAtHeight

IR sensor offset.

If your range map scans have a double image, you probably need to calibrate the offset of the IR range finder from the end effector z axis. You need an IR opaque cylindrical object (henceforth IR Marker) of about 0.5 cm diameter and 1-2 cm height. Issue

closeGripper assumeCalibrationPose 1 changeToHeight 

then drive the arm down so that the end effector tip clears the IR Marker by 2 cm and position the marker directly under the gripper tip, which is at 0 in the local xy plane of the end effector. Issue

setIROffset 

and read the output. It will tell you how much it was adjusted. If it is calibrated well, you should see a single image of IR Marker in the Range Map View.

Upon successful calibration, issue

saveCalibration

to save.

Camera intrinsics.

You must perform table height calibration with

setTable 

as described above before calibrating the camera.

Now we need to calibrate the camera intrinsics and extrinsics, like vanishing point, radial distortion, and the projection of the end effector into the image at different heights. This will allow us to localize points in the image for servoing.

You need a light colored background on the table that will span the camera image during the process. You will place a flat, black marker (Image Marker) of small diameter, around 0.5 cm, on the background and it will be localized repeatedly by finding the darkest point in a time averaged image. Gaffer tape works well, it's not too shiny. You must monitor the process to detect failure. The most common mode of failure for calibration is that something in the background is detected instead of the marker. If this happens you must restart the calibration process.

Issue

closeGripper assumeCalibrationPose 1 changeToHeight

and steer the end effector down so that the tip of the gripper (or 0 in the end effector local xy plane) is just above Image Marker. Now issue

openGripper calibrateRGBCameraIntrinsics 

Check the Object Viewer to make sure the gripper is masked out. It can easily register as the darkest point and require a restart. Watch the Wrist View either in the Ein window or in its own window. Each time the darkness is detected, a red targeting box will appear around the detected point. The arm will move soon after detection, so you must watch and ensure that the box is around Image Marker as soon as it appears; the marker will remain in the old spot as the camera image changes. If a single red box appears in a place other than directly over the center of Image Marker, the calibration is bad and you must restart. You may find it necessary to increase the coverage of your light colored background.

First the vanishing point reticle is calibrated. It is a blue and yellow circle and should remain near the center of the image. The crop Baxter sends from the wrist cam will be adjusted so that the vanishing point is at the center of the image. It probably shouldn't move too much from factory settings.

If the blue-yellow end effector projection line becomes severely distorted, it is a sign something has gone wrong. Sometimes this happens due to a bad darkness detection, and very rarely it can happen due to instability of calibration points leading to an ill defined system. If this happens, you should calibrate again.

It is very useful to monitor ein's stdout to get a feel for the rhythm of the process.

Once calibration has finished, you should see a green cross over Image Marker. Calibration happens at 4 heights (heights 0, 1, 2, and 3) and the results are interpolated. This has worked pretty well so far. If one of the 4 levels is bad, it will affect adjacent levels. Right now everything happens on level 1 by default, so let's check the accuracy at level 1. Issue

1 changeToHeight 

and make sure the green cross is directly over Image Marker. Move around the xy plane. The green cross should be within a centimeter or two of Image Marker within a quarter screen width of the vanishing point and should exhibit less error near the vanishing point. You can notice the quantization induced by the joint sensors, and if you are accurate to within a small multiple of that noise, things are good.

Upon successful calibration, issue

saveCalibration

to save.

Scanning objects

To view the current classes, issue:

printClassLabels 

Mapping the Table

Ein ships with models for a few objects around our lab, and we plan to add models for standard objects that most labs should have, such as an ICRA Standard Duckie. If you don't yet have a model, you'll need to train a model (described below).

For the mapping commands, there is a concept of "active" classes that it will use during mapping. If there are lots and lots of active classes (more than 10? We haven't determined an exact upper bound), classification will be slower, and it will also make more errors. So at any give time you want just a few to be around.

You can see what ones are there by running "pushClassLabels" and it will push them on the stack as strings.

You can set them by running 'endArgs "redMug" "blueMug" setClassLabels'. redMug and blueMug must be directories in default/objects' (or whatever your config directory is). Those directories should contain "thumbnail.png" as well as an "ein" directory which contains the object model.

After doing a scan, the directory will have a weird name that contains the robot's serial number and a timestamp; we often go back and rename the directory to something human-meaningful (e.g., "redMug") and reload it.

Scans

Scanning consumes EePoseWords from the stack. See Arm Buttons and Annotation for details on how to put pose words on the stack. Running

currentPose 

is one way. The consumed poses are interpreted differently depending upon which of the two scans are performed:

<EePoseWord> <EePoseWord> scanObjectStreamWaypoints3d 

or

<EePoseWord> <EePoseWord> scanObjectStreamWaypointsIR 

For scanObjectStreamWaypoints3d, poses are interpreted as grasp annotations. For scanObjectStreamWaypointsIR, poses are interpreted as heights and base points for IR scanning.

object folders will be generated in

(data_dir)/objects

and named according to the time at which they were created, the robot serial number, and the arm of the robot that scanned it. You can change the name of the folder to whatever you want, but you must reload the labels after renaming.

Process raw data of each collected class.

endArgs "autoClass5_left" "autoClass4_left" setClassLabels ;
"autoClass4_left" setTargetClass 
populateStreamBuffers 

After this operation completes (ein will return from being frozen) run:

integrateImageStreamBufferCrops 

Wait for populateStreamBuffers to complete successfully before moving to integrateImageStreamBufferCrops

Alternatively, you can do the three operations above in one line per class as follows:

integrateImageStreamBufferCrops populateStreamBuffers "class" setTargetClass 

Then run the below "trainModelsFromLabels" command

During training, these two commands are issued to generate the range map and servo images, respectively:

integrateRangeStreamBuffer 
integrateImageStreamBufferServoImages 

You don't need to run them again but can if you collect more registered data.

Cache detector data for each class.

trainAndWriteFocusedClassKnn 

Load detectors for a specified subset of objects.

trainModelsFromLabels endArgs "autoClass5_left" "autoClass4_left" setClassLabels 
"autoClass4_left" setTargetClass 
assumeBeeHome 1 changeToHeight pickFocusedClass  

or

createCachedClassifierFromClassLabels 

Map objects in workspace yielding detections.

clearMapForPatrol clearBlueBoxMemories mappingPatrol 
mapLocal 
"autoClass4_left" deliverTargetObject 
setPlaceModeToHold 
zeroGOn 
zeroGOff 

Get workspace detections to your external program, send commands to ein over a topic.

/ein_left/state
/ein/left/forth_commands