Page tree

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

This page contains information about methods of estimating the relative pose between different camera systems and how to estimate the pose between these systems.

1.Single camera relative pose estimation: camera arm

In order to estimate the relative pose of different objects from the robot's workspace an easy to use method is the one based on Augmented Reality(AR) markers.

The main idea is to detect a custom based AR marker from a camera stream and to compute its pose relative to the camera frame.

Fortunately, there are ROS enabled software tools already available, two of them are: alvar and aruco, both having their advantages and disadvantages.

The one used in this case is Alvar.

 

 Image Removed

 

 

Step-by-step guide for single camera relative pose estimation

      Prerequisite

...

Info
titlebaxter_indiv_no_kinect.launch

<launch>

    <arg name="marker_size" default="4.4" />

    <arg name="max_new_marker_error" default="0.08" />

    <arg name="max_track_error" default="0.2" />

    <arg name="cam_image_topic" default="/cameras/left_hand_camera/image" />

    <arg name="cam_info_topic" default="/cameras/left_hand_camera/camera_info" />

    <arg name="output_frame" default="/left_hand_camera" />

    <node name="ar_track_alvar_left" pkg="ar_track_alvar" type="individualMarkersNoKinect" respawn="false" output="screen" args="$(arg marker_size) $(arg max_new_marker_error) $(arg max_track_error) $(arg cam_image_topic) $(arg cam_info_topic) $(arg output_frame)" />

</launch>

 

...

Step-by-step guide for using Alvar

  1. Fire up Baxter simulator or either Baxter
    ./baxter.sh or ./baxter.sh sim --when in simulation
  2. roslaunch baxter_gazebo baxter_marker.launch – when simulation is the case, it will fire up Gazebo containing a marker on the table
  3. Launch ar_track with roslaunch ar_track_alvar baxter_indiv_no_kinect.launch
  4. Run Rviz for visualisin the results with rosrun rviz rviz and after that select the TF

 

Info
titleWhat can go wrong

In order to have proper results, measure the size of the marker you are using and provide it to alvar by an argument when starting the launch file.

An example is roslaunch ar_track_alvar baxter_indiv_no_kinect.launch marker_size:=10 , if using a  10x10 cm marker.

Providing the proper size of the marker has a great significance on the proper estimation of the pose.

...

titleSetting up an offsetted target

If you want to place the arm to a relative position taking into account the pose of the marker, one of the solutions is to create a static transform from the offsetted position to the marker with the command:

rosrun tf static_transform_publisher x y z qx qy qz qw frame_id child_frame_id  period(milliseconds)

...

rosrun tf static_transform_publisher 0.05 0 0 0 0 0 1 /ar_marker_10 /offset_target 100

In the example the offsetted target is placed at 5 cm in front of the marker and has the same orientation with it. If the marker rotates or moves, so does the target.

 

 

Step-by-step guide for depth camera and robot calibration  

...

 

Info

Update this doc for calibration info!

Content by Label
spacesTEAM
reversetrue
showLabelsfalse
max5
sortmodified
labelscamera calibration pose
showSpacefalse
typepage

Page properties
hiddentrue
Related issues 

In our demo, we need an additional 3D camera for detecting objects. The camera is fixed on one of the robot's arms. This tutorial explains how to determine the camera position and orientation relative to the robot coordinate frame.