Name: extrinsic_calibration
Owner: ros-naoqi
Description: Extrinsic calibration for the color and depth cameras of Pepper robot
Created: 2017-11-04 07:58:07.0
Updated: 2017-12-09 20:36:35.0
Pushed: 2017-12-01 09:35:51.0
Size: 48
Language: C++
GitHub Committers
User | Most Recent Commit | # Commits |
---|
Other Committers
User | Most Recent Commit | # Commits |
---|
This a short tutorial for extrinsic calibration of Pepper's Depth and RGB cameras.
First of all, print a calibration pattern (a chessboard) if you do not have any. We would recommend A3 size. Once you have it, count the number of rows and columns and measure a size of a square. Adjust these parameters in launch/calibration_cameras_pepper.launch:
turn on your robot, wake up, and move the head to the “zero” pose, for example
nao@<PEPPER_IP>
i call ALMotion.wakeUp
i call ALMotion.setAngles "HeadPitch" 0.0 0.3
take two post-it (one on the top of enougher) and fix it on the right eye of Pepper (to cover this eye with the camera inside)
start Naoqi Driver, ensuring that color and IR images are enabled in share/boot_config.json
aunch naoqi_driver naoqi_driver.launch nao_ip:=<PEPPER_IP>
create an empty folder for data recording and provide its name in launch/extrinsic_calibration_pepper.launch
update ROS topics for color and IR images in launch/extrinsic_calibration_pepper.launch
launch the code:
aunch extrinsic_calibration extrinsic_calibration_pepper.launch
if the software can subscribe to ROS topics, then you should see an image from a color camera
place the calibration pattern in front of the robot
once the pattern is detected (you can see all detected crossings), press “s” on your keyboard to save images from both cameras
repeat 2 previous steps each time changing a distance and/or an angle to cover different views (at 60cm-80m distance) until you reach a sufficient number of image pairs (for example, 30 times or more)
launch the code to compute extrinsic parameters based on recorded images:
aunch extrinsic_calibration extrinsic_calibration_pepper.launch
press “c” on your keyboard to compute calibration values
check if the resulted XYZ and RPY for last processed image pair correspond to the reality
Once the calibration parameters are found, you can apply them using ROS static_transform_publisher. Check an example in register_depth.launch, update the values (x y z yaw pitch roll) and launch it after naoqi_driver:
aunch naoqi_driver naoqi_driver.launch nao_ip:=<PEPPER_IP>
aunch extrinsic_calibration register_depth.launch