RealSense cameras have been recently proved to be a valid technology for 3D perception. Therefore, it would be beneficial to plug it onto the iCub robot, in order to facilitate perception and grasping, while allowing a comparison with the current cameras mounted on the robot.
A possible solution is to design a holder for the camera to be placed on the robot's head:
This sandbox contains the software to run the experiments needed to identify a set of suitable camera poses. To define a suitable pose, we considered the following points:
- a typical grasping task consists of having the robot to grasp an object placed on a table. To maximize the visibility of the object from the cameras, the robot's neck is usually entirely tilted. In such configuration, it is crucial that:
- the RealSense pose allows the object to fall entirely into the field of view;
- the RealSense pose allows to have a view covering most of the object, rather than (for example) a top view, which might be problematic for grasping;
- ideally, the RealSense should not fall into the cameras field of view.
To this aim, the test-bench we considered includes an object located on the table in front of iCub, taken from the YCB dataset (specifically the mustard bottle). The 3D mesh of the RealSense was imported in the Gazebo simulation environment and moved within a range around the head:
The analysis carried out relies on the following steps:
- the 3D point cloud of the scene is acquired from the RealSense and segmented from the table;
- the retrieved point cloud is used to construct a compact 3D representation based on superquadrics, relying on superquadric-lib;
- the optimal superquadric is retrieved beforehand considering a reasonable view of the object, fully inside the field of view;
- for each pose the superquadric is extracted and compared to the optimal.
The camera pose is finally retrieved in order to minimize the difference between the dimensions of the optimal and the computed superquadric.
A full report describing the performed analysis can be found in the following report.
This repository is maintained by:
@vtikha | |
@vvasco |