TSC bringup

OBJECT LEARNING

To start the object learning action server run:

roslaunch tsc_bringup object_learning.launch model_path:=/path/to/models record_run:=false

If the parameter record_run is set to true a rosbag of the robot navigating around the object will be recorded using rosbag_openni_compression. The rosbag is saved as tracking.bag in the folder where the sweep is saved.

Note recording a rosbag can take up to 1GB of disk space per run.

Note the xml of the current sweep where the rosbag is saved is published when the sweep finished on the topic /local_metric_map/room_observations.

To trigger an action run:

rosrun actionlib axclient.py /learn_object

The argument needed is the name of the node (from the topological map) where to run the learn_object action.

Nodes started

The launch file above is equivalent to the following:

  • roslaunch semantic_map_launcher semantic_map.launch
  • dependencies installed through strands_3d_mapping
  • roslaunch observation_registration_launcher observation_registration.launch
  • dependencies installed through strands_3d_mapping
  • roslaunch learn_objects_action learn_objects_dependencies.launch
  • a list of dependencies can be found here

Debug topics

For debugging, the following topics are useful: * /local_metric_map/merged_point_cloud_downsampled - the point cloud of the sweep * /local_metric_map/dynamic_clusters - the detected dynamic clusters * /object_manager/requested_object_mask - the mask of the dynamic cluster which will be learned * /object_learning/learned_object_xml - the xml of the dynamic cluster learning, pointing to the additional views and masks (among other things) * /additional_view_registration/registered_view_cloud - the point cloud of the registered additional views * /incremental_object_learning/learned_model - learned model using the RAL16 method * /object_learning/learned_object_model - learned model using the IROS16 method

Point Cloud History Search & Incremental Model Building

Note that you need the newest versions of strands_3d_mapping and quasimodo to be able to run this. quasimodo will shortly be merged into strands_3d_mapping. For a lot more info check the quasimodo readme.

In the tsc_start.sh tmux session, tab 10, there is are two panes that start the entire pipeline for retrieval and incremental model building. This needs to be started via ssh to set the display variable correctly. Note that you need to run both

rosrun tsc_bringup tsc_headless.sh

for starting up a virtual display (be sure to type the password), and

DISPLAY=:0 roslaunch tsc_bringup tsc_quasimodo.launch data_path:=/home/strands/.semanticMap

If you want some data to query for, we have uploaded all of the metric map data from the Birmingham deployment week to the server. Otherwise you will have to perform at least 20 sweeps with the robot to be able to query. Get the already collected data with:

wget https://strands.pdc.kth.se/public/semanticMap_BHAM.tar.gz
tar -zxvf semanticMap_BHAM.tar.gz

Note that the folder is 16GB on disk. If you do not want to replace the already existing ~/.semanticMap data you can just use the downloaded folder directly. Then you need to change the data_path argument of tsc_quasimodo.launch to /path/to/Birmingham/semanticMap. If you want to search for the data collected on the robot, it should be the default /home/strands/.semanticMap.

If you want to add previously collected metric maps to the retrieval representation (as opposed to processing them as they are collected), you can also set the parameter add_previous_maps:=true. Processing only has to be done once. If they have been processed previously (as is the case for the data on the server), they are loaded automatically. You will then have to wait to collect new metric maps until the processing of the previously collected maps has finished.

Debug topics

  • /quasimodo_retrieval/visualization - An image showing the result of there retrieval component. The leftmost image shows the masked RGB image of the query object and to the right are rendered views of the ten closest matches represented as 3D surfel clouds.
  • /models/fused - The fused model of the additional view observation that is queried at the moment. Published as sensor_msgs/PointCloud2, displayed in relation to /map.
  • /retrieval_processing/segmentation_cloud - The segmentation of a metric map cloud, published after a sweep is finished, published as sensor_msgs/PointCloud2

Note

You can manually trigger a search (i.e. without using the incremental object building framework) of an object with additional views by starting

rosrun quasimodo_retrieval quasimodo_retrieve_observation

and then, in another terminal specifying the path to the xml of the additional views:

rostopic pub /object_learning/learned_object_xml std_msgs/String "data: '/path/to/.semanticMap/201422/patrol_run_56/room_0/2016-Apr-22 14:58:33.536964_object_0.xml'"

You can also use soma to visualize the queries over time.

Original page: https://github.com/strands-project/g4s_deployment/blob/indigo-devel/tsc_bringup/README.md