7. Onboarding

The idea of this onboarding page is to give new members different excercises over multiple weeks so they get to know the Autonomous System and how to work with it.

There are also some tutorials to learn supporting software like Git, Docker and ROS.

7.1. External Tutorials

7.1.1. Git

Harvard Git Lecture: Github, Commits, Merge Conflicts and Branching

Git submodule tutorial

7.1.2. Docker

tbc

7.1.3. ROS

e.g. ROS tutorial series: Installation, Topics, Messages, Nodes, …

7.2. Exercises

7.2.1. First week

In the first week, you have to clone the git repository to your host system (Windows, Linux, Mac, …). Afterwards you will set up the development environment by building the AS docker image and running an AS image as a container. Subsequently, you set up our standardized IDE (VSCode). Finally you will run our pipeline for a recorded test run. Therefore, you have to migrate the recorded rosbag. Afterwards, you can visualize the system output.

It is recommended that you at least read the following pages to fulfill the exercises:

  1. Setup docker environment

    1. Clone the git repository

    2. Build the docker image

    3. Run an AS container

    4. Build catkin workspace intially

  2. Setup IDE

    1. Install VSCode

    2. Install Extensions

    3. Connect to AS container

  3. Run the pipeline

    1. Download the recorded rosbag for run 370

    2. Migrate the rosbag by deleting output from modules you can run

      1. Depending on the fact whether you own a NVIDIA GPU, from Perception or SLAM

      2. Delete possibly recorded transforms

      3. Specify additional, necessary options like fixing or migrating strategies

    3. Debug the pipeline

      1. Set vehicle accordingly (eva)

      2. Disable modules you cannot run (e.g. control or perception)

      3. Disable preloading, enable gps position and heading sensor for SLAM

    4. Visualize the system outputs

    5. View the visualization

7.2.2. Second week

In the second week, you will learn how to create tracks from drone footage and how to calibrate cameras.

It is recommended that you at least read the following pages to fulfill the exercises:

  1. Create a track from drone footage

    1. Download the drone footage for run 592 and for the calibration as well as the calibration rosbag

    2. Create the track for run 592 and for the calibration with the track creator

  2. Calibrate both cameras with the calibration rosbag

  3. Generate an acceleration map according to the track used for the run 592

  4. Run the pipeline for the rosbag of run 592

    1. Set the vehicle accordingly (emma)

    2. Preload the generated map

    3. Align the map with rational paramaters

    4. Disable the gps position and heading sensor for SLAM

  5. Correct atleast some mistakes you found in the documentations

    1. Create a custom branch for the corrections

    2. Create a pull request for the corrections

7.2.3. Third week

In the third and last week you will learn how to debug and profile a ROS module. Also, you will visualize the output of a previous pipeline simulation onto drone footage.

  1. Debug the ROS module for which you are responsible

    1. You must not fix any but, but at least add at least one breakpoint and make sure, that the programs breaks there.

  2. Profile and analyze the ROS module for which you are responsible

    1. Create an svg image via gprof2dot

    2. View the profile stats in KCachegrind

  3. Visualize the system output of the in the previous week simulated run 592 onto the according drone footage