2.2.1.1. How to calibrate the camera

The current system relies on identifying cones on camera images. As camera we use the stereo camera zed2i. In the next step the local mapping module calculates where the identified cone is in the real world.

The local mapping module needs a intrinsic and extrinsic matrix to calculate this info.

The intrinsic camera matrix is calculated by the ZED camera itself and pubished as ROS message. The extrinsic camera matrix has to be calculated by an alternative method. We chose [coming soon].

While this process was an elaborate manual task in the past. Since October 2022 we have a graphical user interface for the camera calibration.

In the future there should be a new graphical user interface for a new, more precise calibration method.

2.2.1.1.1. calibrate_extrinsic_matrix

calibrate_extrinsic_matrix [OPTIONS] INPUT_ROSBAG

Options

-s, --start <start>

relative start time of the new bag

-d, --duration <duration>

maximal duration of the new bag

-e, --end <end>

relative end time of the new bag

-l, --distance <distance>

Distance between camera and begin of calibration board

-cms, --calibration-matrices-subdirectory <calibration_matrices_subdirectory>

subdirectory for camera calibration matrices

-c, --camera <camera>

camera to calibrate

Options:

left | right | both

-ds, --data-source <data_source>

Data source for Calibraition. Standardized calibration board, manual points or cones from ground truth

Options:

board | manual | ground_truth

-tl, --track-layout <track_layout>

Track layout the ground truth is saved under

-td, --test-day <test_day>

Test day the ground truth is saved under

Arguments

INPUT_ROSBAG

Required argument

2.2.1.1.2. Prerequisites

  1. Connect a display to the ACU, or

  2. Connect to ACU via Ethernet or WiFi and Connect via VNC to ACU

    1. Lay the calibration board in front of the car in such a way that the middle crosses align with the x axis of the vehicle.

      1. the weird corner of the blue polygon must be on the upper right.

    2. Make sure, that all crosses can be viewed on the camera image:

      1. Open foxglove on your local machine or on the ACU itself (display or vnc)

      2. Open a new data source: Open connection to live robot: ROS 1: localhost or <ACU_IP> (based on the decision of previous step)

      3. Open a image panel and check both cameras.

      4. You may have to get some items to block some sun light.

    3. Measure the projected distance between the camera and the lowest cross.

    1. Mark an big (e.g. 15x15m or 20x20m) perfect square with big orange cones

    2. Place yellow, blue and small orange cones inside this square so that they are equally distributed.

      Yellow and blue cones will be used for calibrating the camera (use registration for those), orange cones for controlling the calibration (don’t use registration for those).

    3. Place the vehicle approximately in the middle of one side of the square. Make a picture with a drone.

    4. Create a track from this picture using the Creating Tracks from drone footage. Specify a useful test day and calibration as track layout. No manual map needed.

2.2.1.1.3. Instruction

  1. Follow the basic instructions in Using GUIs inside an AS ROS container

  2. Record a rosbag with some images from both cameras:

    roslaunch local_mapping camera_calibration.launch
    

    Cancel the recording after 3-5 seconds by pressing Ctrl + C

  3. Rename the new rosbag under ./rosbags/ to calibration<number>.bag.

  4. Calibrate the camera via the graphical user interface

    If you are using a VNC session, don’t forget to specify which display the gui should use: export DISPLAY=1

    1. Start the GUI (see above for options):

      calibrate_extrinsic_matrix /workspace/as_ros/rosbags/calibration<number>.bag -ds board
      
    2. If the terminal outputs, that it did not find any image, there is something wrong. You have to think for yourself :o

    3. If the shown image is black, you can restart the graphical user interface with the option -s <delay in seconds>

    4. Click on each cross from left to right, from the top to the bottom. You can delete a point by clicking <backspace>

    5. After clicking on nine crosses, the blue control polygon is overlayed. The terminal will ask you whether this looks good or you want to try it again.

    6. You have to do the previous two steps for other camera aswell.

    7. If you are finished, please commit the matrices at some point in the near futures

    1. Start the GUI (see above for options):

      calibrate_extrinsic_matrix /workspace/as_ros/rosbags/calibration<number>.bag -ds ground_truth -tl calibration -td <test_day>
      
    2. enter the projected distance between the rear axle an camera in x direction in mm (emma: 590mm)

    3. If the terminal outputs, that it did not find any image, there is something wrong. You have to think for yourself :o

    4. If the shown image is black, you can restart the graphical user interface with the option -s <delay in seconds>

    5. Click on the tip of every yellow and blue cone in the same order as when creating the track.

    6. After you clicked on each cone, the small orange cones should be visualized as orange crosses.

    7. You have to do the previous two steps for other camera aswell.

    8. If you are finished, please commit the matrices at some point in the near futures