Sensor Calibration Tool

For Autonomous Vehicles

The integration of various types of sensors has become the standard for most autonomous vehicles, but spending hours calibrating them consumes valuable testing and development time. The Dataspeed Sensor Calibration Tool provides a set of functionality that can be used to set up and calibrate a suite of lidars and cameras. This convenient tool reduces the hassle of sensor setup and allows the user to spend more time on essential autonomous or data collection research.

Lidar Ground Plane Alignment

This calibration mode inputs the point cloud from a 3D lidar sensor, detects the ground plane in the cloud, and then adjusts the roll angle, pitch angle, and z offset of the transform from vehicle frame to lidar frame such that the ground plane is level and positioned at z = 0 in vehicle frame.

Before Alignment

After Alignment

Before Alignment

After Alignment

Lidar-Lidar Alignment

This calibration mode inputs point clouds from two 3D lidar sensors and computes the translation and orientation between the sensors’ coordinate frames. It does this by comparing distinguishing features in the overlapping point clouds.

Camera-Lidar Alignment

This calibration mode inputs a camera image and a point cloud from a 3D lidar sensor and computes the translation and orientation between the sensors’ coordinate frames. It does this by detecting edges and corners of a rectangular target board in both the camera image and the lidar point cloud and comparing multiple samples.

Before Alignment

After Alignment

Bad Camera-Camera Extrinsics

Good Camera-Camera Extrinsics

Camera-Camera Extrinsics Overlay

The camera validation GUI can be used to validate the extrinsics between multiple cameras with overlapping fields of view.

Language »