Software for Autonomy

Sensor Calibration

Dataspeed Inc. provides cutting-edge software solutions for autonomy, enabling a seamless integration of autonomous systems in a variety of applications. A key aspect of this process is sensor calibration, which ensures that the sensors on autonomous vehicles or robotic systems are accurately aligned and synchronized for optimal performance. 

Time Reduction

Dataspeed's Sensor Calibration Tool streamlines the extrinsics calibration process, cutting down on hours of setup, allowing for faster deployment and testing.

Enhanced Accuracy

With advanced calibration algorithms, the tool ensures that LiDAR, cameras, and radar are accurately calibrated, optimizing data collection for safer, more reliable autonomous systems.

Seamless Integration

Designed for easy integration with autonomous vehicles, the tool ensures smooth coordination between sensors, reducing compatibility issues and enhancing overall system performance.

Dataspeed Sensor Calibration

Sensor Calibration

Ensuring Accuracy and Reliability in Sensor Performance

The integration of sensors has become the standard for most autonomous vehicles, but spending hours calibrating them consumes valuable testing and development time. The Dataspeed Sensor Calibration Tool is meant to aid users in setting up the extrinsics calibration of a suite of sensors on a vehicle. This convenient tool reduces the hassle of sensor setup and allows the user to spend more time on essential autonomous or data collection research.

Lidar Ground Plane Alignment

This calibration mode inputs the point cloud from a 3D lidar sensor, detects the ground plane in the cloud, and then adjusts the roll angle, pitch angle, and z offset of the transform from vehicle frame to lidar frame such that the ground plane is level and positioned at z = 0 in vehicle frame.

Multi-Lidar Alignment

This calibration mode inputs point clouds from two 3D lidar sensors and computes the translation and orientation between the sensors’ coordinate frames. It does this by comparing distinguishing features in the overlapping point clouds.

Lidar-Camera Overlay

This tool overlays a lidar point cloud on a camera image in real time according to the current extrinsic transform between the two sensors. This can be used to adjust and verify the transform.

Camera-Camera Overlay

This tool constructs a composite image that visualizes the overlap between two camera images according to the current extrinsic transform between the cameras. This can be used to adjust and verify the transform.

Want to learn more?

Tell us more about your project