Localization in ISS

The ISS localization module ensures that the autonomous vehicle not only understand its position in the vast tapestry of roads and environments but do so with unparalleled accuracy. In this page, we outline our current pipeline, from the individual sensors to the sensor fusion algorithms.

The table below list the global and local planning algorithms that are currently supported in ISS:

TypeInputsOutputsAlgorithmsSource
Lidar odometry
  1. Point cloud map
  2. LiDAR scan
Estimated statesNDT1ISS Sim
Encoder odometryWheel encoder / IMU dataEstimated statesDead reckoningISS Sim
Fusion
  1. Dynamic model
  2. Multiple state estimation
Optimal estimationEKF2ISS Sim
Laser scan matching
  1. Grid map
  2. Odometry information
  3. Transformations of vehicles
  4. Laser scan
Vehicle position (x,y) and yaw (θ) on the mapAMCLROS Navigation

Individual Sensors

To achieve precise localization, a multifaceted approach using various sensors and algorithms is adopted:

  1. LiDAR:
    • Iterative Closest Point (ICP)3: this algorithm aligns two point clouds to determine the best-fit transform that aligns them.
    • Normal Distributions Transform (NDT)1: this advanced method represents point cloud data using normal distributions, offering a robust and efficient means of registration.
  2. IMU (Inertial Measurement Unit):
    • Dead Reckoning: By leveraging motion sensor data, dead reckoning provides a continuous estimate of the vehicle’s position. However, its accuracy diminishes over extended periods and requires supplementary data for correction.
  3. GPS (Global Positioning System):
    • While GPS offers a global reference for positioning, its precision may not be sufficient for the tight tolerances of autonomous driving. Thus, GPS data is often fused with other sensor data to enhance accuracy.

Sensor Fusion

The key to impeccable localization is not just in the individual prowess of sensors, but in their collaborative strength:

  1. Filter-based Methods:
    • Recursive algorithms, such as the Kalman Filter and the Particle Filter4, are indispensable for real-time state estimation and prediction.
  2. Optimization-based Methods:
    • Holistic approaches like GraphSLAM5 adjust and refine entire trajectories or maps, ensuring the highest accuracy in post-processing scenarios.

Implementation Roadmap

  1. Maintain LiDAR (ICP, NDT), IMU (Dead Reckoning), GPS, and filter-based fusion.
  2. Study optimization-based fusion: GraphSLAM, Bundle Adjustment, Pose Graph Optimization.
  3. Gather and process datasets for optimization methods.
  4. Integrate optimization-based fusion into the current pipeline.

References

  1. Biber P, Straßer W. The normal distributions transform: A new approach to laser scan matching. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2003, 3: 2743-2748.  2

  2. Kong F, Chen Y, Xie J, et al. Mobile Robot Localization Based on Extended Kalman Filter. 6th World Congress on Intelligent Control and Automation (WCICA), IEEE, 2006: 9242-9246. 

  3. Besl P J, McKay N D. Method for registration of 3-D shapes. Sensor fusion IV: control paradigms and data structures. Spie, 1992, 1611: 586-606. 

  4. Thrun S. Probabilistic robotics. Communications of the ACM, 2002, 45(3): 52-57. 

  5. Shan T, Englot B, Meyers D, et al. Lio-sam: Tightly-coupled LiDAR inertial odometry via smoothing and mapping. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2020: 5135-5142.