Laser ranging plus GNSS

May 4, 2016  - By

Context-dependent scan matching for aided navigation

By Jyh-Ching Juang, Shang-Lin Yu and Shun-Hung Chen

juang_opener-W

Context-dependent scan matching for aided navigation — finding the rotation and translation that best align two consecutive scans — provides laser-ranging data that can be blended into a GNSS navigation system. A quality index based on analysis of intra-frame point clouds assesses the scan context, accounting for variations in feature richness, to yield a robust aided navigation solution.

For robust and autonomous navigation, many different sensors have been incorporated and, indeed, fused to form a navigation suite that typically includes a GNSS receiver, inertial measurement unit, vision sensor, laser rangefinder, odometer and others. Recently, driven by the goal to achieve autonomous driving, laser range data and image data have been widely adopted in the establishment of vehicle safety and autonomy functions. Laser range data can facilitate navigation and guidance. Through the use of scan matching, vehicle motion can be detected and used in dead reckoning. The surroundings of a vehicle can also be built based on point clouds, so that a feasible path can be generated for obstacle avoidance and vehicle guidance. To some extent, the image data can also be exploited in a similar manner. The use of a visual odometry technique attempts to estimate the relative motion between two consecutive images for dead-reckoning navigation.

This article addresses a limitation in scan matching for vehicular navigation and proposes a context-dependent scheme to account for the variation of the richness of features in scan-matching-based navigation. Environmental context in terms of the richness of features is known to affect the quality of the resulting navigation performance. Thus, in scan matching, we seek to establish a quality index to quantify the quality of the resulting estimates on rotation and translation. In this manner, after fusion with other sensors, a robust positioning solution can be obtained.

Here, we briefly review the scan-matching technique and discuss the aforementioned limitation using a real-world example. We then investigate a context-dependent weighting concept, and the entropy of a scan is used to quantify the richness of its features. We find that a scan with low entropy may be prone to improper registration and an erroneous navigation result. Thus, a weighting is assigned to the scan-matching result for integrated navigation processing. To verify and demonstrate the proposed context-dependent weighting approach, the method is implemented and tested in a vehicle. The result verifies that the proposed scheme can indeed avoid improper registration and lead to robust navigation performance.

Scan Matching

Scan matching is an enabling technique in vehicle navigation, map building and obstacle avoidance, produced by laser ranging devices. Scan matching finds the rotation and translation that best align two consecutive scans. Given two point sets {pn, n = 1,2,K,N} and {qm, m = 1,2,K,M} at two consecutive instants, the scan-matching problem is to determine a correspondence n → m(n) for the registration of two scans and a rotation matrix R and translation (shift) vector s such that the objective function is minimized:

E1(1)

Once the mapping m(n) is determined, the optimization of (1) can be solved analytically. The determination of the mapping from n to m(n) is typically accomplished by using an iterative method. This class of methods is termed as iterative closest point (ICP), in which the mapping m(n) is determined by searching for the closest point in the target point cloud. There have been many different variations to the ICP by using a different objective function for minimization, a point-to-plane matching, the removal of boundary and/or low-quality correspondences, and so forth. By repeating the scan-matching process, the rotation matrices and translation vectors can be determined and used in the dead-reckoning navigation process to estimate the position and attitude of the vehicle. In robotics and autonomous vehicles, the scan matching is typically integrated with the map-building process for simultaneous localization and mapping (SLAM).

Figure 1 depicts a representative result when the scan-matching technique is used in the SLAM. In the figure, the vehicle moves from the bottom to the top. As the vehicle moves, the laser rangefinder collects measurements for the determination of the vehicle and the mapping of the environment. The location of the vehicle can be estimated (in green) and the environment can be mapped (in blue) by using the scan-matching and filtering techniques. However, as also depicted in the figure, as the vehicle moves to the end of the corridor the point clouds that are obtained from the laser rangefinder (in red) are constrained, and the change of the pose of the vehicle cannot be accurately determined.

Figure 1. Representative SLAM result.

Figure 1. Representative SLAM result.

Figure 2 shows the original scans at two consecutive instants (in blue and gray, respectively) and the matched scan after the scan-matching process (in red) when the vehicle moves along the corridor.

Figure 2. Scan-matching result 1.

Figure 2. Scan-matching result 1.

At this point, the laser rangefinder obtains measurements that are rich in context. The rotation and translation of the vehicle can be estimated with an acceptable level of accuracy, and the vehicle can be located. In this example, the translation vector is found to be s = [11.07 0.50 –0.58]mm and the minimal error of the objective function is 3.47. When the vehicle moves to the end of the corridor, the scans at two consecutive instants, together with the matched scan, are depicted in Figure 3.

Figure 3. Scan-matching result 2.

Figure 3. Scan-matching result 2.

In this case, only the end wall is observed by the laser scanner, and the determination of the rotation and translation based on scan matching is subject to errors due to the lack of features. Indeed, by applying the scan-matching technique, the translation vector is found to be s = [9.18 –2.84 13.22]T , which is obviously incorrect in the z axis component. Also, the minimal error of the objective function is 3.20, which is smaller than the error in Figure 2. Thus, the error may not provide a fair assessment of the scan matching due primarily to the fact that the error in registration is not taken into account in the objective function (1). In short, lack of features in the environment may induce improper registration and lead to navigation error.

To account for the aforementioned limitation, several methods can be adopted. One can resort to some variations of the scan-matching techniques by, for example, using feature extraction and matching. Blending with other sensors can be employed. In this case, the vehicle can be equipped with gyros to give information on the change of attitude so that the change of translation can be better estimated. This research project addressed this issue by using a context-dependent weighting to quantify the scan-matching results.

Context-Dependent Weighting

Scan matching attempts to investigate the relationship between two consecutive scans to explore the inter-frame characteristics. However, as discussed, the quality of the scan-matching result depends on the richness of features in the scan, which is revealed by examining the intra-frame characteristic. Given a scan in 2D or 3D, some quality indices can be established to assess its characteristic. For example, principal component analysis (PCA) is a widely applied technique to quantify a scan and to obtain normal vector in a polygon environment. For vehicle navigation in an outdoor environment, the PCA approach may be limited. Here, we propose the use of entropy to assess the complexity of the environment of a scan (or image).

Given a set of K random variables, the entropy is defined as

E2,(2)

where pstands for the probability of the k-th random variable. The entropy is a measure that can be used to probe the randomness of a set of random variables. As each probability is bounded by 1, the entropy in (2) ranges between 0 and logK.

To assess the entropy of a scan, which is characterized in terms of a combination of angle and range, the scan is converted through a kernel function to become a density-based map. Several different kernel functions can be used. With the density-based scan, the histogram can be formed to obtain an estimate of the probabilities and, consequently, (2) is used to evaluate the entropy.

Figure 4 and Figure 5 represent the original scan and the density-based scan, respectively. The entropy of the sacn in Figure 4 is evaluated to be 1.17. In contrast, the scan in Figure 6 is found to have an entropy of 0.86. Note that Figure 6 is limited in terms of its features, leading to a smaller entropy.

Figure 4. A representative laser range measurement.

Figure 4. A representative laser range measurement.

Figure 5. A density-based scan.

Figure 5. A density-based scan.

Figure 6. Another scan.

Figure 6. Another scan.

By evaluating the entropy of the scan, the scan-matching result can be quantified. A weighting can indeed be assigned as a function of the entropy for integration with other sensors in the integrated navigation system. A limitation of using laser scan data for the assessment of entropy is the need of the conversion to its corresponding density-based map. In vehicular navigation, a camera is often mounted together with a laser rangefinder. As a result, it is possible to use the image data from the camera for the assessment of entropy.

Figure 7 depicts the navigation system design when the context-dependent weighting is used. The navigation suite uses laser rangefinder, camera and other navigation sensors to estimate the position, velocity and attitude of the vehicle. In this approach, the reference scan is matched with the current reading scan based on the scan-matching technique to produce estimates on the rotation and translation. In the meantime, the current scan is overlaid on the image that is obtained from the camera. The region of interest, which is the image that covers the scan points, is extracted. With respect to the region of interest of the image, the entropy is evaluated. The entropy then serves as an indicator in adjusting the weighting of the rotation and translation. The use of image data is the saving in computational complexity. A potential limitation is that the entropy may be sensitive to the variation of gray scale, or RGB values may affect the result.

Figure 7. Integrated navigation with context-dependent weighting.

Figure 7. Integrated navigation with context-dependent weighting.

Experiments

To verify the applicability of the context-dependent weighting, an experiment is conducted. The vehicle is equipped with the following navigation sensors for the determination of position, velocity and attitude.

  • laser rangefinder
  • camera
  • IMU
  • GPS receiver
  • odometer

In addition, a GPS real-time kinematic (RTK) receiver provides ground truth. The RTK solution is only used in the evaluation process. Figure 8 depicts the location of the sensors after installation in the test vehicle Luxgen U7.

Figure 8. Test vehicle and the locations of sensors.

Figure 8. Test vehicle and the locations of sensors.

The experiment was conducted at a test track of the Automotive Research and Test Center (ARTC), Taiwan, and Figure 9 depicts the track as well as the RTK result. The starting point is at the right upper corner of the track, and the vehicle moves in a counter-clockwise direction.

Figure 9. Test track at ARTC, Taiwan.

Figure 9. Test track at ARTC, Taiwan.

The proposed context-dependent weighting approach is evaluated. To assess the significance of the context-dependent weighting, the navigation system processes the laser rangefinder, IMU and encoder data only as these data are obtained from dead-reckoning sensors. More exactly, the GPS receiver data is not used in the processing to better quantify the contrition of the proposed approach. In practice, the GPS receiver data can be used to account for dead-reckoning sensor errors.

Figure 10 depicts the comparison of the estimated trajectory. In the figure, the RTK result is used as a reference, and the dead-reckoning results with and without the context-dependent weighting are shown. Note that when the context-dependent weighting is not used, the estimated trajectory (in red) is subject to two erroneous turns at the lower left corner and upper right corner, respectively.

Figure 10. Estimated trajectories.

Figure 10. Estimated trajectories.

The entropy as a function of time is evaluated and shown in Figure 11. Note that the entropies are relatively low at 240 seconds and 1960 seconds, respectively. These two instants correspond to the moments when the vehicle is at the aforementioned corners. Through the use of entropy-based context-dependent weighting in the dead-reckoning process, the navigation error is significantly reduced, as shown in the estimated trajectory (in blue). Thus, the effectiveness of the proposed scheme is verified.

Figure 11. Entropy as a function of time.

Figure 11. Entropy as a function of time.

Conclusion

For autonomous vehicle applications, knowledge of the current state (such as position, velocity and attitude) of the host vehicle are needed. For robust and autonomous navigation, many different sensors have been incorporated and fused to form a navigation suite. In fusing different sensor data for better accuracy and integrity, the quality of sensors must be considered. We investigated the use of a scan-matching technique for aided navigation. The context of the environment in terms of the richness of features may affect the quality of the resulting navigation system.

To address the context-dependent issue, we used a context-dependent entropy measure to assess the quality in scan matching. In addition to the increments in translation and rotation, the corresponding quality indices are obtained to better blend the scan-matching result into the navigation system. As a result, anomalous navigation results due to lack of features and improper registration can be better dealt with. The proposed scheme is experimentally verified.

Acknowledgments

The work is supported by the joint NCKU-ARTC research project, Taiwan.


JYH-CHING JUANG received a Ph.D. in electrical engineering from the University of Southern California, Los Angeles. He was with Lockheed Aeronautical System Company, Burbank, before joining the faculty of the Department of Electrical Engineering, National Cheng Kung University, Tainan, Taiwan. His research interests include sensor networks, GNSS signal processing and software-based receivers.

SHANG-LIN YU is an M.S. student in the Department of Electrical Engineering, National Cheng Kung University.

SHUN-HUNG CHEN received a Ph.D. from the Department of Electrical Engineering, National Cheng Kung University. He is with the Electronic Control Technology Group, Research & Development Division, Automotive Research & Testing Center in Taiwan. His research interests include vehicle navigation and autonomous driving.