Photo: NovAtel|Hexagon LogoAutonomous platforms, whether a car, UAV, rover or hauling truck, require a system that has assured positioning, navigation and other sensor outputs. The basic build of an autonomous system requires navigation systems, GNSS positioning, and sensors like RADAR, LiDAR, vision and ultrasonics. There are many possible combinations of sensors and solutions; however, the key to enabling safe autonomous operations is the software stack bringing these positioning and navigational measurements together to control the platform.

Photo: Hexagon | AutonomouStuff

Figure 1 The autonomy software is housed within the AutonomouStuff Spectra (top), a GPU computing edge AI system.

The software used in autonomous platforms must be robust and reliable. Hexagon | AutonomouStuff leveraged decades of robotics experience and control algorithm knowledge when deploying the software stack. Built using Apollo, an open-source software stack, AutonomouStuff further refined it to create a comprehensive and reliable software platform flexible to a platform’s sensor requirements yet robust for GPU-accelerated edge computing.

“Think of this software stack as a brain powering the autonomous platform,” said Kevin Fay, portfolio manager of vehicle platforms at AutonomouStuff. “Without an adequate brain taking in all measurements from perception sensors and positioning systems, the platform cannot be fully autonomous.”

Fay added that the software stack is adaptable to be customised across platforms and equipment needs: “You can use different sensors and other hardware, but your software has to be powerful enough to process the data – in near real-time.”

Navigating the world through sensors

The Apollo software stack with AutonomouStuff’s refinements has been implemented across various platforms, most recently a Ford Transit vehicle.

Photo: Hexagon | AutonomouStuff

Figure 2: The inner workings of an autonomous vehicle, from perception sensors to GNSS+INS positioning hardware.

AutonomouStuff implemented a drive-by-wire system that enabled electronic control of the vehicle, then installed positioning, navigation and perception sensors. The result is a platform ready to be autonomous as soon as the software stack is integrated.

“The drive-by-wire system acts as the platform’s nervous system,” said Fay. “The brain – autonomy stack – controls the vehicle through drive-by-wire. The software only understands what controls and decisions need to be made thanks to its perception sensors.”

AutonomouStuff supplies high-quality perception sensors from vendors like Velodyne and Valeo. Visible and thermal vision sensors are used for object detection, classification and lane modelling. LiDAR uses laser measurements for localisation, object detection and ground plane detection and segmentation, while RADAR identifies the range, direction or speed of moving and fixed objects. Ultrasonic sensors act as the system’s ears by monitoring the echo of high-frequency sound waves to understand their orientation to nearby objects. GNSS systems with inertial measurements also help orient the system in free space as the platform navigates.

“It’s incredibly vital for the software stack to have the power to handle measurements coming in from multiple sensors for near real-time processing,” said Fay. “There are sensors you can add that help with post-processing or mapping an area, but for the platform to operate autonomously and safely, the computation must be done in near real-time like any other brain.”

Validating a high-tech brain on rural roads

Photo: Hexagon | AutonomouStuffIn a collaborative project with the National Advanced Driving Simulator at the University of Iowa, AutonomouStuff worked with the Automated Driving Systems (ADS) for Rural America project to outfit a Ford Transit 350HD shuttle for autonomous operation.

Rural roads provided a unique challenge to test their Apollo autonomy stack, said Fay. “There’s a broader range of speeds, the possibility of wildlife or heavy equipment on the roads and the variety of roads themselves from asphalt to gravel. It was the best place to put our autonomy system to the test.”

In the phased project, AutonomouStuff deployed, refined and further developed the Apollo software stack to ensure the Ford Transit would be reliable, functional and, above all, safe for passengers and other drivers on the road.

While the project is ongoing, the autonomy software is ready to be shared in AutonomouStuff’s by-wire and software solution designed to fast-track development of autonomous transit shuttles, last-mile delivery and other university R&D programs using a Ford Transit platform.

A brain for any situation

The Ford Transit is the latest in a long line of platforms that AutonomouStuff has supported in the journey to autonomy. While the Ford Transit is ideal for commercial shuttle transit, last-mile delivery and university autonomy programs, AutonomouStuff supports other platforms from class-8 to off-road. Drive-by-wire and electronic controls, perception kits and positioning systems can be customised for the unique needs of each platform. The autonomy software stack is installed across platforms and designed to the operational and environmental needs of the application.

“Flexibility is key,” added Fay. “We want customers to be able to come to us with a list of their requirements for an autonomous platform to be successful. Then we get to work to make that happen.”

To learn more AutonomouStuff’s platforms, autonomy software stack and positioning, navigation and perception sensors, visit autonomoustuff.com.


Images: Hexagon | AutonomouStuff

This page was produced by North Coast Media’s content marketing staff in collaboration with Hexagon | AutonomouStuff. NCM Content Marketing connects marketers to audiences and delivers industry trends, business tips and product information. The GPS World editorial staff did not create this content.