Through the Eyes of Robots

As robots travel, whether by land or air, they need to perceive what’s going on around them.

They require a 360-degree view of their surroundings so that they can safely make it from Point A to Point B as they gather data for specific missions. This awareness is key to ensuring a successful operation for any autonomous vehicle and is the driving force behind researchers at the Waterloo Autonomous Vehicles Laboratory (WAVELab) seeking to advance robotic capabilities. Professor Steven Waslander has led those efforts since he first created the lab in 2009.

“There’s a lot of perception and planning that goes into making these robots do what you want them to do,” said Waslander, who is also an associate professor in the Department of Mechanical and Mechatronics Engineering at the University of Waterloo in Ontario. “The question is, how do you take the data from your sensors, cameras, and GPS and fuse that into a picture around the robot so it can make smart decisions about its actions.”

With the help of NovAtel, the team at the University of Waterloo is using Real Time Kinematic (RTK) technology to answer that question. RTK positioning techniques use the GNSS signal carrier-phase to provide ranges (and therefore positions) that are orders of magnitude more precise than those available through code-based positioning. The real-time corrections that NovAtel’s technology provides give WAVELab’s vehicles the high accuracy they need to safely complete their missions, for localization of a vehicle on its own or for relative positioning between multiple vehicles.