AGRICULTURE
As robots travel, whether by land or air, they need to perceive what’s going on around them.
They require a 360-degree view of their surroundings so that they can safely make it from Point A to Point B as they gather data for specific missions. This awareness is key to ensuring a successful operation for any autonomous vehicle and is the driving force behind researchers at the Waterloo Autonomous Vehicles Laboratory (WAVELab) seeking to advance robotic capabilities. Professor Steven Waslander has led those efforts since he first created the lab in 2009.
“There’s a lot of perception and planning that goes into making these robots do what you want them to do,” said Waslander, who is also an associate professor in the Department of Mechanical and Mechatronics Engineering at the University of Waterloo in Ontario. “The question is, how do you take the data from your sensors, cameras, and GPS and fuse that into a picture around the robot so it can make smart decisions about its actions.”
With the help of NovAtel, the team at the University of Waterloo is using Real Time Kinematic (RTK) technology to answer that question. RTK positioning techniques use the GNSS signal carrier-phase to provide ranges (and therefore positions) that are orders of magnitude more precise than those available through code-based positioning. The real-time corrections that NovAtel’s technology provides give WAVELab’s vehicles the high accuracy they need to safely complete their missions, for localization of a vehicle on its own or for relative positioning between multiple vehicles.
At the WAVELab, Waslander and his team focus on outdoor autonomy for aerial and ground vehicles, and how they can advance robot perception, planning, coordination, and control for new applications. The team conducts a variety of research projects through the lab and has completed experiments involving quadrotor flight control in the presence of winds, visual Simultaneous Localization And Mapping (SLAM), laser scan registration, and autonomous driving.
The lab is organized around three research topics: aerial vehicles, ground rover vehicles and autonomous driving. Lately, the team has been looking for a way to replace expensive laser scanner sensors with vision systems that perform the same function, Waslander said. Laser scanners are known for providing high-quality data of the environment around robots as they move, making it easy for them to travel safely without bumping into objects. But these laser scanners are very expensive—about $70,000 each—and a hindrance to smaller vehicles.
To make safe movement more economical, the WAVELab team is developing new state-of-the-art robot perception algorithms, Waslander said. As the “ground truth” for their research, the WAVELab uses NovAtel’s OEM615™, a dual-frequency GNSS receiver card that can track all four satellite systems—GPS, GLONASS, Galileo, and BeiDou.
NovAtel’s RTK system employs a fixed base station that broadcasts RTK data to the aerial vehicle, Waslander said. This determines the vehicle location, relative to the base station, with an accuracy of two to five centimetres. The GNSS measurements enable the researchers to define the “true” motion of the platform, and then compare their processed results to the true path to see how well their algorithmic innovations work.
“Robots measure things that are relative to them, taking image or laser scans of the environment immediately around them,” Waslander said. “You’re collecting information as you move through the environment. If there’s an error in how you think you’re moving, every step in a new area accumulates more error. Over time, the errors grow and grow.”
The researchers’ approach seeks to minimise those errors and enable the team to fly an Unmanned Aerial Vehicle (UAV) even in environments where GNSS isn’t available—through tunnels or under a canopy of trees, for example—while assuring them that the vision system is working the way they expect it to. The next step is to build maps from the visual data collected by the onboard sensors, reconstructing the obstacles, trees and buildings around the robot.
“My end goal is to use both systems together for inspection operations; so, we have RTK from NovAtel and vision estimates of our motion, and we can compare and fuse these measurements to monitor the quality of both sets of data,” Waslander said. “When GPS fails, the vision can take over, and when we are far away from useful visual features, the GPS measurements will be more than enough to safely and accurately control the vehicle.”
With NovAtel’s solution, Waslander said he can trust that the data gathered is precise and will enable the team to know exactly how reliable the measurements are.
In June 2015, the WAVELab team took a DJI Spreading Wings S900 UAS out to Knox Mountain in Kelowna, British Columbia, to participate in field trials of the Natural Sciences and Engineering Research Council of Canada (NSERC) Canadian Field Robotics Network. They used NovAtel technology to test a new algorithm during flight, Waslander said, employing a method known as Multi-Camera Parallel Tracking and Mapping, or MCPTAM.
For this experiment, MCPTAM used three wide-field-of-view cameras running at 30 Hz to track UAV motion through unknown environments, Waslander said. During the June experiment they flew over varied terrain that included cliffs, trees, fields, paths, and a nearby lake, and tracked the vehicle’s motion throughout with MCPTAM and a NovAtel OEM615 receiver.
As part of the experiment, the team looked for distinct features in each image, tracked them in subsequent frames and triangulated both the feature points and the vehicle motion at the same time, Waslander said.
“We get very accurate estimates of our own motion, comparable in quality to the RTK GPS solutions from NovAtel, and therefore can augment our GPS-only systems to be able to handle flights under bridges, tree cover, in urban canyons, etc., so as to maintain control when too few satellites are available,” he said. “We can use this work to inspect bridges and transmission towers, and to travel quickly through cities or forests for delivery and search and rescue type applications.”
One of the main challenges with this type of experiment is getting the algorithms to work during flight, Waslander said. “As you can imagine, we can’t just keep strapping laptops on these vehicles,” he said. “There’s limited computation we can do, and with the speed these vehicles capture images, things change very rapidly. So, you have a huge amount of data coming into a single embedded computer. You have to be careful about how you deal with that information.”
Waslander and the WAVELab team have also used NovAtel’s RTK positioning in precise docking, successfully landing an aerial vehicle on a moving ground target during one of their experiments.
To make this happen, the team put NovAtel OEMStar® GPS+GLONASS single-frequency receivers on an Ascending Technologies (AscTec) Pelican quadroter drone and a Clearpath Husky ground vehicle. They used relative positioning to determine how far apart the two vehicles were and a decentralized controller to converge on a common point, enabling the drone to land on the ground vehicle autonomously. Again, the relative accuracy required was in the two-to-five-centimetre range. Multiple successful landings were performed, even in 15 kph winds.
“We got great signals on both units,” Waslander said. “You can rely almost entirely on the RTK solution for the landing.”
Although they haven’t landed a UAV on a slow-moving boat using RTK just yet, it’s an experiment the WAVELab team would like to take on. Normally, accomplishing such a feat involves cutting power to the motors over a safety net or enlisting someone on the boat to seize the legs of the UAV and perform a “grab and kill maneuver.” Neither version is fully autonomous or safe for either the operator or the UAV. Fully automating the landing procedure reduces the associated risks.
Much like with a moving ground rover, landing a UAV on a moving boat requires accurate relative position measurements of the quadrotor, meaning using GNSS alone just isn’t enough. Standalone position measurement errors on the UAV and boat can add up to 10 metres in position error between the two vehicles, making a successful landing impossible. NovAtel’s RTK system eliminates that error.
The ability to safely land a UAV on ground or water-based platforms offers a variety of benefits, Waslander said, including making it possible to swap out batteries and sensors during field operations in order to extend mission times.
“The aerial vehicles are very limited in payload capacity and battery life; so, you can only do small missions,” Waslander said. “We looked at trying to couple them with ground vehicles and boats because they can carry extra batteries and swappable sensors. So, you can dock and swap the batteries or the sensors, then boat or drive off to the next task.”
Beyond that, teaming up unmanned aerial and ground vehicles can dramatically improve ground vehicle coverage for search or mapping operations, Waslander said, while allowing the aerial vehicle to go further afield without running out of power.
This ability also allows automatic deployment from ships, he pointed out, enabling UAVs, for example, to scout for or track icebergs outside their ship’s field of view. This application could be useful for oil drilling in the Arctic or enabling fully autonomous border surveillance and other long-duration aerial missions in which multiple vehicles are switched in and out of operation as needed.
This same technology could also make it possible to dock UAVs on cars, trucks, trains, and large aircraft that are in motion, Waslander said, opening up even more applications.
With the emergence of a commercial UAS market, robots now have the capability to perform many tasks, from precision agriculture to cinematography.
Waslander predicts the next wave of advancement in this technology will focus on precision positioning and relative positioning, making it possible to perform more complicated tasks such as detailed pipeline and mine inspections. This is an area in which the WAVELab is looking to advance—with the help of high-accuracy, global and relative positioning solutions from NovAtel.
“My sense is we’re on the cusp of big things in robotics, and from what I’ve seen it’s finally at the point where the sensing is good enough and the computation is fast enough that we can make smart decisions on robots in real time,” Waslander said.
“The kind of information we can collect will enable us to make smarter decisions about the operations of facilities and infrastructures, but we can only do that if we precisely control how the vehicle moves in relation to dangerous objects,” he added. “That’s why we’re focused on getting vision, GPS, and other sensors all integrated into a common picture of what’s going on around the robot, so that it can make the right decision and keep itself safe while still providing the information the operator needs.