Driving Off-Road Autonomy: An AG Showcase

A farmer's work is never done. From month to month, week to week, day to day, there’s always a new task to perform: mow, harrow, till, plant, weed, spot spray, fertilize, inspect, spray for pests, harvest, fallow. With multiple fields and multiple crops in those fields—often an economic necessity to survive—the succession and rotation of chores can be dizzying and never-ending. Each task calls for a different set of equipment and imposes a different set of positioning, navigation and awareness requirements.

Farmers today must automate at least some of their workflows as the workforce available to them shrinks continually. To do so, they need autonomous equipment as multi-talented and nearly as intelligent as themselves. Until now, that has not been available.

“Autonomous ag vehicles today are built for one task,” says Mike Martinez, agriculture segment manager at Hexagon | NovAtel. “These robotic machines are built for a specific purpose like picking strawberries or spot-spraying weeds. Farmers are looking for vehicles that can do multiple things. Machine manufacturers are able to integrate other types of implements [with tractors] now.

“The autonomous tractor, or power unit if you will, has to be able to do everything. It has a long learning curve. It needs to know what kind of field, what kind of crop, what kind of implement it’s pulling—those are all different, and they all have different requirements.”

For years, the NovAtel team has been focused on the positioning aspect in agriculture, providing many different OEM manufacturers with technology like the SMART7 antenna for assured positioning. The giant leap into building an autonomous tractor came from a desire to demonstrate to these ag powerhouses how to unlock the full potential of positioning and properly integrate it with other sensors, enabling autonomous farm operations.

“We’re applying autonomy to agriculture where the industry really needs it today,” adds Martinez. The message to the ag industry is: Hexagon, with its brands NovAtel and AutonomouStuff, are now in this space and leveraging their experience from other autonomy markets.

Hexagon | AutonomouStuff Goes Off-Road

Long known for its expertise in putting driverless kits and capability aboard automobiles, AutonomouStuff plows new ground here with its algorithms and fusion of perception sensors and data. Combining its command of relative positioning with NovAtel’s vast repertoire of knowledge in absolute positioning sets the mold for autonomy in many off-road environments and applications.

“We’re excited to use this tractor as a platform to validate the human identification, obstacle detection and enhanced environmental awareness that our sensing kits add to our assured positioning solutions in agriculture,” said John Buszek, VP of products and services at Hexagon | AutonomouStuff. “The sensing and positioning technologies we’ve integrated on this demonstration platform showcase the Smart Autonomous Mobility portfolio, which enables and accelerates the development of autonomy in agriculture applications from prototyping to production.”

Designed for OEMs

“In choosing a demonstration platform for off-road autonomy,” added Tanner Whitmire, business development manager, agriculture at Hexagon | NovAtel, “we wanted to have a vehicle or platform that would be recognizable to a global audience, something that was also a little different than what has been seen in the marketplace.”

“The platform, the tractor, is to demonstrate different types of autonomous technology,” rejoined Martinez. “We don’t provide it to the end customer, we tailor our services to the OEM manufacturers. Here we’ve fashioned a tool, a very advanced tool, to educate the manufacturers as to what type of perception sensors will be used and how they will be integrated with the smart software that will bring them into working together.”

“For example, today a human driver plowing a field can stop the tractor when he sees a person or an object in its path. When you remove that driver, the tractor needs the same or better intelligence to understand if it will run into a person or structure. What types of sensors are required? A camera or LiDAR? These sensors are very new in the ag industry, and manufacturers are trying to define what the accuracy requirements are.”

The NovAtel team, in addition to targeting OEM agricultural manufacturers, will demonstrate their capabilities to a large number of other autonomous robotic machine designers and integrators, as well as engineers tasked with developing autonomous solutions. “It takes a lot of expertise to build a full solution,” says Whitmire. “There are a bunch of different categories of tasks—vehicle control, position, perception, path planning and so on—that must all work together flawlessly. There are pieces that different manufacturers will concentrate on, there are specialties that integrators will specialize in, like path planning, or the decision-making process. But they may not have that full technical competency yet to tie it all together.”

“There are countless companies trying to do pieces of this. Where we are different is that we have a developed a way of successfully integrating it all: the corrections services, the GNSS and inertial and sensors, and the algorithms that synthesize that data and make them intelligently useful.”

“We concentrate on providing influential technology to the manufacturers to make their machine operate more safely and accurately.”

The Elements of Autonomy

The autonomous tractor is engineered to overcome all the obstacles it may encounter in the field. It combines positioning solutions, correction services, a range of complementary sensors, sensor integration and fusion, and system safety analysis to create a comprehensive autonomous platform designed specifically for agricultural needs. The suite can be broken down conceptually into four main sectors: a sensing electronic control unit (ECU), a positioning ECU, a safety ECU and an automation ECU.

“Each sensor has its pros and cons,” says Martinez. “Manufacturer need to be able to better understand how they could use them. So, the idea was to showcase the sensors that fit this type of platform and demonstrate what proper sensor selection looks like and how important that is to performance.”

While accurate positioning and navigation may at first seem to be the primary concerns, safety is of paramount importance, and the different control units have overlapping and complementary responsibilities in regards to providing safety in an autonomous work environment. “Today there isn’t a good understanding of the safety requirements in the ag industry like there is in automotive or mining,” says Martinez. “We are leveraging the the expertise of NovAtel and AutonomouStuff; we’re incorporating their safety knowledge expertise into agriculture. There’s not yet a safety protocol for a full autonomous level that exists today in ag.”

Sensing ECU

This high-level component typically includes LiDAR for obstacle detection and relative localization, radar for obstacle detection, cameras for scene and object classification and an ultrasonic or sonar sensor for obstacle detection.

Relative localization involves multiple aspects: localizing the tractor to any other vehicles in the field or farmyard, localizing the tractor to a farming implement that it is pulling and localizing the tractor to the crop in the field. A LiDAR laser measurement sensor accomplishes these tasks. It possesses a 360° surround view for localization with up to a 200-meter range. With 32 channels, it can generate 600,000 points per second to furnish real-time 3D data and modeling, including distance and calibrated reflectivity measurements along with rotational angles.

In addition to 3D modeling, the LiDAR unit furnishes object detection, continuous tracking of those objects as the tractor moves, and, with a little bit of additional intelligence, object classification.

“LiDAR-supplied data enables relative localization to help navigate in an orchard, for example, or any other less favorable GNSS conditions,” says Whitmire.

Obstacle detection, avoidance, environmental awareness and an emergency stop function are the responsibility of the radar unit. The requisite forward detection of both moving and fixed objects calls for identifying the range, altitude, direction and speed of objects, to enable tractor functions such as forward collision warning, brake support and headway alert. The autonomous tractor carries radar units furnishing simultaneous long-range (174 meters) and mid-range (60 meters) awareness.

Object classification encompasses identifying objects in motion, situational awareness and operational safety. The high-sensitivity cameras on the autonomous tractor provide object detection, delivering the information needed for decision-making. Increasingly, these are thermal vision or thermal infrared cameras to cope with the variety of difficult visual environments that may be encountered on the farm, from complete darkness to drifting smoke, rain, snow, dust and fog.

The infrared camera on the autonomous tractor sees up to 200 meters away. Multiple cameras may be mounted atop the tractor, pointed at different angles, with variable settings for 24°, 32°, 50° and 75° horizontal field-of-view options. They provide either 30 Hz or 60 Hz thermal video to the tractor’s central processing system.

Two other visual cameras aboard the tractor complement the thermal infrared sensor. “Cameras may determine which weed to pick and which weed not to pick,” Whitmire points out.

Processing and applying all the high-rate data that flows in from the sensors is an extremely complicated task, requiring very advanced software stacks. This is a crucial area where NovAtel and AutonomouStuff provide essential expertise to the ag OEMs, as we’ll see further on.

Positioning ECU

NovAtel’s expertise, quite naturally, lies in powerful GNSS receivers for static and dynamic heading that has further developed into GNSS + INS integration for full attitude and positioning awareness in both open-sky and obstructed environments. The tractor relies on the industry-leading accuracy, availability and reliability of the SMART7 receiver with its built-in precision antenna to provide resilient and reliable continuous positioning and navigation. It is multi-GNSS, drawing on GPS, GLONASS, BeiDou, Galileo and QZSS signals for the ultimate in availability and accuracy.

The SMART7 is enhanced with tightly coupled SPAN GNSS and inertial technology for continuous, assured, 3D positioning, velocity and attitude through GNSS outages. SPAN technology draws on micro-electromechanical (MEMS) gyros and accelerometers for data at a 200 Hz rate, generating continuously available high-precision positioning.

“Dual SMART7 antennas utilizing SPAN technology provide instantaneous heading on start-up,” says Whitmire. “Most vehicles today have a single receiver with a single antenna, which requires the vehicle to move a particular distance to help calibrate the gyros inside the receiver, to define the heading, then the semi-autonomous system can take control. As we transition to full autonomous, the requirement for heading on startup becomes more vital.”

Further, NovAtel’s ALIGN firmware combines two GNSS receivers mounted atop the tractor to provide high-precision heading and pitch angles for relative positioning. For precision agriculture, this feature supplies terrain compensation that is fully adjustable, for more accurate heading in uneven topography—as a field slopes or as the tractor rises and falls while crossing tilled rows, for example.

TerraStar-X GNSS Correction Services bring the positioning accuracy into pinpoint range. “We can execute at 2.5 centimeter accuracy in convergence of less than one minute with TerraStar-X,” says Martinez.

Safety ECU

The concern for safety in autonomous vehicle operation extends into every aspect of operation. It involves vehicle plausibility, handshakes between the different ECUs, emergency brake and emergency stop (E-brake and E-stop), object detection, naturally, and can also factor in remote manual control of autonomous machinery, furnished by wireless mobility for personal protection.

As mentioned earlier, the safety concept has not yet been rigorously defined in the autonomous agricultural realm. “We are using the tractor to help identify the gaps from a semi-autonomous to a fully autonomous machine,” says Martinez. “Today, there are different levels of autonomy. And the terms in use at these levels are very vague. At a high level, everyone has a general understanding of what the term is, but when you get into the details of what the specific application needs to be, it seems the terminology starts to change drastically, depending on what’s needed from the autonomy.”

“One example would be Level 4 for autonomous applications. Today if you were to pull up a Google search for formalized terms or protocols, Level 4—which is formularized for automobiles on a road—would generally state that the autonomous vehicle would execute tasks autonomously with a person monitoring the vehicle.

“But if it’s executing out in the field in an agricultural setting, you need a human within line of sight of the vehicle to ensure no harm is being done. It’s that last safety point, providing an emergency stop in case something happens.

Martinez continues, “In the automotive industry, a car would drive on the highway by itself. In ag, that tractor may have to drive down a dirt road and go to that field and execute those tasks. The field may be 10 or 30 miles away. Depending on who you talk to, that vehicle may need to drive itself to the field. Or, another person may think, load/drive/unload, and then it becomes autonomous in the field.”

“Today, we are making a tractor with the ability to drive by wire. A person can sit behind the wheel, control the vehicle manually. Then when you get to the field, take the operator out of the cab and operate autonomously in the field. It’s not designed today to drive from the machine yard to the field. That could be incorporated down the road.”

People assume if you follow ASIL D, the highest level of protection in the risk classification system for the automotive industry, they should work together seamlessly when ag fully defines its terminology. That may not be the case. Hexagon | NovAtel is learning and incorporating those types of practices into their development, as they make a semi-autonomous vehicle into a fully autonomous vehicle.

Automation ECU

Software stacks are where autonomy comes into its own: all the sensors and control modules work together to power a machine and make crucial decisions without human intervention. This involves sensor fusion, situational awareness, time synchronization vision processing, local path planning and more. Processing the information in a computing system that safely executes autonomous operations ultimately provides a tractor that will function nearly as intelligently as the farmer-operator does.

Elements of the autonomy compute module include:

  • An advanced computer for autonomous applications, designed to fuel emerging GPU-accelerated applications and support relative localization and object detection and identification processing.
  • The Platform Actuation & Control Module (PACMod), a fully integrated by-wire kit and key software stack from AutonomouStuff, performing electronic control of the vehicle, taking the information from the absolute positioning system and hosting the speed and steering control software.
  • Speed and steering control is a software that mimics human behavior, responsible for pulling a particular implement at an appropriate rate and with appropriate maneuvering as it drives through the field.
  • Shuttle automation software is responsible for path planning.


“Not every sensor is integrated today in the by-wire system,” says Martinez. “What’s included helps educate autonomous machine manufacturers. We will fuse the cameras and radar into our autonomy-compute module for 360-degree obstacle detection and human identification. The LiDAR and thermal sensors are there to showcase what raw data you would be able to receive so they can better understand how to integrate for a specific application.

“Here’s how it works in the case of obstacle detection, say of a person in the field where it is operating. The autonomy-compute module would receive the absolute position of a vehicle, the LiDAR and camera will send their data, then the compute module will fuse the information to provide the intelligence, the relative localization of the obstacle. It will also calculate the motion that the human is walking and that data will be tied back in to the decision-making power. It knows its buffer zone is 15 meters, and calculates that the human is within that range. So a command is sent to stop the vehicle,” says Martinez.

The autonomous tractor is designed as a showcase of these ECUs. Not every piece of it will be put onboard for OEM customers, nor will these manufacturers use it straight out of the box, straight off the shelf.

“The prospective clients will want to do some of the components themselves,” adds Martinez, “and then get information and expertise from us. They will tailor it to blueberry farms, for example. They buy the sensor package from us and the human identification piece. Then they’ll make the decisions of what to have the vehicle do with that information. For instance, you can’t swerve to avoid an obstacle in permanent row crops. So, our customer will make that decision on what maneuver is undertaken.

“What goes hand in hand with that goes back to positioning. Today when you look at steering technologies, they record their guidance lines for path planning, and it goes off of their tractor’s GNSS position. The recorded position typically doesn’t account for what goes on with the implement, or if the vehicle is not set up correctly. The rows may be at a different location than the path planning thought. Today the operator can make it perform how it is supposed to. Take the operator out of the picture and you need artificial intelligence to make it pick the right weed or harvest the right strawberry, to execute the task efficiently.”

Levels of Autonomy

The NovAtel team is using the tractor to help identify the gaps from a semiautonomous to a fully autonomous machine in agricultural operations. As we transition from well-defined road operations to operating a tractor in a field, the different levels of autonomous operation become vague.

As the team continues to work closely with agricultural OEMs to adapt various models of the autonomous tractor to different specialized operations. These models will feature different sets of equipment and different sets of positioning, navigation and awareness requirements. But the promise of autonomous agriculture will gradually come into focus.

The farmer may never leave the field, and the farmer’s work may never be totally and finally “done.” But with the autonomous tractor, a new age of agriculture is coming where a farmer’s expertise of cultivation and harvesting can be extended to autonomous equipment.

Read the full PDF here: