July 30, 2024

Share:

Autonomous mobile robots (AMR) can increase productivity, enhance safety and offer substantial cost savings for manufacturers. For these reasons, AMRs will see their adoption spread to almost every industry. The global market for AMRs, valued at $8.65 billion in 2022, is forecast to grow at a compound annual growth rate (CAGR) of 18.3% from 2022-2028.

The idea behind Industry 5.0 is that humans work alongside artificial intelligence (AI)-powered robots which are supporting rather than superseding humans. That’s the vision. Before that can happen, AMRs must overcome several challenges and one of the keys to overcoming them lies in the inclusion of various sensors and the emerging field of sensor fusion.


Challenges Facing AMR Adoption

The number one challenge for AMR adoption is the sheer number of different applications and environments in which they operate. Uses for AMRs have already been identified in warehouses; agricultural technology; commercial landscaping; healthcare; smart retail; security and surveillance; delivery; inventory; and picking and sorting. In all these different environments, AMRs are expected to operate safely with and around people.

There are also situational complexities that make the AMR’s job extremely challenging. There are circumstances we take for granted as humans that AMRs would struggle with. For example, imagine a delivery robot on its way to make the final package delivery and seeing a ball in the middle of the path. The robot may have no problem identifying the ball and avoid hitting it. But would it be smart enough to anticipate a young child running out to retrieve the ball? There are many such complex situations. Would an AMR be able to use a 90-degree mirror mounted on a pole to peak around a corner and anticipate traffic? Would an AMR know that it can’t walk on fresh poured concrete?

Circumstances easily understood by people are more challenging for robots. However, with the right sensors, objects against a bright sun are easier for AMRs to detect than a person. But poured concrete and spilled liquids can be hard to identify. Edges, cliffs, ramps and stairs are all challenging for AMRs. And then there are special circumstances like vandalism where someone tips over an AMR which inspired the first escape maneuver systems.

Addressing many of these challenges requires AI using state-of-the-art large language models (LLMs) and various types of high-performance sensors.


High-Performance Sensors for AMRs

There are different types of sensors available for use in AMRs that are needed for simultaneous localization and mapping (SLAM) and providing distance and depth measurement. Important metrics of sensors include object detection, object identification, color recognition, resolution, power consumption, size, cost, range, dynamic range, speed and if they can operate in various lighting and weather conditions.

Sensor modalities available for use in AMRs include the following:

  • CMOS Imaging
  • Direct time-of-flight (dToF) and indirect time-of-flight (iTOF) depth sensing
  • Ultrasonic
  • Radar
  • Inductive positioning
  • Bluetooth® Low Energy (Bluetooth LE) technology
  • Inertial

Each of the sensor modalities above comes with advantages and tradeoffs. For example, radar offers excellent range and speed in low light or adverse weather conditions but has poor color detection, a high initial cost and is relatively large (which is an important consideration for AMRs). LiDAR has a relatively low initial cost, thanks to the high-volume CMOS silicon foundry processes, and has great at night/direct sunlight detection, but is poor at object classification. Likewise, iToF depth sensors have excellent resolution and low-power processing.

It is evident that a single sensor modality cannot provide all the information required by an AMR to handle all of the challenges mentioned above. Depending on the application and environment, the AMR will require a few to several sensor modalities. And those sensors will not operate in isolation, but rather will function collectively in a process known as sensor fusion.


How Sensor Fusion Enables Autonomous Mobile Robots

Sensor fusion is the process of combining two or more data sources (from sensors and/or an algorithm or a model) to generate a better understanding of the system and its surroundings. Sensor fusion in AMRs is essential as it provides better reliability, redundancy and ultimately safety. Assessments are more consistent, more accurate and more dependable.

As shown in Figure 1 below, sensor fusion combines two functions: data collection and data interpretation.

Figure 1: Sensor fusion process

The “interpret data” step in sensor fusion requires the implementation of either an algorithm or a model. Sometimes sensor fusion results are designed for human consumption, like backing up an automobile, and sometimes they are meant for machine consumption in a next step, like facial recognition in a security system.

Sensor fusion provides several benefits such as reducing signal noise. Uncorrelated noise can be reduced with homogenous sensor fusion while correlated noise can be reduced with heterogeneous sensor fusion.

By its inherent nature, sensor fusion improves reliability through redundancy. Since there are at least two sensors, if data from one sensor is lost, quality is reduced but sensor data is still available from the other sensor(s). Sensor fusion can also be used to estimate unmeasured states such as occlusions, when an object or part of object is hidden from camera, and reflections, when an object or surface reflects light from one camera to another.

As a result of these benefits and its accelerating adoption, several trends have emerged in sensor fusion including the following:

  • Using AI-powered algorithms
  • Enhanced object detection and classification
  • Sensor fusion for cooperative perception
  • Multiple sensor modalities
  • Environmental perception in adverse conditions
  • Sensor fusion for 360-degree surround view
  • Real-time sensor calibration

At the heart of sensor fusion is the sensors. The best algorithms will not produce quality results if the data obtained from the sensors is not good. Fortunately, onsemi offers a library of best-in-class sensors and tools to support sensor fusion in AMRs.


Summary

Autonomous mobile robots have many use cases, and their adoption is accelerating. A set of best practices has emerged to support this rapid adoption. First, it is essential to control the environment to reduce potential collisions that the AMR may encounter. Having designated routines for AMRs/automated guided vehicles (AGVs) in the manufacturing or warehouse facilities can be such an example. Second, it is important to simulate the exact uses cases (with corner cases) using a digital twin during the development. Finally, it is crucial to incorporate sensor fusion with intelligent sensors, algorithms and models.

onsemi has taken a leadership position in intelligent sensing technologies. onsemi offers a variety of both rolling shutter and global shutter image sensors with industry-leading performance in dynamic range and innovative features like wake on motion. In addition to image sensors, onsemi supplies SiPMs for range detection (LiDAR). The portfolio includes ultrasonic sensors, inductive sensors and Bluetooth® LE technology enabled microcontrollers that support AoA (angle of arrival) and AoD (angle of departure), which can be used for position finding.

Sensor fusion in AMRs is positioned to impact industrial and transportation applications significantly towards Industry 5.0 and onsemi is positioned to provide sensors and subsystems to ensure it happens effectively. Learn more about our AMR solutions and system solution guides.


Additional Resources:

Steering Future Autonomous Mobile Robots On-Demand Webinar

Smart & Mobile Robot System Solution Guide

Machine Vision System Solution Guide



Featured

Image Sensors
A broad portfolio of industry leading image sensors that satisfy requirements of every possible end application from wearables and consumer electronics to demanding industrial and automotive applications.
NCV75215
Ultrasonic Parking Distance Measurement ASSP