DigiKey, onsemi discuss the intersection of robotics and physical AI

Estimated read time 10 min read

DigiKey and onsemi recently discussed how the latest generation of sensing and AI can help robotics development. Source: DigiKey

When you combine robotics and physical AI – artificial intelligence systems that are embedded in the physical world, particularly autonomous mobile robots, or AMRs — you have the potential to transform industrial and commercial environments.

Shawn Luke, technical marketing engineer at DigiKey; Bob Card, marketing manager at onsemi; and Theo Kersjes, industrial business development and solutions leader at onsemi, recently discussed this topic.

AMRs rely on a range of sensors—such as lidar, cameras, and ultrasonic detectors—to enhance safety, improve productivity, and operate effectively in complex spaces. Drawing a parallel to self-driving vehicles, this conversation highlights how AMRs use similar technologies and principles, including simultaneous localization and mapping (SLAM), to create accurate, real-time maps and localize themselves within dynamic environments.

The experts also explained how AMRs, once limited to controlled indoor settings, are increasingly being adapted for outdoor and unpredictable environments, thanks to advances in sensor integration, edge computing, and AI. As these technologies evolve, AMRs could become even more autonomous, adaptive, and essential in sectors ranging from logistics and manufacturing to agriculture and infrastructure inspection.

Luke: What design considerations are necessary with today’s smarter robots? 

Card and Kersjes: Industrial robots have been around for decades, so they’re experts at what they do, but they can also be dangerous for humans to work alongside in warehouses, factories, and the like. Industrial robots aren’t designed to move around freely in their environment, especially if that environment is dynamic. Smarter robots come into play to ensure that humans working alongside robots are working in physical harmony. 

A variety of sensors, including ultrasonic, image, lidar, radar and more, allow a robot’s algorithm to process and navigate its environment with human safety at the forefront.

Robots can aid in solving problems that humans cannot, like picking up a car or large equipment, and complete tasks that are safety hazards or very repetitive work. Smarter robots can accomplish more with greater flexibility, and, thanks to sensors, they can complete a broader range of tasks enabled by the convergence of physical AI and robotics.

Luke: How are AMRs similar to autonomous vehicles?

Card and Kersjes: AMRs and self-driving automobiles are most similar in their internal communication systems. Traditionally, robots have used CAN (Controller Area Network), a two-wire, multi-drop communication protocol.

However, onsemi is leading in a new technology: 10BASE-T1S, which is an Ethernet-based, multi-drop protocol that also uses a two-wire unshielded twisted pair.

A car with a touch screen in a vehicle.

onsemi uses 10BASE-T1S for simpler communications for AMRs. Source: DigiKey

Key benefits of 10BASE-T1S over CAN:

  • Higher data rates: 10BASE-T1S runs at 10 Mbps, compared to 2 Mbps for standard CAN and 5 Mbps for CAN-FD under ideal conditions.
  • Reduced wiring complexity and weight: This is important for compact, mobile systems like AMRs.
  • No need for gateways: It eliminates the need to bridge CAN and Ethernet networks.

This innovation aligns with a broader trend in automotive and robotics, moving toward zonal architectures and converging communication technologies, where 10BASE-T1S is expected to replace CAN in both fields. 

onsemi offers two 10BASE-T1S controllers, the NCN26010 (MAC & PHY) and the NCN26000 (PHY only), both of which are fully compliant to the IEEE802.3cg specifications as well as supporting onsemi’s ENI (Enhanced Nose Immunity) feature. ENI extends the 40 node, 25meter SPE (Single Pair Ethernet) cable length for a single 10BASE-T1S segment to 50 meters, 16 nodes or 60 meters, 6 nodes. 

This also highlights how advanced computing technologies—once exclusive to data centers—are now running on edge devices in robotics, powered by platforms like NVIDIA Jetson and other embedded processors. This marks an exciting time in the industry, with significant overlap between robotics and automotive solutions.

Luke: What’s next for AMRs? 

Card and Kersjes: Sensors have seen an improvement in their high dynamic range, which enabled robots to be more effective in uncontrolled environments, such as agricultural settings, outside delivery robots, and the like. Force feedback sensors, rotational positioning sensors, and moisture sensors can account for more environmental variabilities and complete finer tasks, like picking berries.

The NCS32100 Inductive Position Sensor (IPS) is an absolute, contactless, Rotary Position Sensor capable of + 50 arcsec or better accuracy up to 6,000 RPM (revolutions per minute), with a maximum of 45,000 RPM at reduced accuracy. Onsemi offers a free online PCB (printed circuit board) design tool enabling quick rotor and stator PCB design. This facilitates the fabrication of a cost-effective and accurate encoder solution to meet the most demanding robotics applications. 

Robots also have great potential to alleviate dangerous, mundane, or undesirable tasks humans aren’t interested in, both in the industrial space and in our personal lives, such as cleaning gutters or house painting. 

Autonomous forklifts have become a safety necessity in the industrial environment. OSHA numbers show about 35,000 accidents happen with forklift operators each year in the U.S. alone, plus there’s a 40% turnover in this role, so tapping into a robot for those functions can increase overall warehouse safety.

Two women in hardhats and safety vests looking at an industrial robot.

Source: DigiKey

Luke: Could you tell us more about SLAM and how it works?

Card and Kersjes: SLAM starts with using a virtual model of a warehouse for a mobile robot to learn its environment. The robot uses trial and error to learn how to navigate the environment before it is ever physically in it.

When the AMR is deployed, it is well-trained and will keep updating its environmental map. It can even navigate around dynamic objects like other robots that are in the environment.

In the automotive space, this concept is known as the “first car.” A vehicle that first encounters a road or a new roadblock must learn from its environment and then send the experience back to the network for other AMRs to learn from and update maps.

An operating system in mobile robots allows us to integrate sensors through a technology called HoloScan, which allows a fast interface between high-definition or high-bandwidth sensors, like image sensors, to copy what the robot sees directly into its memory for processing. This can be vital for other uses, such as telemedicine, where a robot is operating on someone but controlled remotely by a physician, where both latency and bandwidth of the network are critical. 

Modern robots can use two distinct approaches to robot control and decision-making, referred to as “System 1” and “System 2,” which correspond to the different levels or approaches to control and cognition. This is analogous to the human cognitive systems. System 1 in robotics is characterized by fast, reactive, and often pre-programmed behaviors, similar to how humans react instinctively.

System 2, on the other hand, involves more deliberate, analytical, and potentially slower decision-making processes, requiring more complex computations and reasoning. It also taps into AI and symbolic reasoning to do more complex and higher-level tasks. Both types are needed for robots to become safe and self-sufficient around humans. 

Luke: What types of robotic technology are standing out to you?

Card and Kersjes: The technologies we’re seeing used across various evaluation boards and customer products are image cameras. For AMRs, sensor placement is important. For example, if a view is blocked (e.g., by carried objects), sensor clusters in corners are often used to achieve a 360° field of view.

Technologies like e-fuse (electronic fuses) and graceful reset capabilities are important for managing power and enabling higher-level robot functions, such as intelligent behavior during power faults or navigation issues.

Power management is especially critical in mobile robots, which rely on battery packs rather than stable AC power. Because battery voltage can vary significantly (e.g., 30–42V for a 10-cell pack), efficient DC-DC converters (like the FAN65000 family) are necessary. These converters, with an efficiency of over 95%, help maintain multiple DC rails for subsystems and directly impact battery life and efficiency providing longer runtime.

At onsemi, we aim to demonstrate product advantages—like battery life improvements from our latest Trench 10 efficient MOSFET—in digital environments such as NVIDIA Omniverse Isaac Sim. The idea is to simulate robot behaviors (e.g., driving specific paths) and link performance outcomes (like longer battery life) to the underlying hardware benefits.

There’s also interest in demonstrating real-world system-level benefits of components (beyond lab testing like pulse or thermal tests) by integrating them into functional robotics simulations and evaluations.

Additionally, the team collaborates with multiple channel partners who have different microcontroller platforms. To support this, the robot system is designed using Docker containers, allowing the Robot Operating System (ROS) to run in a portable and flexible manner across different hardware platforms. These platforms include NVIDIA Jetson, D3 Embedded, Advantech, Renesas, and AMD.

This approach enables easy adaptation of the robotics software across various partner ecosystems.

Luke: Where do you see the future of robotics headed?

Card and Kersjes: We see 2025 as ‘the year of proof that the robot can do it,’ with so many new levels of innovations that are becoming more mature in the space. From things like indirect time-of-flight (iToF) cameras that can measure distances to objects by analyzing how modulated light waves reflect off surfaces and return to the sensor, to physical AI that helps accelerate the learning and training process for robots.

All the different sensor technologies that are needed for various use cases help the robotics systems remain safe and effective across multiple applications — leading to robotic use and volumes taking off in the coming years.



SITE AD for the 2025 RoboBusiness registration open.

About the experts

Shawn Luke is technical marketing engineer at DigiKey.Shawn Luke is a technical marketing engineer at DigiKey Electronics with over two years of experience at the company. Previously, he was a product manager at Total Expert and a product owner at Medtronic.

Luke holds an MBA from the University of Vermont and a BS in electrical engineering from the University of Minnesota.

Bob Card is marketing manager at onsemi.Bob Card is the Americas marketing manager of the Analog Mixed Signal Group at onsemi. Card has over 16 years of experience at the Scottsdale, Ariz.-based company, where he previously worked as an applications engineer.

Prior to onsemi, Card was a product engineer at National Semiconductor. He holds a BA in Literature from the University of Massachusetts Amherst and an associate’s in electronics from the Wentworth Institute of Technology.

Theo Kersjes is industrial business development and solutions leader at onsemi.Theo Kersjes is a product manager of systems engineering at onsemi. With over six years of experience at the company, he has extensive experience working in and leading multi-site projects. Kersjes has proven success in product development, software, application support, and business development.

Editor’s note: This article was transcribed from DigiKey’s video series.

Learn about the latest in AI at RoboBusiness

This year’s RoboBusiness, which will be on Oct. 15 and 16 in Santa Clara, Calif., will feature the Physical AI Forum. This track will feature talks about a range of topics, including conversations around safety and AI, simulation-to-reality reinforcement training, data curation, deploying AI-powered robots, and more. Attendees can hear from experts from Dexterity, ABB Robotics, UC Berkeley, Roboto, GrayMatter Robotics, Diligent Robotics, and Dexman AI.

In addition, the show will start with a keynote from Deepu Talla, the vice president of robotics at edge AI at NVIDIA, on how physical AI is ushering in a new era of robotics.

RoboBusiness is the premier event for developers and suppliers of commercial robots. The event is produced by WTWH Media, which also produces The Robot Report, Automated Warehouse, and the Robotics Summit & Expo.

This year’s conference will include more than 60 speakers, a startup workshop, the annual Pitchfire competition, and numerous networking opportunities. Over 100 exhibitors on the show floor will showcase their latest enabling technologies, products, and services to help solve your robotics development challenges.

Registration is now open for RoboBusiness 2025.



SITE AD for the 2025 RoboBusiness registration open.

Nuoroda į informacijos šaltinį

Jums tai gali patikti

Daugiau iš autoriaus