Collision detection and tracking technologies to make collaborative robots safer around humans and other moving objects are emerging from startups and research labs. A University of San Diego team has designed a faster algorithm to help robots avoid obstacles using machine learning, and Massachusetts Institute of Technology spinoff Humatics is developing artificial intelligence (AI)-assisted indoor radar systems so robots can precisely track human movements.
The UCSD algorithm, called Fastron, speeds and simplifies collision detection using machine learning. It classifies collisions or non-collisions of moving objects based on a model of the robot's configuration space (C-space) that uses only a small number of collision points and collision-free points. Existing collision detection algorithms are computation-intensive because they specify all the points in the 3D geometries of robots and their obstacles, and then check every point for possible collisions between two bodies. When those bodies are moving, computational load increases drastically.
Fastron's C-space model is used as a proxy for kinematic-based collision detection. The algorithm combines a modification of the kernel perceptron learning algorithm with an active learning algorithm to reduce the number of kinematic-based collision detections. Instead of checking each point, the algorithm checks near the boundaries and classifies collisions versus non-collisions. The classification boundary between them changes as objects move, so the algorithm rapidly updates its classifier and then continues the cycle.
In simulations, the team has demonstrated that proxy collision checks can be done twice as fast as an efficient polyhedral checker, and eight times as fast as an efficient high-precision collision checker, without the need for GPU acceleration or parallel computing. In addition to use on the factory floor, one potential application is helping robotic surgical arms autonomously perform assistive tasks more safely during surgery, without interfering with either the patient's organs or surgeon-controlled robot arms.
Startup Humatics was co-founded by CEO David Mindell, who is also a professor of aerospace engineering and history of technology at MIT. Its chief product officer is Stephen Toebes, former senior vice president of product development and operations for collaborative robotics leader Rethink Robotics, makers of Baxter and Sawyer robots. Its vice president and principal software architect is Michael Barbehenn, former vice president of software and a longtime leader at Kiva Systems/Amazon Robotics, pioneers in mobile warehouse robots.
The company's Spatial Intelligence Platform combines a micro-location system based on inexpensive RF technology with AI-assisted analytics software. A single system can track multiple, moving transponder targets with millimeter-scale precision at ranges of up to 30 meters. Multiple systems can be networked together for broader coverage, from factory work cells to entire distribution centers.
Humatics wants to create a world with small, inexpensive RF beacons that can do centimeter- and millimeter-scale absolute reference positioning: indoors, outdoors, and in all weather, said Mindell. "Collaborative robots don't really know where people are. To be incorporated into human environments they will all have to be position-navigating the things around them. So we think the future of autonomous robots, whether they are cars, or automated robots in factories, or drones, will all be part of a connected world."
Although the company is working in small- and short-range radar, "we're ambivalent about the term radar," said Mindell. "Our Spatial Intelligence Platform is not a backscatter system, it's a secondary radar system in the way air traffic control is secondary, or beacon-to-beacon." The current system is a millimeter-accurate, 3D measurement single unit that can track a large number of mobile, small battery-powered or vehicle-powered beacons, or "pucks," on moving objects like people or other robots, and that gives millimeter-scale tracking at very high update rates, he said.
The company is building its own analytics, said Mindell. "Everything is moving around at the millimeter scale, so there's a lot of richness in this precision micro-location data." The core algorithms are basic recursive estimators, which are self-tuning and self-optimizing. As they gather reams of data they become better at analyzing motion and position, he said.
The system's hardware and software are inexpensive and scalable. The industry has driven down the cost of microwave and millimeter-wave electronics, and standard APIs for the technology let the data be used by other applications and services. The architecture is extensible, so the system can be networked throughout a large factory or other space to provide broader coverage, said Mindell. The solution will be piloted in 2018 and launched in early 2019.
在线留言询价
型号 | 品牌 | 询价 |
---|---|---|
TL431ACLPR | Texas Instruments | |
BD71847AMWV-E2 | ROHM Semiconductor | |
MC33074DR2G | onsemi | |
RB751G-40T2R | ROHM Semiconductor | |
CDZVT2R20B | ROHM Semiconductor |
型号 | 品牌 | 抢购 |
---|---|---|
BU33JA2MNVX-CTL | ROHM Semiconductor | |
BP3621 | ROHM Semiconductor | |
TPS63050YFFR | Texas Instruments | |
STM32F429IGT6 | STMicroelectronics | |
IPZ40N04S5L4R8ATMA1 | Infineon Technologies | |
ESR03EZPJ151 | ROHM Semiconductor |
AMEYA360公众号二维码
识别二维码,即可关注