Prophesee Foresees Event-Driven CIS, Lidar

发布时间:2018-03-14 00:00
作者:Ameya360
来源:Junko Yoshida
阅读量:1081

  In the fast-growing markets for factory automation, IoT, and autonomous vehicles, CMOS image sensors appear destined for a role capturing data not for human consumption but for machines to see what they need to make sense of the world.

  CMOS image sensors “are becoming more about sensing rather than imaging,” said Pierre Cambou, activity leader, MEMS & Imaging at Yole Développement. The Lyon, France-based market research and technology analysis company boldly predicts that by 2030, 50% of CMOS image sensors will serve “sensing” devices.

  Paris-based Prophesee SA (formerly known as Chronocam) styles itself as a frontrunner in that revolution. A designer of advanced neuromorphic vision systems, it advocates an event-based approach to sensing and processing. Prophesee’s bio-inspired vision technology has been deemed too radically different from conventional machine vision — and perilously “ahead of its time.” But Luca Verre, co-founder and CEO of Prophesee, told us that this is no longer the case.

  In a one-on-one interview here, Verre said that his company has secured its Series B plus funding (the startup raised $40 million in funding in the last three years). It now has a partnership deal with a large unnamed consumer electronics company. Most importantly, Prophesee is now advancing its neuromorphic vision system from the usual technology concept pitch to promoting its reference system for tinkering by developers.

  Prophesee’s first reference design, available in VGA resolution, consists of Prophesee’s Asynchronous Time-Based Image Sensor (ATIS) chip and software algorithms. The ASIC will be manufactured by a foundry partner in Israel, said Verre — most likely Tower Jazz.

  The company declined to detail its ASIC and the specification of the reference design. Prophesee said that it is planning on a formal product announcement in several weeks.

  Nonetheless, the startup reached a milestone when the reference design proved able to offer system designers the opportunity to see and experience just what an ATIS can accomplish in data sensing. The ATIS will be characterized by its high temporal resolution, low data rate, high dynamic range, and low power consumption, said Prophesee.

  Cameras are bottlenecks

  Makers of cameras for machine-vision systems — whether in smart factories, IoT, or autonomous vehicles — have begun to heed the event-based approach promoted by Prophesee’s co-founders such as Ryad Benosman and Christoph Posch.

  With all of the detailed visual information that traditional cameras can capture, “the camera has become a technology bottleneck,” said Verre. Unquestionably, cameras are the most powerful sensing device. Yet for visual data in automation systems, surveillance cameras, or highly automated vehicles, cameras could slow down the processing.

  Consider self-driving cars, said Verre. The central processing system inside the vehicle is bombarded with data from cameras, lidars, radars, and other sources. The key to manage this overload is figuring out how best to “reduce the amount of raw data” streamed from sensors. The sensors should only capture data that matters to “a region of interest,” said Verre.

  As Prophesee explained in past interviews with EE Times, the company’s event-driven vision sensors are inspired by biology. This perception derives from the co-founders’ research on how the human eye and brain work.

  Ryad Benosman, Prophesee’s co-founder, told us that human eyes and brains “do not record the visual information based on a series of frames.” Biology is much more sophisticated. “Humans capture the stuff of interest — spatial and temporal changes — and send that information to the brain very efficiently,” he said. That’s principally what Prophesee’s ATIS does.

  Noting that the ATIS is not bound by frames, Verre explained, “Our technology will not have to miss important events that might have happened between frames.”

  In short, what Prophee’s ATIS offers is everything that frame-based image sensors is not. In the view of another co-founder, Christoph Posch, “Frame-based methodology results in redundancy in the recorded data, which triggers higher power consumption.” He said, “Results include inefficient data rates and inflated storage volume. Frame-based video, at 30 or 60 frames per second, or even a much higher rate, causes a catastrophe in image capturing.”

  Event-driven approach for lidars

  Verre last week disclosed to us that Prophesee is exploring the possibility that its event-driven approach can apply to other sensors such as lidars and radars. Verre asked: “What if we can steer lidars to capture data focused on only what’s relevant and just the region of interest?” If it can be done, it will not only speed up data acquisition but also reduce the data volume that needs processing.

  Phrophesee is currently “evaluating” the idea, said Luca, cautioning that it will take “some months” before the company can reach that conclusion. But he added, “We’re quite confident that we can pull it off.”

  Asked about Prophesee’s new idea — to extend the event-driven approach to other sensors — Yole Développement’s analyst Cambou told us, “Merging the advantages of an event-based camera with a lidar (which offers the “Z” information) is extremely interesting.”

  Noting that problems with traditional lidars are tied to limited resolution — “relatively less than typical high-end industrial cameras” — and the speed of analysis, Cambou said that the event-driven approach can help improve lidars, “especially for fast and close-by events, such as a pedestrian appearing in front of an autonomous car.”

  The downside is that lidar hardware would have to be changed, he added. More importantly, though, Prophesee needs a strong buy-in from lidar companies to this event-driven approach.

  Cambou said, “Sure, this is always the problem for a technology startup.” He pointed out that Mobileye needed some lead customers such as Volvo and a Tesla [before having its technology going mainstream and getting broadly accepted]. Movidius, now an Intel company, needed DJI [to become successful]. “Prophesee will need a strong partner in order to have its solution largely adopted,” said Cambou.

  “Given the market drivers in the realm of robotic vehicles (safety first, technology-driven, not so cost-conscious),” he added, “This should be possible.”

  Although Cambou expressed his concerns about a large player such as Google depending on a small startup for its technology, he brushed off this concern by noting that the small volume involved makes this less of an issue.

  ISSCC demo

  While Prophesee is not forthcoming with details of its reference design, it is not vaporware.

  At the International Solid-State Circuits Conference (ISSCC) last month, Prophesee was among companies invited to present their technologies at an Industry Showcase session. The startup attached to a guitar a high-speed VGA camera unit (based on its reference design), demonstrating its Asynchronous Time-Based Image Sensor’s ability to measure and visualize invisible vibrations of guitar chords in real time at such frequencies up to 1.5 KHz.

  Prophesee’s market opportunities

  Prophesee expects the company’s reference design to find its first home in machine-vision applications on the smart factory floor. Verre is counting on event-driven vision systems as Prophesee’s first revenue-generating commercial product.

  But why smart factories?

  Robotic system builders using machine visions worry about three things, said Verre. “First, they want to decrease machines’ downtime via predictive analysis. Second, they must ensure safety for workers as co-bots increase. Third, they are eager to increase the speed of production by accelerating the cadence of what machines can detect.”

  He added, “Our technology can offer high temporal resolution at great precision,” making it ideal for “predictive maintenance — detecting abnormal behavior, area monitoring, and making systems run fast.”

  No frames, no clock

  Prophesee’s ambition extends to IoT. “Our event-driven image sensors are always on, featuring no clock, and running on ultra-low power,” said Verre.

  Last summer, a study by LCV Capital, a VC firm that invests in visual technologies such as computer vision, predicted that 45 billion cameras will be watching the world by 2022. Many will be ultra-low-power streaming cameras doing surveillance in retail spaces and smart buildings, noted Verre. With that much more data to process, Prophesee’s event-driven sensors can offer increased value, he added.

  According to Prophesee, the company’s ATIS offers not only extremely fast vision processing but also a high dynamic range of more than 120 dB. It is very power-efficient, added the company, with operating characteristics of less than 10 mW.

  ATIS for redundancy?

  The startup today sees ADAS and highly automated vehicles as the default opportunities for its sensors. Until last year, said Verra, many autonomous vehicle technology suppliers, including Prophesee, were concerned that the overhyped robotic vehicle market was a bubble destined to pop. He said, “But obviously that did not happen. Instead, we are seeing the acceleration for autonomous vehicle developments and more investment pouring in the field.”

  As Euro NCAP updates its reward system for 2021 and 2022 vehicles, prompting OEMs to include features like autonomous emergency braking and vision enhancement systems, many traditional tier ones and OEMs are “making decisions this year” about what to incorporate into their ADAS/AV cars, explained Verra.

  Of course, Prophesee’s Asynchronous Time-Based Image Sensors will require changes in much of the fundamental data that an image sensor must capture. Furthermore, to exploit event-driven sensors, users will need a new math model — new algorithms — for machine vision. In this circumstance, Verra believes that OEMs are likely to use Prophesee’s ATIS as a redundancy. “We provide temporal resolution that conventional image sensors can’t provide.”

  Alternatively, “our system can be used as a replacement for some lidars in ADAS,” added Verra, “as we can capture relevant information in a more efficient manner.”

  More pre-processing on the edge

  Speaking of recent imaging developments, said Verra, “The industry’s trend is to do a lot more pre-processing on the edge.” A good example is Sony’s three-layer stacking imaging sensors, he noted.

  Sony’s CIS chip consists of three stacked dies, with a 40-nm logic substrate at the bottom. On top of the logic die is DRAM, which is placed like a flip-chip facing downward at the logic. On top of the DRAM, at the very top, are 90-nm backside illuminated pixels.

  In contrast, conventional non-stacking CIS chips are designed to collect signal data from the pixels and send it through the logic circuit and out through the interface serially. This inherently restricts CIS chip speed to the output speed of the interface. In turn, this holds pixel reading speed at the same level.

  The new three-layer stacking CIS is designed to speed up such pre-processing operations.

  Does this mean that if Prophesee wants to go mainstream, it must be able to offer something similar to Sony’s Pixel/DRAM/logic three-layer stacked CIS?

  Yole’s Cambou said, “Definitely, in the future Prophesee will require 3D semiconductor technology to shrink its pixel.”

  However, he added, “Nevertheless, I think they are able to do 15-μm pixel without 3D stack, which should be okay to start doing business.”

  He added, “As a fabless company, it will depend on availability of open foundries for future products.” As time goes by, Cambou suspects that access to 3D technology will get easier. He said, “TSMC is probably able to offer a solution today. Tower Jazz has announced it would go in this direction. SMIC and Dongbu are other potential foundry partners.”

(备注:文章来源于网络,信息仅供参考,不代表本网站观点,如有侵权请联系删除!)

在线留言询价

相关阅读
  • 一周热料
  • 紧缺物料秒杀
型号 品牌 询价
MC33074DR2G onsemi
BD71847AMWV-E2 ROHM Semiconductor
RB751G-40T2R ROHM Semiconductor
CDZVT2R20B ROHM Semiconductor
TL431ACLPR Texas Instruments
型号 品牌 抢购
ESR03EZPJ151 ROHM Semiconductor
STM32F429IGT6 STMicroelectronics
BP3621 ROHM Semiconductor
IPZ40N04S5L4R8ATMA1 Infineon Technologies
TPS63050YFFR Texas Instruments
BU33JA2MNVX-CTL ROHM Semiconductor
热门标签
ROHM
Aavid
Averlogic
开发板
SUSUMU
NXP
PCB
传感器
半导体
相关百科
关于我们
AMEYA360微信服务号 AMEYA360微信服务号
AMEYA360商城(www.ameya360.com)上线于2011年,现 有超过3500家优质供应商,收录600万种产品型号数据,100 多万种元器件库存可供选购,产品覆盖MCU+存储器+电源芯 片+IGBT+MOS管+运放+射频蓝牙+传感器+电阻电容电感+ 连接器等多个领域,平台主营业务涵盖电子元器件现货销售、 BOM配单及提供产品配套资料等,为广大客户提供一站式购 销服务。