Inuitive Announces NU4100 IC Robotic Processor
Listen to this article
Inuitive, Ltd., a Vision-on-Chip processor company, has announced the launch of its NU4100, an extension of its Vision and AI IC portfolio. Based on Inuitive’s unique architecture and advanced 12nm process technology, the IC NU4100 supports built-in dual channel 4K ISP, enhanced AI processing, and depth sensing in a low-power, single-chip design, setting the new industry standard for Edge-AI performance.
The NU4100 is the second generation of the NU4x00 product series. The NU4x00 series is ideal for robotics, drone, virtual reality, and artificial intelligence applications that require multiple sensor aggregation, processing, conditioning, and streaming. It is specifically designed for robots and other applications that need to sense and analyze the environment using three, six or more cameras, as they make actionable decisions in real time based on this input.
“Robot designers are demanding higher resolutions, ever-increasing channel counts, and improved, high-performance AI and VSLAM capabilities,” Shlomo Gadot, CEO of Inuitive, said. “The addition of the NU4100 to the Vision-on-Chip series of processors is a real breakthrough, based on all the integrated vision capabilities, combined in a single full-mission computer chip. The integrated dual-camera ISP provides much-needed flexibility without having to add additional components, which, in turn, requires additional processing power at a higher price.
Mr. Gadot also said, “Inuitive is committed to bringing the most advanced technology to market. NU4500, the next processor on our roadmap, is slated for tape in the first quarter of 2023 with 8 additional cores of ARM A55, more than double the computing power of AI and H.265 and H.264 video encoders and decoders and will be the ultimate single-chip solution for robotics and applications .
The NU4100 supports multi-camera designs and can simultaneously process and stream two imager channels up to 12MP, or 4K resolution, each at 60 frames per second (fps), while running advanced AI networks. This IC improves the level of integration of products using Inuitive technology and accelerates AI processing power by 2X to 4X while consuming 20% less power than the first generation of Inuitive.
The new NU4100 has been quickly adopted by industry leaders CE and Metaverse, already securing it for their market products, instead of any alternative. Client products powered by NU4100 will be available from Q1 2023.
“Robots are increasingly dependent on vision processors. Their ability to perceive and understand the environment is fundamental to achieving a higher level of robot autonomy,” said Dor Zepeniuk, CTO and VP of Product at Inuitive. “The processing of input streams from multiple cameras extends the independence and flexibility of the robot, while the built-in dual channel 4K ISP improves system capabilities. Both, in turn, serve the end goal of designing powerful products at lower cost.
Key features and capabilities of the new NU4100 include:
- Proprietary Inuitive Deep Vision Accelerators (IDVA):
- High-throughput, low-latency stereo hardware engine
- SLAM HW Accelerators
- General Purpose Imaging/Vision Engines
- Dual camera ISP unit – up to 12Mp per video stream
- Dual-core Vision-DSP with 384GOP – optimized for computer vision functions
- Efficient AI engine with 3.2 TOP processing power for DNN
- ARM Cortex-A5 processor running Linux operating system
- Connectivity for up to 6 cameras
- Fast interfaces – USB3.0, MIPI CSI/DSI – Rx & Tx, LPDDR4 and more
The high resolution and advanced AI processing provided by the new IC can benefit many other Edge-AI applications. Applications such as Industry 4.0 facilities can take advantage of Edge-AI’s high performance and image resolution for better process control and a higher level of automation. Similarly, drones can use ISP and Neural Network vision effects, such as low-light enhancement, to operate autonomously in dark and bright environments.
NU4100 samples are already available and will be ready for mass production by January 2023.
Comments are closed.