Algorized and KUKA Redefine Robot Safety at CES 2026: The First Robot with "Intuition"

 

Algorized and KUKA Redefine Robot Safety at CES 2026: The First Robot with "Intuition"


Algorized, the pioneer in people-sensing foundational models, and KUKA, one of the global leaders in intelligent automation, are introducing a breakthrough that redefines human–robot collaboration: the industry's first Predictive Safety Engine, powered entirely by real-time Edge AI.

Emergency stops, frozen robots, flashing red lights - that's how safety has looked for decades. Automation has lived with a rigid trade-off: the safer the robot, the slower the line. That era ends today.

Debuting at CES 2026, this collaboration integrates Algorized's new foundational model directly into KUKA's robotic arm, allowing machines to do what was previously impossible: perceive human physiology, understand intent, anticipate movement, and adapt in real time - even through occlusion, darkness, or visual clutter. The result: high-speed productivity in shared human–robot spaces. Enabling this capability is the IWR6843AOP mmWave radar sensor from Texas Instruments, which combines SIL2-certified safety with high-resolution sensing to detect human presence and movement with precision – even in conditions where cameras and lidars fail.

Our goal goes beyond human detection - we're addressing the full handover problem.

The Shift: Physics, Not Pixels. Current vision systems are fragile - they fail in low light, get blocked by occlusion, and require massive cloud compute. They see pixels, but they don't understand context.

The Algorized engine runs entirely at the edge, using wireless sensors to digitize the environment based on physics. It delivers five capabilities that redefine the standard for human-machine interaction:

  1. Entity Classification The system knows the difference between a human, a robot, or an asset, directing safety logic precisely where it matters and keeping the rest of the cell productive.
  2. Micro-Motion & Vital Sign Perception It detects breathing and heart rate. A motionless operator is no longer invisible to the machine.
  3. Non-Visual Intent Recognition Understands motion trajectories, posture, and approach angles to predict intent - handover, walkthrough, or potential hazard - and adapts robot behavior accordingly.
  4. Occlusion Immunity It sees through smoke, low light, clutter, and partial blockers.
  5. Sovereign Edge Processing Zero latency. Zero cloud dependency. The intelligence lives on the machine.

The CES Experience: The "Glass Box"

The Demo: The Glass Box At CES 2026, we are stripping away the marketing layers. The "Glass Box" demonstration visualizes the robot's brain in real-time. You will see exactly what the machine sees: the precise moment it classifies a human, predicts the intent vector, and dynamically adjusts its speed - without ever stopping.

Executive Comments

"We are moving from the age of blind automation to the age of aware machines," said Natalya Lopareva, CEO & Co-Founder of Algorized. "We treat safety as an intelligence problem. By giving KUKA robots a foundational model for intuition, we are enabling a workflow where the machine understands the operator's next move before they make it."

"Our mission is to make automation easier and build on top of our deterministic capabilities an evolving technology stack adding software and AI driven solutions of the future," said Christoph Schell, CEO of KUKA Group. "This partnership transforms safety in robotics from a constraint into an intelligent enabler. With Algorized, we're creating robots that are more intuitive, more adaptive, and dramatically easier for people to work with."

SOURCE Algorized

No comments:

Powered by Blogger.