In Pixar’s WALL-E, a lonely robot evolves beyond its programming. It learns, adapts, and acts intelligently in an uncertain world. That same principle, active inference, underpins the VERSES approach to robotics.
Traditional robots rely on rigid scripts or massive data training, excelling only in predictable settings like assembly lines. VERSES flips that model with Genius™, an intelligence platform that builds predictive world models. Rather than memorizing data, agents anticipate, infer, and adapt when faced with uncertainty, mirroring how living systems learn.
In Meta’s Habitat benchmark simulation, our robotics models cleaned rooms, stocked fridges, and set tables without pre-training, outperforming other systems with a 66.5% success rate versus the previous best alternative of 54.7%. The key is Variational Bayes Gaussian Splatting (VBGS), a method that represents 3D environments through billions of probabilistic data points, allowing robots to perceive, reason, and update their understanding continuously, without forgetting what they know.
To help interpret and prioritize what robots perceive, Genius™ turns to hierarchical active inference, which unifies two functions of the human brain that are usually treated separately: control, which pushes goals downward into concrete actions, and planning, which adjusts how strongly each goal should be pursued as conditions shift. These systems work together constantly, balancing competing demands rather than following rigid rules.
These capabilities are now being tested beyond the lab. In our research labs in Belgium, VERSES demonstrated how drones and security cameras could coordinate autonomously during a mock warehouse break-in, forming a kind of digital nervous system.
If a traditional system faces conflicting goals, for example tracking a moving object while also avoiding an obstacle, it can behave unpredictably when the environment changes. Our sensors instead update both the goals and their priorities. If visibility drops or uncertainty spikes, the planning layer increases the level of caution; the control layer then adapts, slowing, repositioning, or reallocating attention as needed.
Genius™ enables machines to reason about changing conditions, anticipate, and act to coordinate resources and demands effectively. An array of devices from drones to industrial robots to traffic sensors can share knowledge securely and collaborate intelligently, enabling devices to communicate instantly. This showcases “distributed intelligence” at the edge using software that runs locally, without having to rely on cloud connectivity.
Much like a company removing internal silos between departments to improve collaboration and efficiency, our system dissolves barriers between devices. This shift toward adaptive intelligence echoes recent MIT research and is at the core of the Spatial Web, the next evolution of the Internet. VERSES is integrating Spatial Web standards into Genius™, and with trial runs in homes, streets, skies, an increasingly interconnected future is no longer speculative. Our real-world WALL-E moment has arrived.