MENLO PARK — Facebook AI has launched a significant update to its open-source AI Habitat platform, aimed at enhancing the training of embodied AI agents in realistic 3D virtual environments. This update introduces interactive objects, realistic physics modeling, and more flexible simulation options, enabling researchers to train AI agents to not only navigate but also interact with their surroundings.
AI Habitat now supports importing a variety of objects from libraries like YCB, allowing researchers to programmatically construct scenes. It also integrates rigid body physics through the Bullet physics engine, allowing AI agents to push, pull, and manipulate objects, making their interaction with the virtual world more dynamic and realistic.
Additionally, the platform now allows seamless transfer from virtual to physical environments, with enhanced support for robots like LoCoBot through the Habitat-PyRobot-Bridge. This feature includes realistic noise models for actuators and depth sensors, further bridging the gap between virtual simulations and real-world applications.
One of the most exciting improvements is the ability to run AI Habitat in a web browser using WebGL and a JavaScript API. This enables researchers to compare the performance of AI agents with real human users easily. The platform also offers TensorBoard support, an improved API, and preliminary support for Oculus Quest VR.
A new embodied question-answering task has also been added, expanding the platform’s capability to train AI in interactive scenarios that require both environmental navigation and object manipulation.
These updates will accelerate the development of AI assistants and robots that can function intelligently in real-world environments, bringing researchers closer to creating more advanced and capable AI agents.