January 21, 2020
Written byAlexander William Clegg, Abhishek Kadian, Erik Wijmans, Mandeep Baines, Oleksandr Maksymets, Yili Zhao, Amanpreet Singh, Aaron Gokaslan, Wojciech Galuba, Dhruv Batra
Written by
Alexander William Clegg, Abhishek Kadian, Erik Wijmans, Mandeep Baines, Oleksandr Maksymets, Yili Zhao, Amanpreet Singh, Aaron Gokaslan, Wojciech Galuba, Dhruv Batra
A major update to Facebook AI’s open source AI Habitat platform , which enables significantly faster training of embodied AI agents in a variety of photorealistic 3D virtual environments. AI Habitat now supports interactive objects, realistic physics modeling, improved rendering, seamless transfer from virtual to physical environments, and a more flexible user interface with in-browser support for running simulations.
With these enhancements, researchers can use AI Habitat to train and test how agents not only move around in but also interact with photorealistic virtual environments and the objects they contain.
When we released AI Habitat last year, it offered compatibility with Facebook Reality Labs’ Replica simulations, one of the most photorealistic 3D reconstructions of environments available; a flexible and modular design; and highly efficient training that enables rendering 10,000 frames per second on a single GPU.
In addition to numerous tool additions and performance improvements, today’s release builds on these previously existing features in significant ways:
Researchers can now import objects from a library (e.g. household objects from the YCB dataset or furniture models) and perform programmatic scene construction with instructions such as “Add a chair here”.
Habitat now offers support for rigid body physics via the Bullet physics engine, for example to, apply forces/torques or check for collisions.
Researchers can now run the same code in AI Habitat and on a physical robot (such as a LoCoBot) using the Habitat-PyRobot-Bridge. This includes realistic noise models for LoCoBot actuators and depth sensors. (More details are available in this paper.)
Habitat now runs in a browser. By running AI Habitat in a browser with WebGL and a JavaScript API, researchers can easily compare agents’ performance with that of real people.
Habitat now offers TensorBoard support and has an improved API interface for Habitat baselines.
AI Habitat now supports HDR textures in Replica environments and offers preliminary support for Oculus Quest VR.
A new embodied question-answering task has been added to Habitat-API.
By teaching agents in virtual worlds, researchers can make much faster progress on tasks necessary to build better AI assistants and robots that operate more intelligently in complex situations in the physical world. For this training to be most effective, these agents must not only move through virtual environments, but also push, pull, and manipulate objects in those spaces. AI Habitat now makes it easy to do this efficiently and then benchmark results and compare performance across different data sets. These improvements in Habitat will accelerate and simplify the use of virtual environments to develop smarter and more capable agents.
https://github.com/facebookresearch/habitat-sim
https://github.com/facebookresearch/habitat-api
Alexander William Clegg
Research Engineer
Abhishek Kadian
Software Engineer
Erik Wijmans
AI Research Intern
Mandeep Baines
Software Engineer
Oleksandr Maksymets
Research Engineer
Yili Zhao
Research Scientist
Amanpreet Singh
Software Engineer
Aaron Gokaslan
AI Resident
Wojciech Galuba
Research Engineering Manager
Dhruv Batra
Research Scientist
Facebook © 2020