You've successfully subscribed to Warp News
Great! Next, complete checkout for full access to Warp News
Welcome back! You've successfully signed in.
Thank you! Check your email inbox to activate your account.
Success! Your billing info is updated.
Billing info update failed.
πŸ€– Facebook releases platform for robot developments

πŸ€– Facebook releases platform for robot developments

Facebook has created a platform where developers can build robots with the ability to recognize natural language, perform tasks and navigate their surroundings.

Elina Holmgren Tyskling
Elina Holmgren Tyskling

Facebook has released Droidlet, which is an open platform based on open-source code. With the help of the Droidlet, you can build an embodied robot that can recognize natural language, navigate its surroundings and react to some input. Robots built in the Droidlet platform can also be tested in simulated environments.

According to Facebook, the purpose of Droidlet is to deepen the understanding of various problems in robotics and simplify the implementation of machine learning in robots. The platform is based on different modules that can be replaced. These modules include a memory system, perceptual modules, an interpretation module that extracts from the memory what to do, and a task module that stores tasks it is to perform, such as moving an arm, for example.

The systems the user can bring into the modules can be based on machine learning and thus be designed for the robot to learn by itself or not. The platform comes with a dashboard that works as a control interface where the user can communicate with the robot via chat or other means. The dashboard also shows the robot's current state, which includes memory and "tasks".

Facebook believes that the development of smart robots will go faster when the same basic system is used, but components can be replaced. The aim is also to make it easier for developers to share improved systems. In an article, the developers behind Droidlet writes:

"The goal of the droidlet platform is to make it so these improvements are practical (as opposed to theoretical)."

The platform includes what they call a "model zoo" where the user can choose between different properties. For example, there are many vision systems, like "Laser Pointer handler," which gives the robot means for proprioception, which means that the robot has a body perception and knows where the different parts of the body are.

Read more here: GitHub - facebookresearch/droidlet: A modular embodied agent architecture and platform for building embodied agents