Had not heard of this before, but interesting take by Facebook. Still have my doubts about Facebook, they seem to be too intent to get into our brains, but some of their research work is worth understanding. I like thinking about elements of the process as being composed of agents that have intent. And the ability to readily test alternatives. Gets to intelligence that way. Tell me more Facebook.
Introducing droidlet, a one-stop shop for modularly building intelligent agents
The annals of science fiction are brimming with robots that perform tasks independently in the world, communicate fluently with people using natural language, and even improve themselves through these interactions. These machines do much more than follow preprogrammed instructions; they understand and engage with the real world much as people do.
Robots today can be programmed to vacuum the floor or perform a preset dance, but the gulf is vast between these machines and ones like Wall-E or R2-D2. This is largely because today’s robots don’t understand the world around them at a deep level. They can be programmed to back up when bumping into a chair, but they can’t recognize what a chair is or know that bumping into a spilled soda can will only make a bigger mess.
To help researchers and even hobbyists to build more intelligent real-world robots, we’ve created and have open-sourced the droidlet platform
Droidlet is a modular, heterogeneous embodied agent architecture, and a platform for building embodied agents, that sits at the intersection of natural language processing, computer vision, and robotics. It simplifies integrating a wide range of state-of-the-art machine learning (ML) algorithms in embodied systems and robotics to facilitate rapid prototyping.
People using droidlet can quickly test out different computer vision algorithms with their robot, for example, or replace one natural language understanding model with another. Droidlet enables researchers to easily build agents that can accomplish complex tasks either in the real world or in simulated environments like Minecraft or Habitat.
There is much more work to do — both in AI and in hardware engineering — before we will have robots that are even close to what we imagine in books, movies, and TV shows. But with droidlet, robotics researchers can now take advantage of the significant recent progress across the field of AI and build machines that can effectively respond to complex spoken commands like “pick up the blue tube next to the fuzzy chair that Bob is sitting in.” We look forward to seeing how the research community uses droidlet to advance this important field. ... '
No comments:
Post a Comment