Why ML interfaces will be more like pets than machines
When I talk to people about what’s happening in deep learning, I often find it hard to get across why I’m so excited. If you look at a lot of the examples in isolation, they just seem like incremental progress over existing features, like better search for photos or smarter email auto-replies. Those are great of course, but what strikes me when I look ahead is how the new capabilities build on each other as they’re combined together. I believe that they will totally change the way we interact with technology, moving from the push-button model we’ve had since the industrial revolution to something that’s more like a collaboration with our tools. It’s not a perfect analogy, but the most useful parallel I can think of is how our relationship with pets differs from our interactions with machines.
To make what I’m saying more concrete, imagine a completely made-up device for helping around the house (I have no idea if anyone’s building something like this, so don’t take it as any kind of prediction, but I’d love one if anybody does get round to it!). It’s a small indoors drone that assists with the housework, with cleaning attachments and a grabbing arm. I’ve used some advanced rendering technology to visualize a mockup below:
No comments:
Post a Comment