Should we delight or be worried?
The world’s most freakishly realistic text-generating A.I. just got gamified By Luke Dormehl in DigitalTrends
What would an adventure game designed by the world’s most dangerous A.I. look like? A neuroscience grad student is here to help you find out.
Earlier this year, OpenAI, an A.I. startup once sponsored by Elon Musk, created a text-generating bot deemed too dangerous to ever release to the public. Called GPT-2, the algorithm was designed to generate text so humanlike that it could convincingly pass itself off as being written by a person. Feed it the start of a newspaper article, for instance, and it would dream up the rest, complete with imagined quotes. The results were a Turing Test tailor-made for the fake news-infused world of 2019.
Of course, like Hannibal Lecter, Heath Ledger’s Joker, or any other top-notch antagonist, it didn’t take GPT-2 too long to escape from its prison. Within months, a version of it had found its way online (you can try it out here.) Now it has formed the basis for a text adventure game created by Northwestern University neuroscience graduate student Nathan Whitmore. Building on the predictive neural network framework of GPT-2, GPT Adventure promises to rewrite itself every time it’s played. It’s a procedurally generated game experience in which players can do whatever they want within the confines of a world controlled by the fugitive A.I.
And you know what? Not since Sarah and John Connor teamed up with The Terminator to take on Skynet has the world’s most dangerous artificial intelligence been quite so much fun. ... "
Sunday, September 15, 2019
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment