AI’s 6 Worst-Case Scenarios. Who needs Terminators when you have precision clickbait and ultra-deepfakes? By NATASHA BAJEMA in IEEE Expert
HOLLYWOOD’S WORST-CASE scenario involving artificial intelligence (AI) is familiar as a blockbuster sci-fi film: Machines acquire humanlike intelligence, achieving sentience, and inevitably turn into evil overlords that attempt to destroy the human race. This narrative capitalizes on our innate fear of technology, a reflection of the profound change that often accompanies new technological developments.
However, as Malcolm Murdock, machine-learning engineer and author of the 2019 novel The Quantum Price, puts it, “AI doesn’t have to be sentient to kill us all. There are plenty of other scenarios that will wipe us out before sentient AI becomes a problem.”
“We are entering dangerous and uncharted territory with the rise of surveillance and tracking through data, and we have almost no understanding of the potential implications.”
—Andrew Lohn, Georgetown University
In interviews with AI experts, IEEE Spectrum has uncovered six real-world AI worst-case scenarios that are far more mundane than those depicted in the movies. But they’re no less dystopian. And most don’t require a malevolent dictator to bring them to full fruition. Rather, they could simply happen by default, unfolding organically—that is, if nothing is done to stop them. To prevent these worst-case scenarios, we must abandon our pop-culture notions of AI and get serious about its unintended consequences.
1. When Fiction Defines Our Reality…
Unnecessary tragedy may strike if we allow fiction to define our reality. But what choice is there when we can’t tell the difference between what is real and what is false in the digital world?
In a terrifying scenario, the rise of deepfakes—fake images, video, audio, and text generated with advanced machine-learning tools—may someday lead national-security decision-makers to take real-world action based on false information, leading to a major crisis, or worse yet, a war. .... '
No comments:
Post a Comment