I remember having to explain to managers why dice had to be 'loaded' in some of the simulations we built. Isn't that biased? Yes, but being loaded is required in accurate models. The article in Quanta Magazine shows why. See also the link to the MIT work, which further explains the why.
How and Why Computers Roll Loaded Dice
Stephen Ornes, Contributing Writer in QuantaMagazine
Researchers are one step closer to injecting probability into deterministic machines.
Here’s a deceptively simple exercise: Come up with a random phone number. Seven digits in a sequence, chosen so that every digit is equally likely, and so that your choice of one digit doesn’t affect the next. Odds are, you can’t. (But don’t take my word for it: Studies dating back to the 1950s reveal how mathematically nonrandom we are, even if we don’t recognize it.)
Don’t take it to heart. Computers don’t generate randomness well, either. They’re not supposed to: Computer software and hardware run on Boolean logic, not probability. “The culture of computing is centered on determinism,” said Vikash Mansinghka, who runs the Probabilistic Computing Project at the Massachusetts Institute of Technology, “and that shows up at pretty much every level.”
But computer scientists want programs that can handle randomness because sometimes that’s what a problem requires. Over the years, some have developed sleek new algorithms that, while they don’t themselves generate random numbers, offer clever and efficient ways to use and manipulate randomness. One of the most recent efforts comes from Mansinghka’s group at MIT, which will present an algorithm called Fast Loaded Dice Roller, or FLDR, at the online International Conference on Artificial Intelligence and Statistics this August. ... "
Wednesday, July 08, 2020
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment