We tend to think of 'analog' as an ancient technology, long superseded by digital. So businesses are constantly figuring out how to 'go digital'. And thus become better, faster, smarter. I learned analog computing before digital, but that is only rarely done today. We first went digital in the 40s because it turns out that many things we like to do fast, like arithmetic, are easier to codify that way. That's good.
But most of the nature that we depend upon remains analog. Our brains, our sensors, our muscles, our neurons. So why shouldn't we think of our models of complex systems that way? Most recently, deep learning, using models of 'neurons', turn to be very successful as perception engines. Still digital, but analogs of some very analog forms.
In CACM:
Building a Brain May Mean Going Analog By Neil Savage
Communications of the ACM, Vol. 60 No. 7, Pages 13-15 ... "
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment