In the CACM: Thoughtful piece on 'Good enough' computing. Many think of computers as being very accurate entities. Based on the inevitability of 0 or 1. Things like measurements are always imprecise to some degree. Then to what degree should calculations be imprecise? For example a concept in the math of optimization is known as Delta-Optimality,which means that sometimes it is far easier, or even possible, to prove that an answer is within some delta of the very best, than finding the very best itself. And that is often good enough.
That's my technical view, but the article looks at this much more broadly. How much computing do we need? And what does that look like? Are we spending too much for accuracy we do not need? Had not thought of it this way before.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment