Amazes me, was the first thing I learned in engineering analytics methodology to optimize systems Originally invented for military supply chains in WWII. Here a technical overview and good historical piece.
Why the Simplex Method, at Age 75, is Still the Go-To Algorithm? By Allyn Jackson, Commissioned by CACM Staff, January 31, 2023
ACM NEWS, January 31, 2023, Daniel A. Spielman
Since its birth more than two decades ago, smoothed analysis has been used to analyze the performance of algorithms other than the simplex method, including interior-point methods for linear programming. It also has guided the design of new algorithms.
In 1947, mathematical scientist George Dantzig invented the simplex method, a powerful and practical means to find solutions to linear programming for optimization problems. Scientists lost no time putting the simplex method to use in a variety of applications across government, industry, science, and engineering.
Half a century later, when Daniel A. Spielman was a Ph.D. student at the Massachusetts Institute of Technology, the simplex method stood at the top of the pantheon of important algorithms, yet research had shown the simplex method had proven pitfalls; it ought not to perform as well as it did. What was going on?
Spielman solved the mystery in 2001, in joint work with Shang-hua Teng, now University Professor and Seeley G. Mudd Professor of Computer Science and Mathematics in the Viterbi School of Engineering at the University of Southern California. Pioneering a technique called smoothed analysis, their work provided an original and compelling explanation of the power of the simplex method and suggested a new paradigm for gauging the effectiveness of algorithms.
This work was recognized in 2008 by the Gödel Prize, sponsored jointly by ACM's Special Interest Group on Algorithms and Computing Theory (SIGACT) and the European Association for Theoretical Computer Science. Now Spielman, Sterling Professor of Computer Science and a professor of Statistics and Data Science and of Mathematics at Yale University, has been awarded the 2023 Breakthrough Prize in Mathematics, in part for this work in optimization.
Spielman has a great ability "to come up with new approaches," said Lance Fortnow, dean of the College of Computing at the Illinois Institute of Technology. "It wasn't like someone else had invented smoothed analysis and Spielman said, 'Let's try it on linear programming'. This was, 'I want to understand linear programming. What's the right way to do it?'"
Simplex Bests Polynomial-time Competition
Dantzig formulated the concept of linear programming as a way to model optimization problems. The model produces a polyhedron, possibly of very high dimension, where the corners represent solutions. The simplex method provides a highly efficient way of moving along polyhedron edges toward an optimal corner. The algorithm was tailor-made for the computing machines that were just beginning to appear when Dantzig did this work.
In the 1970s, the rise of complexity theory brought new precision to the study of efficiency of algorithms. Generally, an algorithm is considered efficient if it runs in polynomial time, meaning that even for the worst-case inputs to the algorithm, the running time is always bounded by a polynomial function of the input size. This is in contrast with algorithms whose running time can increase exponentially with input size.
Soon a curious fact arose: despite its excellent performance in practice, the simplex method is not running in polynomial time. Examples were found on which simplex ran in exponential time. Eventually, polynomial-time algorithms for linear programming were found, but the simplex method continued to be used — and in many situations, outperformed its polynomial-time competitors.
Why does simplex work so well?
This question was in the air when Fortnow was a Ph.D. student at the Massachusetts Institute of Technology (MIT) in the 1980s, several years before Spielman became a graduate student there. However, he recalls, few were attempting to resolve it. "The simplex method had already been around for 40 years," said Fortnow. The general attitude was, "Well, it works well in practice."
When Spielman and Teng came up with smoothed analysis, it was a big surprise. "It's a whole different way of looking at worst-case complexity," said Fortnow, "and they could apply it to linear programming. Both of these pieces were very exciting." ... '
No comments:
Post a Comment