Thanks to Vincent Granville for sending along his 'Empirical Optimization with Divergent Fixed Point Algorithm – When All Else Fail'
Entitled “Empirical Optimization with Divergent Fixed Point Algorithm – When All Else Fails”, the full version in PDF format is accessible in the “Free Books and Articles” section, here. Also discussed in details with Python code in his book “Synthetic Data”, available here. Context him directly here.
While the technique discussed here is a last resort solution when all else fails, it is actually more powerful than it seems at first glance. First, it also works in standard cases with “nice” functions. However, there are better methods when the function behaves nicely, taking advantage of the differentiability of the function in question, such as the Newton algorithm (itself a fixed-point iteration). It can be generalized to higher dimensions, though I focus on univariate functions here.
Perhaps the attractive features are the fact that it is simple and intuitive, and quickly leads to a solution despite the absence of convergence. However, it is an empirical method and may require working with different parameter sets to actually find a solution. Still, it can be turned into a black-box solution by automatically testing different parameter configurations. In that respect, I compare it to the empirical elbow rule to detect the number of clusters in unsupervised clustering problems. I also turned the elbow rule into a fully automated black-box procedure, with full details offered in the same book.
About the author
Vincent Granville is a pioneering data scientist and machine learning expert, co-founder of Data Science Central (acquired by TechTarget in 2020), founder of MLTechniques.com, former VC-funded executive, author and patent owner. Vincent’s past corporate experience includes Visa, Wells Fargo, eBay, NBC, Microsoft, and CNET. Vincent is also a former post-doc at Cambridge University, and the National Institute of Statistical Sciences (NISS).
Vincent published in Journal of Number Theory, Journal of the Royal Statistical Society (Series B), and IEEE Transactions on Pattern Analysis and Machine Intelligence. He is also the author of multiple books, including “Intuitive Machine Learning and Explainable AI”, available here. He lives in Washington state, and enjoys doing research on spatial stochastic processes, chaotic dynamical systems, experimental math and probabilistic number theory. ... '
No comments:
Post a Comment