McAnulty College and Graduate School of Liberal Arts
No Free Lunch (NFL) theorems, optimization, probability theory, learning theory
The No Free Lunch (NFL) theorems for optimization tell us that when averaged over all possible optimization problems the performance of any two optimization algorithms is statistically identical. This seems to imply that there are no "general-purpose" optimization algorithms. That is, the NFL theorems show that, mathematically, any superior performance of an optimization algorithm on one set of problems is offset by inferior performance of that algorithm on the set of all other problems. In this thesis we consider the seemingly negative implications of the NFL theorems. We first extend a previous NFL theorem to get a new NFL result. We then use ideas from probability theory and cryptography to show that if we believe that extraordinarily small probability events will not happen, then there exists (at least) one algorithm that is indeed a general-purpose algorithm. Thus, the implications of the new NFL result are not as negative as expected.
Smith, M. (2009). A No Free Lunch Result for Optimization and Its Implications (Master's thesis, Duquesne University). Retrieved from https://dsc.duq.edu/etd/1216