Author

Marisa Smith

Defense Date

5-5-2009

Graduation Date

2009

Availability

Immediate Access

Submission Type

thesis

Degree Name

MS

Department

Computational Mathematics

School

McAnulty College and Graduate School of Liberal Arts

Committee Chair

Jeffrey Jackson

Committee Member

John Kern

Committee Member

Mark Mazur

Keywords

No Free Lunch (NFL) theorems, optimization, probability theory, learning theory

Abstract

The No Free Lunch (NFL) theorems for optimization tell us that when averaged over all possible optimization problems the performance of any two optimization algorithms is statistically identical. This seems to imply that there are no "general-purpose" optimization algorithms. That is, the NFL theorems show that, mathematically, any superior performance of an optimization algorithm on one set of problems is offset by inferior performance of that algorithm on the set of all other problems. In this thesis we consider the seemingly negative implications of the NFL theorems. We first extend a previous NFL theorem to get a new NFL result. We then use ideas from probability theory and cryptography to show that if we believe that extraordinarily small probability events will not happen, then there exists (at least) one algorithm that is indeed a general-purpose algorithm. Thus, the implications of the new NFL result are not as negative as expected.

Format

PDF

Language

English

Share

COinS