May 10, 2011: Bodo Manthey: Smoothed Analysis of Algorithms

May 10, 2011Smoothed Analysis of Algorithms
Room: Zi 5126Bodo Manthey
12:30-13:30

Many algorithms and heuristics have a remarkable performance in practice, but theoretical analyses are negative or inconclusive. The reason for this is that algorithms are usually analyzed by worst-case or average-case analysis: Worst-case analysis is often dominated by artificially constructed instances that rarely show up in applications. Average-case analysis is dominated by random inputs, which also may not resemble realistic inputs.

Smoothed analysis has been invented to cope with this dilemma: We analyze the performance of algorithms on arbitrary inputs that are subject to slight random perturbations. These perturbations model, e.g., measurement errors or numerical imprecision. They rule out pathological worst-case instances, while most of the structure of the input is preserved and the instances are not fully random.

We will introduce smoothed analysis and present two examples: the k-means method for clustering and heuristics for Euclidean optimization problems.