Prof. dr. Hans Bodlaender

Prof. dr. Hans Bodlaender

Professor
Algorithms and Complexity
+31 30 253 4409
h.l.bodlaender@uu.nl

My research focuses on Algorithms and Complexity. Many computational tasks for computers (and other computing devises) are hard and time consuming. A central question is: how can we build methods (algorithms) that solve these tasks in an efficient way? Given a specific task, we can ask: how hard is it (for a computer to solve)? Are there fast algorithms, or are all algorithms necessarily slow? How much time is needed? And, how much other resources (e.g., memory for the computation, number of parallel processes, etc.) are needed? 

I work and have worked on various topics in this field, including:

  • Fixed parameter complexity. Suppose some parameter of our task is small: e.g., we want to solve a facility location problem (for example: find the optimal location of k hospitals on a map such that the maximum time to travel to a hospital is as small as possible; with the map and k given), but the number of facilities to be placed (k) is small. Several tasks become much easier in the case that such a parameter is small. The area of fixed parameter complexity looks at such cases and comes with faster algorithms, but also a mathematical analysis what is, or is not possible to achieve.
  • Kernelization: mathematical analysis of preprocessing. A common approach when solving a hard problem is to apply preprocessing before applying computationally slow exact algorithms. The notion of kernelization makes a precise mathematical analysis of this: can we give a guarantee on the result of a `safe and sound' preprocessing method?
  • Graphs and networks with a tree-like structure. Many computationally hard problems on graphs and networks appear to become significantly easier if the network (or graph) has a representation with a tree-structure. One of the most successful approaches uses the notions of tree decomposition and treewidth. My work contains algorithms to determine if such tree representations exist, and if so, find them, and look at (improved) methods to exploit these representations.

To study the algorithmic complexity of problems, we both design new algorithms, analyze their efficiency, but also use complexity theoretic tools to predict that algorithms with certain efficiencies can be expected not to exist.

Completed Projects
Project
KERNELS 01.09.2009 to 31.08.2013
General project description
NWO project on Kernelization: Analysis of preprocessing of combinatorial problems
Role
Project Leader
Funding
NWO grant