link

March 17, Wednesday
12:00 – 13:30

Distributed (Delta+1)-coloring in linear (in Delta) time
Graduate seminar
Lecturer : Leonid Barenboim
Affiliation : CS,BGU
Location : 202/37
Host : Graduate Seminar
The distributed (Delta + 1)-coloring problem is a fundamental problem in distributed computing, which is motivated by various tasks such as symmetry breaking, channel allocation in wireless networks, and more. The network is modeled by a graph G = (V,E), where vertices represent processors, and edges represent communication links. The goal of legal vertex coloring is assigning colors to vertices, such that each two connected vertices are assigned distinct colors. The number of colors used is at most (Delta + 1), where (Delta) is the maximum degree in G. The distributed (Delta + 1)-coloring problem was intensively studied starting from mid-eighties. In most of the algorithms, a common technique is used, called “iterative color reduction”. In 1993, Szegedy and Vishwanathan came up with a heuristic argument stating that any algorithm that uses the iterative color reduction technique is unlikely to achieve smaller running time than Omega(Delta log Delta). In 2006, Kuhn and Wattenhoffer presented an algorithm with running time O(Delta log Delta + log^*n). This algorithm was the best known prior to our work.

We improve this result, and break the heuristic barrier of Szegedy and Vishwanathan by devising a (Delta + 1)-coloring algorithm with running time O(Delta) + 1/2 log^* n. Our algorithm employs a novel technique in distributed coloring that is apparently stronger than the iterative coloring reduction technique. In this technique, we efficiently construct colorings with relaxed requirement that each vertex has at most m neighbors colored by the same color. Such coloring is called an m-defective coloring. We show how to construct such coloring very efficiently for certain range of values of m. This construction can be then transformed into a legal (Delta + 1)-coloring. In addition this technique yields a tradeoff between the number of colors and the running time, resulting in O(Delta * t)-coloring with running time O(Delta/t + log^* n). Since many problems use the coloring problem as a block box, our results imply better solutions for these problems. In particular, our results yield an O(Delta) + 1/2 log^* n time algorithm for the problem of Maximal Independent Set, which improves previous results on bounded degree graphs. Appeared in STOC 2009. A joint work with Michael Elkin.