May 4, Tuesday
12:00 – 13:30
1. What constraints should we relax to obtain such an effective over-approximating abstraction? 2. How should we combine information provided by multiple such abstractions?
In this talk we consider both these questions, and present some recent formal and empirical results that help answering these questions (sometimes even to optimality). Specifically,
- Considering Q1, we introduce a generalization of explicit abstractions (such as pattern databases (PDB) and merge-and-shrink) to what we call implicit absractions. The basic idea is in abstracting the problem in hand into provably tractable fragments of optimal planning, alleviating by that the constraint of PDBs to use projections of only low dimensionality. We then introduce concrete instance of this framework called fork-decomposition, and show both formally and empirically that the admissible heuristics induced by the latter abstractions provide state-of-the-art worst-case informativeness guarantees on several standard domains.
- Considering Q2, we describe a procedure that takes a classical planning task, a forward-search state, and a set of abstraction-based admissible heuristics, and derives an optimal additive composition of these heuristics with respect to the given state. Most importantly, we show that this procedure is polynomial-time for arbitrary sets of all known to us abstraction-based heuristics such as PDBs, constrained PDBs, merge-and-shrink abstractions, fork-decomposition implicit abstractions, and implicit abstractions based on tractable constraint optimization.
The talk is based on a joint work with Michael Katz (Technion).