Title: Near Linear Lower Bound for Dimension Reduction in L_1
Authors: Alexandr Andoni, Moses S. Charikar, Ofer Neiman and Huy L. Nguyen
Abstract: Given a set of n points in L_1, how many dimensions
are needed to represent all pairwise distances within a specific
distortion? This dimension-distortion tradeoff question is well understood
for the L_2 norm, where O((log n)/\eps^2) dimensions suffice
to achieve 1+\eps distortion. In sharp contrast, there is a significant
gap between upper and lower bounds for dimension reduction in
L_1. A recent result shows that distortion 1+\eps can be achieved
with n/\eps^2 dimensions. On the other hand, the only lower bounds
known are that distortion D requires n^{1/O(D^2)} dimensions and that
distortion 1+\eps requires n^{1/2-O(\eps\log(1/\eps))} dimensions. In this work,
we show the first near linear lower bounds for dimension reduction
in L_1. In particular, we show that 1+\eps distortion requires at least
n^{1-O(1/log(1/\eps))} dimensions.
Our proofs are combinatorial, but inspired by linear programming.
In fact, our techniques lead to a simple combinatorial
argument that is equivalent to the LP based proof of Brinkman-
Charikar for lower bounds on dimension reduction in L_1.