TR95-041 Authors: Alexander E. Andreev, Andrea E. F. Clementi, Jose Rolim

Publication: 1st August 1995 18:47

Downloads: 1536

Keywords:

We prove an optimal bound on the Shannon function

$L(n,m,\epsilon)$ which describes the trade-off between the

circuit-size complexity and the degree of approximation; that is

$$L(n,m,\epsilon)\ =\

\Theta\left(\frac{m\epsilon^2}{\log(2 + m\epsilon^2)}\right)+O(n).$$

Our bound applies to any partial boolean function

and any approximation degree, and thus completes the study of

booleanfunction approximation, introduced by Pippenger, concerning

circuit-size complexity. As a consequence, we provide the approximation

degree achieved by polynomial size circuits on a `random' boolean function;

that is

$$App_0(n,l(n) =n^k) = \Theta \left( \frac{\sqrt{ (n^k- n)}}{(2^{\frac n2}) } \right) \ , \: \: k >1.$$

As an application, we obtain a non trivial upper bound on the hardness function

$H(f)$ introduced by Nisan and Wigderson; that is, for any boolean function $f: \sEd^n \rightarrow \sEd$:

$$H(f)\ \le\ 2^{n/3}+n^2+O(1).$$

The optimal bound for $L(n,m,\epsilon)$ gives also a general

criterium for determining the quality of a given learning

algorithm for partial boolean functions.

The contribution in our proofs can be viewed as a new technique

based on some particular algebraic properties of linear operators

for approximating partial boolean functions.