printable pdf
比利时vs摩洛哥足彩 ,
university of california san diego

****************************

special recruitment colloquium

dr. misha alekhnovitch

ias princeton

learnability and automatizability

abstract:

we consider the complexity of properly learning concept classes, i.e. when the learner must output a hypothesis of the same form as the unknown concept. we present the following new upper and lower bounds on well-known concept classes: \vskip .1in $\bullet$ \hangindent=29pt we show that unless $np = rp$, there is no polynomial-time learning algorithm for dnf formulae where the hypothesis is an or-of-thresholds. note that as special cases, we show that neither dnf nor or-of-thresholds are properly learnable unless $np = rp$. $\bullet$ \hangindent=29pt assuming that $np \not \subseteq dtime(2^{n^\epsilon})$ for a certain constant $\epsilon <1$ we show that it is not possible to learn size $s$ decision trees by size $s^{k}$ decision trees for any $k \geq 0$. previous hardness results for learning decision trees held for $k \leq 2$. $\bullet$ \hangindent=29pt we present the first non-trivial upper bounds on properly learning dnf formulae and decision trees. in particular we show how to learn size $s$ dnf by dnf in time $2^{\tilde{o}(\sqrt{n \log s})}$, and how to learn size $s$ decision trees by decision trees in time $n^{o(\log s)}$.

host: sam buss

january 24, 2005

3:00 pm

ap&m 7321

****************************