Dear colleagues,
let me invite you to the doctoral seminar tomorrow at 9:50 in S6.
The topic of the presentation will be:
Ran Raz: Fast Learning Requires Good Memory: A Time-Space Lower Bound for Parity Learning
In parity learning, the algorithm is trying to learn a hidden vector x by
requesting samples, where a sample is pair (r,b), with r a uniformly random
{0,1} vector and b the resulting bit of "<r,x> mod 2". The algorithm can keep
requesting samples, but cannot influence r.
The paper shows that any algorithm for parity learning, that uses less than
n^2/25 bits of memory, requires an exponential number of samples. Previously,
there was no non-trivial lower bound on the number of samples needed, for any
learning problem, even if the allowed memory size is O(n) (where n is the space
needed to store one sample).
Requirements: basic probability; some Fourier analysis appears in the proofs
but I do not think we will get to see it.
IMPORTANT: There will be tea, cookies and fruit at 9:40! Come and have some to
get energized before the presentation. Exactly at 9:50, I will eat and drink
what's left.
See you there,
Martin Böhm