Classification noise
Random noise vs Adversarial noise. Getting (x, l) instead of (x, c(x)).
Malicious noise: both x and c(x) corrupted.
Random Classification noise model
Uniform noise
Statistical Query (SQ) learning model
(Kearns). A \textbf{statistical query} (criterion, tolerance) =
Show non-constructive SQ learnability
If
Non uniform algorithm
So make SQs:
Also
Simulate SQ learning
Noise absent: Take many examples to find
Efficient PAC learnability with noise
Any C efficiently learnable using SQ is efficiently PAC learnable with classification noise. So, PAC model more powerful than SQ:
(Also using statistical queries): Conjunctions. k-CNF: by feature expansion.
Halfspaces (\why: Blum etal): Important subroutine for various learning algs.
Learning Parity
Learning k-PARITY with random noise: Measure errors of all
If parity learnable in polytime, DNFs efficiently learnable under U: see reduction from agnostic parity learning.
\oprob: If you can learn parity when there is constant noise, can you learn it when
Random persistant classification noise
Classification noise model with modification:
Motivation: Random classification noise can be beaten if MQ(c) oracle present: For each example, make