Logic Seminar

James Freitag
UIC
Random query learning and model theory
Abstract: In this talk, we will describe the work of Angluin and Dohrn (2017) where a variation on equivalence query learning is analyzed. Specifically, the teacher chooses counterexamples from a fixed (but arbitrary) probability distribution. Angluin and Dohrn show that the number of expected queries for exactly identifying a target concept grows linearly in the $log (n)$ where $n$ is the size of the domain of the concepts. They also show that no better upper bound can be achieved in terms of the VC-dimension of the concept class. Even when the VC-dimension is one, the upper bound can be achieved.
We will show that the number of expected queries grows linearly in the Littlestone dimension. In many set systems, this produces better bounds than those of Angluin and Dohrn, since the Littlestone dimension is bounded from above by $log(n)$, but in general concept classes, $n$ may be arbitrarily large with fixed Littlestone dimension. Many examples are provided by formulas in stable theories (a formula is stable if and only if it has finite Littlestone dimension).
This is joint work with Hunter Chase.
Tuesday October 16, 2018 at 3:30 PM in 427 SEO
Web Privacy Notice HTML 5 CSS FAE
UIC LAS MSCS > persisting_utilities > seminars >