Sign Up

165 SW Sackett Place, Corvallis, OR 97321

View map Free Event

Nishant Mehta, Assistant professor
Department of Computer Science
University of Victoria

I will talk about a new notion of complexity, “COMP”, that interpolates between and generalizes some existing complexity notions from statistical learning theory. I will first derive COMP as a generalization of the Shtarkov sum, the normalizer in the normalized maximum likelihood (NML) distribution. When the NML distribution exists, the logarithm of the Shtarkov sum is precisely equal to the minimax regret for individual sequence prediction under log loss. Next, via a PAC-Bayesian analysis, I will show how COMP can be used to obtain tight upper bounds on the excess risk for randomized estimators (which include generalized Bayesian estimators). This excess risk bound will be in terms of COMP itself. Under a certain specialization, further upper bounding COMP leads to a standard PAC-Bayesian excess risk bound whose right hand side is the information complexity (essentially the empirical excess risk plus an additional complexity term involving the KL divergence from the posterior distribution to the prior distribution). Under a different specialization, the special case of empirical risk minimization with VC-type classes and "large classes" (whose empirical L2 entropy exhibits polynomial growth), we will see how COMP is upper bounded in a way which yields optimal rates of convergence of the excess risk for such classes. Along the way, we will see connections to Rademacher complexity, and, in particular, we will recover bounds based on local Rademacher complexity while completely avoiding complicated local Rademacher complexity-based arguments. This is joint work with Peter Grünwald at CWI (in Amsterdam) and Leiden University.

Additional information: http://eecs.oregonstate.edu/colloquium-series

0 people are interested in this event

User Activity

No recent activity