Wednesday, March 22, 2023 - 4:00pm to 4:30pm
Event Calendar Category
LIDS & Stats Tea
Speaker Name
Zeyu Jia
Affiliation
LIDS & EECS
Building and Room Number
LIDS Lounge
We consider the question of estimating multi-dimensional Gaussian mixtures (GM) with compactly supported or subgaussian mixing distributions. Minimax estimation rate for this class (under Hellinger, TV and KL divergences) is a long-standing open question, even for dimension one. In this paper we characterize this rate (in all dimensions) in terms of the metric entropy of the class. Such characterizations originate from seminal works of Le Cam (1973); Birg ́e (1983); Haussler and Opper (1997); Yang and Barron (1999). However, for GMs a key ingredient missing from earlier work (and widely sought-after) is a comparison result showing that the KL and the squared Hellinger distance are within a constant multiple of each other uniformly over the class. Our main technical contribution is in showing this fact, from which we derive entropy characterization for estimation rate under Hellinger and KL. Interestingly, the sequential (online learning) estimation rate is characterized by the global entropy, while the single-step (batch) rate corresponds to local entropy, paralleling a similar recent discovery for the case of Gaussian sequence model in a pair of works Neykov (2022); Mourtada (2023). Additionally, since Hellinger is a proper metric, our comparison shows that GMs under KL satisfy a version of triangle inequality (with a multiplicative constant), implying that proper and improper estimation rates coincide.
Zeyu Jia is a third year PhD student of EECS, advised by Sasha Rakhlin and Yury Polyanskiy. He did his undergraduate at Peking University. His research focus on machine learning theory and statistics.