Comparison and anti-concentration bounds for maxima of Gaussian random vectors

Slepian and Sudakov-Fernique type inequalities, which compare expectations of maxima of Gaussian random vectors under certain restrictions on the covariance matrices, play an important role in probability theory, especially in empirical process and extreme value theories. Here we give explicit compa...

Full description

Bibliographic Details
Main Authors: Chernozhukov, Victor V. (Contributor), Chetverikov, Denis (Author), Kato, Kengo (Author)
Other Authors: Massachusetts Institute of Technology. Department of Economics (Contributor), Massachusetts Institute of Technology. Operations Research Center (Contributor)
Format: Article
Language:English
Published: Springer Berlin Heidelberg, 2016-06-27T16:05:18Z.
Subjects:
Online Access:Get fulltext
LEADER 01952 am a22002053u 4500
001 103354
042 |a dc 
100 1 0 |a Chernozhukov, Victor V.  |e author 
100 1 0 |a Massachusetts Institute of Technology. Department of Economics  |e contributor 
100 1 0 |a Massachusetts Institute of Technology. Operations Research Center  |e contributor 
100 1 0 |a Chernozhukov, Victor V.  |e contributor 
700 1 0 |a Chetverikov, Denis  |e author 
700 1 0 |a Kato, Kengo  |e author 
245 0 0 |a Comparison and anti-concentration bounds for maxima of Gaussian random vectors 
260 |b Springer Berlin Heidelberg,   |c 2016-06-27T16:05:18Z. 
856 |z Get fulltext  |u http://hdl.handle.net/1721.1/103354 
520 |a Slepian and Sudakov-Fernique type inequalities, which compare expectations of maxima of Gaussian random vectors under certain restrictions on the covariance matrices, play an important role in probability theory, especially in empirical process and extreme value theories. Here we give explicit comparisons of expectations of smooth functions and distribution functions of maxima of Gaussian random vectors without any restriction on the covariance matrices. We also establish an anti-concentration inequality for the maximum of a Gaussian random vector, which derives a useful upper bound on the Lévy concentration function for the Gaussian maximum. The bound is dimension-free and applies to vectors with arbitrary covariance matrices. This anti-concentration inequality plays a crucial role in establishing bounds on the Kolmogorov distance between maxima of Gaussian random vectors. These results have immediate applications in mathematical statistics. As an example of application, we establish a conditional multiplier central limit theorem for maxima of sums of independent random vectors where the dimension of the vectors is possibly much larger than the sample size. 
546 |a en 
655 7 |a Article 
773 |t Probability Theory and Related Fields