LJKProbability & Statistics Seminar

On Thursday March 2 2017 at 14h00 in Room 106  IMAG Building

Seminary of François PORTIER (TELECOM ParisTech)

Ordinary least squares MonteCarlo

Summary

The use of control variates is a wellknown method to reduce the variance of the naive MonteCarlo estimator of an integral. A formulation of the method is presented based on a regression model where the dependent variable is the function to integrate and the predictors are elements of a linear space composed by test functions with known integral, assumed to be zero without loss of generality. It is shown that the ordinary leastsquares estimator for the intercept is equal to a certain controlvariate enhanced MonteCarlo estimator. The asymptotic variance of the estimator is equal to the variance of the residual variable in the regression model. More importantly, it is demonstrated that if the number of predictors is allowed to grow to infinity with the number of Monte Carlo replicates, convergence takes place at a faster rate than for ordinary Monte Carlo integration, the limit distribution of the standardized errors still being Gaussian. In addition, the estimator of the residual variance in the regression model is a consistent estimator for the asymptotic variance. The performance of the method in finite samples is investigated through a simulation study for various choices of the test functions and in various dimensions.
This is a joint work with Johan Segers.
