Bayesian Indirect Inference and the ABC of GMM

Speaker: Han Hong
Stanford University

Venue: Packard 101
Time: 4:15 pm to 5:15 pm
Date: Thursday, May 12, 2016


In this paper we propose and study local linear and polynomial based estimators for implementing Approximate Bayesian Computation (ABC) style indirect inference and GMM estimators. This method makes use of nonparametric regression in the computation of GMM and Indirect Inference models. We provide formal conditions under which frequentist inference is asymptotically valid and demonstrate the validity of the estimated posterior quantiles for confidence interval construction. We also show that in this setting, local linear kernel regression methods have theoretical advantages over local constant kernel methods that are also reflected in finite sample simulation results. Our results also apply to both exactly and over identified models. These estimators do not need to rely on numerical optimization or Markov Chain Monte Carlo (MCMC) simulations. They provide an effective complement to the classical M-estimators and to MCMC methods, and can be applied to both likelihood based models and method of moment based models.

Speaker Bio

Han Hong has been a Professor of Economics at Stanford University since 2007. He obtained his B.A. in Economics from Zhongshan University in 1993 and his PhD in Economics from Stanford University in 1998. Prior to joining Stanford, he was an assistant professor in Department of Economics at Princeton University from 1998-2003 and was an Associate Professor and subsequently Professor in Department of Economics at Duke, from 2003 to 2005 and from 2005 to 2006, respectively. His research interests include econometrics, statistics and applied microeconomics.