Special Colloquium
Jing Wang, Ph.D.
Department of Statistics and Probability, Michigan State University
Spline-Backfitted Kernel Smoothing of Additive Regression Model
Abstract: Abstract: A great deal of efforts has been devoted to the inference of
additive model in the last decade. Among the many existing procedures, the
kernel type are too costly to implement for large number of variables or
for large sample sizes, while the spline type provides no asymptotic
distribution or any measure of uniform accuracy. We propose a synthetic
estimator of the component function in an additive regression model, using
a one step backfitting, with spline estimators in the first stage and
kernel estimators for the second stage. It is established that under very
weak conditions, the proposed estimator's pointwise distribution is
asymptotically equivalent to an ordinary univariate Nadaraya-Watson
estimator, hence the dimension is effectively reduced to one at any point.
This dimension reduction holds uniformly over an interval under stronger
assumptions of normal errors. Monte Carlo evidence supports the asymptotic
results for dimensions ranging from low to very high, and sample sizes
ranging from moderate to large.
Tuesday February 7, 2006 at 4:00 PM in SEO 636