Adaptive Regression on the Real Line in Classes of Smooth Functions

Authors

  • L.M. Artiles Eurandom, Eindhoven, the Netherlands
  • B.Y. Levit Queen’s University, Kingston, Canada

DOI:

https://doi.org/10.17713/ajs.v32i1&2.452

Abstract

Adaptive pointwise estimation of an unknown regression function f(x), ? R corrupted by additive Gaussian noise is considered in the equidistant design setting. The function f is assumed to belong to the class A(?) of functions whose Fourier transform are rapidly decreasing in the weighted L2-sense. The rate of decrease is described by a weight function that depends on the vector of parameters ? which, in the adaptive setting, is typically unknown. For any of the classes A(?) , ? fixed, we describe minimax estimators up to a constant as the bin-width goes to zero. Conditions under which an adaptive study is suitable are presented and a notion of adaptive asymptotic optimality is introduced based on distinguishing, among all possible functional scales, between the so-called non-parametric (NP) and pseudo-parametric (PP) scales. We propose adaptive estimators which ‘tune up’ point-wisely to the unknown smoothness of f. We prove them to be asymptotically adaptively minimax for large collections of NP functional scales, subject to being rate efficient for any of the PP functional

scales.

References

P. Antonsik, J. Mikusiński, and R. Sikorski. Theory of Distribution. The Sequential Approach. Elsevier, Amsterdam, 1973.

L.D. Brown and M.G. Low. Asymptotic equivalence of nonparametric regression and white noise. Ann. Statist., 24:2384–2398, 1996.

W. Feller. An Introduction to Probability Theory and its Applications, volume I. Wiley, New York, 3rd edition, 1968.

G.K. Golubev and B.Y. Levit. Asymptotically efficient estimation for analytic distributions. Math. Meth. Statist., 5:357–368, 1996.

G.K. Golubev, B.Y. Levit, and A.B. Tsybakov. Asymptotically efficient estimation of analytic functions in Gaussian noise. Bernoulli, 2:167–181, 1996.

Kuo Hui-Hsiung. Gaussian Measures in Banach Spaces. Number 463 in Lect. Notes Math. Springer-Verlag, Berlin-Heidelberg-New York, 1975.

I.A. Ibragimov and R.I. Has’minskii. Statistical Estimation, Asymptotic Theory. Springer, New York, 1981.

I.A. Ibragimov and R.I. Has’minskii. Bounds for the risks of non-parametric regression estimates. Theor. Probab. Appl., 27:84–99, 1982.

I.A. Ibragimov and R.I. Has’minskii. Estimation of distribution density. Journ. Sov. Math., 25:40–57, 1983.

O.V. Lepski. On a problem of adaptive estimation in Gaussian noise. Theory Probab. Appl., 35:454–466, 1990.

O.V. Lepski. Asymptotically minimax adaptive estimation. I: Upper bounds. Optimally adaptive estimates. Theory Probab. Appl., 36:682–697, 1991.

O.V. Lepski. Asymptotically minimax adaptive estimation. II: Schemes without optimal adaptation. Adaptive estimators. Theory Probab. Appl., 7:433–448, 1992a.

O.V. Lepski. On problems of adaptive estimation in white Gaussian noise. Adv. Soc. Math., 12:87–106, 1992b.

O.V. Lepski and B.Y. Levit. Adaptive minimax estimation of infinitely differentiable functions. Math. Meth. Statist., 7:123–156, 1998.

O.V. Lepski and B.Y. Levit. Adaptive non-parametric estimation of smooth multivariate functions. Math. Meth. Statist., 8:344–370, 1999.

B.Y. Levit. On the asymptotic minimax estimates of the second order. Theory Prob. Appl., 25:552–568, 1980.

S. Nikol’ski˘ı. Approximation of Functions of Several Variables and Imbedding Theorems. Springer-Verlag, Berlin Heidelberg New York, 1975.

M. Nussbaum. Asymptotic equivalence of density estimation and Gaussian white noise. Ann. Statist., 24:2399–2430, 1996.

C.J. Stone. Optimal global rates of convergence for nonparametric regression. Ann. Statist., 10:1040–1053, 1982.

Downloads

Published

2016-04-03

Issue

Section

Articles

How to Cite

Adaptive Regression on the Real Line in Classes of Smooth Functions. (2016). Austrian Journal of Statistics, 32(1&2), 99–129. https://doi.org/10.17713/ajs.v32i1&2.452