Uniform Convergence and Rate Adaptive Estimation of Convex Functions via Constrained Optimization

Date

2013-01

Department

Program

Citation of Original Publication

Wang, Xiao, and Jinglai Shen. “Uniform Convergence and Rate Adaptive Estimation of Convex Functions via Constrained Optimization.” SIAM Journal on Control and Optimization 51, no. 4 (January 2013): 2753–87. https://doi.org/10.1137/120887837.

Rights

© 2013 Society for Industrial and Applied Mathematics

Subjects

Abstract

This paper discusses asymptotic analysis and adaptive design of convex estimators over the Hölder class under the sup-norm risk and the pointwise risk using constrained optimization and asymptotic statistical techniques. Specifically, convex B-spline estimators are proposed to achieve uniform optimal convergence rates and adaptive procedures. The presence of the convex shape constraint complicates asymptotic performance analysis, particularly uniform convergence analysis. This in turn requires deep understanding of a family of size varying constrained optimization problems on spline coefficients. To address these issues, we establish the uniform Lipschitz property of optimal spline coefficients in the ℓ∞-norm by exploiting the structure of underlying constrained optimization problems. By using this property, polyhedral theory, and statistical techniques, we show that the convex B-spline estimator attains uniform consistency and optimal rates of convergence on the entire interval of interest over the Hölder class under the sup-norm risk and the pointwise risk. In addition, adaptive estimates are constructed under both risks when the Hölder exponent is between one and two. These estimates achieve a maximal risk within a constant factor of the minimax risk over the Hölder class.