BIAS REDUCTION IN SAMPLE-BASED OPTIMIZATION

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

We consider the stochastic optimization problems which use observed data to estimate essential characteristics of the random quantities involved. Sample average approximation (SAA) or empirical (plug-in) estimation are very popular ways to use data in optimization. It is well known that SAA suffers from downward bias. Our proposal is to use smooth estimators rather than empirical ones in the optimization problems. We establish consistency results for the optimal value and the set of optimal solutions of the new problem formulation. The performance of the proposed approach is compared to SAA theoretically and numerically. We analyze the bias of the new problems and identify sufficient conditions for ensuring less biased estimation of the optimal value of the true problem. At the same time, the error of the new estimator remains controlled. Those conditions are satisfied for many popular statistical problems such as regression models, classification problems, and optimization problems with average (conditional) value at risk. We have proved that smoothing the least-squares objective in a regression problem by a normal kernel leads to a ridge regression. Our numerical experience shows that the new estimators also frequently exhibit smaller variance and smaller mean-square error than those of SAA.

Original languageEnglish
Pages (from-to)130-151
Number of pages22
JournalSIAM Journal on Optimization
Volume32
Issue number1
DOIs
StatePublished - 2022

Keywords

  • kernel estimators
  • regularized regression
  • sample average approximation
  • smoothing
  • stochastic programming
  • strong law of large numbers

Fingerprint

Dive into the research topics of 'BIAS REDUCTION IN SAMPLE-BASED OPTIMIZATION'. Together they form a unique fingerprint.

Cite this