AbstractsStatistics

Bayesian Designs For Sequential Learning Problems

by Jing Xie




Institution: Cornell University
Department:
Year: 2014
Keywords: Bayesian Statistics ; Simulation ; Optimization
Record ID: 2052792
Full text PDF: http://hdl.handle.net/1813/37171


Abstract

We consider the Bayesian formulation of a number of learning problems, where we focus on sequential sampling procedures for allocating simulation effort efficiently. We derive Bayes-optimal policies for the problem of multiple comparisons with a known standard, showing that they can be computed efficiently when sampling is limited by probabilistic termination or sampling costs. We provide a tractable method for computing upper bounds on the Bayes-optimal value of a ranking and selection problem, which enables evaluation of optimality gaps for existing ranking and selection procedures. Applying techniques from optimal stopping, multi-armed bandits and Lagrangian relaxation, we are able to efficiently solve the corresponding dynamic programs. We develop a new value-of-information-based procedure for the problem of Bayesian optimization via simulation, which incorporates both correlated prior beliefs and correlated sampling distributions. We also introduce a sequential Bayesian algorithm for optimization of expensive functions under low-dimensional input uncertainties. These implementations take advantage of machine learning tools that enable exploring combinatorially large solution spaces, or estimating expectations of simulation output variables with random inputs. We present theoretical results characterizing the proposed procedures, compare them numerically against previously developed or standard benchmarking procedures, and apply them to applications in emergency services, manufacturing, and health care.