Abstract
Bounding the expectation of a convex function of a multivariate random variable is of great importance for solving stochastic linear programming problems (SLP) with recourse. The classic bounds are those of Jensen and Edmundson-Madansky, whereby Jensen's inequality is available in the independent as well as in the dependent case, in contrast with the Edmundson-Madansky (E-M) inequality (Madansky, A. 1959. Bounds on the expectation of a convex function of multivariate random variable. Ann. Math. Statist. 30 743–746.) which is valid only in the independent case. This paper presents a straightforward extension of the E-M inequality for the dependent case, yielding a discrete distribution that is extremal with respect to a partial ordering of a set of distribution functions. Further, we prove that the bounds for the expectation behave monotonically by applying the obtained inequality to smaller and smaller subintervals. As this requires the evaluation of conditional probabilities and expectations, we replace the given distribution by a discrete one resulting from sampling, and determine the sampling size that keeps the statistical error negligible. Finally we conclude with the application to SLP recourse problems and state some computational results illustrating the effort for solving recourse problems.

This publication has 0 references indexed in Scilit: