We describe an approximation algorithm A which, given small parameters c and d, runs independent experiments and produces an estimate that is within a factor 1+c of M with probability at least 1-d. The number of experiments run by A is proportional to quantities that depend on the distribution Z, which are unknown a priori. We prove that the expected number of experiments run by A is optimal to within a constant factor for every Z.
There are a number of applications where Monte Carlo algorithm simulation of the type suggested above has been used and where the analysis of the number of experiments to run is far from tight for every problem instance. The algorithm A can be directly used in these applications to produce a provably good estimate while running the minimal number of experiments needed for the particular problem instance.
JOINT WORK WITH: Paul Dagum, Richard Karp, and Sheldon Ross