Uncertainty estimation using the moments method facilitated by automatic differentiation in Matlab
Date published
Free to read from
Authors
Supervisor/s
Journal Title
Journal ISSN
Volume Title
Publisher
Department
Type
ISSN
Format
Citation
Abstract
Computational models have long been used to predict the performance of some baseline design given its design parameters. Given inconsistencies in manufacturing, the manufactured product always deviates from the baseline design. There is currently much interest in both evaluating the effects of variability in design parameters on a design’s performance (uncertainty estimation), and robust optimization of the baseline design such that near optimal performance is obtained despite variability in design parameters. Traditionally, uncertainty analysis is performed by expensive Monte-Carlo methods. This work considers the alternative moments method for uncertainty propagation and its implementation in Matlab. In computational design it is assumed a computational model gives a sufficiently accurate approximation to a design’s performance. As such it can be used for estimating statistical moments (expectation, variance, etc.) of the design due to known statistical variation of the model’s parameters, e.g., by the Monte Carlo approach. In the moments method we further assume the model is sufficiently differentiable that a Taylor series approximation to a model may be constructed, and the moments of the Taylor series may be taken analytically to yield approximations to the model’s moments. In this thesis we generalise techniques considered within the engineering community and design and document associated software to generate arbitrary order Taylor series approximations to arbitrary order statistical moments of computational models implemented in Matlab; Taylor series coefficients are calculated using automatic differentiation. This approach is found to be more efficient than a standard Monte Carlo method for the small-scale model test problems we consider. Previously Christianson and Cox (2005) have indicated that the moments method will be non-convergent in the presence of complex poles of the computational model and suggested a partitioning method to overcome this problem. We implement a version of the partitioning method and demonstrate that it does result in convergence of the moments method. Additionally, we consider, what we term, the branch detection problem in order to ascertain if our Taylor series approximation might only be valid piecewise.