It took us a couple of days longer than expected but now we are proud to release the SUMO Toolbox 2017a. This new release can be downloaded through the download page.

## Sensitivity analysis

Sensitivity analysis is a powerful tool to understand the importance of the inputs of your simulator and the interactions among these inputs. The resulting indices provide a measure of how much each of the inputs influences the output and how much they influence the output by interacting with each other. This information can be used to get a detailed understanding about the inner workings of the simulator and to potentially reduce the dimensionality of the simulator.

The common method for sensitivity analysis is to do Monte Carlo based analysis. In this release, we added Monte Carlo variance-based sensitivity indices for all models and Monte Carlo derivative-based sensitivity indices for the Kriging models. For selected models (Kriging, GPML and LS-SVM), analytic expressions of these sensitivity indices have been added to the toolbox. For these models, a significant increase in computational efficiency and accuracy over the standard Monte Carlo method is achieved.

The figure below shows the computation of the sensitivity indices for the Ishigami function (typical in sensitivity analysis benchmarks) using the Kriging model and the FLOLA-Voronoi sequential sampling algorithms. To activate this feature, add the corresponding profiler to the XML configuration. Afterwards, the sensitivity indices can also be computed using the generated model.

The following sensitivity analysis profilers are available:

- Variance based
- sensitivitySobolProfiler
- sensitivitySobolMCProfiler
- sensitivityTotalSobolMCProfiler
- sensitivityTotalSobolProfiler

- Derivative based
- sensitivityDerivativeProfiler

The sensitivity indices can also be computed using the model with the following code where order is the desired order of the sensitivity indices.

>> [sensitivityIndices, totalSensitivityIndices] = model.getSensitivityIndices(order)

More detailed aspects of the algorithm can be defined using the following code where *algorithm *can either be "variance" or "derivative" and *method *can either be "analytic" or "montecarlo". The default setting is variance-based sensitivity indices with the most accurate method available for that model. When using derivative-based sensitivity indices, only totalSensitivityIndices will be returned.

>> [sensitivityIndices, totalSensitivityIndices] = model.getSensitivityIndices(order, algorithm, method)

The help you in interpreting when the results of the sensitivity analysis have become stable and accurate enough, we also two added sensitivity crossvalidation measures. These can be seen in the figure below.

To activate the sensitivity crossvalidation measure, use the "SensitivityCrossValidation" measure. Note that an *aggregation *option has be to supplied which can either be "meanvar" or "maxvar" to activate the mean-variance criterion or the maximal-variance criterion respectively. The *algorithm *and *method *can again be defined using the settings such as in sensitivity indices description. To use total sensitivity indices, supply to boolean option *total*. To use normalized derivative based sensitivity indices, supply the boolean option *norm*.

More detailed information on sensitivity analysis and the sensitivity crossvalidation algorithms can be found in the following publication:

*[1] Van Steenkiste T., J. van der Herten, I. Couckuyt and T. Dhaene. 2016. “Sensitivity Analysis of Expensive Black-Box Systems using Metamodeling”. In Proceedings of the 2016 IEEE Winter Simulation Conference, edited by T. M. K. Roeder, P. I. Frazier, R. Szechtman, E. Zhou, T. Huschka, and S. E. Chick.*

## Changelog

- Added support sensitivity analysis
- Monte carlo Sobol indices and derivatied-based measures
- Analytical derivations are implemented for Kriging, GPML, LS-SVM

- Added SensitivityCrossValidation
- Calculates the error on the sensitivity indices

## Add new comment