Comparison experiment allows you to compare the results of several Simulation experiments.
You must have at least two scenarios to run the Comparison experiment.
Create a Comparison experiment by selecting scenarios to compare and specifying statistics to collect. When launched, the experiment passes through a series of iterations, one for each included scenario. When the experiment is finished, you can compare the values of the collected statistics to examine the effect of the different initial conditions on the outcome.
To run the Comparison experiment
- Click the Simulation scenario type tab in the anyLogistix toolbar, and select the scenario to work with.
- Navigate to the experiments section and select Comparison experiment.
Experiment settings will open over the map area. - Set the experiment start and end dates in the Experiment duration section.
- Click the Use replications toggle button if numerous replications are required and specify their number in the Replications per iteration field. If replications are enabled, each iteration will include several repeating runs of the Simulation experiment. Otherwise, each iteration will include a single run of the Simulation experiment.
- Choose the required scenarios from the list of available scenarios by selecting the checkboxes next to them.
- If needed, specify the measurement units that will be used in the collected statistics
- Configure statistics that will be collected during the experiment execution:
- Click the Statistics Configuration button in the experiment settings. The dashboard will open.
- Select the statistics to be collected during the experiment by clicking the toggle buttons next to them.
- Click or to close the dialog box and save the changes.
- Click Run to start the experiment execution.
- Observe the collected data in the Comparison results page.
-
How can we improve this article?
-