anyLogistix
Expand
Font size

Running the Comparison Experiment

Comparison experiment allows you to compare the results of several Simulation experiments.

You must have at least two scenarios to run the Comparison experiment.

Create a Comparison experiment by selecting scenarios to compare and specifying statistics to collect. When launched, the experiment passes through a series of iterations, one for each included scenario. When the experiment is finished, you can compare the values of the collected statistics to examine the effect of the different initial conditions on the outcome.

To run the Comparison experiment

  1. Click the Simulation scenario type tab in the anyLogistix toolbar, and select the scenario to work with.
  2. Navigate to the experiments section and select Comparison experiment.
    Experiment settings will open over the map area.
  3. Set the experiment start and end dates in the Experiment duration section.
  4. Click the Use replications toggle button if numerous replications are required and specify their number in the Replications per iteration field. If replications are enabled, each iteration will include several repeating runs of the Simulation experiment. Otherwise, each iteration will include a single run of the Simulation experiment.
  5. Choose the required scenarios from the list of available scenarios by selecting the checkboxes next to them.
  6. If needed, specify the measurement units that will be used in the collected statistics
  7. Configure statistics that will be collected during the experiment execution:
    • Click the Statistics Configuration button in the experiment settings. The dashboard will open.
    • Select the statistics to be collected during the experiment by clicking the toggle buttons next to them.
    • Click or to close the dialog box and save the changes.
  8. Click Run to start the experiment execution.
  9. Observe the collected data in the Comparison results page.
How can we improve this article?