Running the Comparison Experiment

Comparison experiment allows you to compare the results of several Simulation experiments. 

Note: You must have at least two scenario to run the Comparison experiment. 

Create a Comparison experiment by selecting scenarios to compare and specifying statistics to collect. When launched, the experiment passes through a series of iterations, one for each included scenario. When the experiment is finished, you can compare the values of the collected statistics to examine the effect of the different initial conditions on the outcome.

To run the Comparison experiment
  1. Click the SIM tab located in the top left area of ALX workspace, and select the scenario to work with.
  2. Navigate to the experiments section and select Comparison experiment.
    The GIS map will be substituted with the experiment parameters.
  3. Set the experiment start and end dates in the Experiment duration section.
  4. Click the Use replications toggle button if numerous replications are required and specify their number in the Replications per iteration field. If replications are enabled, each iteration will include several repeating runs of the Simulation experiment. Otherwise, each iteration will include a single run of the Simulation experiment.
  5. Choose the required scenarios from the list of available scenarios by selecting the check boxes next to them.
  6. Configure the statistics that will be collected during the experiment execution:
  7. Click Run to start the experiment execution. Once the experiment is over, the results will be added to the Comparison experiment branch of the experiments tree.

    Note: If you are currently running one or more experiment(s) of any type, a warning message will pop-up, notifying you about the number of currently running experiments.

  8. Observe the collected data in the Comparison results tab.


 

Related topics

Results of the Comparison experiment

Adding a new dashboard element.