4. Verification and Validation (V&V)
V&V are two distinct but essential steps for ensuring the credibility of a simulation model. An analyst’s key challenge is to validate the model, as it is only useful if it is an accurate representation of the actual system.
- Verification: The process of ensuring the model’s implementation and its data align with the developer’s conceptual description and specifications.
- Validation: The process of comparing the conceptual model’s representation to the real system.
V&V Techniques
Verification Techniques:
- Writing and debugging the program in sub-programs using skilled programming.
- Employing a “Structured Walk-through” policy where multiple people review the program.
- Tracing intermediate results and comparing them with observed outcomes.
- Checking the model’s output against a variety of input combinations.
- Comparing the final simulation result with known analytic results.
Validation Techniques:
- Design for High Validity:
- Discuss the model design with system experts.
- Maintain interaction with the client throughout the development process.
- Have system experts supervise the output.
- Test Model Assumptions:
- Quantitatively test the model by applying assumption data.
- Perform sensitivity analysis to observe how results change when significant changes are made to input data.
- Determine Representative Output:
- Assess how closely the simulation output matches the real system’s output.
- Use a Turing Test, where data is presented in the system format for experts to evaluate.
- Employ statistical methods (e.g., chi-square test, Kolmogorov-Smirnov test) to compare model output with real system data.
Data Comparison Approaches
- Validating an Existing System: Use real-world inputs in the model and compare its output with the real system’s output. This may involve statistical hypothesis testing for metrics like waiting time or idle time.
- Validating a New/Hypothetical System: When historical data is unavailable, validation relies on:
- Subsystem Validity: Testing known subsystems within the larger model separately.
- Internal Validity: Ensuring the model does not have a high degree of internal variance that would obscure the impact of input changes.
- Sensitivity Analysis: Identifying sensitive parameters that require higher attention.
- Face Validity: Rejecting a model if it operates on opposite logic, even if its behavior mimics the real system.