AMSS Lecture 10: Evaluating & Testing UML Models

Traian-Florin Șerbănuță

2025

Agenda

Goal

Learn how to evaluate, validate, verify, and test UML models using structured techniques and common tools.

General evaluation principles

Tools

General Evaluation Principles

Why Evaluate UML Models?

Evaluation dimensions

Evaluation dimensions: Consistency

Evaluation dimensions: Completeness

Evaluation dimensions: Correctness

Evaluation dimensions: Usability

Evaluation dimensions: Maintainability

UML Model Quality Criteria

Semantic quality

Syntactic quality

Pragmatic quality

Static Evaluation Techniques

Checklist-based evaluation

Useful for manual reviews.

Typical questions

Benefits

Limitations

Traceability checks

Goal

ensure that every element of a UML model is properly linked to other artifacts across the software lifecycle

What Do You Check in UML Traceability?

Use Cases – Requirements

Use Cases – Interaction Diagrams (Sequence/Communication)

Interaction Diagrams – Class Diagrams

Class Diagrams – State Machine Diagrams

Design Models – Test Cases

Cross-diagram consistency checks

Goal

multiple UML diagrams describing the same system do not contradict one another

Types of Consistency

Syntactic Consistency

Semantic Consistency

Behavioral Consistency

Naming Consistency

Interactive Exercise

Identify 3 possible inconsistencies in the following class diagram:

Possible solution

Tools for UML OCL Consistency Checking

Expressing constraints using OCL

OCL consistency checking tools

Static analysis

Model validation inside modeling tools

Facilities

Simulation and Execution Tools

Sequence diagram execution

Tools

State machine simulation

Tools

Activity diagram execution

Tools

Testing Behavioral Models

Model-Based Testing (MBT)

Use the UML model as the basis for generating tests.

Tools (test generation from UML models)

State-Based Test Example

For a DoorSensor state machine:

Test coverage:

Cross-Model Evaluation

Tools supporting this:

Interactive Exercise

What should happen? What can go wrong? Design test cases.

Produce: 1. A list of possible failure points
2. A state-based test
3. A message-based consistency check

Wrap‑Up