Copyright © Snyders.US. All rights reserved.

Entry Criteria

A process is defined by its entry and exit criteria. The entry standards for a verification and validation (V&V) exercise involving software design documents must be explicitly documented and enforced. The minimum entry criterion suggested:

  • The requirements specification has successfully exited from a methodical verification and validation exercise
  •   The responsible individual(s) for the design artifacts has indicated readiness for verification and validation.


Do not consider the evaluation of a software design specification that has been generated from unverified requirements. In that case, step back, and ensure that the requirements are correct and complete.


Software Life Cycle Considerations

The development of software uses many techniques, but in general, only two need to be considered relevant to design specification V&V: the System Analysis and Design (SAD) approach, and the Object Oriented (OO) method. In the former, the focus in on the flow of data, using data flow diagrams and program structure charts. In the latter, the concentration is on object interaction; use case models that evolve into objects and class representations. Use of the Unified Modeling Language (UML) has become prevalent in object oriented design specifications.

The consequence of the life cycle to V&V is the management of document scope. For example, an iterative life cycle may produce four or five versions of the system design specification, one for each respective iteration. The designers may have purposely ignored entire sections of the requirements--and V&V must be able to recognize that. Ultimately, this will condense into a matter of communication. However, recognize that if the document contents do not explicitly define the scope of the specification, the intent must be confirmed with the authors.


Approach

At a high level, successful software design specification V&V involves static analysis and static testing techniques. The primary goals are to:

  • ensure the specification document includes the format and content as specified in the implicit and explicit requirements, and
  • confirm the system elements (the attributes of the content) described in the document or model, will meet the current and anticipated requirements for the system.


The first objective closely aligns with the quality function verification, "are we building the product right?" (Dorfman and Thayer, et al., 1997).  Aspects of the verification function are also present in the second objective; however, the quality focus in this case is intimate with the validation function. Software validation should answer the question "are we building the right product?" (Dorfman and Thayer, et al., 1997). 

The methodologies presented will be a combination of static analysis and static testing techniques. Static analysis is the "process of evaluating a system or component based on its form, structure, content, or documentation" (IEEE, 1990). Kaner, Falk, and Nguyen define "static testing" as "examining a program without executing it using methods such as walkthroughs, inspections, reviews, and desk checking" (1993). Depending on the application domain, static testing of specifications may also involve modeling state, data and time. Factors such as risk, cost, and time to market will determine the applicability of these techniques.


Static Analysis

Given a correctly constructed checklist, even the junior engineer can verify the format and content of the most complex software design specification. Admittedly, the design may have serious flaws from a software point of view, but that is another matter. The goal of this verification step is to compare the contexts of the document to what is expected. The attributes of that content are not under consideration in this step.    

Consider again the analogy of an architectural drawing submitted to the building inspector. The building inspector expects to see consistency in the drawing contents: a plan view, elevation view, annotations of dimension, notes of material attributes, a title block with revision information and dates. The software design specification should also have the format and contents arranged to suit the primary audience. In the majority of cases, this will translate into a format based on IEEE recommendations, or will follow object-oriented design guidelines. The standard must be predetermined and agreed to by stakeholders.

A well-designed document content checklist will be based on the document format the sponsor expects. This expectation may be implicit. Sometimes the format of the specification document is specified in the system requirements, or written into a contract. Sometimes not. If the customer is indifferent, then the document author has a prerogative. The V&V engineer should develop a content checklist that follows a well-documented and practiced guideline to ensure that the specification author adheres to an accepted, uniform standard.

The IEEE Standard 1016-1998, Recommended Practice for Software Design Descriptions gives guidance for a specification that is based on the SAD methodology. The following table represents the minimum elements that should be included in a design specification based on this format.


Document Sections
Section Format
Introduction
Purpose, scope definitions
System Decomposition description
Module, process, data descriptions
Dependency description
Module, process, data dependencies
Interface description
Module, process interfaces
Detailed design
Module, data detail








The Rational Unified Process (2000) offers suggested contents for a software design specification using an object-oriented tactic. These are listed in the next table.


Document Sections
Section Format
Introduction
Purpose, scope, definitions
Architectural Representation
Describes the software architecture
Architectural Goals and Constraints
Significant impacts on the architecture
Use-Case View
Use-Case realizations
Logical View
Significant design packages
Process View
Communication between processes
Deployment View
Physical (hardware) configurations
Implementation View
Overall structure of the model
Layers
Logical layers and their contents
Data View
Persistent data storage
Size and Performance
Dimensioning characteristics
Quality
How quality objectives are realized














The information in these checklists is neither definitive nor universal. For example, the document author may make decisions to omit sections, or include content not listed here. A document content checklist could also be considered subjective. To avoid these pitfalls, manage the expectations before you begin any verification steps. Distribute your artifacts for review and set the level of expectations early. Use document templates. Keep open lines of communication between the specification authors and those who are to verify it. This is especially important with an iterative design cycle, as you do not want to be recording issues against elements that are out of scope.  

Consider the content the customer will take for granted, but could be easily overlooked in the document. These are implicit requirements and must be reflected in a document content checklist. For example:

  • the document version should be easy to identify and be unique,
  • the use of consistent application domain terminology that is harmonious with the requirements, and
  • spelling and grammar should go without saying but--say them if you need to.


Static Testing

Static testing of software specifications encompasses a spectrum from radical simplicity to fanatical complexity. At one extreme, an engineer hands a document draft to a peer for a desk check. At the other, the specification is modeled using Colored Petri Nets (CP-nets or CPN), which is a graphical language for design, specification, simulation and verification of systems. In between these extremes are the more typical review and inspection techniques. The salient features of each will be examined in this section in their respective order of formality.


The Desk Check

Desk checking does not often come to mind when reviews are discussed, but the desk check is indeed a type of peer review. The desk check is a one-person walkthrough where "someone other than the author" (Frank, Marriott, & Warzusen, 2002) examines the work product for errors or adherence to specification based on the instruction of the designer. This is typically a casual review as no documentation or process is usually required, however, a standard checklist could be used.


Walkthroughs

A walkthrough is more of an analysis technique than it is a class of review. However, when project controls are applied, such as schedule, defined participants, a statement of objectives, output recording--a walkthrough becomes a formal review class. Without these controls it is informal. A walkthrough may be held for the purpose of educating the audience about various technical aspects of the specification. For example, the scope of the current iteration in an object-oriented design cycle. Another important objective of a walkthrough could be to exchange ideas as each subject area is sequentially addressed (IEEE, 1998).

Typically, the designer of the element under review makes an overview presentation, followed by a general discussion from the participants. Then, the designer (author) "walks through" the artifact under review in detail. Suggested changes and improvements are noted, and optionally, recorded. If the walkthrough is structured, the recorded minutes are consolidated into a review report and distributed (Frank, et al., 2002). If defined roles are used in the walkthrough, they will be a leader (facilitator), a recorder (scribe), and the author.


Technical Review

The technical review identified in the IEEE 1028 standard could also be referred to as an engineering review, or generically as a peer review. The standard defines it as "a systematic evaluation", "by a team of qualified personnel" (IEEE, 1998). The primary purpose is to identify "discrepancies from specification and standards" (IEEE, 1998). However, recommendations of alternatives and examinations of optional approaches may be on the agenda for this type of review. Defined roles will include a decision maker, a leader, and a recorder.


Artifact Inspections

This type of review is developed from the concept of the software inspection identified in IEEE 1028, and pioneered by Michael Fagan at IBM (Fagan, 1976). The original theory was to improve productivity and quality through inspection by a panel of experts. Inspection techniques have since evolved and matured in the software engineering field based on additional contributions (Frank, et al. 2002). These techniques can be applied to software design specifications. An inspection is in a class by itself for two main reasons: the author of the artifact under inspection should not fill the role of leader, reader or recorder--and the issues identified must be resolved to meet exit criteria. Errors identified in an inspection are tracked to resolution much like errors reported from a test.

An artifact inspection is a visual examination of a specification to identify errors and deviations from standards and requirements (IEEE, 1998). What differentiates an inspection from a technical review is the "determination of remedial or investigative action for an anomaly is a mandatory element" (IEEE, 1998). Developing a solution to the issue(s) discovered is not an agenda item in an inspection; however, the inspection is not complete until all of the issues identified are resolved. For example, a technical review could identify issues, report them, and be considered a completed activity. In contrast, an inspection is not complete until it has been shown that the identified issues have been resolved to the satisfaction of the inspection team in subsequent sessions.

Another differentiation for the inspection is the role of "the inspector". This ominous sounding term can be euphemistically referred to as a subject matter expert; can be a team of experts. Where it is implied for other review types that the individual or group that is scrutinizing the work product be skilled in the field--for an inspection it is an explicit requirement. The IEEE standard calls for inspectors "chosen to represent different viewpoints" and gives examples such as "sponsor, requirements, design, safety, test (IEEE, 1998) and others. The standard goes on to state that the inspectors should be assigned "specific review topics", for example, adherence to a "specific standard" (IEEE, 1998).

Roles for an inspection are the leader, recorder, a reader, and the inspector. As stated previously, the author (designer) may be present, but should not act as the leader, the reader, or as the recorder so that any potential influence or bias is negated.


Criteria for Design Specification Review and Inspection

According to the ISO standard ISO/IEC 10746-1: Reference Model – Open Distributed Processing (RM-ODP), there are two dimensions to system architecture (Rational, 2002):

  • Viewpoint – the context for addressing a limited set of quality concerns.
  • Model level – UML models that capture various levels of design specificity. 


The different viewpoints allow for separation of concerns. The verification criterion attempts to objectively answer the following questions:


  • Computation Viewpoint
    • Is the system functionality adequate to realize the use cases? If use cases have not been developed, can the functional requirements be realized?
    • Is the system is extendible and maintainable?
    • Does the design exhibit internal reuse?
    • Does the design exhibit good cohesion and connectivity?
  • Engineering Viewpoint
    • Are the system physical characteristics adequate to host the required functionality?
  • Information Viewpoint
    • Does the system have sufficient capacity to store data?
    • Does the system have sufficient throughput to provide timely access to data?
  • Process Viewpoint
    • Does the system have sufficient partitioning of processing to support concurrency and reliability needs?
  • Stakeholder Viewpoint
    • Does the system fail to meet any specific requirement?


The model level evaluation measures are adherence to UML guidelines and specifications, applicability and relevance to specification goals.   

 Rakitin offers some additional guidelines to use in a high-level design inspection checklist (2001, p. 278):

  • Are all assumptions documented?
  • Have major design decisions been documented?
  • Is the design consistent with those decisions?


One method to consider when attempting an objective static analysis of a specification is to walk the requirements through the design. Use cases lend themselves well to this technique better than functional specifications, however--both requirement types can be used. For example, a simple sign-in use case can be used as input to the specification document; a subset is shown here:


Main Flow

  1. The user enters a user name and password combination and submits the request.
  2. The system validates the credentials and presents the main window.

Alternate Flow

  1. The system invalidates the credentials and presents the user with 2 additional opportunities to submit a valid user name and password combination.
  2.  After 3 invalid sign-in attempts, the system ends the user session and exits.


The engineer can inspect the software design specification and make a determination if a class module exists to perform the sign-in validation. If the class does exist it must contain a member to maintain the number of session sign-in attempts and terminate the application after 3 unsuccessful attempts. If any aspect of the required functionality has been omitted from the design the specification is in error.


Exit Criteria

Exit conditions from a design specification V&V exercise should be consistent with the process model and quality assurance objectives used in the organization. Typically, the standards used will be:

  • All identified errors have been addressed to an agreed upon resolution. 
  • All high severity errors have been resolved.


Consider the classification of errors reported from specification V&V to serve as quality metrics. This cataloging will provide management reporting and process improvement initiatives with information about the source of errors, rather than just raw numbers. The following table lists suggested error categories that would relate to a design specification document.


Error Category
Class of Error
Document
Grammatical, syntax, prose and document format
Model
Errors in the application of graphical models, misuse
Omission
Requirements that have been overlooked
Interpretation
Business requirements misrepresented in the design
Contractual
Omission, issue with contractual requirements








Summary

Verification and validation of software design specifications should become part of every software project plan. The techniques illustrated here have shown that static analysis involves more than just document review and inspection. The specification needs to have the support of a verified requirements specification. The format of the design artifacts should be compared to accepted standards with a checklist. The document content should be verified with viewpoints that address systemic quality concerns in review and inspection exercises. Where modeling has been used to describe the design, validate the content and verify the attributes of the content.

A great deal of quality injection can be accomplished before the first line of source code has been keyed. This effort is valuable, as it prevents mistakes from propagating into the binaries, where they are exponentially more expensive to repair. Static techniques will never totally replace software testing, as they cannot verify that the system is operationally useful, nor quantify performance or reliability. However, the potential for early error detection and consequent cost-savings are great--software architects should get used to submitting their plans to a board of software verification and validation for a building permit!


References

Bezier, B. (1984). Software System Testing and Quality Assurance. NY: Van Nostrand Reinhold.

Blanchard and Fabrycky, (1998). Systems Engineering and Analysis (Third Edition). NY: Prentice Hall, 1998. 

Dorfman, M., Thayer, R. (Eds.) (1997). Software Engineering. CA: IEEE Computer Society Press. 

Fagan, M., (1976). Design and Code Inspections to Reduce Errors in Program Development. IBM Systems Journal, Vol. 15, No. 3. 

Frank, B., Marriott, P., & Warzusen, C. (2002). The Software Quality Engineer Primer. IN: Quality Council of Indiana.  

IEEE Standard 610.12-1990, Standard Glossary of Software Engineering Terminology. NY: IEEE Service Center. 

IEEE, IEEE Standard 1028-1988, IEEE Standard for Software Reviews. NY: IEEE Service Center 

Jensen, K. (1994). An Introduction to the Theoretical Aspects of Coloured Petri Nets. J.W. de 

Kaner, C., Falk, J. & Nguyen H., (1993). Testing Computer Software (Second Edition). NY: Van Nostrand Reinhold. 

Kruchten, P., (2000). The Rational Unified Process, An Introduction (Second Edition). NJ: Addison Wesley. 

Rakitin, Steven R. (2001). Software Verification and Validation for Practitioners and Managers (2nd ed.). Boston: Artech House. 

Rational Software Company, (2002). Rational Unified Process for Systems Engineering (RUP SE 1.1). Rational Software White Paper TP 165A.


Sommerville I., (2001). Software Engineering (Sixth Edition) . UK: Addison-Wesley

To be successful with specification verification and validation, apply the fundamentals

Verification and Validation of Software Design Specifications

An overview of techniques


John R. Snyder, October 2003

Snyders.US

Other engineering disciplines have long ago developed techniques for the verification and approval of design specifications; look at the construction of a home as one example. The plans are developed and approved before any construction begins.

  • Architects of software systems are not accustomed to having their work checked by others, but this should change.
  • The software process lifecycle is a prime consideration for the quality assurance team due to the management of scope: what is intended to be included in the specification document--and what is intentionally excluded. The specification document scope must be clearly communicated between the architecture team and the verification team. 
  • The quality assurance process must be explicitly defined and documented--understood by all. 
  • A common misconception is that a quality assurance team does not possess the engineering expertise to verify a software design specification. With the creation of agreed upon standards for design document format, simple checklists can be developed that any competent engineer can validate. This is document content verification. 
  • The validation of a software design specification entails developing a consensus from many different engineering viewpoints. Design specification validation confirms that the attributes of the document content will meet the requirements of the anticipated system.
  • Requirements can be used as simulated input to the specification ensuring the functionality is supported by the envisioned system.
  • Explicit entry and exit criteria must be established for the verification and validation process.


Conclusion: To be successful with specification verification and validation, apply the fundamentals: defined process, specific objectives, preconditions, and a clear definition of test success.


Introduction

Architects of software systems are not accustomed to having their work checked by others--this should change. Verification of executing code is accepted practice; testers and tools abound that are dedicated to that purpose. When the deliverable is a software design specification, quality assurance may not even be considered. Outside of the world of low-level protocols and embedded systems, verification and validation of software specifications is rare.

A home is not built before the plans are checked and approved by a third-party inspector. The inspector decides if a building permit should be issued based on an evaluation of the design documents. Plans for a mechanical system often require the stamp of a licensed, professional engineer. Why are software design specifications assumed to be accurate, complete, and reflect the sponsors' intent?

One pervasive misconception is that a quality assurance team could not possibly have the engineering expertise to challenge the almighty software architect--the lord of the domain. The fact is that simple methods are available--this is not, usually, rocket science. In the cases when it is rocket science, a full complement of certified software engineers had better be on staff!

This document will present a general process and procedure framework for the verification of software design specifications. Entry criterion, life cycle considerations, and universal aspects of static analysis and static testing are presented.

The scope of this report is limited to the analysis of verification and validation techniques as applied to software design specifications. Formal methods are not considered. Formal techniques such as algebraic descriptions of software are, pragmatically, beyond the consideration of most software development organizations today due to cost considerations. In addition, numerical descriptions of system behavior have been identified as archaic, and difficult to comprehend (Sommerville, 2001, p. 201).


Software Design Specification Verification and Validation

The primary consideration in the verification of software design specifications is to remove the ambiguous. A functional test case would be considered incomplete if it did not include an expected result. You should approach software design document validation similarly. The keys to success are clear and explicit benchmarks.

It is not adequate to state that the document format must be well organized--targets must be more precise. Use criterion such as the document format shall conform to IEEE 1016.1-1993, Guide to Software Design Descriptions. If applicable standards and guidelines do not apply, develop a document template to serve as a standard in your organization. Regardless of the chosen benchmark, you must develop a consensus that it is the best approach for the organization. The document authors must agree that it suits their needs, and not obstructive to them. A review and circulation process should be initiated to develop that consensus on document format and content.

Document content criterion will not be as deliberate as format. However, consider that the typical validation process will include group walkthroughs and team inspections, where consensus building is integral. Guidelines must be agreed upon as a prerequisite to specification verification. Stick to the basics learned in functional testing: defined test objectives, preconditions, post conditions, clear definition of test success. The figure below illustrates a basic process flow that will facilitate the verification and validation of software specifications. Define the benchmark of success--step one!


This document presents a general process and procedure framework for the verification of software design specifications. Entry criterion, life cycle considerations, and universal aspects of static analysis and static testing (review and inspection techniques) are presented.