Role of Verification and Validation (V&V) in SDLC
Verification is the process of checking that a software achieves its goal without any bugs. It is the process to ensure whether the product that is developed is right or not. It verifies whether the developed product fulfills the requirements that we have. Verification is static testing.
Validation is the process of checking whether the software product is up to the mark or in other words product has high-level requirements. It is the process of checking the validity of the product i.e. it checks what we are developing is the right product. it is a validation of the actual and expected products.
A detailed description of Verification and Validation can be found here- https://www.geeksforgeeks.org/differences-between-verification-and-validation/
The various roles of Verification and Validation(V&V) in SDLC in the software testing process are given below:
1. Traceability Analysis :
Traceability can be defined as a property that describes the degree to which it can be traced to its point of origin. It also describes the ability to establish a predecessor-successor relationship between one work product and another. It helps us in tracing every software requirement back to its original system requirements established in the concept activity. The thing is to ensure that every requirement correctly satisfies the system requirements and that no extra things of software requirements are included. In this technique, we also determine whether any derived requirements are consistent with the original objectives, physical laws, and the technologies described in the system document.
2. Interface Analysis :
It is the detailed analysis of the interface requirements specifications in the software development lifecycle. It also helps us in identifying the interfaces between the applications to determining the requirements for ensuring that the components interact with each other effectively. The evaluation criteria are the same as those for requirements specification as it helps us in determining the requirements for interoperability. The main target of this analysis is on the interfaces between the software, hardware, and user.
3. Criticality Analysis :
The criticality is given to each and every software requirement and when the requirements are combined into functions, the combined criticality of requirements forms the criticality for the aggregate function. Criticality analysis is updated regularly after there have been any new requirement changes. This is because such changes can cause an increase or decrease in a function’s criticality which depends on how the revised requirement impacts system criticality.
The Criticality analysis has the following steps:
- A. The first step is to construct a control flow diagram (CFD) of the system with its elements where every block will represent only one software function.
- B. Then next, it will trace every critical function or its quality requirement with the help of a control flow diagram.
- C. Classify all traced software functions as critical to the proper execution of critical software functions & quality requirements.
- D. Focus additional analysis on these traced critical software functions.
- E. Lastly, it will repeat the criticality analysis for every life cycle process to check whether the implementation details shifted the emphasis of the criticality.
4. Hazard and Risk Analysis :
Hazard and Risk analysis is done during the requirements definition activity. In this analysis, the hazards or risks are identified by refining the requirements of the system into detailed software requirements.