You say to-MAY-toe, I say to-MAH-toe.
I hear the questions daily — “Did you validate the system? Did you verify the feature?” The words validate and verify are used interchangeably, but what do they really mean? Is there a difference? In the world of software development and quality assurance, yes… and you need to do both. It’s even more important for the tester to understand what they are, and what each entails – and how some definitions may change in a world where waterfall is out and continuous delivery is king.
What’s the difference?
The English definitions of validate and verify are pretty close; in fact, they are listed as synonyms of each other.
Validate – to recognize, establish, or illustrate the worthiness or legitimacy of (Def 2b, Meriam-Webster Dictionary Online http://www.merriam-webster.com/dictionary/validate)
Verify – to prove, show, find out, or state that (something) is true or correct (Merriam-Webster Dictionary Online http://www.merriam-webster.com/dictionary/verify)
In other words, is something right?
But how do we use them in software development? As I started thinking about this (and slowly started to get more and more irked by people using the words interchangeably), I realized that we may see a shift in how most people view verification and validation as more development teams adopt continuous delivery practices.
There are a lot of sites that indicate that verifying software asks, “Did we build the software right?”, while validating software asks, “Did we build the right software?”1)An example can be found here: http://softwaretestingfundamentals.com/verification-vs-validation/ – and those are great questions that definitely still stand going from a waterfall to a continuous world. However, I’ve been reading other posts that indicate verification does not actually include testing the code 2)Examples that illustrate verification does not include testing in the code can bef ound here: http://testingbasicinterviewquestions.blogspot.com/2012/01/difference-between-verification-and.html and here: http://www.softwaretestingclass.com/difference-between-verification-and-validation/, and I just cannot get behind this. I’m not saying that the authors are necessarily wrong. Some of the posts are pretty old (in software, things age in dog years) and posts like these are more than likely written for teams using a waterfall methodology, or trying to obtain certifications like CMMI or adhere to standards like IEEE that have certain things spelled out. But as we make the shift to continuous delivery, I think we need to change a little bit about what our perceptions of ‘artifacts’ and verification really mean.
My opinion ( perhaps mine alone), is that in continuous integration and delivery, where testing is brought upfront and not held off until the end, verification may also come in the form of testing each user story, and since development is not complete until all of those tests pass, I am answering the question, “Did I build the software right?” Therefore, my testing is verification. This contradicts others’ posts that this is done without using the software, and is about pure reviews of various artifacts. Others may disagree, but I think that testing the software is an important part of verification. How else do you know it was built correctly if you are not testing?
From my point of view, part of validation comes from the acceptance criteria the team defines upfront — diving into the question of whether or not the software is useful for the client, and what is considered acceptable. Yes, there is a lot of validation activity that mostly still occurs at the end (such as usability tests, customer or beta testing, etc…), but I think if you are using Acceptance Test Driven Development, you are identifying early on that what you are building is what is best for the client (whether through prototyping, creating wireframes, writing acceptance tests, and so on) and answering for yourself, “Am I building the right software?” Validation can go beyond just the acceptance criteria (though hopefully the upfront discussions of the user stories, business reasons, and acceptance criteria help mitigate potentially not validating something very late in the game.) Some things you just do not know until it’s built, and so you may not be able to fully validate something until an audit or end-to- end workflow or exploratory testing occurs to really identify how usable something is.
Why do I need to do both?
It is important to both verify and validate your product. Just because you perform one does not mean that the other can be ignored. For example, I can verify that all tests have passed and that technically the software is built correctly. But that does not tell me that I have actually developed something usable and meaningful for the client. In other words, what has been verified is not validated.
Does the vocabulary really matter? In all likelihood, people are still going to use the words interchangeably. But now you know the difference.
Ashley Hunsberger is a Quality Architect at Blackboard, Inc. and co-founder of Quality Element. She’s passionate about making an impact in education and loves coaching team members in product and client-focused quality practices. Most recently, she has focused on test strategy implementation and training, development process efficiencies, and preaching Test Driven Development to anyone that will listen. In her downtime, she loves to travel, read, quilt, hike, and spend time with her family.
References [ + ]
|1.||↑||An example can be found here: http://softwaretestingfundamentals.com/verification-vs-validation/|
|2.||↑||Examples that illustrate verification does not include testing in the code can bef ound here: http://testingbasicinterviewquestions.blogspot.com/2012/01/difference-between-verification-and.html and here: http://www.softwaretestingclass.com/difference-between-verification-and-validation/|