QA at Michigan Labs
A Multi-Faceted Approach
MichiganLabs takes a unique approach to Software Quality Assurance. We believe that quality is the responsibility of everyone, from the business stakeholder to the team lead to each developer. Quality assurance practices pervade our entire software development lifecycle.
From the very start, quality is created when writing user stories1. As a part of the user story creation, stakeholders create acceptance criteria for each story that will be worked on by defining the story and any additional information that is entered into the requirements tracker. The acceptance criteria are recorded as part of the story artifact and are visible to the developer when development starts on the story.
A useful by-product of this story analysis is the decomposition that is forced if the acceptance criteria are too complicated or if it is too difficult to describe what is needed. Smaller stories lead to a smaller test footprint for each story, allowing developers and project stakeholders to quickly understand what is required and a shorter cycle to verify that the requirement is met.
Quality Development Practices
Developers at MichiganLabs work in a team environment where collaboration is the default. During Sprint Planning, the team collaborates on understanding the story as well as coming up with a technical approach fulfilling the requirements. On stories which are technically challenging or high impact, the development team will often choose to pair-program on code which is particularly complicated. Having two people evaluate the code as it is being written to make sure it is feature complete and is error-free.
As part of the development process, developers are expected to write unit tests alongside the code they are writing. MichiganLabs utilizes a continuous integration server which runs a build every time code is pushed to the central repository. As part of that build process, unit tests are run for all new code, and any failed tests are reported to the entire development team immediately. Test coverage is also tracked during the build and is checked during the code review process.
MichiganLabs utilizes a documentation tool for all development that includes an Application Programming Interface (API). The API documentation is versioned and reviewed alongside the product code which implements it. Documentation is built and published during the build phase, and all integration against the API is done by reading the documentation rather than looking for the implementation in code. Developing against the documentation ensures that it matches the implementation.
All code written at MichiganLabs is reviewed by at least one other (often two or three) developer(s) before being merged into the mainline branch. Atlassian’s self-hosted version of Bitbucket provides a standard tool where developers and product owners can view the newly-written code, discuss technical details and verify that requirements are met. Developers are responsible for writing a description of the code change, calling out places where additional review may be beneficial and providing acceptance criteria (based on the product owner’s description) that need to be validated before a merge can happen.
Unit tests and documentation are reviewed alongside the product code. Reviewers check to make sure that the documentation matches the implementation and that tests cover the new code appropriately. Reviewers also look at the implemented functionality to make sure it matches the Quality Requirements that were defined during story creation and sprint planning. Feature verification is done using simulators as well as a myriad of real devices to ensure that functionality works across different operating systems, browsers, and hardware configurations.
MichiganLab’s Bitbucket configuration enforces that the current branch builds without errors, that all tests pass and that at least one reviewer has approved the branch before it can be merged in.
Quality Client Interaction
Throughout the entire development process, the product owner can watch the stories being developed and is available for feedback and questions from the development team. An (automated) build is generated that is provided to the product owner for final acceptance testing. All the completed stories are reviewed at the end of each sprint, and the product owner is given a last chance to sign off of on the deliverable before the development team moves on to the next set of user stories.
How We Are Different
Many companies look at QA as a separate function than development, often creating a QA role that is filled by an individual or team. If QA is its own role, it becomes easier for development staff and stakeholders to offload quality on to that role. The development team may come to expect that the second pair of eyes will catch any issues created and thus be less concerned about writing the best code the first time through. In our model, while there are still multiple people who review for quality, the ultimate responsibility rests with the author and (more importantly) with the entire team. It is no longer an “us vs. them” conversation but rather a story of a team (with developers, business stakeholders, and leaders) all achieving great quality.
We are proud of the quality we provide at MichiganLabs but ultimately realize that quality is one part of the larger project picture which is why we also strive to inject craftsmanship into the solutions we develop. We often partner with clients who have a specific release date that is needed or companies who are willing to make trade-offs to get a prototype project out the door. We recognize that each project is different and in the real-world things are not always so clear. Get in touch with us so we can form a customized approach to your next project, and let us help inject quality into your process while delivering a rock-solid solution that meets your needs.