Capability Chaos Growing Competent Optimizing
Testing Breadth
Mostly, or entirely focused on functional testing or
verification. Non-functional tests may be done
sporadically, but usually as a reaction to an
important customer issue.
Focused on quadrant 1 and part of quadrant 2 of
the testing quadrants.
Understands the importance of multidimensional
test strategy. Occasionally, models like Agile
Quadrants are followed or partially followed.
Some non-functional testing is part of each release,
but some (e.g. performance, security, reliability),
may still be ignored. Non-functional testing is
performed by, or driven by QA.
Team understands testing quadrants, and
consistently adds tests from each of the quadrants
to support functional testing. Many team members
contribute to non-functional tests.
The team has internalized ability to formulate
multidimensional test strategy with significant
involvement from team. Team contributes to
writing/executing tests.
Creation of multidimensional test strategy is an
integral part of feature planning and no longer a
QA-specific task.
Testing approach consistently covers all areas of
testing (e.g. quadrants), and non-functional testing.
Impact is considered during all aspects of feature
development and bug fixing.
Team makes use of data (see "Customer Data
Analysis") to improve quality strategy.
Quality and Test Ownership
Testers “own” quality (via testing). Bugs may be
"blamed" on testers, and testing activities are seen
as something QA does. Testers may add
regression tests in response to bugs found in later
testing cycles. "Acceptable" levels of quality
typically reached via testing and bug fixing only.
Testers focus mostly (or entirely) on verification of
new code (features and bug fixes)
Feature developers own the majority of functional
testing - including unit tests and many/most
integration tests, and may write frameworks to
assist in this effort. Feature developers add
regression tests as needed in reaction to bugs
found in later stages of testing.
Testers focus on functional testing, but frequently
find time to perform other testing activities, pair with
developers, or develop and implement quality
strategies.
Every member of the team writes tests and values
quality. Feature developers write all (or nearly all)
functional tests, and contributes to non-functional
tests. There may be holes in quality ownership, but
these are usually discovered and handled quickly.
Testers typically spend their time organizing and
improving testing efforts (incl. non-functional
testing) across the team. Frequent examples of
improving quality via influencing other team
members.
Every team member values quality and customer
value above all else, and contributes with both
ideas and implementation of functional and non-
functional tests. Quality is "built-in" from the start on
all features.
The test specialist's focus is on coaching the team
and assisting quality efforts. Seen as a valuable
resource for driving quality and quality practices
within the team.
Teams at this level may not need a dedicated
quality specialist.
Technical Debt and
Maintenance
Large backlogs and infrequent updates or triage of
bugs. Average Bug Age can be six months or
longer.
Bugfixes are postponed to bugfix weeks and may
accumulate in large pull requests.
Other technical debt (code refactoring, fixing
analysis errors, etc.) addressed infrequently.
Maintainability is rarely considered.
Bugs are triaged frequently, but the bug backlog
spans several releases. The number of bugs in the
backlog is two to three times the number of team
members. Average Bug Age can be three months
or longer.
Bugs that will potentially be fixed by a refactor may
be closed and attached or copied to the refactor
work item.
Other technical debt addressed semi-frequently,
but maybe an individual effort rather than
something the whole team owns. Maintainability of
the code is primarily a reactive effort.
Bugs are triaged and assigned as they are
discovered. The team maintains a small backlog of
bugs, using a bug ceiling policy, for example, but
no more than one bug per team member. Average
Bug Age is typically less than 30 days.
Technical debt levels are minimal and are tracked
with appropriate priority. Maintainability is a
proactive effort.
No bug backlog, and in some cases, no bug
database. Average Bug Age typically 7 days or
less.
Fixes are targeted in small PRs and are tracked
through backports to release branches by
developers.
Non-bug technical debt is addressed continuously
and the team typically lives debt-free - fixing issues
found by new tools as they are added.
Code Quality and Tools
Little usage of requirements coverage or code
coverage. Inconsistent usage of static analysis
tools (including linters and compiler warnings) and
loosely followed coding standards. Code reviews
not done or inconsistent.
Some usage of coverage or static analysis data
(including compiler warnings or linters) used to
improve testing and code quality. Check-in gates
(e.g. CI) may use tools to monitor code quality
based on these tools and influence or reject check-
ins. Coding standards exist and are followed in
most cases.
Code coverage is used to identify holes in test
coverage.
Code analysis tools used consistently, and are
typically part of check-in gates. There may be
exceptions in place (e.g legacy code, or for some
rules), but team generally values coverage and
analysis tools, and tweak them frequently in order
to improve code correctness.
Additional code-correctness tools are investigated
and implemented at a regular basis, and the team
uses additional tools than are available in the
check-in suites.
Code coverage, used regularly to find test holes,
and to focus testing needs.
Coverage and code correctness tools are used
consistently to track potential testing holes and risk
areas. The team frequently investigates and adds
additional tools to developer-desktop and CI test
suites.
Those tools run frequently and are constantly
updated in order to provide relevant and actionable
information.
Customer Data Analysis
(Analytics)
Data from customers is either untracked or ignored.
Forum and twitter feedback is frequently
considered to be sufficient.
Some data tracked that may be used to understand
quality or prioritization. Data may be discarded if it
does not match intuition or anecdotal feedback.
Analytics / instrumentation is usually added after
the feature is complete.
Team has at least a handful of metrics in place that
help them understand feature quality and customer
experience. There may be holes in interpreting the
data, but these are addressed quickly. Analytics /
instrumentation is part of the requirements or done
definition, and almost always done as part of
feature development.
Customer data drives most, or all decisions about
solution development.
Development Approach
Ad-hoc, or loosely structured. Emphasis on getting
features out over quality. Complex features are
developed in long running branches and merged
infrequently. "Done" is not, or incompletely defined.
Some structure to development - usually using a
partial implementation of an agile framework (e.g.
scrum w/ sprints only, or kanban without a WiP
limit). "Done" definitions begin to be defined, but
often have holes, or are sometimes ignored.
Development mostly done in large batches (big
check-ins).
Development processes and structure focus on
quality and flow. Development is typically done in
small batches, with some exceptions. A "Done"
definition for each feature is vetted and completed
as part of feature planning.
Feature development is organized and thoughtful,
and generated in small batches, vertical slices, or
behind feature flags. If possible (e.g. services) most
changes are released as experiments (A/B testing)
in order to quickly understand business value and
quality.
Learning & Improvement
No organised efforts to learn and improve
approaches, processes, techniques, or policies.
Some learning activities - typically root cause
analysis begin to occur, but are usually an
individual vs. a team effort. Team may perform
retrospectives, but improvement actions occur
infrequently.
Retrospectives and root cause analysis are valued
by most members of the team, and learning and
improvement occur frequently.
Team recognizes the value of learning, and uses
frequent and varied retrospectives, root cause
analysis of customer issues, and other means to
discover areas of improvement, and consistently
strives to learn and improve from this information.
Leadership Emphasis
No comments or direct support from leadership
regarding quality. Values and actions demonstrated
do not include quality. Quality is seen as "over-
engineering" by team leadership.
Team leadership communicates quality as a value
or a quality vision, but reinforcement or
demonstration of the vision or values is infrequent.
Leadership provides and reinforces a clear vision of
the what and why of quality. Leader defined
principles are executed consistently at the team
level.
Quality is internalized by the team at a level where
leadership support is rarely necessary in order to
reinforce quality approaches and practices.

Quality culture transition guide model - full

  • 1.
    Capability Chaos GrowingCompetent Optimizing Testing Breadth Mostly, or entirely focused on functional testing or verification. Non-functional tests may be done sporadically, but usually as a reaction to an important customer issue. Focused on quadrant 1 and part of quadrant 2 of the testing quadrants. Understands the importance of multidimensional test strategy. Occasionally, models like Agile Quadrants are followed or partially followed. Some non-functional testing is part of each release, but some (e.g. performance, security, reliability), may still be ignored. Non-functional testing is performed by, or driven by QA. Team understands testing quadrants, and consistently adds tests from each of the quadrants to support functional testing. Many team members contribute to non-functional tests. The team has internalized ability to formulate multidimensional test strategy with significant involvement from team. Team contributes to writing/executing tests. Creation of multidimensional test strategy is an integral part of feature planning and no longer a QA-specific task. Testing approach consistently covers all areas of testing (e.g. quadrants), and non-functional testing. Impact is considered during all aspects of feature development and bug fixing. Team makes use of data (see "Customer Data Analysis") to improve quality strategy. Quality and Test Ownership Testers “own” quality (via testing). Bugs may be "blamed" on testers, and testing activities are seen as something QA does. Testers may add regression tests in response to bugs found in later testing cycles. "Acceptable" levels of quality typically reached via testing and bug fixing only. Testers focus mostly (or entirely) on verification of new code (features and bug fixes) Feature developers own the majority of functional testing - including unit tests and many/most integration tests, and may write frameworks to assist in this effort. Feature developers add regression tests as needed in reaction to bugs found in later stages of testing. Testers focus on functional testing, but frequently find time to perform other testing activities, pair with developers, or develop and implement quality strategies. Every member of the team writes tests and values quality. Feature developers write all (or nearly all) functional tests, and contributes to non-functional tests. There may be holes in quality ownership, but these are usually discovered and handled quickly. Testers typically spend their time organizing and improving testing efforts (incl. non-functional testing) across the team. Frequent examples of improving quality via influencing other team members. Every team member values quality and customer value above all else, and contributes with both ideas and implementation of functional and non- functional tests. Quality is "built-in" from the start on all features. The test specialist's focus is on coaching the team and assisting quality efforts. Seen as a valuable resource for driving quality and quality practices within the team. Teams at this level may not need a dedicated quality specialist. Technical Debt and Maintenance Large backlogs and infrequent updates or triage of bugs. Average Bug Age can be six months or longer. Bugfixes are postponed to bugfix weeks and may accumulate in large pull requests. Other technical debt (code refactoring, fixing analysis errors, etc.) addressed infrequently. Maintainability is rarely considered. Bugs are triaged frequently, but the bug backlog spans several releases. The number of bugs in the backlog is two to three times the number of team members. Average Bug Age can be three months or longer. Bugs that will potentially be fixed by a refactor may be closed and attached or copied to the refactor work item. Other technical debt addressed semi-frequently, but maybe an individual effort rather than something the whole team owns. Maintainability of the code is primarily a reactive effort. Bugs are triaged and assigned as they are discovered. The team maintains a small backlog of bugs, using a bug ceiling policy, for example, but no more than one bug per team member. Average Bug Age is typically less than 30 days. Technical debt levels are minimal and are tracked with appropriate priority. Maintainability is a proactive effort. No bug backlog, and in some cases, no bug database. Average Bug Age typically 7 days or less. Fixes are targeted in small PRs and are tracked through backports to release branches by developers. Non-bug technical debt is addressed continuously and the team typically lives debt-free - fixing issues found by new tools as they are added. Code Quality and Tools Little usage of requirements coverage or code coverage. Inconsistent usage of static analysis tools (including linters and compiler warnings) and loosely followed coding standards. Code reviews not done or inconsistent. Some usage of coverage or static analysis data (including compiler warnings or linters) used to improve testing and code quality. Check-in gates (e.g. CI) may use tools to monitor code quality based on these tools and influence or reject check- ins. Coding standards exist and are followed in most cases. Code coverage is used to identify holes in test coverage. Code analysis tools used consistently, and are typically part of check-in gates. There may be exceptions in place (e.g legacy code, or for some rules), but team generally values coverage and analysis tools, and tweak them frequently in order to improve code correctness. Additional code-correctness tools are investigated and implemented at a regular basis, and the team uses additional tools than are available in the check-in suites. Code coverage, used regularly to find test holes, and to focus testing needs. Coverage and code correctness tools are used consistently to track potential testing holes and risk areas. The team frequently investigates and adds additional tools to developer-desktop and CI test suites. Those tools run frequently and are constantly updated in order to provide relevant and actionable information. Customer Data Analysis (Analytics) Data from customers is either untracked or ignored. Forum and twitter feedback is frequently considered to be sufficient. Some data tracked that may be used to understand quality or prioritization. Data may be discarded if it does not match intuition or anecdotal feedback. Analytics / instrumentation is usually added after the feature is complete. Team has at least a handful of metrics in place that help them understand feature quality and customer experience. There may be holes in interpreting the data, but these are addressed quickly. Analytics / instrumentation is part of the requirements or done definition, and almost always done as part of feature development. Customer data drives most, or all decisions about solution development. Development Approach Ad-hoc, or loosely structured. Emphasis on getting features out over quality. Complex features are developed in long running branches and merged infrequently. "Done" is not, or incompletely defined. Some structure to development - usually using a partial implementation of an agile framework (e.g. scrum w/ sprints only, or kanban without a WiP limit). "Done" definitions begin to be defined, but often have holes, or are sometimes ignored. Development mostly done in large batches (big check-ins). Development processes and structure focus on quality and flow. Development is typically done in small batches, with some exceptions. A "Done" definition for each feature is vetted and completed as part of feature planning. Feature development is organized and thoughtful, and generated in small batches, vertical slices, or behind feature flags. If possible (e.g. services) most changes are released as experiments (A/B testing) in order to quickly understand business value and quality. Learning & Improvement No organised efforts to learn and improve approaches, processes, techniques, or policies. Some learning activities - typically root cause analysis begin to occur, but are usually an individual vs. a team effort. Team may perform retrospectives, but improvement actions occur infrequently. Retrospectives and root cause analysis are valued by most members of the team, and learning and improvement occur frequently. Team recognizes the value of learning, and uses frequent and varied retrospectives, root cause analysis of customer issues, and other means to discover areas of improvement, and consistently strives to learn and improve from this information. Leadership Emphasis No comments or direct support from leadership regarding quality. Values and actions demonstrated do not include quality. Quality is seen as "over- engineering" by team leadership. Team leadership communicates quality as a value or a quality vision, but reinforcement or demonstration of the vision or values is infrequent. Leadership provides and reinforces a clear vision of the what and why of quality. Leader defined principles are executed consistently at the team level. Quality is internalized by the team at a level where leadership support is rarely necessary in order to reinforce quality approaches and practices.