SOFTWARE
TESTING
FUNDAMENTALS
Agenda
Chapter 1: Software Development Life Cycles and Testing
Chapter 2: Software Testing Overview
Chapter 3: Test Requirements
Chapter 4: Test Design Techniques
Chapter 5: Software Error
Sillicon Valley Testing Expertise logigear.com 2
CHAPTER 1
Software Development
Lifecycles (SDLC) and
Testing
3
CHAPTER 1
SDLC and Testing
Waterfall Model
Agile Model
Testing Phases and Milestones
DevOps and CI/CD
Sillicon Valley Testing Expertise logigear.com 4
OBJECTIVES
An introduction to Common Software Development Life Cycle (SDLC)
practices and how software testing fits in those contexts.
Understanding testing phases, milestones, checkpoints in a SDLC and
their purposes
Learn about how each SDLC choice affects software testing and its
implementation
Learn about DevOps and CI/CD
Sillicon Valley Testing Expertise logigear.com 5
SDLC and Testing
1.1- SDLC and Testing
1.2- Waterfall Model
1.3- Agile Model
1.4- Testing Phases and Milestones
1.5- DevOps and CI/CD
Sillicon Valley Testing Expertise logigear.com 6
SDLC and Testing
How testing is directly affected by the SDLC choice:
Documentation availability to test against
Time to test
Time to automate or produce effective automation
Knowledge and understanding of the application as we plan our tests
Amount of regression testing
Sillicon Valley Testing Expertise logigear.com 7
SDLC and Testing
1.1- SDLC and Testing
1.2- Waterfall Model
1.3- Agile Model
1.4- Testing Phases and Milestones
1.5- DevOps and CI/CD
Sillicon Valley Testing Expertise logigear.com 8
Waterfall Model
Requirements
Definition
verify Functional Design
verify Technical Design
The Waterfall model is a
sequential software development verify Coding
model.
Transition between phases is verify Testing
done by a formal review.
The review is a checkpoint to see verify Deployment
that you are on the right track.
Sillicon Valley Testing Expertise logigear.com 9
Testing in Waterfall Model
Testing is not inherent to every phase of the Waterfall model.
Constant testing from the design, implementation and verification
phases is required to validate the phases preceding them.
Testing phase: Only after coding phase, software testing begins.
Different testing methods are available to detect the bugs that were
committed during the previous phases. A number of testing tools and
methods are already available for testing purposes.
Sillicon Valley Testing Expertise logigear.com 10
SDLC and Testing
1.1- SDLC and Testing
1.2- Waterfall Model
1.3- Agile Model
1.4- Testing Phases and Milestones
1.5- DevOps and CI/CD
Sillicon Valley Testing Expertise logigear.com 11
Agile Model
Key principles of Agile:
Top priority is customer satisfaction
Welcome changing requirements, even late in the project
Deliver working software frequently
Business and development people should work together daily
Communicate face-to-face, whenever possible
Working software is the primary measure of progress
Teams constantly reflect on how to improve their processes and
methods
(From http://www.agilemanifesto.org/principles.html)
Sillicon Valley Testing Expertise logigear.com 12
Agile Model
Sillicon Valley Testing Expertise logigear.com 13
Agile Model
Sillicon Valley Testing Expertise logigear.com 14
Agile Model
What is User Story?
User Story is a story about how the system is supposed to solve a
problem or support a business process
A user story is a very high-level definition of a requirement, containing
just enough information collected from client so that the developers can
implement it.
User stories are simple enough that people can understand easily
Each User Story is written on a User Card (by collecting from client
directly), stored in full documents or an ALM tool such as Rally, TFS or
IBM Rational.
Sillicon Valley Testing Expertise logigear.com 15
Agile Model
Acceptance Criteria
An acceptance criterion is a set of accepted conditions or business rules
which the functionality or feature should satisfy and meet, in order to be
accepted by the Product Owner/Stakeholders.
Acceptance criteria define the boundaries of a user story, and are used
to confirm when a story is completed and working as intended.
Sillicon Valley Testing Expertise logigear.com 16
Testing in Agile Model
Delivery cycles are short. Development is very focused. Diagrams, use
cases, user stories, index cards, or discussions of functionality serve as
“documentation” to test against.
Agile projects usually have more unit testing by developers.
Dynamic nature of development needs structured regression testing.
Testing is often exploratory, but focused.
Dynamic style or projects include much change but the change is
discussed and side-effects noted.
Sillicon Valley Testing Expertise logigear.com 17
SDLC and Testing
1.1- SDLC and Testing
1.2- Waterfall Model
1.3- Agile Model
1.4- Testing Phases and Milestones
1.5- DevOps and CI/CD
Sillicon Valley Testing Expertise logigear.com 18
Phases & Milestones
Development Phase: A time block within SDLC. There are a number of
activities that have to be done in this time block.
Examples of some Test Phases: Unit Test, Integration Test, System
Test, User Acceptance Test, Release Test and Maintenance Test.
Development Milestone: A significant event in the software developing
process in which, the software moves from one phase to another. The
milestone let us know where the application is in SDLC
Criteria: Built or created from the standard of the project.
Sillicon Valley Testing Expertise logigear.com 19
Testing Phases and Milestones
Test Phase Pre – Alpha Alpha Beta GM/Release
Unit Integration System User Acceptance
Performed by Developers Developers/(Technical) Testers Users
Testers
Definition Test one unit of Iteratively more complex Big bang. The entire Requirements-
code at a time system components product as it is based testing by
integrated. Checking the intended to be used the project sponsor,
units of code work testing the
together incrementally production system
Outcome Fewer Bugs Bug Finding Bug Finding/QC QC
Goals of To validate To find bugs, to learn To find bugs, to To validate user
Testing each module in about the product, to validate the system requirements
isolation validate the integration requirements, data
requirements flow, scenarios, load
Sillicon Valley Testing Expertise logigear.com 20
SDLC and Testing
1.1- SDLC and Testing
1.2- Waterfall Model
1.3- Agile Model
1.4- Testing Phases and Milestones
1.5- DevOps and CI/CD
Sillicon Valley Testing Expertise logigear.com 21
DevOps Overview
DevOps = “software DEVelopment” and “OPerationS”
DevOps is a software engineering culture and practice that aims at
unifying software development (Dev) and software operation (Ops).
The main characteristic of the DevOps movement is to strongly
advocate automation and monitoring at all steps of software
construction, from integration, testing, releasing to deployment and
infrastructure management.
DevOps aims at shorter development cycles,
increased deployment frequency, and more
dependable releases, in close alignment with
business objectives.
Sillicon Valley Testing Expertise logigear.com 22
Life Before DevOps
Sillicon Valley Testing Expertise logigear.com 23
DevOps Goals
Improved deployment frequency;
Faster time to market;
Lower failure rate of new releases;
Shortened lead time between fixes;
Faster mean time to recovery (in the event of a new release crashing or
otherwise disabling the current system).
Sillicon Valley Testing Expertise logigear.com 24
Continuous Integration
Continuous Integration is a key aspect of rapid development:
Build often
Automated builds
Automated unit tests re-run
Automated build acceptance (smoke) testing
Automated functional/regression testing
Sillicon Valley Testing Expertise logigear.com 25
CI / CD
CI / CD / CD
Continuous Integration: Developers practicing continuous integration
merge their changes back to the main branch as often as possible.
Continuous Delivery: Continuous Delivery is an extension of
Continuous Integration to make sure that the new changes can be
deployed to testing / staging environment to go through automation test
quickly in a sustainable way.
Continuous Deployment: Continuous Deployment goes one step
further than Continuous Delivery. Every change that passes all stages of
the production pipeline is released to customers. There's no human
intervention.
Sillicon Valley Testing Expertise logigear.com 26
CI/CD
Continuous Integration – Continuous Delivery – Continuous Deployment
Sillicon Valley Testing Expertise logigear.com 27
SUMMARY
Depending on the type of application to be built and or customer’s
requirement, we choose a suitable SDLC model.
There are several phases in the SDLC. In each phase, there are
milestones and criteria for each milestone. The criteria, milestones and
phases of the project help the product to be developed correctly, timely
and effectively.
Depending on the SDLC, the testing and its implementation is affected,
usually in time, knowledge about the application, documentation and
amount of regression test done.
DevOps is a software engineering culture and practice, aiming at
shorter development cycles, increased deployment frequency, and
more dependable releases, in close alignment with business
objectives.
Sillicon Valley Testing Expertise logigear.com 28
CHAPTER 2
Software Testing
Overview
29
CHAPTER 2
Testing Group
Objectives of Testing
Test Coverage
Test Planning Framework
Manual Testing vs. Automated Testing
Key Lessons-Learned
Sillicon Valley Testing Expertise logigear.com 30
OBJECTIVES
An introduction to the concept of quality assurance (QA), quality control
(QC), and software testing
The roles of software testing groups
Objectives of testing and some testing issues
An introduction to Manual Testing and Automated Testing
An introduction to LogiGear© Software Test Planning Framework and
possible test strategies, approaches, methods, types and techniques
used in various software application development phases, for specific
quality objectives
Sillicon Valley Testing Expertise logigear.com 31
An Overview of Testing
2.1- Testing Group
2.2- Objectives of Testing
2.3- Test Coverage
2.4- Test Planning Framework
2.5- Manual Testing vs. Automated Testing
2.6- Key Lessons-Learned
Sillicon Valley Testing Expertise logigear.com 32
Testing, QA, QC
Quality Assurance: A process for providing adequate assurance that the
software products and processes in the product life cycle conform to
their specific requirements and adhere to their established plans.
Quality Control: A set of activities designed to evaluate a developed
working product.
Testing: The process of executing a system with the intent of finding
defects including test planning prior to the execution of the test cases.
Quality Assurance is interested in the process.
whereas
Testing and Quality Control are interested in the product.
Sillicon Valley Testing Expertise logigear.com 33
Roles of the Testing Group
The Testing Group provides important technical services, including:
Finding and reporting bugs.
Identifying weak areas of the program.
Identifying high risk areas of a project.
Explaining findings in ways that help
Engineering solve or fix the problems.
The customer service staff help customers.
Management make sound business decisions about each bug.
Sillicon Valley Testing Expertise logigear.com 34
Typical Responsibilities of a Tester
Find Problems
Find bugs.
Find design issues.
Find more efficient ways to find bugs.
Communicate Problems
Report bugs and design issues.
Report on testing progress.
Evaluate and report the program’s stability.
More Senior Testers Manage/Supervise Testing Projects
Prepare test plans and schedules.
Estimate testing tasks, resources, time and budget.
Measure and report testing progress against milestones.
Teach other testers to find bugs.
Sillicon Valley Testing Expertise logigear.com 35
An Overview of Testing
2.1- Testing Group
2.2- Objectives of Testing
2.3- Test Coverage
2.4- Test Planning Framework
2.5- Manual Testing vs. Automated Testing
2.6- Key Lessons-Learned
Sillicon Valley Testing Expertise logigear.com 36
Example Test Series
The program’s specification:
This program is designed to add two numbers, which you will enter
Each number should be one or two digits
Sillicon Valley Testing Expertise logigear.com 37
Equivalence Class & Boundary Analysis
There are 199 x 199 = 39,601 test cases for valid values:
– definitely valid: 0 to 99
– might be valid: -99 to -1
There are infinitely many invalid cases:
– 100 and above
– -100 and below
– anything non-numeric
Sillicon Valley Testing Expertise logigear.com 38
Equivalence Class & Boundary Analysis
Equivalence Class — Two tests belong to the same equivalence class if
the expected result of each is the same. Executing multiple tests of the
same equivalence class is by definition, redundant testing.
Boundaries — Mark the point or zone of transition from one equivalence
class to another. The program is more susceptible to errors a the
boundary conditions. Therefore, these are powerful sets of tests within
the equivalent class to use.
Generally, each class is partitioned by boundary values. Nevertheless,
not all equivalence classes have boundaries. For example, browser
equivalence classes might consist of Chrome and Firefox.
Sillicon Valley Testing Expertise logigear.com 39
Test Design
* Smallest/Largest Possible Values Allowed via UI
8* 1 9*
- Partition’s Valid Inputs
2 5
LB+1 UB+1
Lower Boundary (LB) 3 Upper Boundary (UB) 6
LB-1 UB-1
4 7
For each equivalence class partition, we’ll have at most, 9 tests to
execute.
It is essential to understand that each identified equivalence class
represents a specific risk that it may pose.
Sillicon Valley Testing Expertise logigear.com 40
Test Design
General Steps
Identify the classes
Identify the boundaries
Identify the expected output(s) for valid input(s)
Identify the expected error-handling for invalid inputs
Generate a table of tests (maximum, 9 tests for each partition of each
class).
Sillicon Valley Testing Expertise logigear.com 41
Test Design
Example:
Valid Values: 1 to 9999
or 1 to 4-digit value
Sillicon Valley Testing Expertise logigear.com 42
Objectives of Testing
Your objective should not be to verify that the program works
correctly, because:
If you can’t test the program completely, you can’t verify that it works
correctly.
The program doesn’t work correctly, so you can’t verify that it does.
Then as a tester, you fail to reach your goal every time you find an error.
You’ll be more likely to miss problems than if you want and expect the
program to fail.
Sillicon Valley Testing Expertise logigear.com 43
Objectives of Testing
The Objective of Testing a Program is to Find Problems.
Finding problems is the core of your work. You should want to find as many
problems as possible. The more serious the problem tester find, the better
tester is.
A test that reveals a problem is a success. A test that did not reveal a
problem is (often) a waste of time!
Sillicon Valley Testing Expertise logigear.com 44
Objectives of Testing
The Purpose of Finding Problems is to Get Them Fixed.
The point of the exercise is quality improvement!
The best tester is not the one who finds the most bugs or who
embarrasses the most programmers.
The best tester is the one who gets the most bugs fixed.
Sillicon Valley Testing Expertise logigear.com 45
Objectives of Testing
We cannot test everything, thus,
We want to focus and use our testing time in finding as many bugs as
possible through good test design
There is no such thing as bug-free software
We want to cover as many areas of the application as possible within
the time constraint.
We want to find the most serious bugs as quickly as possible.
We want to be very familiar the test design technique called equivalent
class and boundary condition analysis. It is an effective way to find
bugs quickly.
Sillicon Valley Testing Expertise logigear.com 46
An Overview of Testing
2.1- Testing Group
2.2- Objectives of Testing
2.3- Test Coverage
2.4- Test Planning Framework
2.5- Manual Testing vs. Automated Testing
2.6- Key Lessons-Learned
Sillicon Valley Testing Expertise logigear.com 47
Test Coverage
We know:
Complete Testing is impossible.
There are too many tests to run.
We have limited time.
So, what do we test? How deep? Where? With what data? On what
platforms? …
Sillicon Valley Testing Expertise logigear.com 48
Test Coverage
Coverage measures of the amount of testing done of a certain type. Since
testing is done to find bugs, coverage is a measure of your effort to detect
a certain class of potential errors.
So what kind(s) and level(s) of coverage are considered appropriate?
There is no magic answer.
Sillicon Valley Testing Expertise logigear.com 49
Test Coverage
Basic or simplistic coverage is the percentage of product “tested” based on
some aspect of the project.
# of tests to execute
Coverage =
total # of tests
Sillicon Valley Testing Expertise logigear.com 50
Test Coverage
Let’s look at possible coverage metrics, or measurements of how much is
tested and how much is not.
functionality requirements
functionality
screens
windows
External use cases
forms
user scenarios
…
Quantity of testing test cases
data
platforms
modules
lines of code
number of APIs
Internal # of parameters of each API
…
test cases
data
platforms
Sillicon Valley Testing Expertise logigear.com 51
Common Coverage Measures
Most Common for test team:
Requirements or User Story coverage (Acceptance Criteria)
Functional Coverage
Test Case Coverage
Method / Code coverage (sometimes)
Sillicon Valley Testing Expertise logigear.com 52
An Overview of Testing
2.1- Testing Group
2.2- Objectives of Testing
2.3- Test Coverage
2.4- Test Planning Framework
2.5- Manual Testing vs. Automated Testing
2.6- Key Lessons-Learned
Sillicon Valley Testing Expertise logigear.com 53
Testing Paradigm Definition - Glass-box
“In glass box testing, the tester (usually the programmer) uses his/her
knowledge of the source code to create test cases.” - Hung Nguyen
“Tests design based on the knowledge of the code to exercising the
code.” - Doug Hoffman
Example:
Sillicon Valley Testing Expertise logigear.com 54
Testing Paradigm Definition - Black-box
“Black box tests execute the running program, without reference to the
underlying code. This is testing from the customer’s view rather than from the
programmer’s.” - Hung Nguyen
“Focus on input, outputs, and an “externally derived” theory of operation.” - Cem
Kaner
Example:
Step Description Data Expected Result
1 Navigate to Gmail Gmail Login page is displayed.
2 Enter a valid email into ‘Email Ex.
or phone’ textbox
[email protected] 3 Click on Next button
4 Enter a valid password into Ex. 123456
‘Enter your password’ textbox
5 Click on Next button User can login successfully.
Sillicon Valley Testing Expertise logigear.com 55
Testing Paradigm Definition - Gray-Box
“Gray-box testing consists of methods and tools derived from the knowledge of
the application internals and the environment with which it interacts, that can be
applied in black-box testing to enhance testing productivity, bug finding and bug
analyzing efficiency.” - Hung Nguyen
“Testing involving inputs, outputs, but test design is educated by information
about the code and the program operation of a kind that would normally be out of
scope of the view of the tester.” - Cem Kaner
Example:
Step Description Data Expected Result
1 Navigate to Gmail Gmail Login page is displayed.
2 Enter an HTML command Ex.
into ‘Email or phone’ textbox </BODY></HMTL>
3 Click on Next button An error message is displayed
to inform user about the
invalid
Sillicon Valley email
Testing or phone.
Expertise logigear.com 56
Testing Effectiveness
IT Testing
Knowledge Knowledge
Domain
Knowledge
Technical Knowledge
Sillicon Valley Testing Expertise logigear.com 57
Test Planning Framework Diagram
T
E Determine Quality Objectives Write Write
S T
T Select Test Approaches T
E E
Select Test Types S S
S Bug Types Application Stability
T
Targeted Interface Requirements
T T
R Determine Execution Test Phases C
A P A
T Design Methods/Techniques L
E
Apply Test S
A E
G
Y
Select Test Tools N S
Sillicon Valley Testing Expertise logigear.com 58
Test Strategy
Definition: A plan that starts with a clear understanding of the core
objective of testing from which, we derive a structure for testing by
selecting from many testing styles and approaches available to help us
meet our objectives.
Goals for testing usually fall into 2 categories: Bug finding or validation.
Once you have settled upon the goal for the test project, you can
strategize how you will accomplish that goal.
Sillicon Valley Testing Expertise logigear.com 59
Quality Objectives
Definition: Quality objective is the concrete requirement terms that
specify the type of the quality objectives and the level of acceptability
for each.
Example: Functionality, Usability, Security, Load, Stress, Performance,
Recovery…
Sillicon Valley Testing Expertise logigear.com 60
Test Approaches
Definition: A testing philosophy or style that drives the selection of
certain test design methods.
Examples: Requirement-based testing, Exploratory testing, Scenario-
based testing, Attack-based testing…
(…will be discussed more in a later section)
Sillicon Valley Testing Expertise logigear.com 61
Test Types
Definition: A category of test activities with the objective of exposing
specific types of bugs at certain interface, applicable for various level
of product maturity.
The selection of the test types must satisfy the test or quality
objectives.
Examples: Requirement Review, Design Review, Code Review, Code
Walkthrough, Design Walkthrough, Unit, Module, API, Usability,
Configuration, Compatibility, I18N, L10N, Regression, Performance,
Load, Stress, Failover/ Recovery, Installation, Security, Data
Verification, Compliance…
(…will be discussed more in a later section)
Sillicon Valley Testing Expertise logigear.com 62
Bug Types
Definition: Bugs or software errors are often grouped into categories
or types so we can condense a huge list of issues into a conceptually
manageable group of ideas. Any approach to categorizing bugs is
arbitrary.
Examples: User Interface, Error Handling, Boundary-Related,
Calculation, Initial and Later States, Control Flow, Handling or
Interpreting Data, Race Conditions, Load Conditions, Hardware/
Environment Compatibility, Source, Version, and ID Control,
Documentation…
(…will be discussed more in a later section)
Sillicon Valley Testing Expertise logigear.com 63
Targeted Interface
Definition: A boundary across which, two independent systems meet
and act on or communicate with each other. In computer technology,
there are several types of interfaces.
Examples: Source-level, Calling Functions and Classes, API, GUI,
CLI…
Sillicon Valley Testing Expertise logigear.com 64
Application Stability Requirements
Definition: The state or quality of being stable of a software
application.
In software development, a stable application or functionality is the one
that is reliable and dependable, and will consistently produce the same
expected outputs given on the same set of inputs and conditions.
Example: Software application which has passed the Alpha testing
phase is more stable than when it was in the beginning of the Alpha
testing phase.
Sillicon Valley Testing Expertise logigear.com 65
Test Phases
Definition: A test phase is a time block often derived from the
SDLC phases.
Within each SDLC phase, there might be more than one test phase.
Each test phase has different objectives related to bug finding or
validation, described in each section.
Examples: Pre-coding, Coding, Pre-integration, System, System,
Integration, Final, Beta / User Acceptance, Post, Production…
Sillicon Valley Testing Expertise logigear.com 66
Test Design Methods/Techniques
Definition: A test design method is a systematic procedure in which
you create tests to be executed. A test type or test approach may
use one or more test design methods.
Examples: Boundary condition, Equivalent class analysis,
Constraint analysis, State transition, Condition combination…
(…will be discussed more in a later section)
Sillicon Valley Testing Expertise logigear.com 67
Test Tools
Some test tools that can be used in a Test Project
Project Management Tools:
Test Case Management Tools:
Bug Management Tools:
Automation Tools:
Functional Testing:
Load and Performance Testing:
Unit Test:
IoT, API, Security…
Sillicon Valley Testing Expertise logigear.com 68
Test Plan
Definition: A management document outlining risks, priorities, and
schedules for testing. A document prescribing the approach to be
taken for intended testing activities.
Sillicon Valley Testing Expertise logigear.com 69
Test Case
Definition: A specific of test data and associated procedures
developed for a particular objective, such as to exercise a particular
program path or to verify compliance with a specific requirement
(IEEE – 729-1983)
Definition: A test that (ideally) executes a single well defined test
objective (i.e., a specific behavior of a feature under a specific
condition) (Testing Computer Software – Kaner, Falk, Nguyen)
(…will be discussed more in a later section)
Sillicon Valley Testing Expertise logigear.com 70
An Overview of Testing
2.1- Testing Group
2.2- Objectives of Testing
2.3- Test Coverage
2.4- Test Planning Framework
2.5- Manual Testing vs. Automated Testing
2.6- Key Lessons-Learned
Sillicon Valley Testing Expertise logigear.com 71
Manual & Automated Testing
Manual Testing
Performed by carrying out all the actions manually against the
application under test (AUT), step by step and revealing whether a
particular step was completed successfully or failing.
Always a part of any testing effort. It is especially useful in the initial
phase of software development, when the software and its user
interface are not stable enough, thus test automation does not make
sense.
Sillicon Valley Testing Expertise logigear.com 72
Manual & Automated Testing
Automated Testing
Using of software to control the execution of tests, and the comparison
of actual outcomes with the expected (specified) outcomes.
Setting up test-preconditions, and other test control and test reporting
functions.
Commonly, test automation involves automating a manual process
already in place that uses a formalized testing process.
Sillicon Valley Testing Expertise logigear.com 73
An Overview of Testing
2.1- Testing Group
2.2- Objectives of Testing
2.3- Test Coverage
2.4- Test Planning Framework
2.5- Manual Testing vs. Automated Testing
2.6- Key Lessons-Learned
Sillicon Valley Testing Expertise logigear.com 74
Key Lessons-Learned
The Program Doesn’t Work. Your task is to find errors.
Be Methodical
Complete Testing is Impossible
Use Powerful Representatives of Tests
Communication Must be Effective
Change is Normal
Sillicon Valley Testing Expertise logigear.com 75
The Program Doesn’t Work
All programs have bugs. Nobody would pay you to test if their program
didn’t have bugs.
Any change to a program can cause new bugs, and any aspect of the
program can be broken.
You DON’T “verify that the program is working.” You FIND bugs.
If you set your mind to show that a program works correctly, you’ll be
more likely to miss problems than if you want and expect the
program to fail.
Sillicon Valley Testing Expertise logigear.com 76
Be Methodical
Testing isn’t just banging away at the keyboard.
To have any hope of doing an efficient job, you must be methodical:
Break the testing project down into tasks and subtasks so that you
have a good idea of what has to be done and what has been done.
Track what has been done so that people don’t repeat the same
tasks and you know what tasks are left.
Prioritize the tasks.
Sillicon Valley Testing Expertise logigear.com 77
Complete Testing is Impossible
There are a nearly infinite number of paths through any non-trivial
program.
There are a virtually infinite set of combinations of data that you can
feed the program.
You can’t test them all.
Therefore, your task is to find bugs -- not to find all the bugs.
You want to:
Find as many bugs as possible
Find the most serious bugs
Find bugs as early as possible
Sillicon Valley Testing Expertise logigear.com 78
Use Powerful Representatives of Tests
Develop powerful tests by analyzing
Equivalence classes
Boundary conditions
Input combinations
Data/functionality relationships
…
Sillicon Valley Testing Expertise logigear.com 79
Communication Must be Effective
Your bug reports advise people of difficult situations. The problems that
you report can affect project schedule and company’s cash flow.
The clearer your reports are, the more likely it will be that the company
will make reasonable business decisions based on them.
Persuasive and technical writing, oral argument, face-to-face
negotiation, and diplomacy are core skills for your job.
Sillicon Valley Testing Expertise logigear.com 80
Change is Normal
Project requirements change, as do design and specifications.
The market changes.
Development groups come to understand the product more thoroughly
as it is being built.
Some companies accept late changes into their products as a matter of
course.
Take steps to make yourself more effective in dealing with late changes.
Sillicon Valley Testing Expertise logigear.com 81
SUMMARY
Testing Group includes QA, QC, and Testing staff who perform very
important services such as: find and report bugs, communicate
problems, and evaluate the product quality...
The objective of testing is finding as many good bugs as possible.
For testing software, LogiGear uses a test planning framework. This
framework includes several parts: test strategies, approaches, test
plans, test methods, test cases…
Sillicon Valley Testing Expertise logigear.com 82
CHAPTER 3
Test Requirement
83
CHAPTER 3
Product’s Documents
What is Test Requirement (TR)?
Requirement Attributes
Sillicon Valley Testing Expertise logigear.com 84
OBJECTIVES
An introduction to requirement, specification and design
An introduction to test requirements and how to use test requirement to
implement test case
Sillicon Valley Testing Expertise logigear.com 85
Test Requirement
3.1- Product’s Documents
3.2- What is Test Requirement (TR)?
3.3- Requirement Attributes
Sillicon Valley Testing Expertise logigear.com 86
Product’s Documents
(User) Requirements – Statements that identify attributes, capabilities,
characteristics, or qualities of a system. This is the foundation for what
will be or has been implemented. This can be written as User Story or
Use Case.
Architecture/Design/Specification – Overview of the software, including
relations to an environment and construction principles to be used in
design of software components.
Technical documents –
Documentation of code,
algorithms, interfaces, APIs…
End-user – Manuals for end-
user, system administrators
and support staff.
Sillicon Valley Testing Expertise logigear.com 87
Test Requirement
3.1- Product’s Documents
3.2- What is Test Requirement (TR)?
3.3- Requirement Attributes
Sillicon Valley Testing Expertise logigear.com 88
Test Requirement
A statement of what should be tested in the application under test (AUT)
A Test Requirement is created based on product documents. Tester uses
Test Requirement to create Test Cases.
Functional Requirement: the requirements for the functions that the
application should do.
Non-functional Requirement: the requirements for the properties that the
functions should have or should look like. There are 3 types of non-
function TR:
Look & Feel
Boundary
Negative
Sillicon Valley Testing Expertise logigear.com 89
Test Requirement
Suggested syntax:
Function + Operating Condition
Examples:
Functional: User can login successfully with valid username and
password.
Look & Feel: Login button is highlighted when hovering mouse over it.
Boundary: Username must have at least 5 characters.
Negative: Error message displays when user tries to login with
blank username.
Sillicon Valley Testing Expertise logigear.com 90
Test Requirement
3.1- Product’s Documents
3.2- What is Test Requirement (TR)?
3.3- Requirement Attributes
Sillicon Valley Testing Expertise logigear.com 91
Test Requirement Attributes
Written Prioritized
Specific Clear/ Unambiguous
Complete Concise
Correct Verifiable (key attribute)
Feasible In Understandable language
Consistent
(From Barry Boehm, Founding Director of the Center for Systems and
Software Engineering at the University of Southern California)
Sillicon Valley Testing Expertise logigear.com 92
SUMMARY
There are 4 types of Product Documents: (User) Requirements,
Architecture/Design/Specification, Technical documents, and End-User
documents.
A Test Requirement is created based on product documents. Tester
uses Test Requirement to create Test Cases.
There are 4 types of Test Requirements:
Functional
Look & Feel
Boundary
Negative
Sillicon Valley Testing Expertise logigear.com 93
CHAPTER 4
Test Methods and
Test Case Design
Techniques
94
CHAPTER 4
Some Common Test Approaches and Test Types
Smoke Test
Requirement-based Testing
Exploratory Testing
Regression Testing
Test Case
Test case criteria
Test case essentials
Test case syntax
Some Test Case Design Techniques
Equivalent Partitioning & Boundary Analysis
Condition Combination
Sillicon Valley Testing Expertise logigear.com 95
OBJECTIVES
An introduction to commonly used test approaches and how to
apply them to test an application
An overview about Test Case
Some techniques to design Test Cases
Sillicon Valley Testing Expertise logigear.com 96
Test Methods & Test Case Design Techniques
4.1- Some Common Test Approaches and Test Types
Smoke Test
Requirement-based Testing
Exploratory Testing
Regression Testing
4.2- Test Case
Test case criteria
Test case essentials
Test case syntax
4.3- Some Test Case Design Techniques
Equivalent Partitioning
Boundary Analysis
Condition Combination
Sillicon Valley Testing Expertise logigear.com 97
Smoke Test
Smoke Test - also known as Build Verification Test - is a group of
written Test Cases that are performed on a system prior to being
accepted for further testing.
This is a "shallow and wide" approach to the application, to make sure
that all major functions work under “normal” conditions. The tester
"touches" all areas of the application without getting too deep. No need
to get down to field validation or business flows.
If the Smoke Test fails, then the application is so badly broken and it is
not effective to continue testing on it.
These written tests can either be performed manually or using an
automated tool.
Sillicon Valley Testing Expertise logigear.com 98
Requirement-based Testing
Requirement-based Testing is:
Testing the Requirements, User Stories, Acceptance Criteria, Use
Cases…
Testing the Architecture, Design, Specifications…
Testing the product against the technical documents
The primary goal of Requirement-based Testing is to validate
requirements. It is not experimental, behavioral, exploratory or bug
finding. It validates.
The word “requirement” is used as a catch-all word for any
documentation, generated by another person or team for testers to
validate.
Sillicon Valley Testing Expertise logigear.com 99
Requirement-based Testing
The “mechanical approach” for test development:
Analyze requirements
Gather more information
Make at least a test case for every requirement
Execute the test cases
Sillicon Valley Testing Expertise logigear.com 100
Exploratory Testing
Exploratory Testing can have many names: Ad hoc, Discovery, Error
Guessing…
The primary goal of exploratory testing is to uncover new defects in the
product on-the-fly.
The task is finding ways to creatively design experiments, exploring,
experimenting, going from a known place to an unknown place.
Sillicon Valley Testing Expertise logigear.com 101
Exploratory Testing
Exploratory Testing is not requirement-driven. The goal is to probe for
weak areas of the program.
Some various error conditions should be checked are:
Out of boundary
Null input and overflow (storage of the sum)
Non-numeric
Hardware and memory error handling
…
Exploratory Testing is the testing approach that can produce the
highest number of bugs.
Sillicon Valley Testing Expertise logigear.com 102
Regression Testing
Regression Testing: The testing that is performed after making a
functional improvement or repair to the program.
Regression purpose:
Re-run the test cases to verify that the modification does not
cause any unintended effects and the system still complies with its
specified requirements.
Verify the fixed bugs to make sure that the bugs marked as fixed
by developers are really fixed.
Sillicon Valley Testing Expertise logigear.com 103
Regression Testing
Some of the most popular strategies for selecting regression test
suites:
Re-test all: Re-run all test cases. Simple but takes time.
Re-test risky Use Cases: Choose baseline tests to re-run by risk
heuristics.
Re-test by profile: Choose baseline tests to re-run by allocating
time in proportion to operational profile.
Re-test changed segment: Choose baseline tests to re-run by
comparing code changes.
Re-run test cases can be done using automated testing tools.
Sillicon Valley Testing Expertise logigear.com 104
Common Types of Black Box Testing
Acceptance (into-testing) Test Documentation & Help Test
Feature-Level Test Usability test
User Interface Test Performance Test
Install / Uninstall Test Regression Test
Import / Export Beta Test
Configuration / Compatibility Test User Acceptance Test
Utilities, Tool Kits & Collateral
Test
Sillicon Valley Testing Expertise logigear.com 105
Test Methods & Test Case Design Techniques
4.1- Some Common Test Approaches and Test Types
Smoke Test
Requirement-based Testing
Exploratory Testing
Regression Testing
4.2- Test Case
Test case criteria
Test case essentials
Test case syntax
4.3- Some Test Case Design Techniques
Equivalent Partitioning
Boundary Analysis
Condition Combination
Sillicon Valley Testing Expertise logigear.com 106
Test Case
Definition: A test that (ideally) executes a single well defined test
objective (i.e., a specific behavior of a feature under a specific
condition) (Testing Computer Software – Kaner, Faulk, Nguyen)
Definition: A specific set of test data and associated procedures
developed for a particular objective, such as to exercise a particular
program path or to verify compliance with a specific requirement (IEEE
729-1983)
Sillicon Valley Testing Expertise logigear.com 107
Why write Test Case?
Accountability
Reproducibility
Tracking
Verifying that tests are being executed correctly
Measuring test coverage
Finding bugs
Automation
Used as a training tool for new testers
For compliance
Sillicon Valley Testing Expertise logigear.com 108
Test Case Essentials
Things usually to include in a Test Case:
Test Case ID
Test Description/ Purpose/ Objective/ Title
Pre-condition (system requirements: printer, system clock, environment
conditions…)
Steps/ Procedures
Script
Parameters (input, output, default, options…)
Expected Result
Observed / Actual Result
Test Result: Passed / Failed / Blocked / Skipped
Bug ID
Notes / Comments
Sillicon Valley Testing Expertise logigear.com 109
Test Case Description
The most important part of a Test Case is the one-line title, describing
the objective of the test
That title can be called: Test Description, Test Title, Test Name, Test
Case, Test Objective, Test Goal, Test Purpose…
It is most important because:
It gives the reader a description and idea of the test
A good test name makes review easier
Easier to pass to another person
Easier to pass to automation team
In many cases, it may be the only part of a test case documented
Sillicon Valley Testing Expertise logigear.com 110
Test Case Description
Test Objective suggested syntax:
Action + Function + Operating Condition
In which:
Function may be function, feature, validation point…
Condition may be data…
Action:
Verify Print
Test Calculate
Validate Run
Prove …any action verb
Execute
Sillicon Valley Testing Expertise logigear.com 111
Test Steps
Test Steps describe the execution steps.
Start from opening the application, the Test Steps guide the tester
through a sequence of steps to validate the check-point.
Test Steps should be precise. No missing nor redundant steps.
In general, we avoid using too-specific data in the Test Steps.
However, special data used for special cases, if any, should have
specific examples.
Sillicon Valley Testing Expertise logigear.com 112
Validation Points
Tests need validation points.
This is the Expected Result. It can also be stated in the test objectives.
Do not rely on “seeing” that a test passes or fails. Write it.
Many times it is easier to define the test once you clearly state what
behavior, result or point you are attempting to validate.
Sillicon Valley Testing Expertise logigear.com 113
Example
TC ID: TC_LO01
Description: Verify that user can login successfully with valid
username and password.
Steps:
1. Navigate to the web application
2. Enter valid username
3. Enter valid password
4. Click on ‘Login’ button
Expected Result: User can login successfully. Welcome page
displays with the text of Welcome Username.
Sillicon Valley Testing Expertise logigear.com 114
Test Methods & Test Case Design Techniques
4.1- Some Common Test Approaches and Test Types
Smoke Test
Requirement-based Testing
Exploratory Testing
Regression Testing
4.2- Test Case
Test case criteria
Test case essentials
Test case syntax
4.3- Some Test Case Design Techniques
Equivalent Partitioning
Boundary Analysis
Condition Combination
Sillicon Valley Testing Expertise logigear.com 115
Equivalence Class and Boundary Analysis
For each equivalence class partition, we’ll have at most, 9 test cases to
execute.
It is essential to understand that each identified equivalence class
represents a specific risk that it may pose.
* Smallest/Largest Possible Values Allowed via UI
8* 1 9*
- Partition’s Valid Inputs
2 5
LB+1 UB+1
Lower Boundary (LB) 3 Upper Boundary (UB) 6
LB-1 UB-1
4 7
Sillicon Valley Testing Expertise logigear.com 116
Equivalence Class Example
Ranges of numbers (such as all the numbers between 10 and 99)
Membership in groups (dates, time, country names, etc.)
Invalid inputs
Equivalent output events (variation of inputs that produces the same outputs)
Equivalent operating environments
The number of times you’ve done something
The number of records in the database (or how many other equivalent objects)
Equivalent sums or other arithmetic results
Equivalent number of items entered (such as the number of characters entered
into a field)
Equivalent space (on a page or on a screen) required
Equivalent amounts of memory, disk space, or other resources available to the
program
Sillicon Valley Testing Expertise logigear.com 117
Equivalence Class and Boundary Analysis
Other Equivalence Classes and Special Cases
Negative values
Number of digit or characters
0
Non-printable characters
Upper ASCII (128-254)
O/S reserved characters
Space
Nothing
etc.
See Input Dialog Test Matrix
Sillicon Valley Testing Expertise logigear.com 118
Character Grouping
Low ASCII These are the Ctrl keys. Interesting ones include 000-null, 007-beep, 008-BS,
009-Tab, 010-LF, 011-VT/home, 012-FF, 013CR, 026-EOF (end of file, very
nasty sometimes), 027-ESC
Non-a lpha nume ric, Low non-alphanumeric (ASCII codes 32-47) space ! “ # $ % & ‘ ( ) * + , - . /
sta nda rd, printing ASCII
cha ra cte rs. W e ofte n Intermediate non-alphanumeric (ASCII 58-64) : ; < = > ? @
lump the se toge the r More intermediates (ASCII 91-96) [ \ ] ^ _ `
e ve n though the y’re in
four distinct groups. Top of standard ASCII (ASCII 123-126) { | } ~
Digits (ASCII 48-57) 0 1 2 3 4 5 6 7 8 9
Uppe r a nd low e r ca se (ASCII 65-90) A B C D E F G H I J K L M N O P Q R S T U V W XY Z
a lpha
(ASCII 97-122) a b c d e f g h i j k l m n o p q r s t u vw x y z
Uppe r ASCII These subdivide further depending on the character set, the
(ASCII 128-254)
user’s language, and the application.
Modifie r ke ys
These keys include (depending on the keyboard) Alt, Right-Alt, Shift, Control,
Command, Option, Left-Amiga, Right-Amiga, Open-Apple, etc. They generally
have no effect when pressed alone, but when pressed in conjunction with
some other key, they create a
It often pays to test all the “interesting” standard values, plus a sample of
others.
It is often best to assign a separate chart column to each modifier key, i.e.
one column for Ctrl, one for Shift, etc.
Function ke ys Test them alone and in combination with the modifier keys.
Cursor ke ys Test them alone and in combination with the modifier keys. It’s common for
every modifier key to have a different effect on cursor keys.
Nume ric ke ypa d ke ys These are not necessarily equivalent to number keys elsewhere on the
keyboard.
Europe a n ke yboa rds The left and right Alt keys often have different effects on non-English
keyboards. Also, these keyboards provide deal keys -- you press a dead key
to specify an accent, then press a letter key and (if it’s a valid character) the
Sillicon Valley
computer displays Testing Expertise
that let logigear.com 119
Equivalence Class and Boundary Analysis
How to input data into a text-box?
Enter / type
Copy & paste
Drag & drop
Virtual keyboard
Voice
…
Sillicon Valley Testing Expertise logigear.com 120
Test Methods & Test Case Design Techniques
Some Common Test Approaches and Test Types
Smoke Test
Requirement-based Testing
Exploratory Testing
Regression Testing
Test Case
Test case criteria
Test case essentials
Test case syntax
Some Test Case Design Techniques
Equivalent Partitioning
Boundary Analysis
Condition Combination
Sillicon Valley Testing Expertise logigear.com 121
Test Methods & Test Case Design Techniques
Some Common Test Approaches and Test Types
Smoke Test
Requirement-based Testing
Exploratory Testing
Regression Testing
Test Case
Test case criteria
Test case essentials
Test case syntax
Some Test Case Design Techniques
Equivalent Partitioning
Boundary Analysis
Condition Combination
Sillicon Valley Testing Expertise logigear.com 122
Condition Combination
Condition Combination
Involves an analysis of the combination relationship of the variables.
Each combination represents a condition to be tested with the same
test script or procedure.
General Steps
Identify the variables.
Identify the number of possible unique values of each variable.
Create a table illustrating the complete unique combination of
conditions formed by the variables and their values.
Sillicon Valley Testing Expertise logigear.com 123
Combination Test Case Design
Sillicon Valley Testing Expertise logigear.com 124
Combination Test Case Design
Sillicon Valley Testing Expertise logigear.com 125
SUMMARY
There are many methods to test an application. The best practice is a
combination of a few methods to satisfy the project needs.
Test case is written to test an application. Some of the best
techniques to design test case are:
Equivalent Partitioning
Boundary Analysis
Constraint Analysis
State Transition
Condition Combination
Sillicon Valley Testing Expertise logigear.com 126
CHAPTER 5
Software Error
127
CHAPTER 5
What is a Software Error?
18 Common Types of Software Errors
Finding, Reproducing and Analyzing a Software Error
Reporting a Software Error
A Common Bug Life
Sillicon Valley Testing Expertise logigear.com 128
OBJECTIVES
An introduction to Software Error and some common types of
software errors
Learn to find, analyze bugs and write bug report
An introduction to a common bug life
Sillicon Valley Testing Expertise logigear.com 129
Software Error
5.1- What is a Software Error?
5.2- 18 Common Types of Software Errors
5.3- Finding, Reproducing and Analyzing a Software Error
5.4- Reporting a Software Error
5.5- A Common Bug Life
Sillicon Valley Testing Expertise logigear.com 130
What is a Software Error?
A software error is present when the program does not do what its
user reasonably expects it to do.
It is fair and reasonable to report any deviation from high quality as a
software error.
The existence of software errors reflects an impediment on the
quality of the product, but does not necessarily imply that the
developers are incompetent.
Sillicon Valley Testing Expertise logigear.com 131
Software Error
5.1- What is a Software Error?
5.2- 18 Common Types of Software Errors
5.3- Finding, Reproducing and Analyzing a Software Error
5.4- Reporting a Software Error
5.5- A Common Bug Life
Sillicon Valley Testing Expertise logigear.com 132
18 Common Types of Software Errors
1. REQUIREMENTS: Missing function, Wrong function, Incorrect coding /
implementation of business rules…
2. FUNCTIONALITY: Function doesn’t work as expected, Unreliable
results, Excessive functionality, Inadequacy for the task at hand…
3. COMMUNICATION: Missing information, Wrong, misleading, or
confusing instructions, Display bugs, Display layout…
4. BOUNDARY-RELATED: Numeric boundaries, Boundaries in space,
Boundaries in time, Boundaries in loops, Boundaries in memory,
Boundaries within data structures…
Sillicon Valley Testing Expertise logigear.com 133
18 Common Types of Software Errors
5. ERROR HANDLING: Error Prevention (Inadequate tests of user input,
Inadequate protection against operating system bugs…), Error
Detection (Ignores overflow, Ignores impossible values…), Error
Recovery (Automatic error correction, Where does the program go
back to…)
6. ERRORS IN PROCESSING OR INTERPRETING DATA: problems
when passing data between routines, data storage corruption…
7. CONTROL FLOW: Logic error, Interrupts, Dead crash, Run-time error,
Loops, If-then-else…
Sillicon Valley Testing Expertise logigear.com 134
18 Common Types of Software Errors
8. MISSING FUNCTIONS: State transitions (Can't quit mid-program,
Can't pause…), Disaster prevention (No backup facility, No undo, No
Are you sure?...), Inadequate privacy or security, Doesn't support
standard O/S features…
9. USABILITY: Inconsistencies, Time-wasters (Garden paths, Choices
that can't be taken…), Menus (too complex menu hierarchy,
Inadequate menu navigation, Unrelated commands…), Software that
is difficult to use…
10. PROGRAM RIGIDITY: User tailorability (Can't turn off the noise,
Failure to execute a customization command…), Artificial intelligence
and automated stupidity, Superfluous or redundant information
required…
Sillicon Valley Testing Expertise logigear.com 135
18 Common Types of Software Errors
11. PERFORMANCE: Slow program, Poor responsiveness, No warning
that an operation will take a long time, Problems with time-outs…
12. RACE CONDITIONS: Races in updating data, Assumption that one
task has finished before another begins, Task starts before its
prerequisites are met…
13. LOAD CONDITIONS: Required resource not available, Doesn't erase
old files from mass storage, Doesn't return unused memory…
14. HARDWARE: Wrong device address, Device unavailable, Time-out
problems, Wrong operation or instruction codes, Device protocol
error...
Sillicon Valley Testing Expertise logigear.com 136
18 Common Types of Software Errors
15. ENVIRONMENT COMPATIBILITY
16. SECURITY
17. DOCUMENTATION
18. SOURCE, VERSION, AND ID CONTROL: Failure to update multiple
copies of data or program files, no title, no version ID, Wrong version
number on the title…
Reference:
http://www.testingeducation.org/BBST/testdesign/Kaner_Common_Software_Errors.pdf
Sillicon Valley Testing Expertise logigear.com 137
Software Error
5.1- What is a Software Error?
5.2- 18 Common Types of Software Errors
5.3- Finding, Reproducing and Analyzing a Software Error
5.4- Reporting a Software Error
5.5- A Common Bug Life
Sillicon Valley Testing Expertise logigear.com 138
What to do in a bug finding process?
1. Find the error
2. Reproduce the error
3. Analyze the error
4. Report the error
Sillicon Valley Testing Expertise logigear.com 139
Reproducing a Software Error
Some bugs are always reproducible, but some are just sometimes or
even rarely.
Bugs don’t just miraculously happen and then go away. If a bug
happens intermittently, it might be under some certain conditions.
When we find a bug, we are looking at a failure, which is a set of
symptoms of an underlying error.
We hypothesize the cause, then we try to re-create the conditions that
make the error visible.
If the bug is non-reproducible, you should always report it, but describe
your steps and observations precisely. Programmers will often figure
them out.
Sillicon Valley Testing Expertise logigear.com 140
Why is a Bug Hard to Reproduce?
Memory dependent
Memory corruption, run-time errors, corrupted data
Configuration dependent: software, hardware
Timing related
Initialization
Data flow dependent
Control flow dependent
Error condition dependent
…
Sillicon Valley Testing Expertise logigear.com 141
Making an Error Reproducible
Write down everything you remember about what you did the first time.
Note which things you are sure of and which are good guesses.
Note what else you did before starting on the series of steps that led to
this bug.
Review similar problem reports you’ve come across before.
Use tools such as capture/replay program, debugger, debug-logger,
videotape, log files or monitoring utilities that can help you identify
things that you did before running into the bug.
Talk to the programmer and/or read the code.
Sillicon Valley Testing Expertise logigear.com 142
Analyzing a Software Error
Why Analyze a Bug?
Make your communication effective;
Make sure you are reporting what you think you are reporting.
Make sure that questionable side issues are thoroughly
investigated.
Create accountability.
Support the making of business decisions;
Avoid wasting the time of the programming and management staff;
Find more bugs.
Sillicon Valley Testing Expertise logigear.com 143
Improve the Bug Report
Good bug analyzing can help a lot to improve the Bug Report in 4 ways:
1. Simplify the report by eliminating unnecessary or irrelevant steps
2. Simplify the report by splitting it into separate reports
3. Strengthen the report by showing that it is more serious than it first
appears
4. Strengthen the report by showing that it is more general than it first
appears.
Sillicon Valley Testing Expertise logigear.com 144
1. Eliminate Unnecessary Steps
Look for critical steps to reproduce the bug -- Sometimes the first
symptoms of an error are subtle.
Make a list of all the steps to show the error and try to shorten the list.
If there are just some critical steps, try to eliminate everything else
Sillicon Valley Testing Expertise logigear.com 145
2. Split into Separate Reports
Related problems can be reported together on the same report.
However, if this lengthens the report or makes it confusing, write two
reports instead.
If you have to run two tests to verify that a bug has been fixed, it’s fair
to report the two symptoms that you have to test on two separate bug
reports.
When you report related problems, it’s a courtesy to cross-reference
them. For example: “Related bug -- see Report # xxx”
Sillicon Valley Testing Expertise logigear.com 146
3. Show that it is More Serious
Look for follow-up errors:
Keep using the program after you get this problem. Does anything else happen?
Sometimes a modest-looking bug can lead to a system crash or corrupted data.
Look for nastier variants:
Vary the conditions under which you got the bug. For example, try to get the bug
while the program is doing a background save. Does that cause data loss or
corruption along with this failure?
Look for nastier configurations:
Sometimes a bug will show itself as more serious if you run the program with less
memory, a higher resolution output, more graphics on a page, etc.
Sillicon Valley Testing Expertise logigear.com 147
4. Show that it is More General
Look for alternative paths to the same problem:
Sometimes the bug can happen in some alternative path. For example, a
bug that happens when deleting a file can happen when deleting a folder
also.
Look for configuration dependence:
Sometimes the bug happens because of some hardware configuration.
For example, a graphic/UI bug might have some tie to the graphic card
driver, the graphic card manufacturer or the screen resolution.
Sillicon Valley Testing Expertise logigear.com 148
Software Error
5.1- What is a Software Error?
5.2- 18 Common Types of Software Errors
5.3- Finding, Reproducing and Analyzing a Software Error
5.4- Reporting a Software Error
5.5- A Common Bug Life
Sillicon Valley Testing Expertise logigear.com 149
How to Report a Software Error
Bug reports are your primary work product.
Sillicon Valley Testing Expertise logigear.com 150
A Useful Bug Report
Written (not reported orally)
Uniquely numbered (ID required)
Simple (non-compound - one bug per report)
Understandable
Reproducible
Non-judgmental (only facts, no opinions)
Sillicon Valley Testing Expertise logigear.com 151
A Sample Bug Report Form
Sillicon Valley Testing Expertise logigear.com 152
Report Content
Summary
Description
Steps to Reproduce, including Expected Result and Observed Result
Reproducible
Severity
Frequency
Priority
Keyword (Functional Area)
Attachment (screenshot, video, log file…)
Sillicon Valley Testing Expertise logigear.com 153
Bug Summary
This one-line description of the problem is the most important part of
the report.
The ideal summary tells the reader what the bug is, what caused the
bug, what part of the program it’s from and what its worst
consequence is. It usually runs from 8 to 15 words long.
The following syntax is used for writing Bug Summary:
Symptom + Action + Operating Condition
Sillicon Valley Testing Expertise logigear.com 154
Bug Summary
Some good examples of Bug Summary:
Run-time error when submitting Contact Us form with first name of
more than 256 characters.
The main dialog box is resizable.
Maximize button is still active while the dialog box is maximized.
Application crashes when clicking on ‘Submit’ button in Windows 10.
Top of ‘Maria’ image in ‘Friends’ page is not displayed.
Upgrade installation 1.0 fails when Window Media is running.
Sillicon Valley Testing Expertise logigear.com 155
Bug Summary
Some not-so-good examples of Bug Summary:
Software fails
Can't install
Severe performance problem
Back button does not work
My browser crashed. I think I was on www.xyz.com. I play golf with Bill
Gates, so you better fix this problem, or I'll report you to him. By the
way, your Back icon looks like a squashed rodent. Too ugly. And my
grandmother's home page is all messed up in your browser.
Sillicon Valley Testing Expertise logigear.com 156
Bug Description
Bug Description is used to describe the problem.
It can be used to:
Clarify the details of the bug
Provide more information: follow-up errors, nastier variants, nastier
configurations, alternative paths, configuration dependance…
Quote the error messages
Sillicon Valley Testing Expertise logigear.com 157
Steps to Reproduce
Go through the steps to recreate the bug.
Start from a known place (open the application) and then describe the
steps to reveal the bug.
Describe the Expected Result and Observed/Actual Result
Sillicon Valley Testing Expertise logigear.com 158
Reproducible
Never say it is reproducible unless you have recreated the bug.
(Always try to recreate the bug before writing the report.)
If you have tried and tried but you cannot recreate the bug, say No.
Then explain what steps you tried in your attempt to recreate it.
If the bug appears sporadically and you do not yet know why, say
“sometimes” and explain.
Values for this field: Yes, No, Sometimes (Or Reproducibility: Always,
Intermittent, Once)
Sillicon Valley Testing Expertise logigear.com 159
Severity
The severity of the bug is usually rated per 3 levels:
1. Critical: Fatal to the release (unacceptable to ship), such as
application crash, run-time error, data lost…
2. Major: It’s a bad bug, but it’s not critical.
3. Minor: It’s a bug, but it’s not a big deal.
Different companies / projects might have different ratings.
Bug Severity is very important to evaluate the product quality, so you’ll
need to assess this field thoughtfully.
Sillicon Valley Testing Expertise logigear.com 160
Frequency
With some bug tracking systems, you also have to rate the bug
frequency. Frequency is usually graded by assessing the following
three characteristics:
How easy is it for the user to encounter the bug
How frequent would the user encounter the bug
How often the buggy feature is used
Many companies use a three-level rating:
1. Always
2. Often
3. Seldom
Sillicon Valley Testing Expertise logigear.com 161
Priority
Priority rating is either automatically generated by the bug tracking
system by assessing the Severity and the Frequency ratings, or it can
be assigned by the Project Manager.
This is the priority to fix the bugs.
Sillicon Valley Testing Expertise logigear.com 162
Keyword (Functional Area)
The bug can be categorized according to its functional area.
It is important to categorize these bugs consistently. They’ll often be
assigned out and summary-reported by category. Burying a bug in the
wrong category can lead to its never getting fixed.
The Bug Tracking System should include a list of possible keywords.
Sillicon Valley Testing Expertise logigear.com 163
Attachment
This is where we attach the additional information / evidences for the
bug, such as bug screenshots, videos, log files…
This is very important in case the bug is difficult to describe or
reproduce.
Sillicon Valley Testing Expertise logigear.com 164
Example
Bug ID #1
Summary: The application does not close when clicking on End button.
Description: The application closes when clicking on X button but it does
not close when clicking on End button.
Steps to Reproduce:
1. Open Minibank
2. Click on End button
Expected Result: The application closes.
Observed Result: The application does not close.
Sillicon Valley Testing Expertise logigear.com 165
Software Error
6.1- What is a Software Error?
6.2- 18 Common Types of Software Errors
6.3- Finding, Reproducing and Analyzing a Software Error
6.4- Reporting a Software Error
6.5- A Common Bug Life
Sillicon Valley Testing Expertise logigear.com 166
Bug Status and Resolution
Status: The Status field indicates the current state of a bug.
Resolution: The Resolution field indicates what happened to this bug.
Bug Status and Resolutions, as well as the Bug Life Cycle Workflow, in
different projects / companies might be different.
Sillicon Valley Testing Expertise logigear.com 167
Common Bug Status and Resolution
Status Resolution Description
Open New The newly submitted bug
Open Need More Information The bug needs more information
Open To Be Distributed The bug is waiting to be distributed
Open To Be Fixed The bug needs to be fixed or is being fixed.
Closed As Invalid Not Reproducible The bug cannot be reproduced.
Closed As Invalid Not a Bug The application works as it is supposed to. So this is not a bug.
Closed As Invalid Duplicated The bug is just a repeat of another bug.
Closed As Invalid Deferred The bug will be fixed in a later release.
Closed As Invalid Feature Limitation There are some feature limitations that do not allow to fix the bug.
Open Fixed The bug is fixed.
Open Failed Verification The fixed bug is verified but still failed.
Closed As Valid Passed Verification The
Sillicon fixed
Valley bugExpertise
Testing is verified and passed verification.
logigear.com 168
A Common Bug Life
Tester Test Lead / Manager Developer
New
Submit new bug Review / Triage
Provide more info Provide more info
Deferred
Duplicated
Not a Bug
Not Reproducible
Feature Limitation
Closed As Invalid
Feature
Limitation
Regression Test
Fixed
Failed Passed
Verification Verification
Closed As Valid Sillicon Valley Testing Expertise logigear.com 169
SUMMARY
A software error is present when the program does not do what its
user reasonably expects it to do.
There are a lot of bugs in a software, which usually belong to one of 18
types of software errors.
Bug report is the main product of tester. To write a bug report, the
tester has to find, reproduce, analyze and finally report the bugs.
Sillicon Valley Testing Expertise logigear.com 170
Thank You!
Sillicon Valley Testing Expertise logigear.com