0% found this document useful (0 votes)
42 views78 pages

Unit 4 ST&M

The document outlines the syllabus and key concepts of software testing and maintenance in the context of object-oriented software engineering. It covers various testing strategies, including unit testing, integration testing, regression testing, and validation testing, highlighting the importance of verification and validation in ensuring software quality. Additionally, it discusses debugging as a critical process that follows testing to identify and correct errors.

Uploaded by

22z160
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
42 views78 pages

Unit 4 ST&M

The document outlines the syllabus and key concepts of software testing and maintenance in the context of object-oriented software engineering. It covers various testing strategies, including unit testing, integration testing, regression testing, and validation testing, highlighting the importance of verification and validation in ensuring software quality. Additionally, it discusses debugging as a critical process that follows testing to identify and correct errors.

Uploaded by

22z160
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd

CCS356 Object Oriented

Software Engineering

Dr. MAHAVISHNU.V.C
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Syllabus

UNIT IV SOFTWARE TESTING AND MAINTENANCE 9


Testing – Unit testing – Black box testing– White box testing –
Integration and System testing– Regression testing – Debugging -
Program analysis – Symbolic execution – Model Checking-Case Study
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Software Testing

Testing is the process of exercising a program with the


specific intent of finding errors prior to delivery to the end
user.
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
What Testing Shows

errors

requirements conformance

performance

an indication
of quality
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Strategic Approach

To perform effective testing, you should conduct


effective technical reviews. By doing this, many
errors will be eliminated before testing
commences.
Testing begins at the component level and works
"outward" toward the integration of the entire
computer-based system.
Different testing techniques are appropriate for
different software engineering approaches and at
different points in time.
Testing is conducted by the developer of the
software and (for large projects) an independent
test group.
Testing and debugging are different activities,
but debugging must be accommodated in any
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
V&V

Verification refers to the set of tasks


that ensure that software correctly
implements a specific function.
Validation refers to a different set of
tasks that ensure that the software that
has been built is traceable to customer
requirements. Boehm [Boe81] states
this another way:
– Verification: "Are we building the product
right?"
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Who Tests the Software?

developer independent tester

Understands the system Must learn about the system,


but, will test "gently" but, will attempt to break it
and, is driven by "delivery" and, is driven by quality
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Testing Strategy

System engineering

Analysis modeling
Design modeling

Code generation Unit test

Integration test
Validation test

System test
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Testing Strategy

We begin by ‘testing-in-the-small’ and


move toward ‘testing-in-the-large’
For conventional software
– The module (component) is our initial focus
– Integration of modules follows

For OO software
– our focus when “testing in the small” changes
from an individual module (the conventional
view) to an OO class that encompasses
attributes and operations and implies
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Strategic Issues

Specify product requirements in a quantifiable manner long before


testing commences.
State testing objectives explicitly.
Understand the users of the software and develop a profile for each
user category.
Develop a testing plan that emphasizes “rapid cycle testing.”
Build “robust” software that is designed to test itself
Use effective technical reviews as a filter prior to testing
Conduct technical reviews to assess the test strategy and test cases
themselves.
Develop a continuous improvement approach for the testing
process.
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Unit Testing

module
to be
tested

results

software
engineer
test cases
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Unit Testing

module
to be
tested
interface
local data structures

boundary conditions
independent paths
error handling paths

test cases
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Unit Test Environment

driver
interface
local data structures

Module boundary conditions


independent paths
error handling paths

stub stub

test cases

RESULTS
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Integration Testing Strategies

Options:
• the “big bang” approach
• an incremental construction strategy
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Top Down Integration

A
top module is tested with
stubs

B F G

stubs are replaced one at


a time, "depth first"
C
as new modules are integrated,
some subset of tests is re-run
D E
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Bottom-Up Integration

B F G

drivers are replaced one at a


time, "depth first"
C

worker modules are grouped into


builds and integrated
D E
cluster
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Sandwich Testing

A
Top modules are
tested with stubs

B F G

Worker modules are grouped into


builds and integrated
D E

cluster
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Regression Testing

Regression testing is the re-execution of some subset of


tests that have already been conducted to ensure that
changes have not propagated unintended side effects
Whenever software is corrected, some aspect of the
software configuration (the program, its documentation, or
the data that support it) is changed.
Regression testing helps to ensure that changes (due to
testing or for other reasons) do not introduce unintended
behavior or additional errors.
Regression testing may be conducted manually, by re-
executing a subset of all test cases or using automated
capture/playback tools.
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Smoke Testing

A common approach for creating “daily builds” for product


software
Smoke testing steps:
– Software components that have been translated into code are
integrated into a “build.”
• A build includes all data files, libraries, reusable modules, and engineered
components that are required to implement one or more product
functions.
– A series of tests is designed to expose errors that will keep the build
from properly performing its function.
• The intent should be to uncover “show stopper” errors that have the
highest likelihood of throwing the software project behind schedule.
– The build is integrated with other builds and the entire product (in its
current form) is smoke tested daily.
• The integration approach may be top down or bottom up.
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Object-Oriented Testing

begins by evaluating the correctness and


consistency of the analysis and design models
testing strategy changes
– the concept of the ‘unit’ broadens due to
encapsulation
– integration focuses on classes and their execution
across a ‘thread’ or in the context of a usage scenario
– validation uses conventional black box methods
test case design draws on conventional
methods, but also encompasses special features
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Validation Testing

Validation testing succeeds when software functions in a


manner that can be reasonably expected by the customer.
Like all other testing steps, validation tries to uncover
errors, but the focus is at the requirements level— on
things that will be immediately apparent to the end-user.
Reasonable expectations are defined in the Software
Requirements Specification— a document that describes
all user-visible attributes of the software.
Validation testing comprises of
– Validation Test criteria
– Configuration review
– Alpha & Beta Testing
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Validation Test criteria

It is achieved through a series of tests that demonstrate agreement


with requirements.
A test plan outlines the classes of tests to be conducted and a test
procedure defines specific test cases that will be used to
demonstrate agreement with requirements.
Both the plan and procedure are designed to ensure that
– all functional requirements are satisfied,
– all behavioral characteristics are achieved,
– all performance requirements are attained,
– documentation is correct,
– other requirements are met
After each validation test case has been conducted, one of two
possible conditions exist:
1. The function or performance characteristics conform to
specification and are accepted
2. A deviation from specification is uncovered and a deficiency list is
created
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Configuration Review

The intent of the review is to ensure that


all elements of the software configuration
have been properly developed, are
cataloged, and have the necessary detail
to the support phase of the software life
cycle.
The configuration review, sometimes
called an audit.
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Alpha and Beta Testing

When custom software is built for one customer,


a series of acceptance tests are conducted to
enable the customer to validate all requirements.
Conducted by the end-user rather than software
engineers, an acceptance test can range from
an informal "test drive" to a planned and
systematically executed series of tests.
Most software product builders use a process
called alpha and beta testing to uncover errors
that only the end-user seems able to find.
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Alpha testing

The alpha test is conducted at the


developer's site by a customer.
The software is used in a natural setting
with the developer "looking over the
shoulder" of the user and recording errors
and usage problems.
Alpha tests are conducted in a controlled
environment.
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Beta testing

The beta test is conducted at one or more customer sites


by the end-user of the software.
beta test is a "live" application of the software in an
environment that cannot be controlled by the developer.
The customer records all problems (real or imagined)
that are encountered during beta testing and reports
these to the developer at regular intervals.
As a result of problems reported during beta tests,
software engineers make modifications and then prepare
for release of the software product to the entire customer
base.
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
System Testing

System testing is actually a series of different


tests whose primary purpose is to fully exercise
the computer-based system.
Although each test has a different purpose, all
work to verify that system elements have been
properly integrated and perform allocated
functions.
Types of system tests are:
– Recovery Testing
– Security Testing
– Stress Testing
– Performance Testing
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Recovery Testing

Recovery testing is a system test that forces the


software to fail in a variety of ways and verifies
that recovery is properly performed.
If recovery is automatic (performed by the
system itself), reinitialization, checkpointing
mechanisms, data recovery, and restart are
evaluated for correctness.
If recovery requires human intervention, that is
mean-time-to-repair (MTTR) is evaluated to
determine whether it is within acceptable limits.
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Security Testing

Security testing attempts to verify that protection mechanisms


built into a system will, in fact, protect it from improper break
through .
During security testing, the tester plays the role(s) of the
individual who desires to break through the system.
Given enough time and resources, good security testing will
ultimately penetrate a system.
The role of the system designer is to make penetration cost
more than the value of the information that will be obtained.
The tester may attempt to acquire passwords through
externally, may attack the system with custom software
designed to breakdown any defenses that have been
constructed; may browse through insecure data; may
purposely cause system errors.
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Stress Testing

Stress testing executes a system in a manner that demands


resources in abnormal quantity, frequency, or volume.
For example,
1. special tests may be designed that generate ten interrupts per
second
2. Input data rates may be increased by an order of magnitude to
determine how input functions will respond
3. test cases that require maximum memory or other resources are
executed
4. test cases that may cause excessive hunting for disk-resident data
are created
A variation of stress testing is a technique called sensitivity testing
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Performance Testing

Performance testing occurs throughout all steps in the


testing process.
Even at the unit level, the performance of an individual
module may be assessed as white-box tests are
conducted.
Performance tests are often coupled with stress testing
and usually require both hardware and software
instrumentation
It is often necessary to measure resource utilization
(e.g., processor cycles).
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Acceptance Testing

Acceptance Testing is a method of


software testing where a system is tested
for acceptability.
The major aim of this test is to evaluate
the compliance of the system with the
business requirements and assess
whether it is acceptable for delivery or not.
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Acceptance Testing

Types of Acceptance Testing:


User Acceptance Testing (UAT):
User acceptance testing is used to determine whether the
product is working for the user correctly. This is also termed
as End-User Testing.
Business Acceptance Testing (BAT):
BAT is used to determine whether the product meets the
business goals and purposes or not.
Contract Acceptance Testing (CAT):
CAT is a contract which specifies that once the product goes
live, within a predetermined period, the acceptance test must
be performed and it should pass all the acceptance use cases.
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Acceptance Testing

Regulations Acceptance Testing (RAT):


RAT is used to determine whether the product violates
the rules and regulations that are defined by the
government of the country where it is being released.
Operational Acceptance Testing (OAT):
OAT is used to determine the operational readiness of
the product and is a non-functional testing. It mainly
includes testing of recovery, compatibility, maintainability,
reliability etc.
OAT assures the stability of the product before it is
released to the production.
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Testing
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
THE ART OF DEBUGGING

Debugging is the process that results in


the removal of the error.
Although debugging can and should be an
orderly process, it is still very much an art.
Debugging is not testing but always occurs
as a consequence of testing.
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Debugging Process
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Debugging Process
Results are examined and a lack of correspondence
between expected and actual performance is
encountered ( due to cause of error).
Debugging process attempts to match symptom with
cause, thereby leading to error correction.
One of two outcomes always comes from debugging
process:
– The cause will be found and corrected,
– The cause will not be found.
The person performing debugging may suspect a cause,
design a test case to help validate that doubt, and work
toward error correction in an iterative fashion.
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Why is debugging so difficult?

1. The symptom may disappear (temporarily) when


another error is corrected.
2. The symptom may actually be caused by non-errors
(e.g., round-off inaccuracies).
3. The symptom may be caused by human error that is
not easily traced (e.g. wrong input, wrongly configure the
system)
4. The symptom may be a result of timing problems,
rather than processing problems.( e.g. taking so much
time to display result).
5. It may be difficult to accurately reproduce input
conditions (e.g., a real-time application in which input
ordering is indeterminate).
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
6. The symptom may be intermittent (connection
irregular or broken). This is particularly common in
embedded systems that couple hardware and software
7. The symptom may be due to causes that are
distributed across a number of tasks running on
different processors

As the consequences of an error increase, the amount of


pressure to find the cause also increases. Often,
pressure sometimes forces a software developer to fix
one error and at the same time introduce two more.
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Debugging Approaches or strategies
Debugging has one overriding objective: to find and correct the
cause of a software error.
Three categories for debugging approaches
– Brute force
– Backtracking
– Cause elimination
Brute Force:
probably the most common and least efficient method for
isolating the cause of a software error.
Apply brute force debugging methods when all else fails.
Using a "let the computer find the error" philosophy,
memory dumps are taken, run-time traces are invoked,
and the program is loaded with WRITE or PRINT
statements
It more frequently leads to wasted effort and time.
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Backtracking:
common debugging approach that can be used
successfully in small programs.
Beginning at the site where a symptom has been open,
the source code is traced backward (manually) until the
site of the cause is found.
Cause elimination
Is cleared by induction or deduction and introduces the
concept of binary partitioning (i.e. valid and invalid).
A list of all possible causes is developed and tests are
conducted to eliminate each.
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Correcting the error

The correction of a bug can introduce other errors and


therefore do more harm than good.

Questions that every software engineer should ask before


making the "correction" that removes the cause of a bug:
Is the cause of the bug reproduced in another part of
the program? (i.e. cause of bug is logical pattern)
What "next bug" might be introduced by the fix I'm
about to make? (i.e. cause of bug can be in logic or
structure or design).
What could we have done to prevent this kind of bug
previously? ( i.e. same kind of bug might generated
early so developer can go through the steps)
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Testability

Operability—it operates cleanly


Observability—the results of each test case are readily
observed
Controllability—the degree to which testing can be
automated and optimized
Decomposability—testing can be targeted
Simplicity—reduce complex architecture and logic to
simplify tests
Stability—few changes are requested during testing
Understandability—of the design
PSG Institute of Technology and Applied Research

What is a “Good” Test?

Dr. Mahavishnu.V.C
A good test has a high probability of
finding an error
A good test is not redundant.
A good test should be “best of breed”
A good test should be neither too simple
nor too complex
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Internal and External Views

Any engineered product (and most


other things) can be tested in one of
two ways:
– Knowing the specified function that a
product has been designed to perform,
tests can be conducted that
demonstrate each function is fully
operational while at the same time
searching for errors in each function;
– Knowing the internal workings of a
product, tests can be conducted to
PSG Institute of Technology and Applied Research

Test Case Design

Dr. Mahavishnu.V.C
"Bugs lurk in corners
and congregate at
boundaries ..."

Boris Beizer

OBJECTIVE to uncover errors

CRITERIA in a complete manner

CONSTRAINT with a minimum of effort and time


PSG Institute of Technology and Applied Research

Exhaustive Testing

Dr. Mahavishnu.V.C
loop < 20 X

14
There are 10 possible paths! If we execute one
test per millisecond, it would take 3,170 years to
test this program!!
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Selective Testing

Selected path

loop < 20 X
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Software Testing

white-box black-box
methods methods

Methods

Strategies
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
White-Box Testing

... our goal is to ensure that all


statements and conditions have
been executed at least once ...
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Why Cover?

logic errors and incorrect assumptions


are inversely proportional to a path's
execution probability

we often believe that a path is not


likely to be executed; in fact, reality is
often counter intuitive

typographical errors are random; it's


likely that untested paths will contain
some
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Basis Path Testing

First, we compute the cyclomatic


complexity:

number of simple decisions + 1

or

number of enclosed areas + 1

In this case, V(G) = 4


PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Cyclomatic Complexity

A number of industry studies have indicated


that the higher V(G), the higher the probability
or errors.

modules

V(G)

modules in this range are


more error prone
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Basis Path Testing

1
Next, we derive the
independent paths:

2 Since V(G) = 4,
there are four paths
3 Path 1: 1,2,3,6,7,8
4
5 6
Path 2: 1,2,3,5,7,8
Path 3: 1,2,4,7,8
Path 4: 1,2,4,7,2,4,...7,8
Finally, we derive test
7 cases to exercise these
paths.

8
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Basis Path Testing

you don't need a flow chart,


but the picture will help when
you trace program paths

count each simple logical test,


compound tests count as 2 or
more

basis path testing should be


applied to critical modules
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Deriving Test Cases

Summarizing:
– Using the design or code as a
foundation, draw a corresponding flow
graph.
– Determine the cyclomatic complexity of
the resultant flow graph.
– Determine a basis set of linearly
independent paths.
– Prepare test cases that will force
execution of each path in the basis set.
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Graph Matrices

A graph matrix is a square matrix whose


size (i.e., number of rows and columns) is
equal to the number of nodes on a flow
graph
Each row and column corresponds to an
identified node, and matrix entries
correspond to connections (an edge)
between nodes.
By adding a link weight to each matrix
entry, the graph matrix can become a
powerful tool for evaluating program
control structure during testing
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Control Structure Testing

Condition testing — a test case design


method that exercises the logical
conditions contained in a program module
Data flow testing — selects test paths of a
program according to the locations of
definitions and uses of variables in the
program
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Loop Testing

Simple
loop
Nested
Loops

Concatenated
Loops Unstructured
Loops
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Loop Testing: Simple Loops
Minimum conditions—Simple Loops

1. skip the loop entirely

2. only one pass through the loop

3. two passes through the loop


4. m passes through the loop m < n

5. (n-1), n, and (n+1) passes through


the loop

where n is the maximum number


of allowable passes
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Loop Testing: Nested Loops
Nested Loops
Start at the innermost loop. Set all outer loops to their
minimum iteration parameter values.
Test the min+1, typical, max-1 and max for the
innermost loop, while holding the outer loops at their
minimum values.
Move out one loop and set it up as in step 2, holding all
other loops at typical values. Continue this step until
the outermost loop has been tested.
Concatenated Loops
If the loops are independent of one another
then treat each as a simple loop
else* treat as nested loops
endif*
for example, the final loop counter value of loop 1 is
used to initialize loop 2.
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Black-Box Testing

requirements

output

input events
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Black-Box Testing
How is functional validity tested?
How is system behavior and performance tested?
What classes of input will make good test cases?
Is the system particularly sensitive to certain input
values?
How are the boundaries of a data class isolated?
What data rates and data volume can the system
tolerate?
What effect will specific combinations of data have on
system operation?
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Graph-Based Methods

To understand
the objects that object
#1
Directed link object
#2
(link weight)
are modeled in
software and Undirected link
Node weight
(value
the Parallel links
)

relationships object
#
that connect 3

these objects (a)

menu select generates


In this context, we new document
file (generation time  1.0 sec) window
consider the term
“objects” in the allows editing
of
broadest possible is represented as Attributes:

context. It contains
background color: white
encompasses data document
text color: default color
tex
objects, traditional t or preferences

components (b)
(modules), and
object-oriented
elements of
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Equivalence Partitioning

user output FK
queries formats input
mouse
picks data
prompts
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Sample Equivalence Classes
Valid data
user supplied commands
responses to system prompts
file names
computational data
physical parameters
bounding values
initiation values
output data formatting
responses to error messages
graphical data (e.g., mouse picks)

Invalid data
data outside bounds of the program
physically impossible data
proper value supplied in wrong place
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Boundary Value Analysis

user output FK
queries formats input
mouse
picks data
prompts

output
input domain domain
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Comparison Testing
Used only in situations in which the
reliability of software is absolutely critical
(e.g., human-rated systems)
– Separate software engineering teams develop
independent versions of an application using
the same specification
– Each version can be tested with the same
test data to ensure that all provide identical
output
– Then all versions are executed in parallel with
real-time comparison of results to ensure
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Orthogonal Array Testing

Used when the number of input


parameters is small and the
values that each of the parameters
may take are clearly bounded
Z Z

Y Y
X X
One input item at a time L9 orthogonal array
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Model-Based Testing

Analyze an existing behavioral model for the software or


create one.
– Recall that a behavioral model indicates how software will
respond to external events or stimuli.
Traverse the behavioral model and specify the inputs that
will force the software to make the transition from state to
state.
– The inputs will trigger events that will cause the transition to
occur.
Review the behavioral model and note the expected
outputs as the software makes the transition from state to
state.
Execute the test cases.
Compare actual and expected results and take corrective
action as required.
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Software Testing Patterns

Testing patterns are described in


much the same way as design
patterns (Chapter 12).
Example:
• Pattern name: ScenarioTesting
• Abstract: Once unit and integration tests
have been conducted, there is a need to
determine whether the software will
perform in a manner that satisfies users.
The ScenarioTesting pattern describes a
technique for exercising the software from
the user’s point of view. A failure at this
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Program analysis
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Symbolic execution
PSG Institute of Technology and Applied Research

Dr. Mahavishnu.V.C
Model Checking
THANK
YOU

You might also like