System Testing
System Testing
Unit& Integration Testing
Objective: to make sure that the
program code implements the
design correctly.
System Testing Objective: to
ensure that the software does
what the customer wants it to
do.
System Testing Process
SystemTesting Process
Function Testing: the system must perform
functions specified in the requirements.
Performance Testing: the system must satisfy
security, precision, load and speed
constrains specified in the requirements.
Acceptance Testing: customers try the system
(in the lab) to make sure that the system
built is the system they requested.
Deployment Testing: the software is deployed
and tested in the production environment.
Build & IntegrationPlan
Build & Integration Plan
- Large complex system is very hard to
test.
+ We can devise a build & integration
plan defining subsystems to be built
and tested such that we are
managing a less complex entity.
Such subsystems usually correspond
to increments in system scope &
functionality and are called spins.
Example: Microsoft’s Synch & Stabilize
approach.
6.
Build & IntegrationPlan Cont’d
Build & Integration Plan Cont’d
Each spin is identified including its
functionality and testing schedule.
If the test of spin n succeeds but the
problem arises during the testing of
spin n+1 we know that the fault
comes from components /
functionality introduced in spin n+1.
Accounting System Example: spin 0 –
basic accounting operations are
tested; spin 1 – accounting with data
auditing tested; spin 2 – access user
control is tested, etc.
7.
Configuration Management
Configuration Management
Module-basedsystems can be shipped in various configurations
of components defining the system’s functionality and
performance.
Different configurations are referred to as product versions or
product editions.
Sequential modifications of a configuration corresponding to
product releases are identified as major versions.
Sequential modifications of the same major release
configuration are identified by minor versions.
During development & testing current pre-release version is
identified by the build number.
Product Edition V Major . Minor . Build Number
E.g. Sonar Producer v 5.1.1012
Keeping track of version is a must for phased development!
8.
Regression Testing
Regression Testing
Regressiontesting ensures that no new faults are
introduced when the system is modified, expanded
or enhanced as well as when the previously
uncovered faults are corrected.
Procedure:
1. Insert / modify code
2. Test the directly affected functions
3. Execute all available tests to make sure that nothing
else got broken
Invaluable for phased development!
Unit testing coupled with nightly builds serves as a
regression testing tool for agile methodology.
9.
Source & VersionControl
Source & Version Control
Open Source: CVS flavors
Microsoft: Visual Source Safe, TFS
• Each version is stored separately
• Delta changes
• Modification comments
• Revision history including users
Do not use conditional compilation for tasks
other than debugging, tracing or checked
builds.
10.
Test Team
Test Team
Testers:devise test plans, design,
organize and run tests
Analysts: assist testers, provide
guidance for the process verification
as well as on the appearance of the
test results
Designers: assist testers, provide
guidance for testing components and
software architecture
Users: provide feedback for the
development team
11.
Function Testing
Function Testing
ThreadTesting: Verifying a narrow set of
actions associated with a particular function
Test process requirements
• Test plan including the stopping criteria
• High probability for detecting a fault
• Use of independent testers
• The expected actions and the expected
output must be specified
• Both valid and invalid input must be tested
• System must not be modified in the testing
process (or to make testing easier)
12.
Performance Testing
Performance Testing
Load/ Stress Testing: large amount of
users / requests
Volume Testing: large quantities of data
Security Testing: test access,
authentication, data safety
Timing Testing: for control and event-
driven systems
Quality / Precision Testing: verify the
result accuracy (when applies)
Recovery Testing: verify recovery from
failure process (when applies)
13.
Reliability,Availability,Maintainability
Reliability,Availability,Maintainability
Reliability: the probabilityof software operating
without failure under given conditions for a
given time interval T.
R = MTTF/(T+MTTF)
Availability: the probability of software operating
without failure at a given moment in time.
MT(Between Failures) = MT(To Failure)+MT(To
Repair)
A = MTBF/(T+MTTF)
Maintainability: the probability of successful
software maintenance operation being
performed within a stated interval of time.
M = T/(T+MTTR)
14.
Acceptance Testing
Acceptance Testing
Thepurpose is to enable customers and users to determine if
the system built really meets their needs and
expectations.
Benchmarking: a predetermined set of test cases
corresponding to typical usage conditions is executed
against the system
Pilot Testing: users employ the software as a small-scale
experiment or in a controlled environment
Alpha-Testing: pre-release closed / in-house user testing
Beta-Testing: pre-release public user testing
Parallel Testing: old and new software are used together and
the old software is gradually phased out.
Acceptance testing uncovers requirement discrepancies as well
as helps users to find out what they really want (hopefully
not at the developer’s expense!)
15.
Deployment Testing
Deployment Testing
Thesoftware is installed on a target
system and tested with:
• Various hardware configurations
• Various supported OS versions and
service packs
• Various versions of third-party
components (e.g. database servers,
web servers, etc.)
• Various configurations of resident
software