ASM Performance Test Report
Conducted By: Murray Wardle & Srimanta Mukherjee
Access Testing
Date: 19/10/2010
2
1. Executive Summary.....................................................................3
1.1. Objectives...........................................................................3
1.2. Analysis Session Summary.........................................................3
1.3. Conclusions..........................................................................5
2. Scenario Configuration.................................................................6
2.1. Files..................................................................................6
2.2. Scheduler Information.............................................................6
2.3. Scripts................................................................................6
2.4. Run Time Settings..................................................................7
Test steps.................................................................................................................7
3. Running Vusers...........................................................................8
4. Average Transaction Response Time.................................................9
5. Errors per Second......................................................................11
6. Throughput.............................................................................13
7. Hits per Second........................................................................14
8. Connections.............................................................................15
9. Terminology............................................................................16
9.1. LoadRunner Objects..............................................................16
9.2. Graph Information................................................................16
3
1. Executive Summary
1.1. Objectives
Determine whether the system is capable of supporting the target
transaction rate while retaining its functional stability and responding within pre-
defined guidelines.
Identify any bottlenecks preventing the achievement of performance targets
and assist in their resolution.
1.2. Analysis Session Summary
Analysis Summary Period: 18-Oct-2010 [Link] PM - 18-Oct-2010 [Link] PM
Project Name: ASM
Test Name: 01_LoadTest1_100%
Test Description: ASM Load test at 100% peak load.
Run Time: 18-Oct-2010 [Link] PM
Duration: 2 hours and 14 minutes.
User Notes: none
Statistics Summary
Maximum Running Vusers: 17
Total Throughput (bytes): 148,921,442
Average Throughput (bytes/second): 18,520
Total Hits: 16,804
Average Hits per Second: 2.09
Total Errors: 17
4
Transaction Summary
Total Passed: Total Failed: Total Stopped:
Transactions: Average Response Time
1,377 34 0
SLA Std. 90
Transaction Name Minimum Average Maximum Pass Fail Stop
Status Deviation Percent
VU01_00_ALL 160.31 180.978 200.919 9.357 194.715 57 0 0
VU01_01_LandingPage 0.062 0.18 2.531 0.449 0.141 57 0 0
VU01_02_ClkImageASMqvw 1.048 1.121 1.484 0.061 1.172 57 0 0
VU01_03_ClickTabMnthlyRep 0.249 0.315 1.969 0.222 0.313 57 0 0
VU01_04_ClickTabHeldOfferDetails 0.156 0.18 0.203 0.01 0.187 57 0 0
VU01_05_ClickDropDownMonthFirstHeld 0.25 0.265 0.297 0.008 0.266 57 0 0
VU01_06_SelectMonth 0.203 0.216 0.719 0.068 0.219 57 0 0
VU01_07_SelectRegion 0.125 0.344 0.781 0.25 0.734 57 0 0
VU01_08_OrderByBusinessManager 0.109 0.325 0.906 0.241 0.687 57 0 0
VU01_09_FilterByProductName 0.016 0.018 0.031 0.006 0.031 57 0 0
VU01_10_BusinessVehicleLoan 0.156 0.386 2.312 0.362 0.766 57 0 0
VU02_00_ALL 97.796 121.62 136.811 7.744 131.607 56 0 0
VU02_01_LandingPage 0.062 0.273 2.453 0.602 0.219 56 0 0
VU02_02_ClickImageASMqvw 1.063 1.129 1.5 0.068 1.187 56 0 0
VU02_03_CurrentHeldOffersPage 0.187 0.202 0.266 0.013 0.219 56 0 0
VU02_04_CurrentHeldOffersRegion 0 0.001 0.062 0.008 0 56 0 0
VU02_05_BusinessManagerSummarisation 0.062 0.068 0.203 0.025 0.063 56 0 0
VU02_06_FiltersRegion 0.093 0.122 0.468 0.05 0.126 56 0 0
VU02_07_OrderByDealership 0.031 0.054 0.063 0.011 0.063 56 0 0
VU03_00_ALL 211.542 234.986 247.495 8.045 242.949 14 17 0
VU03_01_LandingPage 0.062 0.171 2.437 0.415 0.156 31 0 0
VU03_02_ClickImageASMqvw 1.031 1.107 1.235 0.041 1.156 31 0 0
VU03_03_ClickTabTimeAnalytics 0.265 0.29 0.343 0.015 0.297 30 1 0
VU03_04_Select_StartDate 0.125 0.128 0.188 0.012 0.125 28 2 0
VU03_05_Select_EndDate 0.125 0.127 0.156 0.006 0.125 27 1 0
VU03_06_SelectFirstStatus 0.204 0.219 0.234 0.004 0.219 24 3 0
VU03_07_SelectSecondStatus 0.296 0.3 0.312 0.006 0.312 23 1 0
VU03_08_SelectGranularity 0.515 0.525 0.594 0.02 0.531 23 0 0
VU03_09_FilterByReferredItems 0.484 0.496 0.547 0.013 0.501 22 1 0
VU03_10_FilterByRegion 0.484 0.512 0.562 0.025 0.562 19 3 0
VU03_11_OrderByAverageTime 0.406 0.43 0.578 0.039 0.437 16 3 0
VU03_12_ClearFiltersWorkAround 0.14 0.157 0.203 0.014 0.156 14 2 0
1.3. Conclusions
5
Five noticeable spikes in page response times occurred for he “QlikView Access Point” page,
the impact of these spikes to the end user is minimal. All page response times were acceptable
during the test.
Seventeen errors occurred during the test, all of these errors resulted in a “Failed to connect”
message being sent to the client browser. Only page in the “Time Analytics Tab” were
affected. Error details and timestamps for each occurrence of these errors have been provided
to the project for further investigation.
6
2. Scenario Configuration
2.1. Files
Name: d:\Apps\HP\Performance Center\orchidtmp\scenarios\
[Link]
Results in Session: d:\Apps\HP\Performance Center\orchidtmp\Results\5\
[Link]
Session Name: D:\TEMP\MercuryOrchid_85829274\Sess_85835068\ASM\0
1_LoadTest1_100_\18_Oct_2010 03_46_43 PM\Results\R
[Link]
2.2. Scheduler Information
Started On: 10/18/2010 [Link]
Ended On: 10/18/2010 [Link]
Duration: Until Completion
Load Behavior: Load all Vusers simultaneously
2.3. Scripts
Script Type File
\\aupperfdb01\LRFS\2\Scripts\VU02_H
VU02_HeldOffersTab_v0_1_1 Multi+QTWeb eldOffersTab_v0_1\VU02_HeldOffersT
ab_v0_1.usr
\\aupperfdb01\LRFS\2\Scripts\VU01_M
VU01_MonthlyHeldOfferDetailsRep_v0 onthlyHeldOfferDetailsRep_v0_1\VU01
Multi+QTWeb
_1_1 _MonthlyHeldOfferDetailsRep_v0_1.us
r
\\aupperfdb01\LRFS\2\Scripts\VU03_Ti
VU03_TimeAnalyticsTab_v0_1_1 Multi+QTWeb meAnalyticsTab_v0_1\VU03_TimeAnal
yticsTab_v0_1.usr
7
2.4. Run Time Settings
Target load level: 100% Current peak load
Cache settings: Each iteration of each VU will emulate a new user with an empty
browser cache
Groups: Each script will form a single group
Load Injectors: Load from each group will be spread evenly across all injectors
Test steps
Ramp up: 15 minutes
Sustain: 90 minutes
Ramp down: 15 minutes
Think Time Trans. Per
Process Total VUs Settings* Pacing** Hour
VU01_HeldOffersReport 6 75% - 125% 576 - 864 30
VU02_HeldOffersTab 6 75% - 125% 576 - 864 30
VU03_TimeAnalytics 5 75% - 125% 960 - 1440 15
Total 17 75
*The range of random deviation from the recorded think time.
**The range of start-to-start pacing, in seconds, selected as random.
8
3. Running Vusers
Displays the number of Vusers that executed Vuser scripts, and their status, during each second
of a load test. This graph is useful for determining the Vuser load on your server at any given
moment.
Title: Running Vusers
Current Results: d:\Apps\HP\Performance Center\orchidtmp\Results\5\
[Link]
Filters: Vuser Status = (Run)
Group By:
Granularity: 30 Seconds
Color Scale Measurement Graph Min. Graph Ave. Graph Max. Graph Median Graph SD
1 Run 0.0 8.094 17 8 5.162
9
4. Average Transaction Response Time
Displays the average time taken to perform transactions during each second of the load test.
This graph helps you determine whether the performance of the server is within acceptable
minimum and maximum transaction performance time ranges defined for your system.
Title: Average Transaction Response Time
Current Results: d:\Apps\HP\Performance Center\orchidtmp\Results\5\
[Link]
Filters: Transaction End Status = (Pass), (Include Think Ti
me)
Group By:
Granularity: 120 Seconds
Color Scale Measurement Min. Ave. Max. SD
1 VU01_01_LandingPage 0.062 0.18 2.531 0.449
1 VU01_02_ClkImageASMqvw 1.048 1.121 1.484 0.061
1 VU01_03_ClickTabMnthlyRep 0.249 0.315 1.969 0.222
1 VU01_04_ClickTabHeldOfferDetails 0.156 0.18 0.203 0.01
1 VU01_05_ClickDropDownMonthFirstHeld 0.25 0.265 0.297 0.008
1 VU01_06_SelectMonth 0.203 0.216 0.719 0.068
1 VU01_07_SelectRegion 0.125 0.344 0.781 0.25
1 VU01_08_OrderByBusinessManager 0.109 0.325 0.906 0.241
1 VU01_09_FilterByProductName 0.016 0.018 0.031 0.006
1 VU01_10_BusinessVehicleLoan 0.156 0.386 2.312 0.362
10
1 VU02_01_LandingPage 0.062 0.273 2.453 0.602
1 VU02_02_ClickImageASMqvw 1.063 1.129 1.5 0.068
1 VU02_03_CurrentHeldOffersPage 0.187 0.202 0.266 0.013
1 VU02_04_CurrentHeldOffersRegion 0.0 0.001 0.062 0.008
1 VU02_05_BusinessManagerSummarisation 0.062 0.068 0.203 0.025
1 VU02_06_FiltersRegion 0.093 0.122 0.468 0.05
1 VU02_07_OrderByDealership 0.031 0.054 0.063 0.011
1 VU03_01_LandingPage 0.062 0.171 2.437 0.415
1 VU03_02_ClickImageASMqvw 1.031 1.107 1.235 0.041
1 VU03_03_ClickTabTimeAnalytics 0.265 0.29 0.343 0.015
1 VU03_04_Select_StartDate 0.125 0.128 0.188 0.012
1 VU03_05_Select_EndDate 0.125 0.127 0.156 0.006
1 VU03_06_SelectFirstStatus 0.204 0.219 0.234 0.004
1 VU03_07_SelectSecondStatus 0.296 0.3 0.312 0.006
1 VU03_08_SelectGranularity 0.515 0.525 0.594 0.02
1 VU03_09_FilterByReferredItems 0.484 0.496 0.547 0.013
1 VU03_11_OrderByAverageTime 0.406 0.43 0.578 0.039
1 VU03_12_ClearFiltersWorkAround 0.14 0.157 0.203 0.014
Most Page response times were under 0.8 seconds and were stable and consistent
throughout the entire test.
Five noticeable spikes in page response times occurred for the “QlikView Access Point”
page ([Link] - Measurement timers VU01_01, VU02_01 &
VU03_01). The worst of these spikes was 2.5 seconds.
The initial ASM page (Measurement timers VU01_02, VU02_02 & VU03_02) was the
slowest page which typically took 1.2 seconds. Again, response times for this page were
stable and consistent throughout the entire test.
Overall page response times were acceptable during the test.
11
5. Errors per Second
Displays the average number of errors that occurred during each second of the scenario run,
grouped by error code.
Title: Errors per Second
Current Results: d:\Apps\HP\Performance Center\orchidtmp\Results\5\
[Link]
Filters: None
Group By:
Granularity: 1 Second
Graph
Color Scale Measurement Graph Min. Ave. Graph Max. Graph SD
Median
Failed to connect | Error Description:
1 Failed to connect | Error Number: 0.0 0.002 1 0.0 0.046
-2147352567
The error “Failed to connect | Error Description: Failed to connect | Error Number:
-2147352567” occurred 17 times during the test on the following pages.
Time Analytics Tab
o Click Tab Time Analytics (x1)
o Select Start Date (x2)
o Select End Date (x1)
o Select First Status (x3)
12
o Select Second Status (x1)
o Filter By Referred Items (x1)
o Filter By Region (x3)
o Order By Average Time (x3)
o Clear Filters (x2)
Only functionality in the ‘Time Analytics Tab’ area was affected by this issue. No issues
were encountered in the ‘Held Offer Detail Report’ or ‘Held Offers Tab’ areas of ASM.
13
6. Throughput
Displays the amount of throughput (in bytes) on the Web server during the load test.
Throughput represents the amount of data that the Vusers received from the server at any
given second. This graph helps you to evaluate the amount of load Vusers generate, in terms of
server throughput.
Title: Throughput
Current Results: d:\Apps\HP\Performance Center\orchidtmp\Results\5\
[Link]
Filters: None
Group By:
Granularity: 120 Seconds
Color Scale Measurement Graph Min. Ave. Graph Max. Graph Median Graph SD
1 Throughput 0.0 18520.264 48645.475 17153.775 11771.508
14
7. Hits per Second
Displays the number of hits made on the Web server by Vusers during each second of the load
test. This graph helps you evaluate the amount of load Vusers generate, in terms of the
number of hits.
Title: Hits per Second
Current Results: d:\Apps\HP\Performance Center\orchidtmp\Results\5\
[Link]
Filters: None
Group By:
Granularity: 120 Seconds
Color Scale Measurement Graph Min. Ave. Graph Max. Graph Median Graph SD
1 Hits 0.0 2.09 5.45 2.008 1.342
15
8. Connections
Displays the number of Connections
Title: Connections
Current Results: d:\Apps\HP\Performance Center\orchidtmp\Results\5\
[Link]
Filters: None
Group By:
Granularity: 120 Seconds
Color Scale Measurement Graph's Min. Graph's Ave. Graph's Max. Graph's Median Graph's SD
1 Connections 0.0 25.833 34 32 11.441
16
9. Terminology
9.1. LoadRunner Objects
Term Definition
A Vuser script describes the actions that a Vuser performs during the scenario. Each Vuser executes a Vuser script
Vuser Scripts during a scenario run. The Vuser scripts include functions that measure and record the performance of your
application�s components.
Tests a system's ability to handle a heavy workload. A load test simulates multiple transactions or users
Load Test
interacting with the computer at the same time and provides reports on response times and system behavior.
Run-Time settings allow you to customize the way a Vuser script is executed. You configure the run-time settings
from the Controller or VuGen before running a scenario. You can view information about the Vuser groups and
Run-Time Settings
scripts that were run in each scenario, as well as the run-time settings for each script in a scenario, in the
Scenario Run-Time Settings dialog box.
A scenario defines the events that occur during each testing session. For example, a scenario defines and controls
Scenario the number of users to emulate, the actions that they perform, and the machines on which they run their
emulations.
The Schedule Builder allows you to set the time that the scenario will start running, the duration time of the
Scheduler scenario or of the Vuser groups within the scenario, and to gradually run and stop the Vusers within the scenario
or within a Vuser group. It also allows you to set the load behavior of Vusers in a scenario.
When you work with the Analysis utility, you work within a session. An Analysis session contains at least one set of
scenario results (lrr file). The Analysis utility processes the scenario result information and generates graphs and
Session
reports. The Analysis stores the display information and layout settings for the active graphs in a file with an .lra
extension. Each session has a session name, result file name, database name, directory path, and type.
A transaction represents an action or a set of actions used to measure the performance of the server. You define
Transactions transactions within your Vuser script by enclosing the appropriate sections of the script with start and end
transaction statement.
Vusers or virtual users are used by LoadRunner as a replacement for human users. When you run a scenario,
Vusers Vusers emulate the actions of human users working with your application. A scenario can contain tens, hundreds,
or even thousands of Vusers running concurrently on a single workstation.
9.2. Graph Information
Term Definition
Average Average value of the graph measurement's.
Hits The number of HTTP requests made by Vusers to the Web server.
Maximum Maximum value of the graph measurement's.
Measurement This is the type of resource being monitored
Median Middle value of the graph measurement's.
Minimum Minimum value of the graph measurement's.
Network Delay The time it takes for a packet of data sent across the network to go to the requested node and return.
Network Path The Network Path is the route data travels between the source machine and the destination machine.
Response time The time taken to perform a transaction.
In order to display all the measurements on a single graph, thus making the graphs easier to read and
analyze, you can change the scale or (granularity) of the x-axis. You can either set measurement scales
Scale (or granularity)
manually, view measurement trends for all measurements in the graph, or let Analysis scale them
automatically. The Legend tab indicates the scale factor for each resource.
Standard Deviation (SD) The square root of the arithmetic mean value of the squares of the deviations from the arithmetic mean.
Throughput is measured in bytes and represents the amount of data that the Vusers received from the
Throughput
server.
When you run a scenario, the Vusers generate load or stress on the server. LoadRunner monitors the effect
Vuser Load
of this load on the performance of your application.