0% found this document useful (0 votes)
173 views149 pages

Camel Development With Red Hat JBoss Fuse

Camel Dev Assessment
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
173 views149 pages

Camel Development With Red Hat JBoss Fuse

Camel Dev Assessment
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 149

Camel Development with

Red Hat JBoss Fuse

COMPLETE SELF-ASSESSMENT GUIDE

PRACTICAL TOOLS FOR SELF-ASSESSMENT

Diagnose projects, initiatives, organizations,


businesses and processes using accepted
diagnostic standards and practices

Implement evidence-based best practice


strategies aligned with overall goals

Integrate recent advances and process design


strategies into practice according to best practice
guidelines

Use the Self-Assessment tool Scorecard and


develop a clear picture of which areas need
attention

The Art of Service


Camel Development with Red Hat JBoss Fuse
Complete Self-Assessment Guide

The guidance in this Self-Assessment is based on Camel Development


with Red Hat JBoss Fuse best practices and standards in business process
architecture, design and quality management. The guidance is also based
on the professional judgment of the individual collaborators listed in the
Acknowledgments.

Notice of rights

You are permitted to use the Self-Assessment contents in your


presentations and materials for internal use and customers
without asking us - we are here to help.

All rights reserved for the book itself: this book may not be reproduced
or transmitted in any form by any means, electronic, mechanical,
photocopying, recording, or otherwise, without the prior written
permission of the publisher.

The information in this book is distributed on an “As Is” basis without


warranty. While every precaution has been taken in the preparation of he
book, neither the author nor the publisher shall have any liability to any
person or entity with respect to any loss or damage caused or alleged to
be caused directly or indirectly by the instructions contained in this book
or by the products described in it.

Trademarks

Many of the designations used by manufacturers and sellers to


distinguish their products are claimed as trademarks. Where those
designations appear in this book, and the publisher was aware of a
trademark claim, the designations appear as requested by the owner
of the trademark. All other product names and services identified
throughout this book are used in editorial fashion only and for the
benefit of such companies with no intention of infringement of the
trademark. No such use, or the use of any trade name, is intended to
convey endorsement or other affiliation with this book.

Copyright © by The Art of Service


http://theartofservice.com
[email protected]

1
Table of Contents
About The Art of Service 3
Acknowledgments 4
Included Resources - how to access 4
Your feedback is invaluable to us 5
Purpose of this Self-Assessment 5
How to use the Self-Assessment 6
Camel Development with Red Hat JBoss Fuse
Scorecard Example 8
Camel Development with Red Hat JBoss Fuse
Scorecard 9

BEGINNING OF THE
SELF-ASSESSMENT: 10
CRITERION #1: RECOGNIZE 12
CRITERION #2: DEFINE: 23
CRITERION #3: MEASURE: 41
CRITERION #4: ANALYZE: 58
CRITERION #5: IMPROVE: 72
CRITERION #6: CONTROL: 91
CRITERION #7: SUSTAIN: 107
Index 122

2
About The Art of Service

T
he Art of Service, Business Process Architects since 2000, is
dedicated to helping business achieve excellence.

Defining, designing, creating, and implementing a process


to solve a business challenge or meet a business objective is
the most valuable role… In EVERY company, organization and
department.

Unless you’re talking a one-time, single-use project within a


business, there should be a process. Whether that process is
managed and implemented by humans, AI, or a combination
of the two, it needs to be designed by someone with a complex
enough perspective to ask the right questions.

Someone capable of asking the right questions and step back and
say, ‘What are we really trying to accomplish here? And is there a
different way to look at it?’

With The Art of Service’s Business Process Architect Self-


Assessments, Research, Toolkits, Education and Certifications
we empower people who can do just that — whether their title
is marketer, entrepreneur, manager, salesperson, consultant,
Business Process Manager, executive assistant, IT Manager, CIO
etc... —they are the people who rule the future. They are people
who watch the process as it happens, and ask the right questions
to make the process work better.

Contact us when you need any support with this Self-


Assessment and any help with templates, blue-prints and
examples of standard documents you might need:

http://theartofservice.com
[email protected]

3
Acknowledgments
This checklist was developed under the auspices of The Art of
Service, chaired by Gerardus Blokdyk.

Representatives from several client companies participated in the


preparation of this Self-Assessment.

Our deepest gratitude goes out to Matt Champagne, Ph.D.


Surveys Expert, for his invaluable help and advise in structuring
the Self Assessment.

Mr Champagne can be contacted at


http://matthewchampagne.com/

In addition, we are thankful for the design and printing services


provided.

Included Resources - how to access


Included with your purchase of the book is the Camel
Development with Red Hat JBoss Fuse Self-Assessment
downloadable resource, which contains all questions and Self-
Assessment areas of this book.

Get it now- you will be glad you did - do it now, before you forget.

How? Simply send an email to [email protected] with


this books’ title in the subject to get all the Camel Development
with Red Hat JBoss Fuse Self-Assessment questions in a ready to
use Excel spreadsheet, containing the self-assessment, graphs,
and project RACI planning - all with examples to get you started
right away.

4
Your feedback is invaluable to us

If you recently bought this book, we would love to hear from you!
You can do this by writing a review on amazon (or the online store
where you purchased this book) about your last purchase! As part
of our continual service improvement process, we love to hear real
client experiences and feedback.

How does it work?


To post a review on Amazon, just log in to your account and click
on the Create Your Own Review button (under Customer Reviews)
of the relevant product page. You can find examples of product
reviews in Amazon. If you purchased from another online store,
simply follow their procedures.

What happens when I submit my review?


Once you have submitted your review, send us an email at
[email protected] with the link to your review so we
can properly thank you for your feedback.

Purpose of this Self-Assessment


This Self-Assessment has been developed to improve
understanding of the requirements and elements of Camel
Development with Red Hat JBoss Fuse, based on best practices
and standards in business process architecture, design and quality
management.

It is designed to allow for a rapid Self-Assessment of an


organization or facility to determine how closely existing
management practices and procedures correspond to the
elements of the Self-Assessment.

The criteria of requirements and elements of Camel Development


with Red Hat JBoss Fuse have been rephrased in the format of
a Self-Assessment questionnaire, with a seven-criterion scoring
system, as explained in this document.

5
In this format, even with limited background knowledge of Camel
Development with Red Hat JBoss Fuse, a facility or other business
manager can quickly review existing operations to determine
how they measure up to the standards. This in turn can serve as
the starting point of a ‘gap analysis’ to identify management tools
or system elements that might usefully be implemented in the
organization to help improve overall performance.

How to use the Self-Assessment


On the following pages are a series of questions to identify to
what extent your Camel Development with Red Hat JBoss Fuse
initiative is complete in comparison to the requirements set in
standards.

To facilitate answering the questions, there is a space in front of


each question to enter a score on a scale of ‘1’ to ‘5’.

1 Strongly Disagree
2 Disagree
3 Neutral
4 Agree
5 Strongly Agree

Read the question and rate it with the following in front of mind:

‘In my belief,
the answer to this question is clearly defined’.

There are two ways in which you can choose to interpret this
statement;

6
1. how aware are you that the answer to the question is
clearly defined
2. for more in-depth analysis you can choose to gather
evidence and confirm the answer to the question. This
obviously will take more time, most Self-Assessment
users opt for the first way to interpret the question
and dig deeper later on based on the outcome of the
overall Self-Assessment.

A score of ‘1’ would mean that the answer is not clear at


all, where a ‘5’ would mean the answer is crystal clear and
defined. Leave emtpy when the question is not applicable
or you don’t want to answer it, you can skip it without
affecting your score. Write your score in the space provided.

After you have responded to all the appropriate statements


in each section, compute your average score for that
section, using the formula provided, and round to the
nearest tenth. Then transfer to the corresponding spoke in
the Camel Development with Red Hat JBoss Fuse Scorecard
on the second next page of the Self-Assessment.

Your completed Camel Development with Red Hat JBoss


Fuse Scorecard will give you a clear presentation of which
Camel Development with Red Hat JBoss Fuse areas need
attention.

7
Camel Development with Red Hat
JBoss Fuse
Scorecard Example

Example of how the finalized Scorecard can look like:

8
Camel Development with Red Hat
JBoss Fuse
Scorecard

Your Scores:

9
BEGINNING OF THE
SELF-ASSESSMENT:

10
SELF-ASSESSMENT SECTION
START

11
CRITERION #1: RECOGNIZE

INTENT: B e aware of the need for


change. Recognize that there is an
unfavorable variation, problem or
symptom.

In my belief, the answer to this


question is clearly defined:

5 Strongly Agree

4 Agree

3 Neutral

2 Disagree

1 Strongly Disagree

1. How do you identify the kinds of information


that you will need?
<--- Score

2. What tools and technologies are needed for a


custom Camel Development with Red Hat JBoss
Fuse project?
<--- Score

12
3. Does a troubleshooting guide exist or is it
needed?
<--- Score

4. What does Camel Development with Red Hat JBoss


Fuse success mean to the stakeholders?
<--- Score

5. Does Camel Development with Red Hat JBoss Fuse


create potential expectations in other areas that need
to be recognized and considered?
<--- Score

6. What problems are you facing and how do you


consider Camel Development with Red Hat JBoss Fuse
will circumvent those obstacles?
<--- Score

7. What is the smallest subset of the problem we


can usefully solve?
<--- Score

8. How do you assess your Camel Development


with Red Hat JBoss Fuse workforce capability and
capacity needs, including skills, competencies, and
staffing levels?
<--- Score

9. What training and capacity building actions are


needed to implement proposed reforms?
<--- Score

10. Will a response program recognize when a crisis


occurs and provide some level of response?
<--- Score

13
11. Problem statement/identification – What
specifically is the problem?
<--- Score

12. What are the principles of software defect


prevention?
<--- Score

13. Are the error diagnostics, such as error


messages, straightforward, or does the user need
a PhD in computer science to comprehend them?
<--- Score

14. Who defines the rules in relation to any given


issue?
<--- Score

15. Are controls defined to recognize and contain


problems?
<--- Score

16. When a Camel Development with Red Hat


JBoss Fuse manager recognizes a problem, what
options are available?
<--- Score

17. Create: What technologies may need to be


created?
<--- Score

18. How do scaling issues affect the manner in


which you fulfill your goal of identifying your
initial scope?
<--- Score

19. Are there Camel Development with Red Hat JBoss

14
Fuse problems defined?
<--- Score

20. What was the problem?


<--- Score

21. What are the business objectives to be achieved


with Camel Development with Red Hat JBoss Fuse?
<--- Score

22. What could we have done to prevent this bug


in the first place?
<--- Score

23. What prevents me from making the changes


I know will make me a more effective Camel
Development with Red Hat JBoss Fuse leader?
<--- Score

24. Who needs to know about Camel Development


with Red Hat JBoss Fuse ?
<--- Score

25. What would happen if Camel Development with


Red Hat JBoss Fuse weren’t done?
<--- Score

26. Are there recognized Camel Development with


Red Hat JBoss Fuse problems?
<--- Score

27. Will Camel Development with Red Hat JBoss Fuse


deliverables need to be tested and, if so, by whom?
<--- Score

28. What if your business needs are still emerging

15
and certain aspects of the system are rapidly
changing or cannot be defined yet?
<--- Score

29. Will new equipment/products be required to


facilitate Camel Development with Red Hat JBoss Fuse
delivery for example is new software needed?
<--- Score

30. Switch off during upgrade -what happens,


does the application know there is a problem?
<--- Score

31. Will the program, module, or subroutine


eventually terminate?
<--- Score

32. Why do we need to keep records?


<--- Score

33. Are difficult problems being deferred?


<--- Score

34. As a sponsor, customer or management, how


important is it to meet goals, objectives?
<--- Score

35. How much are sponsors, customers, partners,


stakeholders involved in Camel Development with
Red Hat JBoss Fuse? In other words, what are the risks,
if Camel Development with Red Hat JBoss Fuse does
not deliver successfully?
<--- Score

36. Are there any specific expectations or concerns


about the Camel Development with Red Hat JBoss

16
Fuse team, Camel Development with Red Hat JBoss
Fuse itself?
<--- Score

37. What will be purchased, what needs to be


written?
<--- Score

38. How does it fit into our organizational needs


and tasks?
<--- Score

39. What problems to look for?


<--- Score

40. What vendors make products that address


the Camel Development with Red Hat JBoss Fuse
needs?
<--- Score

41. What Needs to Conform?


<--- Score

42. What are the expected benefits of Camel


Development with Red Hat JBoss Fuse to the
business?
<--- Score

43. Do we know what we need to know about this


topic?
<--- Score

44. What should be considered when identifying


available resources, constraints, and deadlines?
<--- Score

17
45. Will it solve real problems?
<--- Score

46. What are the different steps of software defect


prevention?
<--- Score

47. Think about the people you identified for your


Camel Development with Red Hat JBoss Fuse
project and the project responsibilities you would
assign to them. what kind of training do you think
they would need to perform these responsibilities
effectively?
<--- Score

48. How are we going to measure success?


<--- Score

49. What information do users need?


<--- Score

50. How long does it take to close a problem


report?
<--- Score

51. How do you identify the information basis for


later specification of performance or acceptance
criteria?
<--- Score

52. Can Management personnel recognize the


monetary benefit of Camel Development with Red
Hat JBoss Fuse?
<--- Score

53. Are there any explicit or implicit addressing

18
problems if, on the machine being used, the units
of memory allocation are smaller than the units of
memory addressability?
<--- Score

54. What else needs to be measured?


<--- Score

55. How can auditing be a preventative security


measure?
<--- Score

56. Does our organization need more Camel


Development with Red Hat JBoss Fuse education?
<--- Score

57. What should occur in the event of an incident?


<--- Score

58. What could have been done to prevent this bug


in the first place?
<--- Score

59. How are the Camel Development with Red Hat


JBoss Fuse’s objectives aligned to the organization’s
overall business strategy?
<--- Score

60. How could the error have been prevented?


<--- Score

61. Consider your own Camel Development


with Red Hat JBoss Fuse project. what types of
organizational problems do you think might be
causing or affecting your problem, based on the
work done so far?

19
<--- Score

62. Who else hopes to benefit from it?


<--- Score

63. How do we prevent defects & increase the pace


of testing?
<--- Score

64. What situation(s) led to this Camel Development


with Red Hat JBoss Fuse Self Assessment?
<--- Score

65. How many problem reports are open?


<--- Score

66. Is it clear when you think of the day ahead


of you what activities and tasks you need to
complete?
<--- Score

67. Have you identified your Camel Development


with Red Hat JBoss Fuse key performance
indicators?
<--- Score

68. How do we Identify specific Camel Development


with Red Hat JBoss Fuse investment and emerging
trends?
<--- Score

69. Will every loop eventually terminate?


<--- Score

70. How do you prevent errors and rework?


<--- Score

20
71. For your Camel Development with Red Hat
JBoss Fuse project, identify and describe the
business environment. is there more than one
layer to the business environment?
<--- Score

72. How many tests are needed?


<--- Score

73. How many problems reports have been


written?
<--- Score

74. What do we need to start doing?


<--- Score

75. Are reported problems being closed in a timely


manner?
<--- Score

Add up total points for this section:


_ _ _ _ _ = To t a l p o i n t s f o r t h i s s e c t i o n

Divided by: ______ (number of


statements answered) = ______
Average score for this section

Tr a n s f e r y o u r s c o re t o t h e C a m e l
Development with Red Hat JBoss Fuse
Index at the beginning of the Self-
Assessment.

21
SELF-ASSESSMENT SECTION
START

22
CRITERION #2: DEFINE:

INTENT: Formulate the business


problem. Define the problem, needs and
objectives.

In my belief, the answer to this


question is clearly defined:

5 Strongly Agree

4 Agree

3 Neutral

2 Disagree

1 Strongly Disagree

1. Has the direction changed at all during the course


of Camel Development with Red Hat JBoss Fuse? If so,
when did it change and why?
<--- Score

2. Do you have at least one test case specifying


noninteger values (such as 2.5, 3.5, 5.5)?
<--- Score

3. Non-functional requirements testing -have

23
non-functional requirements such as usability,
performance and reliability been met?
<--- Score

4. How did the Camel Development with Red Hat


JBoss Fuse manager receive input to the development
of a Camel Development with Red Hat JBoss Fuse
improvement plan and the estimated completion
dates/times of each activity?
<--- Score

5. Has the improvement team collected the ‘voice of


the customer’ (obtained feedback – qualitative and
quantitative)?
<--- Score

6. Are security/privacy roles and responsibilities


formally defined?
<--- Score

7. Size of Test Case File?


<--- Score

8. Is it clearly defined in and to your organization what


you do?
<--- Score

9. Is Camel Development with Red Hat JBoss Fuse


linked to key business goals and objectives?
<--- Score

10. What are the management responsibilities


regarding ISO 9001 requirements?
<--- Score

11. Who are the Camel Development with Red Hat

24
JBoss Fuse improvement team members, including
Management Leads and Coaches?
<--- Score

12. Define white box testing?


<--- Score

13. Has anyone else (internal or external to the


organization) attempted to solve this problem or
a similar one before? If so, what knowledge can be
leveraged from these previous efforts?
<--- Score

14. Are there any constraints known that bear on the


ability to perform Camel Development with Red Hat
JBoss Fuse work? How is the team addressing them?
<--- Score

15. Is there a completed, verified, and validated


high-level ‘as is’ (not ‘should be’ or ‘could be’) business
process map?
<--- Score

16. Project scope – What are the boundaries of the


scope?
<--- Score

17. Have the customer needs been translated into


specific, measurable requirements? How?
<--- Score

18. What are some keys to successfully conquering


ever changing business requirements?
<--- Score

19. What would be the goal or target for a Camel

25
Development with Red Hat JBoss Fuse’s improvement
team?
<--- Score

20. Since the goal of testing is to find errors, why


not make the completion criterion the detection of
some predefined number of errors?
<--- Score

21. How was the ‘as is’ process map developed,


reviewed, verified and validated?
<--- Score

22. Is the scope of Camel Development with Red Hat


JBoss Fuse defined?
<--- Score

23. How do we test OOA models (requirements and


use cases)?
<--- Score

24. What subset of all possible test cases has the


highest probability of detecting the most errors?
<--- Score

25. Is the current ‘as is’ process being followed? If not,


what are the discrepancies?
<--- Score

26. Has everyone on the team, including the team


leaders, been properly trained?
<--- Score

27. Is there a completed SIPOC representation,


describing the Suppliers, Inputs, Process, Outputs, and
Customers?

26
<--- Score

28. Are accountability and ownership for Camel


Development with Red Hat JBoss Fuse clearly
defined?
<--- Score

29. What tools and roadmaps did you use for getting
through the Define phase?
<--- Score

30. Is Camel Development with Red Hat JBoss Fuse


currently on schedule according to the plan?
<--- Score

31. For all array references, is each subscript value


within the defined bounds of the corresponding
dimension?
<--- Score

32. Are audit criteria, scope, frequency and methods


defined?
<--- Score

33. How do senior leaders promote an


environment that fosters and requires legal and
ethical behavior?
<--- Score

34. In what way can we redefine the criteria of choice


in our category in our favor, as Method introduced
style and design to cleaning and Virgin America
returned glamor to flying?
<--- Score

35. What defines Best in Class?

27
<--- Score

36. How will variation in the actual durations of each


activity be dealt with to ensure that the expected
Camel Development with Red Hat JBoss Fuse results
are met?
<--- Score

37. What Organizational Structure is Required?


<--- Score

38. Are different versions of process maps needed to


account for the different types of inputs?
<--- Score

39. How would you define the culture here?


<--- Score

40. What are CASE tools?


<--- Score

41. Have all of the relationships been defined


properly?
<--- Score

42. What is expected of implementations in


order to claim conformance -i.e., what are the
requirements?
<--- Score

43. What key business process output measure(s) does


Camel Development with Red Hat JBoss Fuse leverage
and how?
<--- Score

44. Has a high-level ‘as is’ process map been

28
completed, verified and validated?
<--- Score

45. What are the compelling business reasons for


embarking on Camel Development with Red Hat
JBoss Fuse?
<--- Score

46. Are task requirements clearly defined?


<--- Score

47. Is the improvement team aware of the different


versions of a process: what they think it is vs. what it
actually is vs. what it should be vs. what it could be?
<--- Score

48. What are the capability levels defined in SPICE?


<--- Score

49. What critical content must be communicated –


who, what, when, where, and how?
<--- Score

50. Is there a Camel Development with Red Hat JBoss


Fuse management charter, including business case,
problem and goal statements, scope, milestones, roles
and responsibilities, communication plan?
<--- Score

51. Will team members perform Camel Development


with Red Hat JBoss Fuse work when assigned and in a
timely fashion?
<--- Score

52. Are there any variables with similar names


(VOLT and VOLTS, for example)?

29
<--- Score

53. Has a team charter been developed and


communicated?
<--- Score

54. How is the team tracking and documenting its


work?
<--- Score

55. Structure definitions match across procedures?


<--- Score

56. Is the team adequately staffed with the desired


cross-functionality? If not, what additional resources
are available to the team?
<--- Score

57. How do you keep key subject matter experts in


the loop?
<--- Score

58. How does the Camel Development with Red Hat


JBoss Fuse manager ensure against scope creep?
<--- Score

59. Do you have a test case that represents a valid


equilateral triangle?
<--- Score

60. Are approval levels defined for contracts and


supplements to contracts?
<--- Score

61. In conducting formal technical reviews to


assess test strategy & test cases, who watches the

30
watchers?
<--- Score

62. Do you have a test case that represents a valid


isosceles triangle?
<--- Score

63. For object-oriented languages, are all


inheritance requirements met in the implementing
class?
<--- Score

64. Are customer(s) identified and segmented


according to their different needs and requirements?
<--- Score

65. What constraints exist that might impact the


team?
<--- Score

66. To what level of detail will you capture the


requirements, if at all?
<--- Score

67. When is the estimated completion date?


<--- Score

68. What are the rough order estimates on cost


savings/opportunities that Camel Development with
Red Hat JBoss Fuse brings?
<--- Score

69. Is data collected and displayed to better


understand customer(s) critical needs and
requirements.
<--- Score

31
70. What are the dynamics of the communication
plan?
<--- Score

71. What are the boundaries of the scope? What is in


bounds and what is not? What is the start point? What
is the stop point?
<--- Score

72. Is the team equipped with available and reliable


resources?
<--- Score

73. What baselines are required to be defined and


managed?
<--- Score

74. Many teams will find that informal modeling


sessions around whiteboards will be sufficient,
although sometimes more formal modeling
sessions, such as Joint Application Design (JAD)
strategies or stakeholder interviews will work best.
How will non- functional requirements pertaining
to availability, security, performance, and many
other factors be addressed?
<--- Score

75. Have specific policy objectives been defined?


<--- Score

76. What sources do you use to gather information


for a Camel Development with Red Hat JBoss Fuse
study?
<--- Score

32
77. What are the requirements of internal
auditing?
<--- Score

78. Which part of the systems requires most


attention?
<--- Score

79. Does the team have regular meetings?


<--- Score

80. Are inheritance requirements met?


<--- Score

81. Is there a critical path to deliver Camel


Development with Red Hat JBoss Fuse results?
<--- Score

82. How would one define Camel Development with


Red Hat JBoss Fuse leadership?
<--- Score

83. Is the Camel Development with Red Hat JBoss


Fuse scope manageable?
<--- Score

84. What specifically is the problem? Where does it


occur? When does it occur? What is its extent?
<--- Score

85. Will team members regularly document their


Camel Development with Red Hat JBoss Fuse work?
<--- Score

86. What is the minimum educational requirement


for potential new hires?

33
<--- Score

87. Is full participation by members in regularly held


team meetings guaranteed?
<--- Score

88. Are customers identified and high impact areas


defined?
<--- Score

89. Has the Camel Development with Red Hat JBoss


Fuse work been fairly and/or equitably divided and
delegated among team members who are qualified
and capable to perform the work? Has everyone
contributed?
<--- Score

90. When was the Camel Development with Red Hat


JBoss Fuse start date?
<--- Score

91. Do the problem and goal statements meet the


SMART criteria (specific, measurable, attainable,
relevant, and time-bound)?
<--- Score

92. Scope -what should be addressed?


<--- Score

93. Do we all define Camel Development with Red Hat


JBoss Fuse in the same way?
<--- Score

94. How often are the team meetings?


<--- Score

34
95. For a loop controlled by both iteration and a
Boolean condition (a searching loop, for example)
what are the consequences of loop fall-through?
<--- Score

96. Do you have a test case in which one side has a


zero value?
<--- Score

97. What components require additional testing or


review?
<--- Score

98. Do you have a test case in which all sides are


zero (0, 0, 0)?
<--- Score

99. What are the purposes, goals and requirements


of the system?
<--- Score

100. Is the team sponsored by a champion or business


leader?
<--- Score

101. Is Camel Development with Red Hat JBoss Fuse


Required?
<--- Score

102. When are meeting minutes sent out? Who is on


the distribution list?
<--- Score

103. What customer feedback methods were used to


solicit their input?
<--- Score

35
104. If substitutes have been appointed, have they
been briefed on the Camel Development with
Red Hat JBoss Fuse goals and received regular
communications as to the progress to date?
<--- Score

105. Do you have a test case in which one side has


a negative value?
<--- Score

106. How and when will the baselines be defined?


<--- Score

107. Is the team formed and are team leaders


(Coaches and Management Leads) assigned?
<--- Score

108. Is there regularly 100% attendance at the


team meetings? If not, have appointed substitutes
attended to preserve cross-functionality and full
representation?
<--- Score

109. Do the requirements that we’ve gathered and


the models that demonstrate them constitute a
full and accurate representation of what we want?
<--- Score

110. Are business processes mapped?


<--- Score

111. Are improvement team members fully trained on


Camel Development with Red Hat JBoss Fuse?
<--- Score

36
112. Is a fully trained team formed, supported, and
committed to work on the Camel Development with
Red Hat JBoss Fuse improvements?
<--- Score

113. Are roles and responsibilities formally defined?


<--- Score

114. Does method X generate fewer test cases


than the base test suite?
<--- Score

115. In what way can we redefine the criteria of


choice clients have in our category in our favor?
<--- Score

116. Has a project plan, Gantt chart, or similar been


developed/completed?
<--- Score

117. Are team charters developed?


<--- Score

118. How will the Camel Development with Red


Hat JBoss Fuse team and the organization measure
complete success of Camel Development with Red
Hat JBoss Fuse?
<--- Score

119. Global variable definitions consistent across


modules?
<--- Score

120. What are the Roles and Responsibilities for


each team member and its leadership? Where is this
documented?

37
<--- Score

121. What is the scope of the assessment?


<--- Score

122. Who defines (or who defined) the rules and roles?
<--- Score

123. How can the value of Camel Development with


Red Hat JBoss Fuse be defined?
<--- Score

124. Are there different segments of customers?


<--- Score

125. Have all basic functions of Camel Development


with Red Hat JBoss Fuse been defined?
<--- Score

126. Are Required Metrics Defined?


<--- Score

127. Has/have the customer(s) been identified?


<--- Score

Add up total points for this section:


_ _ _ _ _ = To t a l p o i n t s f o r t h i s s e c t i o n

Divided by: ______ (number of


statements answered) = ______
Average score for this section

Tr a n s f e r y o u r s c o re t o t h e C a m e l
Development with Red Hat JBoss Fuse
Index at the beginning of the Self-
Assessment.

38
SELF-ASSESSMENT SECTION
START

39
40
CRITERION #3: MEASURE:

INTENT: Gather the correc t data.


Measure the current performance and
evolution of the situation.

In my belief, the answer to this


question is clearly defined:

5 Strongly Agree

4 Agree

3 Neutral

2 Disagree

1 Strongly Disagree

1. What are your key Camel Development with


Red Hat JBoss Fuse organizational performance
measures, including key short and longer-term
financial measures?
<--- Score

2. How frequently do we track measures?


<--- Score

3. Are there any easy-to-implement alternatives

41
to Camel Development with Red Hat JBoss Fuse?
Sometimes other solutions are available that do not
require the cost implications of a full-blown project?
<--- Score

4. Any loop bypasses because of entry conditions?


<--- Score

5. Can We Measure the Return on Analysis?


<--- Score

6. Are we taking our company in the direction of


better and revenue or cheaper and cost?
<--- Score

7. How will effects be measured?


<--- Score

8. What measurements are being captured?


<--- Score

9. Why do measure/indicators matter?


<--- Score

10. Was a data collection plan established?


<--- Score

11. Have the concerns of stakeholders to help identify


and define potential barriers been obtained and
analyzed?
<--- Score

12. How will success or failure be measured?


<--- Score

13. How is Knowledge Management Measured?

42
<--- Score

14. What is an unallowable cost?


<--- Score

15. What is quality cost?


<--- Score

16. Is a solid data collection plan established that


includes measurement systems analysis?
<--- Score

17. Which customers can’t participate in our market


because they lack skills, wealth, or convenient access
to existing solutions?
<--- Score

18. What to measure and why?


<--- Score

19. Is it possible that, because of the conditions


upon entry, a loop will never execute?
<--- Score

20. Are priorities and opportunities deployed


to your suppliers, partners, and collaborators to
ensure organizational alignment?
<--- Score

21. How will the process owner verify


improvement in present and future sigma levels,
process capabilities?
<--- Score

22. What has the team done to assure the stability and
accuracy of the measurement process?

43
<--- Score

23. What particular quality tools did the team find


helpful in establishing measurements?
<--- Score

24. What are the different diagrams defined in


UML?
<--- Score

25. What potential environmental factors impact


the Camel Development with Red Hat JBoss Fuse
effort?
<--- Score

26. Are the measurements objective?


<--- Score

27. What are our key indicators that you will measure,
analyze and track?
<--- Score

28. What are the agreed upon definitions of the high


impact areas, defect(s), unit(s), and opportunities that
will figure into the process capability metrics?
<--- Score

29. How could principles be more precisely


measured or valued?
<--- Score

30. Have the types of risks that may impact Camel


Development with Red Hat JBoss Fuse been identified
and analyzed?
<--- Score

44
31. Does the practice systematically track and analyze
outcomes related for accountability and quality
improvement?
<--- Score

32. What is the total cost related to deploying


Camel Development with Red Hat JBoss Fuse,
including any consulting or professional services?
<--- Score

33. What are the uncertainties surrounding


estimates of impact?
<--- Score

34. How frequently do you track Camel


Development with Red Hat JBoss Fuse measures?
<--- Score

35. Which methods and measures do you use to


determine workforce engagement and workforce
satisfaction?
<--- Score

36. What are the uses of arrow diagram?


<--- Score

37. Customer Measures: How Do Customers See Us?


<--- Score

38. What are my customers expectations and


measures?
<--- Score

39. What was the cost of the investment?


<--- Score

45
40. What are the measures of software quality?
<--- Score

41. Who participated in the data collection for


measurements?
<--- Score

42. Is data collection planned and executed?


<--- Score

43. Why Measure?


<--- Score

44. Which customers cant participate in our Camel


Development with Red Hat JBoss Fuse domain
because they lack skills, wealth, or convenient
access to existing solutions?
<--- Score

45. Are you taking your company in the direction of


better and revenue or cheaper and cost?
<--- Score

46. Is it possible to estimate the impact of


unanticipated complexity such as wrong or failed
assumptions, feedback, etc. on proposed reforms?
<--- Score

47. Are the units of measure consistent?


<--- Score

48. How do we measure bugs at delivery time?


<--- Score

49. What is the cost of poor quality as supported


by the teams analysis?

46
<--- Score

50. Is the cause of the bug reproduced in another


part of the program?
<--- Score

51. What does the charts tell us in terms of variation?


<--- Score

52. When is Knowledge Management Measured?


<--- Score

53. Do we effectively measure and reward individual


and team performance?
<--- Score

54. What charts has the team used to display the


components of variation in the process?
<--- Score

55. What are the uses of control charts?


<--- Score

56. What data was collected (past, present, future/


ongoing)?
<--- Score

57. Are there measurements based on task


performance?
<--- Score

58. Can you afford to lock your business into a


rigid long-term project where the cost of change
grows exponentially?
<--- Score

47
59. Meeting the challenge: are missed Camel
Development with Red Hat JBoss Fuse
opportunities costing us money?
<--- Score

60. How will you measure your Camel Development


with Red Hat JBoss Fuse effectiveness?
<--- Score

61. Is this an issue for analysis or intuition?


<--- Score

62. How will measures be used to manage and adapt?


<--- Score

63. How do Agile projects prioritize work?


<--- Score

64. How to measure lifecycle phases?


<--- Score

65. How will your organization measure success?


<--- Score

66. Does Camel Development with Red Hat JBoss


Fuse analysis isolate the fundamental causes of
problems?
<--- Score

67. Does the Camel Development with Red Hat


JBoss Fuse task fit the client’s priorities?
<--- Score

68. How do you identify and analyze stakeholders and


their interests?
<--- Score

48
69. What quality tools were used to get through
the analyze phase?
<--- Score

70. How can we measure the performance?


<--- Score

71. Have you tested your analysis and design?


<--- Score

72. Cost to Fix Bugs Grows With Time ?


<--- Score

73. How do senior leaders create a focus on action


to accomplish the organization s objectives and
improve performance?
<--- Score

74. How do we test OOD models (class and


sequence diagrams)?
<--- Score

75. What evidence is there and what is measured?


<--- Score

76. How is progress measured?


<--- Score

77. What about Camel Development with Red Hat


JBoss Fuse Analysis of results?
<--- Score

78. Is there a Performance Baseline?


<--- Score

49
79. Will We Aggregate Measures across Priorities?
<--- Score

80. Is performance measured?


<--- Score

81. How are measurements made?


<--- Score

82. Are losses documented, analyzed, and remedial


processes developed to prevent future losses?
<--- Score

83. Who should receive measurement reports ?


<--- Score

84. Why do the measurements/indicators matter?


<--- Score

85. What is the right balance of time and resources


between investigation, analysis, and discussion
and dissemination?
<--- Score

86. Do we aggressively reward and promote the


people who have the biggest impact on creating
excellent Camel Development with Red Hat JBoss
Fuse services/products?
<--- Score

87. Have all non-recommended alternatives been


analyzed in sufficient detail?
<--- Score

88. How Will We Measure Success?


<--- Score

50
89. What methods are feasible and acceptable to
estimate the impact of reforms?
<--- Score

90. Why should we expend time and effort to


implement measurement?
<--- Score

91. Is the solution cost-effective?


<--- Score

92. Have you found any ‘ground fruit’ or ‘low-


hanging fruit’ for immediate remedies to the gap in
performance?
<--- Score

93. What key measures identified indicate the


performance of the business process?
<--- Score

94. What will be measured?


<--- Score

95. Is the amount of rework impacting the cost and


schedule?
<--- Score

96. What is measured?


<--- Score

97. Is key measure data collection planned


and executed, process variation displayed and
communicated and performance baselined?
<--- Score

51
98. How to measure variability?
<--- Score

99. Are key measures identified and agreed upon?


<--- Score

100. How do we do risk analysis of rare, cascading,


catastrophic events?
<--- Score

101. How do you measure success?


<--- Score

102. What measurements are possible, practicable


and meaningful?
<--- Score

103. Are control charts being used or needed?


<--- Score

104. Have changes been properly/adequately


analyzed for effect?
<--- Score

105. What are the different errors for which defect


prevention analysis is required?
<--- Score

106. What are the key input variables? What are


the key process variables? What are the key output
variables?
<--- Score

107. Is the amount of rework impacting cost or


schedule?
<--- Score

52
108. Is data collected on key measures that were
identified?
<--- Score

109. Are process variation components displayed/


communicated using suitable charts, graphs, plots?
<--- Score

110. What are measures?


<--- Score

111. Are Acceptance Tests specified by the


customer and analyst to test that the overall
system is functioning as required (Do developers
build the right system?
<--- Score

112. Does Camel Development with Red Hat JBoss


Fuse systematically track and analyze outcomes for
accountability and quality improvement?
<--- Score

113. Does Camel Development with Red Hat


JBoss Fuse analysis show the relationships among
important Camel Development with Red Hat JBoss
Fuse factors?
<--- Score

114. How do we focus on what is right -not who is


right?
<--- Score

115. How can you measure Camel Development with


Red Hat JBoss Fuse in a systematic way?
<--- Score

53
116. Which Stakeholder Characteristics Are Analyzed?
<--- Score

117. What should be measured?


<--- Score

118. Why identify and analyze stakeholders and their


interests?
<--- Score

119. What are the types and number of measures to


use?
<--- Score

120. Is Process Variation Displayed/Communicated?


<--- Score

121. Will any special training be provided for


control chart interpretation?
<--- Score

122. How large is the gap between current


performance and the customer-specified (goal)
performance?
<--- Score

123. Among the Camel Development with Red


Hat JBoss Fuse product and service cost to
be estimated, which is considered hardest to
estimate?
<--- Score

124. Is long term and short term variability accounted


for?
<--- Score

54
125. Do staff have the necessary skills to collect,
analyze, and report data?
<--- Score

126. Do we need less efforts for testing because of


greater reuse of design patterns?
<--- Score

127. Where is it measured?


<--- Score

128. What Relevant Entities could be measured?


<--- Score

129. How is the value delivered by Camel


Development with Red Hat JBoss Fuse being
measured?
<--- Score

130. Are high impact defects defined and identified in


the business process?
<--- Score

131. What are the costs of reform?


<--- Score

132. How are you going to measure success?


<--- Score

133. Can we do Camel Development with Red Hat


JBoss Fuse without complex (expensive) analysis?
<--- Score

Add up total points for this section:


_ _ _ _ _ = To t a l p o i n t s f o r t h i s s e c t i o n

55
Divided by: ______ (number of
statements answered) = ______
Average score for this section

Tr a n s f e r y o u r s c o re t o t h e C a m e l
Development with Red Hat JBoss Fuse
Index at the beginning of the Self-
Assessment.

56
SELF-ASSESSMENT SECTION
START

57
CRITERION #4: ANALYZE:

INTENT: Analyze causes, assumptions


and hypotheses.

In my belief, the answer to this


question is clearly defined:

5 Strongly Agree

4 Agree

3 Neutral

2 Disagree

1 Strongly Disagree

1. Word processors, databases, custom tools. What


will be purchased, what needs to be written?
<--- Score

2. What tools were used to narrow the list of possible


causes?
<--- Score

3. Did any additional data need to be collected?


<--- Score

58
4. When conducting a business process
reengineering study, what should we look for
when trying to identify business processes to
change?
<--- Score

5. Are there any comparisons between variables


having different datatypes, such as comparing a
character string to an address, date, or number?
<--- Score

6. How was the detailed process map generated,


verified, and validated?
<--- Score

7. What are the disruptive Camel Development


with Red Hat JBoss Fuse technologies that enable
our organization to radically change our business
processes?
<--- Score

8. What are the different process model views?


<--- Score

9. Do the attributes (e.g., datatype and size) of


each parameter match the attributes of each
corresponding argument?
<--- Score

10. How to identify the expected output?


<--- Score

11. When you are identifying the potential


technical strategy(s) you have several process
factors that you should address. As with initial
scoping how much detail you go into when

59
documenting the architecture, the views that
you create, and your approach to modeling
are important considerations. Furthermore,
will you be considering one or more candidate
architectures and what is your overall delivery
strategy?
<--- Score

12. Was a detailed process map created to amplify


critical steps of the ‘as is’ business process?
<--- Score

13. What are the best opportunities for value


improvement?
<--- Score

14. How is the way you as the leader think and process
information affecting your organizational culture?
<--- Score

15. Have the problem and goal statements been


updated to reflect the additional knowledge gained
from the analyze phase?
<--- Score

16. Are there any computations using variables


having inconsistent (such as nonarithmetic)
datatypes?
<--- Score

17. Identify an operational issue in your


organization. for example, could a particular task
be done more quickly or more efficiently?
<--- Score

18. Are there any computations using variables

60
having the same datatype but different lengths?
<--- Score

19. If a data structure is referenced in multiple


procedures or subroutines, is the structure defined
identically in each procedure?
<--- Score

20. How does an application behave as the data it


processes increases in size?
<--- Score

21. Was a cause-and-effect diagram used to explore


the different types of causes (or sources of variation)?
<--- Score

22. How often will data be collected for measures?


<--- Score

23. Is the suppliers process defined and


controlled?
<--- Score

24. What should the next improvement project be


that is related to the process?
<--- Score

25. Think about some of the processes you


undertake within your organization. which do you
own?
<--- Score

26. Does job training on the documented


procedures need to be part of the process teams
education and training?
<--- Score

61
27. Can we add value to the current Camel
Development with Red Hat JBoss Fuse
decision-making process (largely qualitative)
by incorporating uncertainty modeling (more
quantitative)?
<--- Score

28. How do you measure the Operational


performance of your key work systems and
processes, including productivity, cycle time,
and other appropriate measures of process
effectiveness, efficiency, and innovation?
<--- Score

29. What were the financial benefits resulting from


any ‘ground fruit or low-hanging fruit’ (quick fixes)?
<--- Score

30. How does the organization define, manage, and


improve its Camel Development with Red Hat JBoss
Fuse processes?
<--- Score

31. What is the most recent process yield (or sigma


calculation)?
<--- Score

32. Do our leaders quickly bounce back from


setbacks?
<--- Score

33. What controls do we have in place to protect data?


<--- Score

34. What are the different process maturity levels?

62
<--- Score

35. Is the gap/opportunity displayed and


communicated in financial terms?
<--- Score

36. What process should we select for improvement?


<--- Score

37. What are the different risks associated with a


software process?
<--- Score

38. How do you use Camel Development with Red


Hat JBoss Fuse data and information to support
organizational decision making and innovation?
<--- Score

39. Does the organization have a distinct quality


program that support continuous process
improvement?
<--- Score

40. When a memory area has alias names with


differing attributes, does the data value in this
area have the correct attributes when referenced
via one of these names?
<--- Score

41. What does the data say about the performance of


the business process?
<--- Score

42. What are our Camel Development with Red Hat


JBoss Fuse Processes?
<--- Score

63
43. What were the crucial ‘moments of truth’ on the
process map?
<--- Score

44. Is the process severely broken such that a re-


design is necessary?
<--- Score

45. What conclusions were drawn from the team’s


data collection and analysis? How did the team reach
these conclusions?
<--- Score

46. What are the best software metrics for


discerning Agile (vs. non-Agile) process effects on
teams’ artifacts?
<--- Score

47. What did the team gain from developing a sub-


process map?
<--- Score

48. What successful thing are we doing today that


may be blinding us to new growth opportunities?
<--- Score

49. How should sensitive data be handled?


<--- Score

50. What is the cost of poor quality as supported by


the team’s analysis?
<--- Score

51. Were there any improvement opportunities


identified from the process analysis?

64
<--- Score

52. What tools were used to generate the list of


possible causes?
<--- Score

53. Does the process performance meet the


customers requirements?
<--- Score

54. How Do We Develop a High-Yield (HY) Process?


<--- Score

55. What are the critical software process issues?


<--- Score

56. Is the performance gap determined?


<--- Score

57. Do you, as a leader, bounce back quickly from


setbacks?
<--- Score

58. Have any additional benefits been identified that


will result from closing all or most of the gaps?
<--- Score

59. Is Data and process analysis, root cause analysis


and quantifying the gap/opportunity in place?
<--- Score

60. What are the different levels of software


process models?
<--- Score

61. How does the testing process look like?

65
<--- Score

62. Record-keeping requirements flow from the


records needed as inputs, outputs, controls and
for transformation of a Camel Development
with Red Hat JBoss Fuse process. ask yourself:
are the records needed as inputs to the Camel
Development with Red Hat JBoss Fuse process
available?
<--- Score

63. Where is the data coming from to measure


compliance?
<--- Score

64. What are the revised rough estimates of the


financial savings/opportunity for Camel Development
with Red Hat JBoss Fuse improvements?
<--- Score

65. Think about the functions involved in your


Camel Development with Red Hat JBoss Fuse
project. what processes flow from these functions?
<--- Score

66. Is the Camel Development with Red Hat JBoss


Fuse process severely broken such that a re-design is
necessary?
<--- Score

67. How do mission and objectives affect the


Camel Development with Red Hat JBoss Fuse
processes of our organization?
<--- Score

68. Why is defect prevention crucial to the

66
software process?
<--- Score

69. Do your employees have the opportunity to do


what they do best everyday?
<--- Score

70. Were Pareto charts (or similar) used to portray the


‘heavy hitters’ (or key sources of variation)?
<--- Score

71. An organizationally feasible system request


is one that considers the mission, goals and
objectives of the organization. key questions are:
is the solution request practical and will it solve a
problem or take advantage of an opportunity to
achieve company goals?
<--- Score

72. What other organizational variables, such as


reward systems or communication systems, affect
the performance of this Camel Development with
Red Hat JBoss Fuse process?
<--- Score

73. Any textual or grammatical errors in output


information?
<--- Score

74. Are the outputs of the program meaningful,


nonabusive, and devoid of computer gibberish?
<--- Score

75. What are the revised rough order estimates


of the financial savings/opportunity for the
improvement project?

67
<--- Score

76. What quality tools were used to get through the


analyze phase?
<--- Score

77. Is the datatype of the target variable of an


assignment smaller than the datatype or result of
the right-hand expression?
<--- Score

78. Did any value-added analysis or ‘lean thinking’


take place to identify some of the gaps shown on the
‘as is’ process map?
<--- Score

79. Are we following a good process?


<--- Score

80. A compounding model resolution with


available relevant data can often provide insight
towards a solution methodology; which Camel
Development with Red Hat JBoss Fuse models,
tools and techniques are necessary?
<--- Score

81. What key inputs and outputs are being


measured on an ongoing basis?
<--- Score

82. What other jobs or tasks affect the


performance of the steps in the Camel
Development with Red Hat JBoss Fuse process?
<--- Score

83. What kind of crime could a potential new hire

68
have committed that would not only not disqualify
him/her from being hired by our organization, but
would actually indicate that he/she might be a
particularly good fit?
<--- Score

84. How do we promote understanding that


opportunity for improvement is not criticism of
the status quo, or the people who created the
status quo?
<--- Score

85. Is each variable assigned the correct length


and datatype?
<--- Score

86. Does installer create folders, icons, short cuts,


files, database, registry entries?
<--- Score

87. Were any designed experiments used to generate


additional insight into the data analysis?
<--- Score

88. Are gaps between current performance and the


goal performance identified?
<--- Score

89. What are your current levels and trends in key


measures or indicators of Camel Development
with Red Hat JBoss Fuse product and process
performance that are important to and
directly serve your customers? how do these
results compare with the performance of your
competitors and other organizations with similar
offerings?

69
<--- Score

90. How will input, process, and output variables


be checked to detect for sub-optimal conditions?
<--- Score

91. What were the financial benefits resulting from


any ground fruit or low-hanging fruit (quick fixes)?
<--- Score

92. What are your current levels and trends in


key Camel Development with Red Hat JBoss Fuse
measures or indicators of product and process
performance that are important to and directly
serve your customers?
<--- Score

93. Who is the process owner?


<--- Score

Add up total points for this section:


_ _ _ _ _ = To t a l p o i n t s f o r t h i s s e c t i o n

Divided by: ______ (number of


statements answered) = ______
Average score for this section

Tr a n s f e r y o u r s c o re t o t h e C a m e l
Development with Red Hat JBoss Fuse
Index at the beginning of the Self-
Assessment.

70
SELF-ASSESSMENT SECTION
START

71
CRITERION #5: IMPROVE:

INTENT: D evelop a prac tical solution.


Innovate, establish and test the
solution and to measure the results.

In my belief, the answer to this


question is clearly defined:

5 Strongly Agree

4 Agree

3 Neutral

2 Disagree

1 Strongly Disagree

1. What tools do you use once you have decided


on a Camel Development with Red Hat JBoss
Fuse strategy and more importantly how do you
choose?
<--- Score

2. Is product being developed at a rate to be


completed within budget?
<--- Score

72
3. How do Web Operators communicate with
Developers?
<--- Score

4. Risk events: what are the things that could go


wrong?
<--- Score

5. What contract applies, what are its terms, and


what decisions have been made?
<--- Score

6. Whats the difference between Agile


Development and Lean UX?
<--- Score

7. How and for what purpose do you use the


results?
<--- Score

8. Are the intellectual rights, formats/interfaces,


and OTS components for the resulting works
clearly described?
<--- Score

9. How can skill-level changes improve Camel


Development with Red Hat JBoss Fuse?
<--- Score

10. Who will be using the results of the measurement


activities?
<--- Score

11. Imagine a scenario where you engage a


software group to build a critical software system.
Do you think you could provide every last detail

73
the developers need to know right off the bat?
<--- Score

12. Is the developer efficient enough to meet


current commitments?
<--- Score

13. What resources are required for the improvement


effort?
<--- Score

14. What current systems have to be understood


and/or changed?
<--- Score

15. How will you know when its improved?


<--- Score

16. Why improve in the first place?


<--- Score

17. Any nonexhaustive decisions?


<--- Score

18. Is the solution technically practical?


<--- Score

19. How to Improve?


<--- Score

20. Have any additional benefits been identified


that will result from closing all or most of the
gaps?
<--- Score

21. How is the development team organized?

74
<--- Score

22. How can agile software development be


utilised when the development is done in several
different locations instead of one site?
<--- Score

23. Complexity: an appropriate framework for


development?
<--- Score

24. How Agile are Industrial Software


Development Practices?
<--- Score

25. For expressions containing more than one


Boolean operator, are the assumptions about
the order of evaluation and the precedence of
operators correct?
<--- Score

26. How large is the system that is being


developed?
<--- Score

27. How Do We Link Measurement and Risk?


<--- Score

28. Is our organization clear about the relationship


between agile software development and
DevOps?
<--- Score

29. What can we do to improve?


<--- Score

75
30. How can a conceptual agile framework be
developed?
<--- Score

31. What are the characteristics of software risks?


<--- Score

32. Is Internet-speed software development


different?
<--- Score

33. How can we improve performance?


<--- Score

34. How do you develop requirements for agile


software development?
<--- Score

35. How do you use other indicators, such as


workforce retention, absenteeism, grievances,
safety, and productivity, to assess and improve
workforce engagement?
<--- Score

36. How do we keep improving Camel Development


with Red Hat JBoss Fuse?
<--- Score

37. For decision problems, how do you develop a


decision statement?
<--- Score

38. How do we go about Comparing Camel


Development with Red Hat JBoss Fuse approaches/
solutions?
<--- Score

76
39. Is Supporting Camel Development with Red
Hat JBoss Fuse documentation required?
<--- Score

40. What are the implications of this decision 10


minutes, 10 months, and 10 years from now?
<--- Score

41. How can the balance between tacit and explicit


knowledge and their diffusion be found in agile
software development when there are several
parties involved?
<--- Score

42. How important is the completion of a


recognized college or graduate-level degree
program in the hiring decision?
<--- Score

43. Default attributes understood?


<--- Score

44. What software development activity required


the most rework?
<--- Score

45. If the project is using Agile Development or


Agile Acquisition, how many staff on the Scrum
Team are identified as testers?
<--- Score

46. How do the Camel Development with Red Hat


JBoss Fuse results compare with the performance
of your competitors and other organizations with
similar offerings?

77
<--- Score

47. Do we get business results?


<--- Score

48. If all attributes of a variable are not explicitly


stated in the declaration, are the defaults well
understood?
<--- Score

49. What is the team’s contingency plan for potential


problems occurring in implementation?
<--- Score

50. How do you manage and improve your Camel


Development with Red Hat JBoss Fuse work
systems to deliver customer value and achieve
organizational success and sustainability?
<--- Score

51. Operator precedence understood?


<--- Score

52. What is the implementation plan?


<--- Score

53. Automated or Manual Tests -Will automated


tests be developed?
<--- Score

54. What tools were used to evaluate the potential


solutions?
<--- Score

55. How do we Improve Camel Development with Red


Hat JBoss Fuse service perception, and satisfaction?

78
<--- Score

56. What is the magnitude of the improvements?


<--- Score

57. What should a proof of concept or pilot


accomplish?
<--- Score

58. What went well, what should change, what can


improve?
<--- Score

59. Management buy-in is a concern. Many


program managers are worried that upper-level
management would ask for progress reports and
productivity metrics that would be hard to gather
in an Agile work environment. Management
ignorance of Agile methodologies is also a worry.
Will Agile advantages be able to overcome
the well-known existing problems in software
development?
<--- Score

60. How does one decide if what you built is high


quality without testing it?
<--- Score

61. What was the quality of the initial


development effort?
<--- Score

62. What improvements have been achieved?


<--- Score

63. How will we know that a change is improvement?

79
<--- Score

64. And the results?


<--- Score

65. Was the program easy to understand?


<--- Score

66. How can a given credibility measure be


optimized given constrained resources?
<--- Score

67. What kind of results?


<--- Score

68. What if any is the difference between Lean and


Agile Software Development?
<--- Score

69. How good are the designers and programmers


in the development team?
<--- Score

70. Are we Assessing Camel Development with Red


Hat JBoss Fuse and Risk?
<--- Score

71. What actually has to improve and by how


much?
<--- Score

72. How significant is the improvement in the eyes of


the end user?
<--- Score

73. How do you measure progress and evaluate

80
training effectiveness?
<--- Score

74. In the past few months, what is the smallest


change we have made that has had the biggest
positive result? What was it about that small change
that produced the large return?
<--- Score

75. Could Agile Manifesto and agile methods be a


good starting point for the corporate venture to
start their development effort towards their own,
efficient agile in-house software development
method?
<--- Score

76. What error proofing will be done to address some


of the discrepancies observed in the ‘as is’ process?
<--- Score

77. What to do with the results or outcomes of


measurements?
<--- Score

78. What lessons, if any, from a pilot were


incorporated into the design of the full-scale solution?
<--- Score

79. What is the risk?


<--- Score

80. Have we developed requirements for agile


software development?
<--- Score

81. What changes need to be made to agile

81
development today?
<--- Score

82. What technologies are available to support


system development?
<--- Score

83. At what point will vulnerability assessments be


performed once Camel Development with Red Hat
JBoss Fuse is put into production (e.g., ongoing
Risk Management after implementation)?
<--- Score

84. Are there any nonexhaustive decisions?


<--- Score

85. How do we measure risk?


<--- Score

86. Are there cultural or organizational issues that


may affect the system development?
<--- Score

87. How can we improve Camel Development with


Red Hat JBoss Fuse?
<--- Score

88. Is there documentation that will support the


successful operation of the improvement?
<--- Score

89. How do you improve workforce health,


safety, and security? What are your performance
measures and improvement goals for each
of these workforce needs and what are any
significant differences in these factors and

82
performance measures or targets for different
workplace environments?
<--- Score

90. What type of test material would best serve my


development, integration, or testing needs?
<--- Score

91. As corporate ventures usually go to new


business areas and work with new technologies,
they are most likely unable to utilise existing
commercial or parent corporation’s in-house
development methods. Could Agile Manifesto
and agile methods be a good starting point for
the corporate venture to start their development
effort towards their own, efficient agile in-house
software development method?
<--- Score

92. Does the goal represent a desired result that can


be measured?
<--- Score

93. To what extent does management recognize


Camel Development with Red Hat JBoss Fuse as a tool
to increase the results?
<--- Score

94. What evaluation strategy is needed and what


needs to be done to assure its implementation and
use?
<--- Score

95. Does the way in which the compiler evaluates


Boolean expressions affect the program?
<--- Score

83
96. Risk factors: what are the characteristics of
Camel Development with Red Hat JBoss Fuse that
make it risky?
<--- Score

97. Do we know the difference between lean and


agile software development?
<--- Score

98. Can the solution be designed and


implemented within an acceptable time period?
<--- Score

99. How do we measure improved Camel


Development with Red Hat JBoss Fuse service
perception, and satisfaction?
<--- Score

100. How will you know that you have improved?


<--- Score

101. How are credibility measures best applied for


decisions with uncertainty?
<--- Score

102. How do we decide how much to remunerate


an employee?
<--- Score

103. How could a more enhanced framework be


developed?
<--- Score

104. How does the team improve its work?


<--- Score

84
105. How do you improve your likelihood of success ?
<--- Score

106. What are reactive risk strategies?


<--- Score

107. Compiler evaluation of Boolean expressions


understood?
<--- Score

108. How do we improve productivity?


<--- Score

109. Who will be responsible for documenting


the Camel Development with Red Hat JBoss Fuse
requirements in detail?
<--- Score

110. Who are the people involved in developing


and implementing Camel Development with Red
Hat JBoss Fuse?
<--- Score

111. Is the measure understandable to a variety of


people?
<--- Score

112. Who will be responsible for making the decisions


to include or exclude requested changes once Camel
Development with Red Hat JBoss Fuse is underway?
<--- Score

113. Goal statement/identification – What is the


goal or target for the improvement teams project?
<--- Score

85
114. What is the best online tool for Agile
development using Kanban?
<--- Score

115. What does the ‘should be’ process map/design


look like?
<--- Score

116. How will you measure the results?


<--- Score

117. What is metrics evaluation?


<--- Score

118. Can working in an agile mode assist a


corporate venture in achieving good results early,
in starting business, and in bringing income for
the parent company?
<--- Score

119. How can we fix actual and perceived problems


uncovered in ethnographic investigations of Agile
software development teams?
<--- Score

120. If you could go back in time five years, what


decision would you make differently? What is your
best guess as to what decision you’re making today
you might regret five years from now?
<--- Score

121. What are our metrics to use to measure


the performance of a team using agile software
development methodology?
<--- Score

86
122. What tools were most useful during the improve
phase?
<--- Score

123. What tools were used to tap into the creativity


and encourage ‘outside the box’ thinking?
<--- Score

124. Is there a high likelihood that any


recommendations will achieve their intended
results?
<--- Score

125. What were the underlying assumptions on the


cost-benefit analysis?
<--- Score

126. What are the sources of risk?


<--- Score

127. What is Camel Development with Red Hat JBoss


Fuse’s impact on utilizing the best solution(s)?
<--- Score

128. Who controls the risk?


<--- Score

129. For estimation problems, how do you develop


an estimation statement?
<--- Score

130. What communications are necessary to support


the implementation of the solution?
<--- Score

87
131. What is quality improvement?
<--- Score

132. Do we cover the five essential competencies-


Communication, Collaboration,Innovation,
Adaptability, and Leadership that improve an
organization’s ability to leverage the new Camel
Development with Red Hat JBoss Fuse in a volatile
global economy?
<--- Score

133. What do we want to improve?


<--- Score

134. Who controls key decisions that will be made?


<--- Score

135. What needs improvement?


<--- Score

136. So what do your developers do differently in


agile?
<--- Score

137. Can research really be relegated to a series of


steps that when performed in sequence result in a
new product?
<--- Score

138. What type of system is being developed?


<--- Score

Add up total points for this section:


_ _ _ _ _ = To t a l p o i n t s f o r t h i s s e c t i o n

Divided by: ______ (number of

88
statements answered) = ______
Average score for this section

Tr a n s f e r y o u r s c o re t o t h e C a m e l
Development with Red Hat JBoss Fuse
Index at the beginning of the Self-
Assessment.

89
SELF-ASSESSMENT SECTION
START

90
CRITERION #6: CONTROL:

INTENT: Implement the prac tical


solution. Maintain the performance and
correct possible complications.

In my belief, the answer to this


question is clearly defined:

5 Strongly Agree

4 Agree

3 Neutral

2 Disagree

1 Strongly Disagree

1. How will input, process, and output variables be


checked to detect for sub-optimal conditions?
<--- Score

2. What is the recommended frequency of auditing?


<--- Score

3. What quality tools were useful in the control


phase?
<--- Score

91
4. What other data formats, standards & interfaces
are proposed for use?
<--- Score

5. Do the decisions we make today help people and


the planet tomorrow?
<--- Score

6. Is reporting being used or needed?


<--- Score

7. What are the major test plan elements?


<--- Score

8. How will the process owner verify improvement in


present and future sigma levels, process capabilities?
<--- Score

9. Have new or revised work instructions resulted?


<--- Score

10. How can we best use all of our knowledge


repositories to enhance learning and sharing?
<--- Score

11. Does job training on the documented procedures


need to be part of the process team’s education and
training?
<--- Score

12. What are your results for key measures


or indicators of the accomplishment of your
Camel Development with Red Hat JBoss Fuse
strategy and action plans, including building and
strengthening core competencies?

92
<--- Score

13. How do you select, collect, align, and integrate


Camel Development with Red Hat JBoss Fuse data
and information for tracking daily operations and
overall organizational performance, including
progress relative to strategic objectives and action
plans?
<--- Score

14. Is there documentation that will support the


successful operation of the improvement?
<--- Score

15. What should we measure to verify efficiency


gains?
<--- Score

16. Is there a standardized process?


<--- Score

17. Were the planned controls working?


<--- Score

18. Are qualified staffs assigned according to plan?


<--- Score

19. What is your quality control system?


<--- Score

20. What is quality planning?


<--- Score

21. Can application/algorithms scale to handle


increased data requirements?
<--- Score

93
22. How might the organization capture best practices
and lessons learned so as to leverage improvements
across the business?
<--- Score

23. Will existing staff require re-training, for example,


to learn new business processes?
<--- Score

24. Is there a test plan?


<--- Score

25. Are operating procedures consistent?


<--- Score

26. How do our controls stack up?


<--- Score

27. Is the planned software productivity rate


realistic?
<--- Score

28. What is Scale and Why Manage It?


<--- Score

29. Validation -Comparing program outcomes


against user expectations. Are we building the
right product?
<--- Score

30. How do you scale Agile to large (500-5000


person) teams?
<--- Score

31. Are there documented procedures?

94
<--- Score

32. Should assertions be part of the standard?


<--- Score

33. Where do ideas that reach policy makers and


planners as proposals for Camel Development
with Red Hat JBoss Fuse strengthening and reform
actually originate?
<--- Score

34. Against what alternative is success being


measured?
<--- Score

35. Does a troubleshooting guide exist or is it needed?


<--- Score

36. What are the known security controls?


<--- Score

37. How do controls support value?


<--- Score

38. How do you encourage people to take control


and responsibility?
<--- Score

39. If there currently is no plan, will a plan be


developed?
<--- Score

40. How do you know when the software will be


finished if theres no up-front plan?
<--- Score

95
41. How will control chart readings and control
chart limits be checked to effectively monitor
performance?
<--- Score

42. How does your workforce performance


management system support high-performance
work and workforce engagement; consider
workforce compensation, reward, recognition, and
incentive practices; and reinforce a customer and
business focus and achievement of your action
plans?
<--- Score

43. How likely is the current Camel Development


with Red Hat JBoss Fuse plan to come in on
schedule or on budget?
<--- Score

44. What are you planning to complete today?


<--- Score

45. Whats the best design framework for


Camel Development with Red Hat JBoss Fuse
organization now that, in a post industrial-age if
the top-down, command and control model is no
longer relevant?
<--- Score

46. Who has control over resources?


<--- Score

47. What Data Formats, Standards & Interfaces are


proposed for use?
<--- Score

96
48. How will report readings be checked to effectively
monitor performance?
<--- Score

49. Does the response plan contain a definite closed


loop continual improvement scheme (e.g., plan-do-
check-act)?
<--- Score

50. Is there a recommended audit plan for routine


surveillance inspections of the DMAIC projects
gains?
<--- Score

51. Are new process steps, standards, and


documentation ingrained into normal operations?
<--- Score

52. What can you control?


<--- Score

53. Do we monitor the Camel Development with


Red Hat JBoss Fuse decisions made and fine tune
them as they evolve?
<--- Score

54. Do you monitor the effectiveness of your


Camel Development with Red Hat JBoss Fuse
activities?
<--- Score

55. What is our theory of human motivation, and


how does our compensation plan fit with that
view?
<--- Score

97
56. How will the day-to-day responsibilities for
monitoring and continual improvement be
transferred from the improvement team to the
process owner?
<--- Score

57. Are pertinent alerts monitored, analyzed and


distributed to appropriate personnel?
<--- Score

58. How do we enable market innovation while


controlling security and privacy?
<--- Score

59. Did the planning process consider


decentralized interoperable data storage/access
vs. a centralized fused data warehouse approach?
<--- Score

60. Implementation Planning- is a pilot needed to


test the changes before a full roll out occurs?
<--- Score

61. In the case of a Camel Development with


Red Hat JBoss Fuse project, the criteria for the
audit derive from implementation objectives.
an audit of a Camel Development with Red Hat
JBoss Fuse project involves assessing whether the
recommendations outlined for implementation
have been met. in other words, can we track that
any Camel Development with Red Hat JBoss
Fuse project is implemented as planned, and is it
working?
<--- Score

62. Are controls in place and consistently applied?

98
<--- Score

63. Is there a recommended audit plan for routine


surveillance inspections of Camel Development with
Red Hat JBoss Fuse’s gains?
<--- Score

64. Does Camel Development with Red Hat JBoss Fuse


appropriately measure and monitor risk?
<--- Score

65. Who controls critical resources?


<--- Score

66. Will any special training be provided for results


interpretation?
<--- Score

67. What other areas of the organization might benefit


from the Camel Development with Red Hat JBoss Fuse
team’s improvements, knowledge, and learning?
<--- Score

68. What standards should/could be used?


<--- Score

69. What is the control/monitoring plan?


<--- Score

70. Has the improved process and its steps been


standardized?
<--- Score

71. Is a response plan in place for when the input,


process, or output measures indicate an ‘out-of-
control’ condition?

99
<--- Score

72. Were the planned controls in place?


<--- Score

73. Verification -Comparing program outcomes


against a specification. Are we building the
product right?
<--- Score

74. What Can We Learn From a Theory of


Complexity?
<--- Score

75. What should the next improvement project be


that is related to Camel Development with Red Hat
JBoss Fuse?
<--- Score

76. What do we stand for--and what are we


against?
<--- Score

77. What are we attempting to measure/monitor?


<--- Score

78. Is new knowledge gained imbedded in the


response plan?
<--- Score

79. What other systems, operations, processes, and


infrastructures (hiring practices, staffing, training,
incentives/rewards, metrics/dashboards/scorecards,
etc.) need updates, additions, changes, or deletions
in order to facilitate knowledge transfer and
improvements?

100
<--- Score

80. Is there a Camel Development with Red Hat


JBoss Fuse Communication plan covering who
needs to get what information when?
<--- Score

81. What key inputs and outputs are being measured


on an ongoing basis?
<--- Score

82. Are documented procedures clear and easy to


follow for the operators?
<--- Score

83. Is knowledge gained on process shared and


institutionalized?
<--- Score

84. Is effort being expended according to plan?


<--- Score

85. How will new or emerging customer needs/


requirements be checked/communicated to orient
the process toward meeting the new specifications
and continually reducing variation?
<--- Score

86. Is there a documented and implemented


monitoring plan?
<--- Score

87. The test for a planning assumption is: will the


plan fail if the assumption is not true?
<--- Score

101
88. How will the process owner and team be able to
hold the gains?
<--- Score

89. What does it mean to scale agile solution


delivery?
<--- Score

90. Does the Camel Development with Red Hat JBoss


Fuse performance meet the customer’s requirements?
<--- Score

91. What are the key elements of your Camel


Development with Red Hat JBoss Fuse
performance improvement system, including
your evaluation, organizational learning, and
innovation processes?
<--- Score

92. Can we learn from other industries?


<--- Score

93. Who will be in control?


<--- Score

94. Is there a transfer of ownership and knowledge


to process owner and process team tasked with the
responsibilities.
<--- Score

95. What are the critical parameters to watch?


<--- Score

96. Is a response plan established and deployed?


<--- Score

102
97. Why is change control necessary?
<--- Score

98. Available as planned?


<--- Score

99. What is the control/monitoring plan?


<--- Score

100. Is there a control plan in place for sustaining


improvements (short and long-term)?
<--- Score

101. Do you have a test case that represents a valid


scalene triangle?
<--- Score

102. Do the Camel Development with Red Hat


JBoss Fuse decisions we make today help people
and the planet tomorrow?
<--- Score

103. How do disciplined agile teams work at scale?


<--- Score

104. What does the Test Plan show?


<--- Score

105. Are suggested corrective/restorative actions


indicated on the response plan for known causes to
problems that might surface?
<--- Score

106. Does a separate CM plan exist?


<--- Score

103
107. How do you take a methodology, like agile
development, that basically evolved in small
groups and then scale it up so that it works
on projects with hundreds of developers and
thousands of users?
<--- Score

108. Does the model reflect the real world


problem?
<--- Score

109. Who sets the Camel Development with Red


Hat JBoss Fuse standards?
<--- Score

110. What is your theory of human motivation, and


how does your compensation plan fit with that view?
<--- Score

111. Who is the Camel Development with Red Hat


JBoss Fuse process owner?
<--- Score

112. Is the planned impact of the leveraged


technology being realized?
<--- Score

113. What should we measure to verify effectiveness


gains?
<--- Score

114. How will the process owner and team be able


to hold the gains?
<--- Score

Add up total points for this section:

104
_ _ _ _ _ = To t a l p o i n t s f o r t h i s s e c t i o n

Divided by: ______ (number of


statements answered) = ______
Average score for this section

Tr a n s f e r y o u r s c o re t o t h e C a m e l
Development with Red Hat JBoss Fuse
Index at the beginning of the Self-
Assessment.

105
SELF-ASSESSMENT SECTION
START

106
CRITERION #7: SUSTAIN:

INTENT: Retain the benefits.

In my belief, the answer to this


question is clearly defined:

5 Strongly Agree

4 Agree

3 Neutral

2 Disagree

1 Strongly Disagree

1. Do/end statements match?


<--- Score

2. Parameter and argument attributes match?


<--- Score

3. What next bug might be introduced by the fix Im


about to make?
<--- Score

4. How formal should testing be?


<--- Score

107
5. Constants passed as arguments?
<--- Score

6. Initialization consistent with storage class?


<--- Score

7. How to generate meaningful scenarios in


practice?
<--- Score

8. How does the effort to install/deploy an


application increase as the installation base
grows?
<--- Score

9. Verification: Are we building the product right?


<--- Score

10. How will the test suite be delivered/used (e.g.,


web based, downloadable)?
<--- Score

11. Test passed?


<--- Score

12. What are the different software testing tactics?


<--- Score

13. What are the basic objectives of inspections?


<--- Score

14. Are functionality and quality attributes


orthogonal?
<--- Score

15. Correct lengths, types, and storage classes

108
assigned?
<--- Score

16. Based storage attributes correct?


<--- Score

17. Will each loop terminate?


<--- Score

18. Any variables with similar names?


<--- Score

19. Who has authority?


<--- Score

20. Is UML used as intended by its designers?


<--- Score

21. Cluster (Integration) Testing: Why is it


Different?
<--- Score

22. End-of-file conditions handled?


<--- Score

23. Does the system return some type of


immediate acknowledgment to all inputs?
<--- Score

24. What is quality management?


<--- Score

25. What should be tested?


<--- Score

26. Input checked for validity?

109
<--- Score

27. Would it be easy for you to modify this


program?
<--- Score

28. File attributes correct?


<--- Score

29. Validation: Are we building the right product?


<--- Score

30. What is the Return on Investment?


<--- Score

31. Is a system failure an aspect of availability, an


aspect of security, or an aspect of usability?
<--- Score

32. Files closed after use?


<--- Score

33. What are the benefits of ISO 9000 verification?


<--- Score

34. Are possible loop fall-throughs correct?


<--- Score

35. How will you validate your term product?


<--- Score

36. What types of faults are we looking for?


<--- Score

37. What is to be achieved by the quality practices?


<--- Score

110
38. Why Test?
<--- Score

39. What is external failure?


<--- Score

40. Is there a trust indenture?


<--- Score

41. What are the different types of software tests?


<--- Score

42. Code that is assigned to one person?


<--- Score

43. What stage of tests (unit test, integration test,


system test) are you are performing?
<--- Score

44. The amount of code that can be written in 4 to


40 hours?
<--- Score

45. How is class testing different from


conventional testing?
<--- Score

46. Why the urgency?


<--- Score

47. Why test software for correctness?


<--- Score

48. I/O errors handled?


<--- Score

111
49. All variables declared?
<--- Score

50. What are function-oriented metrics?


<--- Score

51. What are the reasons for the necessity of object


– orientation?
<--- Score

52. If the assessors discover an actual intruder or


an intruders footprints within the network, should
testing stop?
<--- Score

53. Where was the error made?


<--- Score

54. Who Tests the Software?


<--- Score

55. Who are the different inspection participants?


<--- Score

56. Arrays and strings initialized properly?


<--- Score

57. Does the system contain an excessive number


of options, or options that are unlikely to be used?
<--- Score

58. Does the test path tour the simple path


directly?
<--- Score

112
59. Test scalability by installing system on 10K
desktops?
<--- Score

60. What are installation tests?


<--- Score

61. Attributes of arguments transmitted to called


modules equal to attributes of parameters?
<--- Score

62. Can you do better?


<--- Score

63. What licenses, permits or other authority does


the Company already have?
<--- Score

64. Might any particular values, or value


combinations behave differently?
<--- Score

65. Is a coherent and workable technical


community strategy laid out?
<--- Score

66. Any references to parameters not associated


with current point of entry?
<--- Score

67. Adopt: What technologies are available for use


unmodified?
<--- Score

68. Does it meet the overall objectives?


<--- Score

113
69. What would you like to achieve during this
project?
<--- Score

70. What if I use different combinations of the


types float and int?
<--- Score

71. Off-by-one iteration errors?


<--- Score

72. How do we test software in real-time systems?


<--- Score

73. If you just found the 73rd defect in your 50,000


LOC program, do you feel good about it?
<--- Score

74. Open statements correct?


<--- Score

75. Does the program accept blank as an answer?


<--- Score

76. What are regression tests?


<--- Score

77. Buffer size matches record size?


<--- Score

78. What are the qualities team leaders should


posses?
<--- Score

79. Any unreferenced variables in crossreference

114
listing?
<--- Score

80. How do you find things?


<--- Score

81. How will we test more in less time?


<--- Score

82. Why are we testing?


<--- Score

83. Who better to do all that testing than the


people doing the actual coding?
<--- Score

84. What are the steps implied by statistical quality


assurance?
<--- Score

85. Number of input parameters equal to number


of arguments?
<--- Score

86. What if the order of arguments is reversed?


<--- Score

87. Input-only arguments altered?


<--- Score

88. Multiway branches exceeded?


<--- Score

89. What is Independent Verification and


Validation (IV&V)?
<--- Score

115
90. What is cleanroom software engineering?
<--- Score

91. Record and structure attributes match?


<--- Score

92. Has each user interface been tailored to


the intelligence, educational background, and
environmental pressures of the end user?
<--- Score

93. How do we test a real-time software?


<--- Score

94. What is a Good Test?


<--- Score

95. What are the different dimensions of quality?


<--- Score

96. Who is authorized to conduct the assessment?


<--- Score

97. Would you be proud to have written this


program?
<--- Score

98. Can we do better?


<--- Score

99. Do you have the materials (e.g., source code)


and are all materials properly marked?
<--- Score

100. What licenses, permits or other authority

116
must the Company have in order to conduct
business?
<--- Score

101. How could the error have been detected


earlier?
<--- Score

102. Why do we test this?


<--- Score

103. Do you want anyone to be able to use the


software for any purpose, including creating
divergent incompatible proprietary versions and
proprietary modules inside larger proprietary
programs?
<--- Score

104. Number, attributes, and order of arguments


to built-in functions correct?
<--- Score

105. Do you have the necessary copyright-related


rights?
<--- Score

106. Do you have permission to release to the


public?
<--- Score

107. Is the program easy to use?


<--- Score

108. Files opened before use?


<--- Score

117
109. Off-by-one errors in indexing or subscripting
operations?
<--- Score

110. Do we test how easily an external attacker


or malicious insider could successfully attack a
system?
<--- Score

111. What is an agile team?


<--- Score

112. When are we done testing?


<--- Score

113. Testing Object-Oriented Applications: Why is


it Different?
<--- Score

114. Describe the corrective actions that relate


to your area of responsibility. Who should be
assigned responsibility for each corrective action?
<--- Score

115. What things are dependent upon others?


<--- Score

116. What are the pre-requisites for employees?


<--- Score

117. What is most important for your


organization?
<--- Score

118. What should be the qualities of assessment


team members?

118
<--- Score

119. Format specification matches I/O statement?


<--- Score

120. What happens if an input Vector is null?


<--- Score

121. Number of arguments transmitted to called


modules equal to number of parameters?
<--- Score

122. When should testing begin?


<--- Score

123. How will we know if an implementation


conforms?
<--- Score

124. Units system of arguments transmitted


to called modules equal to units system of
parameters?
<--- Score

125. What kind of collaboration do you expect?


<--- Score

126. Is this a specification fault or a software fault?


<--- Score

127. Was the low-level design visible and


reasonable?
<--- Score

128. Any warning or informational messages?


<--- Score

119
129. Present in the input?
<--- Score

130. What changes?


<--- Score

131. How will you validate your software product?


<--- Score

132. Parameter and argument units system match?


<--- Score

133. What are the drawbacks of water fall model?


<--- Score

134. What is the maturity date?


<--- Score

135. What was done incorrectly?


<--- Score

136. Are there any existing relevant COTS and


GOTS capabilities, including OSS and OGOTS?
<--- Score

137. Was the high-level design visible and


reasonable?
<--- Score

Add up total points for this section:


_ _ _ _ _ = To t a l p o i n t s f o r t h i s s e c t i o n

Divided by: ______ (number of


statements answered) = ______
Average score for this section

120
Tr a n s f e r y o u r s c o re t o t h e C a m e l
Development with Red Hat JBoss Fuse
Index at the beginning of the Self-
Assessment.

121
Index
ability 25, 88
accept 114
acceptable 51, 84
Acceptance 18, 53
access 2, 4, 43, 46, 98
accomplish 3, 49, 79
according 27, 31, 93, 101
account 5, 28
accounted 54
accuracy 43
accurate 36
achieve 3, 67, 78, 87, 114
achieved 15, 79, 110
achieving 86
across 30, 37, 50, 94
action 49, 92-93, 96, 118
actions 13, 103, 118
activities 20, 73, 97
activity 24, 28, 77
actual 28, 86, 112, 115
actually 29, 69, 80, 95
addition 4
additional 30, 35, 58, 60, 65, 69, 74
additions 100
address 17, 59, 81
addressed 32, 34
addressing 18, 25
adequately 30, 52
advantage 67
advantages 79
advise 4
affect 14, 66-68, 82-83
affecting 7, 19, 60
afford 47
against30, 94-95, 100
Aggregate 50
agreed 44, 52
alerts 98
algorithms 93
aligned 19

122
alignment 43
alleged 1
allocation 19
already 113
altered 115
although 32
Amazon 5
America 27
amount 51-52, 111
amplify 60
analysis 6-7, 42-43, 46, 48-50, 52-53, 55, 64-65, 68-69, 87
analyst 53
analyze 2, 44-45, 48-49, 53-55, 58, 60, 68
analyzed 42, 44, 50, 52, 54, 98
another 5, 47
answer 6-7, 12, 23, 41, 58, 72, 91, 107, 114
answered 21, 38, 56, 70, 89, 105, 120
answering 6
anyone 25, 117
appear 1
applicable 7
applied 84, 98
applies 73
appointed 36
approach 60, 98
approaches 76
approval 30
Architect 3
Architects 3
argument 59, 107, 120
arguments 108, 113, 115, 117, 119
around 32
Arrays 112
artifacts 64
asking 1, 3
aspect 110
aspects 16
assertions 95
assess 13, 30, 76
assessing 80, 98
assessment 4, 20, 38, 116, 118
assessors 112
assign 18

123
assigned 29, 36, 69, 93, 109, 111, 118
assignment 68
assist 86
assistant 3
associated 63, 113
assumption 101
assurance 115
assure 43, 83
attack 118
attacker 118
attainable 34
attempted 25
attempting 100
attendance 36
attended 36
attention 7, 33
attributes 59, 63, 77-78, 107-110, 113, 116-117
auditing 19, 33, 91
auspices 4
author 1
authority 109, 113, 116
authorized 116
automated 78
available 14, 17, 30, 32, 42, 66, 68, 82, 103, 113
Average 7, 21, 38, 56, 70, 89, 105, 120
background 6, 116
balance 50, 77
barriers 42
Baseline 49
baselined 51
baselines 32, 36
basically 104
because 42-43, 46, 55
before 4, 25, 98, 117
beginning 2, 10, 21, 38, 56, 70, 89, 105, 121
behave 61, 113
behavior 27
belief 6, 12, 23, 41, 58, 72, 91, 107
benefit 1, 18, 20, 99
benefits 17, 62, 65, 70, 74, 107, 110
better 3, 31, 42, 46, 113, 115-116
between 50, 54, 59, 69, 73, 75, 77, 80, 84
biggest 50, 81

124
blinding 64
Blokdyk 4
Boolean 35, 75, 83, 85
bought 5
bounce 62, 65
boundaries 25, 32
bounds 27, 32
branches 115
briefed 36
bringing 86
brings 31
broken 64, 66
budget 72, 96
Buffer 114
building 13, 92, 94, 100, 108, 110
built-in 117
business 1, 3, 5-6, 15, 17, 19, 21, 23-25, 28-29, 35-36, 47, 51,
55, 59-60, 63, 78, 83, 86, 94, 96, 117
button 5
buy-in 79
bypasses 42
called 113, 119
candidate 60
cannot 16
capability 13, 29, 44
capable 3, 34
capacity 13
capture 31, 94
captured 42
cascading 52
category 27, 37
caused 1
causes 48, 58, 61, 65, 103
causing 19
certain 16
chaired 4
challenge 3, 48
Champagne 4
champion 35
change 12, 23, 47, 59, 79, 81, 103
changed 23, 74
changes 15, 52, 73, 81, 85, 98, 100, 120
changing 16, 25

125
character 59
charter 29-30
charters 37
charts 47, 52-53, 67
cheaper 42, 46
checked 70, 91, 96-97, 101, 109
checklist 4
choice 27, 37
choose 6-7, 72
circumvent 13
claimed 1
classes 108
cleaning 27
cleanroom 116
clearly 6-7, 12, 23-24, 27, 29, 41, 58, 72-73, 91, 107
client 4-5, 48
clients 37
closed 21, 97, 110
closely 5
closing 65, 74
Cluster 109
Coaches 25, 36
coding 115
coherent 113
collect 55, 93
collected 24, 31, 47, 53, 58, 61
collection 42-43, 46, 51, 64
college 77
coming 66
command 96
commercial 83
committed 37, 69
community 113
companies 1, 4
company 3, 42, 46, 67, 86, 113, 117
compare 69, 77
Comparing 59, 76
-Comparing 94, 100
comparison 6
compelling 29
Compiler 83, 85
complete 1, 6, 20, 37, 96
completed 7, 25-26, 29, 37, 72

126
completion 24, 26, 31, 77
complex 3, 55
Complexity 46, 75, 100
compliance 66
components 35, 47, 53, 73
comprehend 14
compute 7
computer 14, 67
concept 79
conceptual 76
concern 79
concerns 16, 42
condition 35, 99
conditions 42-43, 70, 91, 109
conduct 116-117
conducting 30, 59
confirm 7
Conform 17
conforms 119
conquering 25
consider 13, 19, 96, 98
considered 13, 17, 54
considers 67
consistent 37, 46, 94, 108
Constants 108
constitute 36
consultant 3
consulting 45
Contact 3
contacted 4
contain 14, 97, 112
contained 1
containing 4, 75
contains 4
content 29
Contents 1-2
continual 5, 97-98
continuous 63
contract 73
contracts 30
control 2, 47, 52, 54, 91, 93, 95-97, 99, 102-103
controlled 35, 61
controls 14, 62, 66, 87-88, 93-95, 98-100

127
convenient 43, 46
convey 1
Copyright 1
corporate 81, 83, 86
correct 41, 63, 69, 75, 91, 108-110, 114, 117
corrective 103, 118
correspond 5
costing 48
course 23
covering 101
create 5, 13-14, 49, 60, 69
created 14, 60, 69
creating 3, 50, 117
creativity 87
crisis 13
criteria 5, 18, 27, 34, 37, 98
CRITERION 2, 12, 23, 26, 41, 58, 72, 91, 107
critical 29, 31, 33, 60, 65, 73, 99, 102
criticism 69
crucial 64, 66
crystal 7
cultural82
culture 28, 60
current 26, 41, 54, 62, 69-70, 74, 96, 113
currently 27, 95
custom12, 58
customer 5, 16, 24-25, 31, 35, 38, 45, 53, 78, 96, 101-102
customers 1, 16, 26, 34, 38, 43, 45-46, 65, 69-70
damage 1
dashboards 100
database 69
databases 58
datatype 59, 61, 68-69
datatypes 59-60
day-to-day 98
deadlines 17
decide 79, 84
decided 72
decision 63, 76-77, 86
decisions 73-74, 82, 84-85, 88, 92, 97, 103
declared 112
dedicated 3
deeper 7

128
deepest 4
Default 77
defaults 78
defect 14, 18, 44, 52, 66, 114
defects 20, 55
deferred 16
define 2, 23, 25, 27-28, 33-34, 42, 62
defined 6-7, 12, 14-16, 23-24, 26-30, 32, 34, 36-38, 41, 44,
55, 58, 61, 72, 91, 107
defines 14, 27, 38
Defining 3
definite 97
degree 77
delegated 34
deletions 100
deliver 16, 33, 78
delivered 55, 108
delivery 16, 46, 60, 102
department 3
dependent 118
deploy 108
deployed 43, 102
deploying 45
derive 98
Describe 21, 118
described 1, 73
describing 26
design 1, 4-5, 27, 32, 49, 55, 81, 86, 96, 119-120
designed 3, 5, 69, 84
designers 80, 109
designing 3
desired 30, 83
desktops 113
detail 31, 50, 59, 73, 85
detailed 59-60
detect 70, 91
detected 117
detecting 26
detection 26
determine 5-6, 45
determined 65
develop 65, 72, 76, 87
developed 4-5, 26, 30, 37, 50, 72, 75-76, 78, 81, 84, 88, 95

129
developer 74
developers 53, 73-74, 88, 104
developing 64, 85
devoid 67
DevOps 75
diagram 45, 61
diagrams 44, 49
difference 73, 80, 84
different 3, 18, 28-29, 31, 38, 44, 52, 59, 61-63, 65, 75-76,
83, 108-109, 111-112, 114, 116, 118
differing 63
difficult 16
diffusion 77
dimension 27
dimensions 116
direction 23, 42, 46
directly 1, 69-70, 112
Disagree 6, 12, 23, 41, 58, 72, 91, 107
discerning 64
discover 112
discussion 50
display 47
displayed 31, 51, 53-54, 63
disqualify 69
disruptive 59
distinct 63
divergent 117
Divided 21, 34, 38, 56, 70, 88, 105, 120
document 5, 33
documented 37, 50, 61, 92, 94, 101
documents 3
domain 46
drawbacks 120
durations 28
during 16, 23, 87, 114
dynamics 32
earlier 117
easily 118
economy 88
editorial 1
education 3, 19, 61, 92
effect 52
effective 15

130
effects 42, 64
efficiency 62, 93
efficient 74, 81, 83
effort 44, 51, 74, 79, 81, 83, 101, 108
efforts 25, 55
electronic 1
elements 5-6, 92, 102
embarking 29
emerging 15, 20, 101
employee 84
employees 67, 118
empower 3
enable 59, 98
encourage 87, 95
engage 73
engagement 45, 76, 96
enhance 92
enhanced 84
enough 3, 74
ensure 28, 30, 43
Entities 55
entity 1
entries 69
equipment 16
equipped 32
equitably 34
errors 20, 26, 52, 67, 111, 114, 118
essential 88
establish 72
estimate 46, 51, 54
estimated 24, 31, 54
estimates 31, 45, 66-67
estimation 87
ethical 27
evaluate 78, 80
evaluates 83
evaluation 75, 83, 85-86, 102
events 52, 73
eventually 16, 20
everyday 67
everyone 26, 34
evidence 7, 49
evolution 41

131
evolve 97
evolved 104
example 2, 8, 16, 29, 35, 60, 94
examples 3-5
exceeded 115
excellence 3
excellent 50
excessive 112
exclude 85
execute 43
executed 46, 51
executive 3
existing 5-6, 43, 46, 79, 83, 94, 120
expect 119
expected 17, 28, 59
expend 51
expended 101
expensive 55
Expert 4
experts 30
explained 5
explicit 18, 77
explicitly 78
explore 61
expression 68
extent 6, 33, 83
external 25, 111, 118
facilitate 6, 16, 100
facility 5-6
facing 13
factors 32, 44, 53, 59, 82, 84
failed 46
failure 42, 110-111
fairly 34
fashion 1, 29
faults 110
feasible 51, 67
feedback 2, 5, 24, 35, 46
figure 44
finalized 8
financial 41, 62-63, 66-67, 70
finished 95
flying 27

132
folders 69
follow 5, 101
followed 26
following 6, 68
footprints 112
for--and 100
forget 4
formal 30, 32, 107
formally 24, 37
format 5-6, 119
formats 73, 92, 96
formed 36-37
formula 7
Formulate 23
fosters 27
framework 75-76, 84, 96
frequency 27, 91
frequently 41, 45
fulfill 14
full-blown 42
full-scale 81
functional 32
functions 38, 66, 117
future 3, 43, 47, 50, 92
gained 60, 100-101
gather 7, 32, 41, 79
gathered 36
generate 37, 65, 69, 108
generated 59
Gerardus 4
getting 27
gibberish 67
glamor 27
global 37, 88
graphs 4, 53
gratitude 4
greater 55
grievances 76
ground 51, 62, 70
groups 104
growth 64
guaranteed 34
guidance 1

133
handle 93
handled 64, 109, 111
happen 15
happens 3, 5, 16, 119
hardest 54
having 59-61
health 82
helpful 44
helping 3
highest 26
high-level 25, 28, 120
High-Yield 65
hiring 77, 100
hitters 67
humans 3
hundreds 104
hypotheses 58
identified 1, 18, 20, 31, 34, 38, 44, 51-53, 55, 64-65, 69, 74,
77
identify 6, 12, 18, 20-21, 42, 48, 54, 59-60, 68
ignorance 79
Imagine 73
imbedded 100
immediate 51, 109
impact 31, 34, 44-46, 50-51, 55, 87, 104
impacting 51-52
implement 13, 51, 91
implicit 18
implied 115
important 16, 53, 60, 69-70, 77, 118
improve 2, 5-6, 49, 62, 72-76, 78-80, 82, 84-85, 87-88
improved 74, 84, 99
improving 76
incentive 96
incentives 100
incident 19
include 85
Included 2, 4
includes 43
including 13, 25-26, 29, 41, 45, 62, 92-93, 102, 117, 120
income 86
increase 20, 83, 108
increased 93

134
increases 61
indenture 111
in-depth 7
indexing 118
indicate 51, 69, 99
indicated 103
indicators 20, 42, 44, 50, 69-70, 76, 92
indirectly 1
individual 1, 47
Industrial 75
industries 102
informal 32
ingrained 97
in-house 81, 83
initial 14, 59, 79
initiative 6
Innovate 72
innovation 62-63, 88, 98, 102
Input-only 115
inputs 26, 28, 66, 68, 101, 109
inside 117
insider 118
insight 68-69
inspection 112
install 108
installer69
installing 113
instead 75
integrate 93
intended 1, 87, 109
INTENT 12, 23, 41, 58, 72, 91, 107
intention 1
interests 48, 54
interface 116
interfaces 73, 92, 96
internal 1, 25, 33
interpret 6-7
interviews 32
introduced 27, 107
intruder 112
intruders 112
intuition 48
invaluable 2, 4-5

135
investment 20, 45, 110
involved 16, 66, 77, 85
involves 98
isolate 48
isosceles 31
issues 14, 65, 82
iteration 35, 114
itself 1, 17
judgment 1
Kanban 86
knowledge 6, 25, 42, 47, 60, 77, 92, 99-102
languages 31
largely 62
larger 117
leader 15, 35, 60, 65
leaders 26-27, 36, 49, 62, 114
leadership 33, 37, 88
learned 94
learning 92, 99, 102
length 69
lengths 61, 108
lessons 81, 94
levels 13, 29-30, 43, 62, 65, 69-70, 92
leverage 28, 88, 94
leveraged 25, 104
liability 1
licenses 113, 116
lifecycle 48
likelihood 85, 87
likely 83, 96
limited 6
limits 96
linked 24
listed 1
listing 115
locations 75
longer 96
long-term 47, 103
looking 110
losses 50
low-level 119
machine 19
magnitude 79

136
Maintain 91
makers 95
making 15, 63, 85-86
malicious 118
manage 48, 62, 78, 94
manageable 33
managed 3, 32
management 1, 5-6, 16, 18, 24-25, 29, 36, 42, 47, 79, 82-83, 96,
109
manager 3, 6, 14, 24, 30
managers 79
Manifesto 81, 83
manner 14, 21
Manual 78
mapped 36
marked 116
market 43, 98
marketer 3
matches 114, 119
material 83
materials 1, 116
matter 30, 42, 50
maturity 62, 120
meaningful 52, 67, 108
measurable 25, 34
measure 2, 6, 18-19, 28, 37, 41-44, 46-53, 55, 62, 66, 72, 80,
82, 84-86, 93, 99-100, 104
measured 19, 42, 44, 47, 49-51, 54-55, 68, 83, 95, 101
measures 41, 45-46, 48, 50-54, 61-62, 69-70, 82-84, 92, 99
mechanical 1
meeting 35, 48, 101
meetings 33-34, 36
member 37
members 25, 29, 33-34, 36, 118
memory 19, 63
messages 14, 119
method 27, 37, 81, 83
methods 27, 35, 45, 51, 81, 83
metrics 38, 44, 64, 79, 86, 100, 112
milestones 29
minimum 33
minutes 35, 77
missed 48

137
mission 66-67
modeling 32, 60, 62
models 26, 36, 49, 65, 68
modify 110
module 16
modules 37, 113, 117, 119
moments 64
monetary 18
monitor 96-97, 99-100
monitored 98
monitoring 98-99, 101, 103
months 77, 81
motivation 97, 104
multiple 61
Multiway 115
narrow 58
nearest 7
necessary 55, 64, 66, 68, 87, 103, 117
necessity 112
needed 12-13, 16, 21, 28, 52, 66, 83, 92, 95, 98
negative 36
neither 1
network 112
Neutral 6, 12, 23, 41, 58, 72, 91, 107
nonabusive 67
non-Agile 64
noninteger 23
normal 97
Notice 1
number 21, 26, 38, 54, 56, 59, 70, 88, 105, 112, 115, 117,
119-120, 122
object 112
objective 3, 44
objectives 15-16, 19, 23-24, 32, 49, 66-67, 93, 98, 108, 113
observed 81
obstacles 13
obtained 24, 42
obviously 7
occurring 78
occurs 13, 98
Off-by-one 114, 118
offerings 69, 77
one-time 3

138
ongoing 47, 68, 82, 101
online 5, 86
opened 117
operating 94
operation 82, 93
operations 6, 93, 97, 100, 118
Operator 75, 78
operators 73, 75, 101
optimized 80
options 14, 112
organized 74
orient 101
originate 95
orthogonal 108
others 118
otherwise 1
outcome 7
outcomes 45, 53, 81, 94, 100
outlined 98
output 28, 52, 59, 67, 70, 91, 99
outputs 26, 66-68, 101
outside 87
overall 6-7, 19, 53, 60, 93, 113
overcome 79
ownership 27, 102
Parameter 59, 107, 120
parameters 102, 113, 115, 119
parent 83, 86
Pareto 67
particular 44, 60, 113
parties 77
partners 16, 43
passed 108
patterns 55
people 3, 18, 50, 69, 85, 92, 95, 103, 115
perceived 86
perception 78, 84
perform 18, 25, 29, 34
performed 82, 88
performing 111
period 84
permission 1, 117
permits 113, 116

139
permitted 1
person 1, 94, 111
personnel 18, 98
pertaining 32
pertinent 98
phases 48
planet 92, 103
planned 46, 51, 93-94, 98, 100, 103-104
planners 95
planning 4, 93, 96, 98, 101
Planning- 98
points 21, 38, 55, 70, 88, 104-105, 120
policy 32, 95
portray 67
positive 81
posses 114
possible 26, 43, 46, 52, 58, 65, 91, 110
potential 13, 33, 42, 44, 59, 68, 78
practical 67, 72, 74, 91
practice 45, 108
practices 1, 5, 75, 94, 96, 100, 110
precaution 1
precedence 75, 78
precisely 44
predefined 26
present 43, 47, 92, 120
preserve 36
pressures 116
prevent 15, 19-20, 50
prevented 19
prevention 14, 18, 52, 66
prevents 15
previous 25
principles 14, 44
printing 4
priorities 43, 48, 50
prioritize 48
privacy 24, 98
problem 12-16, 18-20, 23, 25, 29, 33-34, 60, 67, 104
problems 13-19, 21, 48, 76, 78-79, 86-87, 103
procedure 61
procedures 5, 30, 61, 92, 94, 101

140
process 1, 3, 5, 25-26, 28-29, 43-44, 47, 51-55, 59-70, 81,
86, 91-93, 97-99, 101-102, 104
processes 36, 50, 59, 61-63, 66, 94, 100, 102
processors 58
produced 81
product 1, 5, 54, 69-70, 72, 88, 94, 100, 108, 110, 120
production 82
products 1, 16-17, 50
program 13, 16, 47, 63, 67, 77, 79-80, 83, 94, 100, 110, 114,
116-117
programs 117
progress 36, 49, 79-80, 93
project 3-4, 12, 18-19, 21, 25, 37, 42, 47, 61, 66-67, 77, 85, 98, 100,
114
projects 48, 97, 104
promote 27, 50, 69
proofing 81
properly 5, 26, 28, 52, 112, 116
proposals 95
proposed 13, 46, 92, 96
protect 62
provide 13, 68, 73
provided 4, 7, 54, 99
public 117
publisher 1
purchase 4-5
purchased 5, 17, 58
purpose 2, 5, 73, 117
purposes 35
qualified 34, 93
qualities 114, 118
quality 1, 5, 43-46, 49, 53, 63-64, 68, 79, 88, 91, 93, 108-110, 115-
116
question 6-7, 12, 23, 41, 58, 72, 91, 107
questions 3-4, 6, 67
quickly 6, 60, 62, 65
radically 59
rapidly 16
reactive 85
readings 96-97
realistic 94
realized 104
really 3, 88

141
real-time 114, 116
reasonable 119-120
reasons 29, 112
receive 24, 50
received 36
recent 62
recently 5
recognize 2, 12-14, 18, 83
recognized 13, 15, 77
recognizes 14
Record 114, 116
recording 1
records 16, 66
redefine 27, 37
re-design 64, 66
reducing 101
referenced 61, 63
references 27, 113, 122
reflect 60, 104
reform 55, 95
reforms13, 46, 51
regarding 24
registry 69
regression 114
regret 86
regular 33, 36
regularly 33-34, 36
reinforce 96
relate 118
related 45, 61, 100
relation 14
relative93
release 117
relegated 88
relevant 5, 34, 55, 68, 96, 120
reliable 32
remedial 50
remedies 51
remunerate 84
rephrased 5
report 18, 55, 97
reported 21
reporting 92

142
reports 20-21, 50, 79
represent 83
represents 30-31, 103
reproduced 1, 47
request 67
requested 1, 85
require 35, 42, 94
required 16, 28, 32, 35, 38, 52-53, 74, 77
requires 27, 33
research 3, 88
reserved 1
resolution 68
resource 4
resources 2, 4, 17, 30, 32, 50, 74, 80, 96, 99
respect 1
responded 7
response 13, 97, 99-100, 102-103
result 65, 68, 74, 81, 83, 88
resulted 92
resulting 62, 70, 73
results 28, 33, 49, 69, 72-73, 77-78, 80-81, 83, 86-87, 92, 99
Retain 107
retention 76
Return 42, 81, 109-110
returned 27
revenue 42, 46
reversed 115
review 5-6, 35
reviewed 26
reviews5, 30
revised 66-67, 92
reward 47, 50, 67, 96
rewards 100
rework 20, 51-52, 77
right-hand 68
rights 1, 73, 117
roadmaps 27
routine 97, 99
safety 76, 82
savings 31, 66-67
scalene 103
scaling 14
scenario 73

143
scenarios 108
schedule 27, 51-52, 96
scheme 97
science 14
scoping 59
Scorecard 2, 7-9
scorecards 100
Scores 9
scoring 5
searching 35
second 7
section 7, 21, 38, 55-56, 70, 88-89, 104-105, 120
security 19, 24, 32, 82, 95, 98, 110
segmented 31
segments 38
select 63, 93
sellers 1
senior 27, 49
sensitive 64
separate 103
sequence 49, 88
series 6, 88
service 1-5, 54, 78, 84
services 1, 4, 45, 50
sessions 32
setbacks 62, 65
several 4, 59, 75, 77
severely 64, 66
shared 101
sharing 92
should 3, 17, 19, 25, 29, 34, 50-51, 54, 59, 61, 63-64, 79, 86, 93,
95, 99-100, 104, 107, 109, 112, 114, 118-119
similar 25, 29, 37, 67, 69, 77, 109
simple 112
simply 4-5
single-use 3
situation 20, 41
skills 13, 43, 46, 55
smaller 19, 68
smallest 13, 81
software 14, 16, 18, 46, 63-65, 67, 73, 75-77, 79-81, 83-84,
86, 94-95, 108, 111-112, 114, 116-117, 119-120
solicit 35

144
solution 51, 67-68, 72, 74, 81, 84, 87, 91, 102
solutions 42-43, 46, 76, 78
Someone 3
Sometimes 32, 42
source 116
sources 32, 61, 67, 87
special 54, 99
specific 16, 20, 25, 32, 34
specified 53
specifying 23
sponsor 16
sponsored 35
sponsors 16
stability 43
staffed 30
staffing 13, 100
staffs 93
standard 3, 95
standards 1, 5-6, 92, 96-97, 99, 104
started 4
starting 6, 81, 83, 86
stated 78
statement 6, 14, 76, 85, 87, 119
statements 7, 21, 29, 34, 38, 56, 60, 70, 89, 105, 107, 114, 120
status 69
storage 98, 108-109
strategic 93
strategies 32, 85
strategy 19, 30, 59-60, 72, 83, 92, 113
string 59
strings 112
Strongly 6, 12, 23, 41, 58, 72, 91, 107
structure 28, 30, 61, 116
subject 4, 30
submit 5
submitted 5
subroutine 16
subscript 27
subset 13, 26
success 13, 18, 37, 42, 48, 50, 52, 55, 78, 85, 95
successful 64, 82, 93
sufficient 32, 50
suggested 103

145
suitable 53
suppliers 26, 43, 61
support 3, 63, 82, 87, 93, 95-96
supported 37, 46, 64
Supporting 77
surface 103
Surveys 4
SUSTAIN 2, 107
sustaining 103
Switch 16
symptom 12
system 5-6, 16, 35, 53, 67, 73, 75, 82, 88, 93, 96, 102, 109-113,
118-120
systematic 53
systems 33, 43, 62, 67, 74, 78, 100, 114
tactics 108
tailored 116
taking 42, 46
talking 3
target 25, 68, 85
targets 83
tasked 102
technical 30, 59, 113
techniques 68
technology 104
templates 3
terminate 16, 20, 109
tested 15, 49, 109
testers 77
testing 20, 23, 25-26, 35, 55, 65, 79, 83, 107-109, 111-112, 115,
118-119
textual 67
thankful 4
theory 97, 100, 104
theres 95
things 73, 115, 118
thinking 68, 87
thousands 104
through 27, 49, 68
throughout 1
time-bound 34
timely 21, 29
tomorrow 92, 103

146
Toolkits 3
top-down 96
toward 101
towards 68, 81, 83
tracking 30, 93
trademark 1
trademarks 1
trained 26, 36-37
training 13, 18, 54, 61, 81, 92, 99-100
Transfer 7, 21, 38, 56, 70, 89, 100, 102, 105, 121
translated 25
trends 20, 69-70
triangle 30-31, 103
trying 3, 59
unable 83
uncovered 86
underlying 87
understand 31, 80
understood 74, 77-78, 85
undertake 61
underway 85
Unless 3
unlikely 112
unmodified 113
updated 60
updates 100
up-front 95
upgrade 16
urgency 111
usability 24, 110
useful 87, 91
usefully 6, 13
usually 83
utilise 83
utilised 75
utilizing 87
validate 110, 120
validated 25-26, 29, 59
Validation 94, 110, 115
validity 109
valuable 3
valued 44
values 23, 113

147
variable 37, 68-69, 78
variables 29, 52, 59-60, 67, 70, 91, 109, 112, 114
variation 12, 28, 47, 51, 53-54, 61, 67, 101
variety 85
Vector 119
vendors 17
venture 81, 83, 86
ventures 83
verified 25-26, 29, 59
verify 43, 92-93, 104
Version 122
versions 28-29, 117
Virgin 27
visible 119-120
volatile 88
warehouse 98
warning 119
warranty 1
watchers 31
watches 30
wealth 43, 46
well-known 79
whether 3, 98
within 3, 27, 61, 72, 84, 112
without1, 7, 55, 79
workable 113
workforce 13, 45, 76, 82, 96
working 86, 93, 98
workplace 83
worried 79
writing 5
written 1, 17, 21, 58, 111, 116
yourself 66

148

You might also like