The State of Measurement & Analysis 2008: Applications in Support of High Maturity Practices

 

Welcome to your personalized questionnaire. Our hope is that your participation will help your organization judge its effectiveness relative to the successes and challenges reported by others throughout the industry. As always, any information that could identify you or your organization will be held in strict confidence by the SEI.

Be sure to save your personalized URL if you have not already done so. It is https://seir.sei.cmu.edu/feedback/HighMaturity2008.asp?ID=c3203. You or your designee may return to that URL and continue completing the questionnaire at any time. You also may save your work at any time. There are separate Save buttons for each section of the questionnaire. Please be sure that you are fully finished before you press the Submit button at the end of the questionnaire.

Thank you once again for your help with this important activity. And, please feel free to contact us at sei-analysis@sei.cmu.edu if you have any difficulty with the questionnaire.

 

I. About Yourself & Your Organization

 

  • By “organization” we mean an entity within which projects or similar work efforts are organized under common management and policies.
  • Such an organization can provide services, develop, maintain, or acquire software or software intensive systems.
  • When thinking about your organization, please answer for the organizational unit that you actually work in or support -- not for the larger corporation or other entity of which it may be a part.
  • For the purposes of this survey, please answer from the perspective of the organizational unit that most recently had a CMMI-based appraisal.

 

1. Which of the following best describes the role you play in your organization? (Please select one)
Executive or senior manager
Middle manager (e.g., program or product line)
Project manager
Project engineer or other technical staff
Process or quality engineer
Measurement specialist
Other (Please describe briefly)

 

2. How is your organization best described? (Please select one)
Commercial off the shelf (e.g., shrink-wrap or custom installation of enterprise solutions such as SAP or Oracle)
Contracted new development, (e.g., for use in particular product lines or other novel point solutions)
In-house or proprietary development or maintenance
Defense contractor
Other government contractor
Department of Defense or military organization
Other government agency
Other (Please describe briefly)

 

3. What is the primary focus of your organization’s work? (Please select one)
Product or system development
Maintenance or sustainment
Acquisition
Service provision
Other (Please describe briefly)

 

4. What kinds of engineering are major parts of your organization’s work? (Please select as many as apply)
Software engineering
Systems engineering
Hardware engineering
Design engineering
Test engineering
Other (Please describe briefly)

 

5. How would you best describe your involvement with measurement and analysis? (Please select one)
I am a provider of measurement-based information
I am a user (consumer) of measurement-based information
I am both a provider and user (consumer) of measurement-based information
Other (Please describe briefly)

 

6. In what country is your organization primarily located?

 

7. Approximately how many full-time employees in your organization work predominantly in software, hardware or systems engineering (e.g., development, maintenance, acquisition or provision of related services)? (Please select one)

 

8. To the best of your knowledge, what is the current maturity level of your organization? (Please select one)
CMMI Maturity Level 3 or lower
Close To Maturity Level 4
CMMI Maturity Level 4
CMMI Maturity Level 5
Don't know
 

 

II. Measurement Related Training & Staffing

 

1. What best characterizes the measurement related training that your organization requires for its employees? (Please select one for each)
Executive and senior managers
Middle managers (e.g., program or product line)
Project managers
Project engineers and other technical staff
Process or quality engineers

 

2. What kind of specialized training, if any, does your organization require for its employees who have responsibilities for process performance modeling? (Please select one for each)
Process performance model builders & maintainers (e.g., Six Sigma black belts or other measurement specialists)
Coaches & mentors who assist the model builders and maintainers (e.g., Six Sigma master black belts)
Those who collect and manage the baseline data (e.g., Six Sigma green belts, other project engineers or EPG members)
Users of the models

 

3. In what ways does your organization ensure that its process performance model builders and maintainers are properly trained? (Please check as many as apply)
Training developed and delivered internally within the organization
Purchase of training materials that are developed elsewhere but delivered internally
Contracts with external training services
Conferences and symposiums
We hire the right people in the first place
Other (Please describe briefly)

 

4. What, if any, other types of measurement related training or mentoring does your organization provide? (Please describe briefly)

 

5. Approximately how many people in your organization work with process performance baselines and models - as part of their explicitly assigned work efforts? (Please specify a number for each ... or type DK if you don’t know)
     Those who collect and manage the baseline data (e.g., Six Sigma green belts, other project engineers or EPG members)
     Those who build and maintain the models (e.g., Six Sigma black belts or other measurement specialists)
     Those who mentor or coach the model builders and maintainers (e.g., Six Sigma master black belts)
     Those who use the model results to inform their decision making

 

6. Approximately how many people build or maintain the models and baselines as their primary work assignments? (Please specify a number for each ... or type DK if you don’t know)
     The builders and maintainers
     Their mentors or coaches

 

7. How well do the people who create your organization’s process performance models and baselines understand the intent of CMMI? (Please select one for each)
The CMMI definition of a process performance model
The CMMI definition of a process performance baseline
The circumstances when process performance baselines are useful
The circumstances when process performance models are useful

 

8. How well do the managers in your organization who use process performance model results understand the results that they use? (Please select one)
Extremely well
Very well
Moderately well
To some extent
Hardly at all
Don't know

 

9. How often are qualified, well-prepared people available to work on process performance modeling in your organization when you need them (i.e., people with sufficient measurement related knowledge, competence, and statistical sophistication)? (Please select one)
Almost always (Greater than or equal to 80%)
Frequently (Greater than or equal to 60%)
About half of the time (Greater than 40% but less than 60%)
Occasionally (Less than or equal to 40%)
Rarely if ever (Less than or equal to 20%)

 

10. Does your organization provide promotion or financial incentives for its employees that are tied to the deployment and adoption of measurement and analysis (e.g., via six sigma belt programs)? (Please select as many as apply)
No
Yes … for executive and senior managers
       ... for middle managers (e.g., program or product line)
       ... for project managers
       ... for project engineers and other technical staff
       ... for others (Please describe briefly)
Don’t know
 

 

III. Alignment, Coordination & Infrastructure

 

1. How would you characterize the involvement of various potential stakeholders in setting goals and deciding on plans of action for measurement and analysis in your organization? (Please select one for each)
Customers
Executive and senior managers
Middle managers (e.g., program or product line)
Project managers
Project engineers and other technical staff
Process and quality engineers
Measurement specialists
Others (Please describe briefly)

 

2. Which of the following best describes how work on measurement and analysis is staffed in your organization? (Please select one)
An organization-wide, division or similar corporate support group (e.g., an engineering process, quality assurance or measurement group)
Separate groups or individuals in different projects or other organizational units (e.g., project, product team or similar work groups)
A few key people (or one person) in the organization who are measurement experts
Other (Please describe briefly)

 

3. How much automated support is available for measurement related activities in your organization? (Please select one for each)
Data collection (e.g., on-line forms with "tickler" reminders, time stamped activity logs, static or dynamic analyses of call graphs or run-time behavior)
Commercial work flow automation that supports data collection
Data management (e.g., relational or distributed database packages, open database connectivity, tools for data integrity, verification, or validation)
Spreadsheet add-ons for basic statistical analysis
Commercial statistical packages that support more advanced analyses
Customized spreadsheets for routine analyses (e.g. for defect phase containment)
Commercial software for report preparation (e.g., graphing packages or other presentation quality results)
Other (Please describe briefly)

 

4. How often does your organization do the following with the data it collects? (Please check one for each)
Check for out of range or other illegal values in the recorded data
Evaluate the number and distribution of missing data
Ensure that missing data are not inadvertently treated as zero values
Check for precision and accuracy of the data
Estimate measurement error statistically
Check for inconsistent interpretations of measurement definitions
Check for consistency/reliability of measurement results and procedures across time and reporting units
Check for consistency of classification decisions based on the same information, (otherwise known as inter-coder reliability)
Analyze & address the reasons for unusual patterns in the data distributions, e.g., outliers, skewness, or other aspects of non normal distributions
Analyze & address the reasons for unusual or unanticipated relationships between two or more measures
Automate data quality/integrity checks for ease of collecting consistent data
Other (Please describe briefly)
 

 

IV. Use of Process Performance Models & Baselines

 

1. For what operational purposes are models and baselines routinely used in your project and organizational product development, maintenance or acquisition activities? (Please select as many as apply ... or be sure to check ‘None of the above’ if appropriate)
Defining the organization's standard processes (e.g., implementing the entire process asset library in a simulation model)
Composing projects’ defined processes (e.g., modeling trade-offs in cost, schedule and quality to select among alternative subprocesses or compositions for a given project)
Risk management
Project planning
Project monitoring and corrective actions
Identifying opportunities for process or technology improvement
Evaluating process or technology improvements
Other (Please describe briefly)
None of the above
Don't know

 

2. Where else in the organization are process performance models and baselines used? (Please select as many as apply ... or be sure to check ‘None of the above’ if appropriate)
Corporate planning and strategy
Portfolio planning
Responses to requests for proposals or tender offers (e.g., cost and achievability modeling)
Marketing
Technology R&D
Business operations
Supply chain management
Human resources management
Other (Please describe briefly)
None of the above
Don't know

 

3. Which of the following product quality and project performance outcomes are routinely predicted with process performance models in your organization? (Please select as many as apply ... or be sure to check ‘None of the above’ if appropriate)
Delivered defects
Type or severity of defects
Product quality attributes (e.g., mean time to failure, design complexity, maintainability, interoperability, portability, usability, reliability, complexity, reusability or durability)
Quality of services provided (e.g., IT ticket resolution time)
Cost and schedule duration
Work product size
Accuracy of estimates (e.g., cost, schedule, product size or effort)
ROI of process improvement or related financial performance
Customer satisfaction
Other (Please describe briefly)
None of the above
Don't know

 

4. Which of the following (often interim) process performance outcomes are routinely predicted with process performance models in your organization? (Please select as many as apply ... or be sure to check ‘None of the above’ if appropriate)
Escaped defects (e.g., as predicted by defect phase containment models)
Cost of quality and poor quality (e.g., rework)
Estimates at completion (i.e., performed periodically throughout the project)
Requirements volatility or growth
Effectiveness or efficiency of inspection or test coverage
Practitioner adherence to defined processes
Other (Please describe briefly)
None of the above
Don't know

 

5. Which of the following processes and activities are routinely modeled in your organization? (Please select as many as apply ... or be sure to check ‘None of the above’ if appropriate)
Project planning and estimation
Requirements engineering
Product architecture
Software design and coding
Process documentation
Quality control processes
Systems engineering processes
Hardware engineering processes
Acquisition or supplier processes
Other (Please describe briefly)
None of the above
Don't know

 

6. How much emphasis does your organization place upon the following in its process performance modeling? (Please select one for each)
Accounting for uncertainty and variability in predictive factors and predicted outcomes
Factors that are under management or technical control
Other product, contractual or organizational characteristics, resources or constraints
Segmenting or otherwise accounting for uncontrollable factors
Factors that are tied to detailed subprocesses
Factors that are tied to larger, more broadly defined organizational processes
Other (Please describe briefly)

 

7. To what degree are your organization’s process performance models used for the following purposes? (Please select one for each)
Predict final project outcomes
Predict interim outcomes during project execution (e.g., connecting “upstream” with “downstream” activities)
Model the variation of factors and understand the predicted range or variation of the predicted outcomes
Enable “what-if” analysis for project planning, dynamic re-planning and problem resolution during project execution
Enable projects to achieve mid-course corrections to ensure project success
Other (Please describe briefly)
 

 

V. Other Analytic Methods & Techniques

 

1. To what extent are the following statistical methods used in your organization’s process performance modeling? (Please select one for each)
Regression analysis predicting continuous outcomes (e.g., bivariate or multivariate linear regression or non-linear regression)
Regression analysis predicting categorical outcomes (e.g., logistic regression or loglinear models)
Analysis of variance (e.g., ANOVA, ANCOVA or MANOVA)
Attribute SPC charts (e.g., c, u, p, or np)
Individual point SPC charts (e.g., ImR or XmR)
Continuous SPC charts (e.g., XbarR or XbarS)
Design of experiments
Other (Please describe briefly)

 

2. Which of the following visual display techniques are used to communicate the results of your organization’s analyses of process performance baselines? (Please select as many as apply ... or be sure to check ‘None of the above’ if appropriate)
Box plots
Histograms
Scatter plots or multivariate charting
Pareto charts, pie charts or bar charts
Mosaic charts for categorical data
Other (Please describe briefly)
None of the above
Don't know

 

3. Which of the following other optimization approaches are used in your organization’s process performance modeling? (Please select as many as apply ... or be sure to check ‘None of the above’ if appropriate)
Monte Carlo simulation
Discrete event simulation for process modeling
Markov or Petri-net models
Probabilistic modeling
Neural networks
Optimization
Other (Please describe briefly)
None of the above
Don't know

 

4. Which of these decision techniques are used in your organization? (Please select as many as apply ... or be sure to check ‘None of the above’ if appropriate)
Analytic Hierarchy Process (AHP)
Real options
Conjoint analysis
Wide band Delphi
Weighted multi criteria methods (e.g., QFD or Pugh)
Decision trees
Other (Please describe briefly)
None of the above
Don't know
 

 

VI. Challenges & Value Added

 

1. Following are a series of statements about the kinds of technical challenges that projects sometimes face. How well do they describe your organization? ((Please select one for each)
Initial project requirements are not well defined
Requirements change significantly throughout the life of the projects
There is little or no precedent for the kind of work we are doing
Significant constraints are placed on product quality attributes (e.g. reliability, scalability, security, supportability, etc.)
The size of the development effort is large
The technology needed for the projects is not mature
There are extensive needs for interoperability with other systems
Insufficient resources (e.g. people, funding) are available to support the projects
Insufficient skills and subject matter expertise are available to support the projects
Other (Please describe briefly)

 

2. Following are a few statements about the possible effects of using process performance modeling. To what extent do they describe what your organization has experienced? (Please select one for each)
Better project performance (e.g., more accurate estimation, reduced cost, shorter cycle time or higher productivity)
Better product quality (e.g., fewer defects or improved customer satisfaction)
Fewer project failures
Better tactical decisions about the adoption or improvement of work processes and technologies)
Better strategic decision making (e.g., about business growth or profitability)
Other (Please describe briefly)

 

3. How often are process performance model predictions used to inform decision making in your organization’s status and milestone reviews? (Please select one)
Almost always
Frequently
About half the time
Occasionally
Rarely if ever
Don't know

 

4. Overall, how useful have process performance models been for your organization? (Please select one)
Extremely valuable -- we couldn’t do our work properly without them
Very valuable -- we have obtained much useful information from them
Mixed value -- we have obtained useful information on occasion
Little or no value
It’s been harmful, not helpful
Don't know
 

 

VII. Barriers & Facilitators of Effective Measurement & Analysis

 

1. Which, if any, of the following have been major obstacles during your organization’s journey to high maturity? (Please select as many as apply ... or be sure to check ‘None of the above’ if appropriate)
We focused only on final project outcomes rather than interim outcomes
We didn’t collect data frequently enough to help projects make mid-course corrections
We failed to collect enough contextual information for proper segmentation and stratification
We failed to achieve enough consistency in our measures to aggregate and disaggregate them properly across the organization
We failed to sufficiently align and prioritize our measurement and analysis practices with our business and technical goals and objectives
We’ve encountered resistance to collecting new or additional data after achieving maturity level 3
Our management thought that process performance modeling would be an expensive overhead function rather than an essential part of project work
We spent too much time creating reports for management review instead of doing thorough analysis
We emphasized statistics more than domain knowledge and ended up with ineffective models
We didn’t provide sufficient mentoring and coaching for the individuals responsible for developing the models
Our process performance modelers don’t have sufficient access to people with statistical expertise
Other (Please describe briefly)
None of the above
Don't know

 

2. Following are a series of statements that are made in some organizations about the use of process performance modeling. How well do they describe your organization? (Please select one for each)
We have trouble doing process performance modeling because it takes too long to accumulate enough historical data
Doing process performance modeling has become an accepted way of doing business here
We make our decisions about the models we build without sufficient participation by management or other important stakeholders
We have trouble convincing management about value of doing process performance modeling
The messenger has been shot for delivering bad news based on process performance model predictions
We thought we knew what was driving process performance, but process performance modeling has taught us otherwise
Our managers want to know when things are off-track
Our managers are less willing to fund new work when the outcome is uncertain
We use data mining when similar but not identical electronic records exist
We do real time sampling of current processes when historical data are not available
We create our baselines from paper records for previously unmeasured attributes

 

3. What have been the greatest barriers faced by your organization during its journey to high maturity? What have you done to overcome them? (Please describe fully)
 

 

In Conclusion

 

  Is there anything else that you would like to tell us about the use or usefulness of measurement and analysis in your organization? (Please describe fully)
 

Thank you very much for your time and effort!

Before pressing the Submit button:

  • Be sure to print a copy if you want one.
  • Please be sure that you have saved everything first if you wish to save a copy on your local computer. Then use your browser's File Save As... command. When doing so, do not change the default file type.

Please press the Submit button only when you have completed the questionnaire fully to your satisfaction.

 


  Copyright 2008, Carnegie Mellon University.