This document discusses implementing technical performance measures (TPM) on projects. It begins by outlining several learning objectives related to understanding the role and requirements of TPM. It then discusses that TPM are needed in addition to earned value measures, which only measure cost and schedule, to also measure technical progress. The document provides examples of how TPM can be defined and measured for work breakdown structure elements and used to track and reduce risk over time. It emphasizes that integrating TPM with cost and schedule measures provides program management with the necessary performance information to deliver projects on time, on budget, and meeting technical requirements.
How AI, OpenAI, and ChatGPT impact business and software.
Cpm500 d _alleman__tpm lesson 3 (v1)
1. PMI EVM Community of Practice
IPM 2011
CPM-500(D) : Implementing Technical Performance
Measures
Glen B. Alleman
DoD Programs
glen.alleman@niwotridge.com
+1 303 241 9633
2. Learning Objectives
TLO #9: The student will understand the role of Technical Performance Measurement
(TPM) in the project office.
ELO #1: The student will recognize the policy requirements for Technical Performance
Measures.
ELO #2: The student will recognize the role of Integrated Baseline Reviews in confirming the
entire technical scope of work has been planned.
ELO #3: The student will recognize the role of the WBS in supporting Technical Performance
Measure requirements.
TLO #9: The student will understand the scope of DCMA’s (or other) TPM software
management tool implementation.
ELO #1: The student will recognize the benefits and challenges of Technical Performance
Measure implementation.
ELO #2: The student will recognize the use of control limit charts to track Technical
Performance Measure metrics.
ELO #3: The student will understand the methodology and approach used to show the
effect of Technical Performance Measure on Earned Value.
2/66
4. Increasing the Probability of
Program Success Means …
Building A Credible Performance Measurement Baseline
Risk Cost
IMP/IMS PMB SOW
WBS TPM
This is actually harder than it looks!
4/66
5. A Core Problem With Earned
Value
Measures Of Progress Must Be In Units
Meaningful To The Stakeholders
Earned Value measures
performance in units
of “money” (BCWS,
BCWP, ACWP).
We need another
measure of progress in
units of TIME.
5/66
6. Doing This Starts With Some Guidance
Systems engineering uses technical performance
measurements to balance cost, schedule, and performance
throughout the life cycle. Technical performance
measurements compare actual versus planned technical
development and design. They also report the degree to
which system requirements are met in terms of performance,
cost, schedule, and progress in implementing risk handling.
Performance metrics are traceable to user–defined
capabilities.
― Defense Acquisition Guide
(https://dag.dau.mil/Pages/Default.aspx)
In The End ― It’s All About Systems Engineering
6/66
7. Just A Reminder Of The …
Primary Elements of Earned Value
Cost
Funding margin for Schedule margin for
under performance Over cost or Over cost or over target baseline
under over schedule (OTB)
performance
Over schedule
Technical or under
Performance performing
Schedule
Schedule margin for
underperformance or
schedule extension
7/66
8. This Has All Been Said Before.
We Just Weren’t Listening…
… the basic tenets of the process are the need for
seamless management tools, that support an
integrated approach … and “proactive
identification and management of risk” for critical
cost, schedule, and technical performance
parameters.
― Secretary of Defense, Perry memo, May 1995 TPM Handbook 1984
Why Is This Hard To Understand?
We seem to be focused on EV reporting, not the use of EV to
manage the program.
Getting the CPR out the door is the end of Program Planning
and Control’s efforts, not the beginning.
8/66
9. The Gap Seems To Start With A
Common Problem
Many Times, The Information from Cost, Schedule, Technical
Performance, and Risk Management Gets Mixed Up When We
Try to Put Them Together
9/66
9/66
10. The NDIA EVM Intent Guide Says
Notice the inclusion of Technical along with
Cost and Schedule
That’s the next step is generating Value from Earned Value
EV MUST include the Technical Performance Measures
10/66
11. Back To Our Technical
Performance Measures
Technical Performance Measures do what
they say,
Measure the Technical Performance
of the product or service produced by the
program.
11/66
12. The real question?
How fast can we safely go?
Yes, the Units of Measure are MPH
12/66
13. Measure of Effectiveness (MoE)
The operational measures of success that are closely related to the
achievements of the mission or operational objectives evaluated in
the operational environment, under a specific set of conditions.
Measures of Effectiveness …
Are stated in units meaningful to the buyer,
Focus on capabilities independent of any
technical implementation,
Are connected to the mission success.
MoE’s Belong to the End User
“Technical Measurement,” INCOSE–TP–2003–020–01 13/66
14. Measure of Performance (MoP)
Measures that characterize physical or functional attributes
relating to the system operation, measured or estimated
under specific conditions.
Measures of Performance are …
Attributes that assure the system has the
capability to perform,
Assessment of the system to assure it meets
design requirements to satisfy the MoE.
MoP’s belong to the Program – Developed by the Systems
Engineer, Measured By CAMs, and Analyzed by PP&C
“Technical Measurement,” INCOSE–TP–2003–020–01 14/66
15. Key Performance Parameters (KPP)
Represent the capabilities and characteristics so
significant that failure to meet them can be cause for
reevaluation, reassessing, or termination of the program
Key Performance Parameters …
Have a threshold or objective value,
Characterize the major drivers of performance,
Are considered Critical to Customer (CTC).
The acquirer defines the KPPs during the operational
concept development – KPPs say what DONE looks like
“Technical Measurement,” INCOSE–TP–2003–020–01 15/66
16. Technical Performance Measures (TPM)
Attributes that determine how well a system or system
element is satisfying or expected to satisfy a technical
requirement or goal
Technical Performance Measures …
Assess design progress,
Define compliance to performance requirements,
Identify technical risk,
Are limited to critical thresholds,
Include projected performance.
“Technical Measurement,” INCOSE–TP–2003–020–01 16/66
17. Dependencies Between These Measures
Acquirer Defines the Needs and Capabilities Supplier Defines Physical Solutions that
in terms of Operational Scenarios meet the needs of the Stakeholders
KPP
Mission
MoE MoP TPM
Need
Operational Measures that Measures used to
measures of success characterize physical assess design
related to the or functional progress,
achievement of the attributes relating compliance to
mission or to the system performance
operational operation. requirements, and
objective being technical risks.
evaluated.
“Coming to Grips with Measures of Effectiveness,” N. Sproles,
Systems Engineering, Volume 3, Number 1, pp. 50–58 17/66
18. “Measures” of Technical Measures
Attribute Description
Measured technical progress or estimate of
Achieved to Date
progress
Value of a technical parameter that is predicted to
Current Estimate
be achieved
Point in time when an evaluation of a measure is
Milestone
accomplished
Planned Value Predicted value of the technical parameter
Planned Performance Profile representing the project time phased
Profile demonstration of a technical parameter
Tolerance Band Management alert limits
Threshold Limiting acceptable value of a technical parameter
Demonstrated technical variance
Variances
INCOSE Systems Engineering Handbook Predicted technical variance 18/66
19. A Familiar Graphic of TPMs
TPM
Upper Limit
Planned Profile
Current Estimate
Planned Value
Mean To Between Failure
Threshold
Variance Lower Limit
Achieved to Date
Milestones
Time = Program Maturity
19/66
20. TPMs from an Actual Program
Chandra X–Ray Telescope
20/66
21. What Does A Real Technical
Performance Measure Look Like?
Not that bagels are not
interesting in Lesson 1 and
2, but let’s get ready to look
at a flying machine.
21/66
22. The WBS for a UAV
1.1 Air Vehicle TPMs Start With The WBS
1.1.1 Sensor Platform
1.1.2
1.1.2 Airframe
Airframe
1.1.3 Propulsion
1.1.4 On Board Comm
1.1.5 Auxiliary Equipment
1.1.6 Survivability
Modules
1.1.7 Electronic Warfare
Module
1.1.8 On Board
Application &
System SW
1.3 Mission Control /
Ground Station SW
1.3.1 Signal Processing
SW
1.3.2 Station Display
1.3.3 Operating System
1.3.4 ROE Simulations
1.3.5 Mission Commands
22/66
23. What Do We Need To Know About
This Program Through TPMs
What WBS elements represent the TPMs?
What Work Packages produce these WBS elements?
Where do these Work Packages live in the IMS?
What are the Earned Value baseline values for these
Work Packages?
How are we going to measure all these variables?
What does the curve look like for these
measurements?
23/66
24. Verifying Each TPM
Evidence that we’re in compliance
With our submitted ROM what are the values we need to get
Do we know what we promised to
CA through Integrated Baseline Review. How do we measure
deliver, now that we’ve won?
weight for each program event?
The contributors to the vehicle weight are confirmed and the
Can we proceed into preliminary
SFR upper limits defined in the product architecture and
design?
requirements flow down database (DOORS) into a model.
Can we proceed into the System Do we know all drivers of vehicle weight? Can we bound their
SRR Development and Demonstration upper limits? Can the subsystem owners be successful within
(SDD) phase? these constraints uses a high fidelity model?
Can we start detailed design, and Does each subsystem designer have the target component
meet the stated performance weight target and have some confidence they can stay below
PDR
requirements within cost, schedule, the upper bound? Can this be verified in some tangible way?
risk, and other constraints? Either through prior examples or a lab model?
Can the system proceed to
Do we know all we need to know to start the fabrication of
fabrication, demonstration, and test,
CDR the first articles of the flight vehicle. Some type of example,
within cost, schedule, risk, and other
maybe a prototype is used to verify we’re inside the lines.
system constraints?
Can the system ready to Does the assembled vehicle fall within the weight range limits
TRR
proceed into formal test? for 1st flight – will this thing get off the ground? 24/66
25. Design Model
TPM Trends & Responses
ROM in Proposal Detailed Design Model
Bench Scale Model Measurement
Technical Performance Measure
28kg Prototype Measurement
Vehicle Weight
Flight 1st Article
26kg
25kg
23kg
CA SFR SRR PDR CDR TRR
EV Taken, planned values met, tolerances kept, etc.
Dr. Falk Chart – modified 25/66
26. The Assessment Of Weight As A
Function Of Time
At Contract Award there is a Proposal grade estimate of
vehicle weight.
At System Functional Review, the Concept of Operations is
validated for the weight.
At System Requirements Review the weight targets are
flowed down to the subsystems components.
At PDR the CAD model starts the verification process.
At CDR actual measurements are needed to verify all
models.
At Test Readiness Review we need to know how much
fuel to put on board for the 1st flight test.
26/66
27. The WBS for a UAV Airframe Weight TPM
1.1 Air Vehicle The planned weight is
1.1.1 Sensor Platform 25kg. The actual weight is
1.1.2
1.1.2 Airframe
Airframe 25.5kg.
Close to plan! So we are
doing okay, right?
CA SFR SRR PDR CDR TRR
Planned Value 28.0kg 27.0kg 26.0kg 25.0kg 24.0kg 23.0kg
Actual Value 30.4kg 29.0kg 27.5kg 25.5kg
Moderate Low Low Very Low (less
Assessed Risk
>2.0kg off 1–2 kg off 1–2 kg off than 1.0 kg
to TRR
target target target off target)
Program– Actual Actual
Program–
Planned “Similar to” unique design measurement measurement
ROM unique design
Method Estimate model with of bench–test of prototype
model
validated data components airframe
Actual “Similar to”
Method Estimate
ROM ROM ROM Here’s the Problem
27/66
28. Raison d'etre for Technical
Performance Measures
The real purpose of
Risk Cost
Technical Performance
Measures is to reduce IMP/IMS PMB SOW
Programmatic and WBS TPM
Technical RISK
28/66
29. Buying Down Risk with TPMs
“Buying down” risk is
Risk: CEV-037 - Loss of Critical Functions During Descent
24
Correlate the analytical model
planned in the IMS.
22 Conduct focus splinter review
20 Develop analytical model to de
Conduct Force and Moment Wind
18
MoE, MoP, and KPP
Conduct Block 1 w ind tunnel te
16 Conduct w ind tunnel testing of
Conduct w ind tunnel testing of
Risk Score
14
Flight Application of Spacecra
12
10
8
CEV block 5 w ind tunnel testin
defined in the work
6
4
In-Flight development tests of
package for the critical
measure – weight.
2 Damaged TPS flight test
0
3.Jul.06
1.Jul.11
31.Mar.05
5.Oct.05
1.Jun.07
1.Jan.10
16.Dec.10
15.Sep.06
3.Apr.06
1.Apr.08
1.Aug.08
1.Apr.09
Planned Risk Level
Weight risk
Planned (Solid=Linked, Hollow =Unlinked, Filled=Complete)
Weight confirmed
If we can’t verify we’ve
reduced from
RED to Yellow
ready to fly – it’s
GREEN at this point succeeded, then the
risk did not get
reduced.
The risk may have
gotten worse.
29/66
30. Increasing the Probability of
Success with Risk Management
Going outside the TPM
limits always means
cost and schedule
impacts.
“Coloring Inside the
Lines” means knowing
the how to keep the
program GREEN, or at
least stay close to
GREEN. So much for our strategy of winning
through technical dominance
30/66
31. Connecting the EV Variables
Integrating Cost, Schedulele, and Technical Performance
Assures Program Management has the needed performance information to deliver
on‒time, on‒budget, and on‒specification
=
Technical Performance Measures
Cost + Schedule
Conventional Earned Value
Cost Baseline Technical Performance Schedule Baseline
Master Schedule is used to Earned Value is diluted by Requirements are
derive Basis of Estimate missing technical decomposed into physical
(BOE) not the other way performance. deliverables.
around. Earned Value is diluted by Deliverables are produced
Probabilistic cost postponed features. through Work Packages.
estimating uses past Earned Value is diluted by Work Packages are
performance and cost risk non compliant quality. assigned to accountable
modeling. All these dilutions require manager.
Labor, Materiel, and other adjustments to the Work Packages are
direct costs accounted for Estimate at Complete sequenced to form the
in Work Packages. (EAC) and the To Complete highest value stream with
Risk adjustments for all Performance Index (TCPI). the lowest technical and
elements of cost. programmatic risk.
31/66
32. TPM Checklist
MoE MoP TPM
Traceable to applicable Traceable to applicable MoPs,
Traceable to needs,
MOEs, KPPs, system level system element performance,
goals, objectives, and
performance requirements, requirements, objectives,
risks.
and risks. risks, and WBS elements.
Focused on technical risks Further decomposed,
Defined with associated and supports trades budgeted, and allocated to
KPPs. between alternative lower level system elements in
solutions. the WBS and IMS.
Each MoE independent Provided insight into Assigned an owner, the CAM
from others. system performance. and Work Package Manager.
Each MoE independent Decomposed, budgeted Sources of measure identified
of technical any and allocated to system and processes for generating
solution. elements. the measures defined.
Assigned an “owner,” the Integrated into the program’s
Address the required
CAM and Technical IMS as part of the exit criteria
KPPs.
Manager. for the Work Package. 32/66
33. Did We Accomplish the Learning
Objectives?
TLO #9: The student will understand the role of Technical Performance Measurement (TPM) in the project
office.
ELO #1: The student will recognize the policy Policies and supporting guidance, with links and
requirements for TPM. reference numbers provided.
ELO #2: The student will recognize the role of IBRs in This is the first place where cost, schedule and
confirming the entire technical scope of work has technical performance come together – in the
been planned. Integrated Master Schedule (IMS)
ELO #3: The student will recognize the role of the TPMs are first located in the WBS
WBS in supporting TPM requirements.
TLO #9: The student will understand the scope of DCMA’s (or other) TPM software management tool
implementation.
ELO #1: The student will recognize the benefits and Progress is measured in units of physical percent
challenges of TPM implementation. complete. TPMs are those units.
ELO #2: The student will recognize the use of control We’ve seen notional and actual charts
limit charts to track TPM metrics.
ELO #3: The student will understand the The example of our “flying machine” connects the
methodology and approach used to show the effect dots for TPMs, risk, cost, and schedule.
of TPMs on earned value.
33/66
35. Backup Materials
Knowledge is of two kinds. We know a
subject ourselves, or we know where
we can find information on it
— Samuel Johnson
35/66
36. Many of Sources for Connecting the Dots
OMB Circular A–11, Section 300 Interim Defense Acquisition Guidebook (DAG)
6/15/09
GAO Report 06–250 Systems Engineering Plan (SEP) Preparation Guide
4/08
DoDI 5000.02, Operation of the Defense WBS Handbook, Mil–HDBK–881A (WBS) 7/30/05
Acquisition System (POL) 12/08
Integrated Master Plan (IMP) & Integrated Guide for Integrating SE into DOD Acquisition
Master Schedule Preparation & Use Guide Contracts 12/06
(IMS) 10/21/05
Defense Acquisition Program Support Guide to the Project Management Institute Body of
Methodology (DAPS) V2.0 3/20/09 Knowledge (PMBOK Guide®), 4th Edition
Standard for Application and Capability Maturity Model Integration (CMMI®)
Management of the SE Process (IEEE
1220)
IEEE 1220: 6.8.1.5 Processes for Engineering a System (ANSI/EIA–632)
NASA EVM Guide NPG 9501.3
36/66
37. Office of Management and
Budget
Circular No. A–11, Section 300
Planning, Budgeting, Acquisition and Management
of Capital Assets
Section 300–5
– Performance–based acquisition management
– Based on EVMS standard
– Measure progress towards milestones
• Cost
• Capability to meet specified requirements
• Timeliness
• Quality
37/66
38. Need: Accurate Performance
Measurement
GAO Report 06–250 Findings and Recommendations
Information Technology: 2. If EVM is not implemented
Improve the Accuracy and effectively, decisions based on
Reliability of Investment inaccurate and potentially
Information misleading information
3. Agencies not measuring actual
versus expected performance
in meeting IT performance
goals.
38/66
39. DOD Guides:
Technical Performance
Department of Defense Guidelines for Technical Performance Measures
DoDI 5000.02, Operation of the Defense Acquisition System (POL) 12/08
Interim Defense Acquisition Guidebook (DAG) 6/15/09
Systems Engineering Plan (SEP) Preparation Guide 4/08
WBS Handbook, Mil–HDBK–881A (WBS) 7/30/05
Integrated Master Plan (IMP) & Integrated Master Schedule Preparation &
Use Guide (IMS) 10/21/05
Guide for Integrating SE into DOD Acquisition Contracts (Integ SE) 12/06
Defense Acquisition Program Support Methodology (DAPS) V2.0 3/20/09
39/66
40. DoD: TPMs in Technical Baselines and Reviews
Engineering
Integrated
IMP/IMS
Systems
DAPS
WBS
DAG
POL
SEP
DoD Policy or Guide
Technical Baselines:
IMP/IMS
Functional (SFR)
Allocated (PDR)
Product (CDR)
Event driven timing
Success criteria of
technical review
Entry and exit criteria
for technical reviews
Assess technical
maturity 40/66
41. DoD: TPMs in Integrated Plans
Engineering
Integrated
IMP/IMS
Systems
DAPS
WBS
DAG
POL
SEP
DoD Policy or Guide
Integrated SEP with:
IMP/IMS
TPMs
EVM
Integrated WBS with
Requirement
Specification
Statement of Work
IMP/IMS/EVMS
Link risk management,
technical reviews, TPMs,
EVM, WBS, IMS 41/66
42. Guidance in Standards, Models,
and Defense Acquisition Guide
Processes for Engineering a System (ANSI/EIA–632)
Standard for Application and Management of the SE
Process (IEEE 1220)
Capability Maturity Model Integration (CMMI®)
– CMMI for Development, Version 1.2
– CMMI for Acquisition, Version 1.2
– Using CMMI to Improve Earned Value Management,
2002
Guide to the Project Management Institute Body of
Knowledge (PMBOK Guide®), 4th Edition
42/66
43. Technical Performance
Measures (TPM)
More Sources
IEEE 1220: 6.8.1.5, EIA–632: Glossary CMMI for Development
Performance–based progress Requirements Development
measurement
TPMs are key to Predict future value of key Specific Practice (SP) 3.3,
progressively assess technical technical parameters of the Analyze Requirements
progress end system based on current Typical work product:
assessments TPMs
Establish dates for Planned value profile is Subpractice:
– Checking progress time–phased achievement Identify TPMs that will be
– Meeting full projected tracked during development
conformance to • Achievement to date
requirements • Technical milestone where
TPM evaluation is reported
43/66
44. PMBOK® Guide
10.5.1.1 Project Management Plan
Performance Measurement Baseline:
– Typically integrates scope, schedule, and cost
parameters of a project
– May also include technical and quality parameters
44/66
45. PMBOK® Guide
8.3.5.4 Work Performance Measurements
Used to produce project activity metrics
Evaluate actual progress as compared to planned
progress
Include, but are not limited to:
– Planned vs. actual technical performance
– Planned vs. actual schedule performance, and
– Planned vs. actual cost performance.
45/66
46. TPMs in DAG and DAPS
Defense Acquisition Guide
Performance measurement of WBS elements, using
objective measures:
– Essential for EVM and Technical Assessment activities
Use TPMs and Critical Technical Parameters (CTP) to
report progress in achieving milestones
DAPS
Use TPMs to determine whether % completion metrics
accurately reflect quantitative technical progress and
quality toward meeting Key Performance Parameters
(KPP) and Critical Technical Parameters
46/66
47. TPMs in DAG
Compare the actual versus planned technical
development and design
Report progress in the degree to which system
performance requirements are met.
Plan is defined in terms of:
– Expected performance at specific points
• Defined in the WBS and IMS
– Methods of measurement at those points
– Variation limits for corrective action.
47/66
48. PMBOK® Guide
11.6.2.4 Technical Performance Measurement
Compares technical accomplishments… to … project
management plan’s schedule of technical
achievement
Requires definition of objective quantifiable
measures of technical performance which can be
used to compare actual results against targets.
Might include weight, transaction times, number of
delivered defects, storage capacity etc.
Deviation, such as demonstrating more or less
functionality than planned at a milestone…forecast
degree of success in achieving the project’s scope.
48/66
49. CMMI–ACQ
Acquisition Technical Management
SP 1.3 Conduct Technical Reviews
Typical supplier deliverables
Progress reports and process, product, and
service level measurements
TPMs
49/66
50. SMS Shall:
Monitor Progress Against the Plan
4.2.12.2 Monitoring
– Contractor SHALL monitor progress against plan to
validate, approve, and maintain each baseline and
functional architecture
4.2.12.2.2 Required Product Attributes
– Each documented assessment includes:
– TPMs, metrics
– Metrics and technical parameters for tracking that
are critical indicators of technical progress and
achievement
50/66
51. NASA EVM Guide:
Technical Performance
• NASA EVM Guide NPG 9501.3
– 4.5 Technical Performance Requirements (TPR): When
TPRs are used,
– appropriate and relevant metrics…
– must be defined in the solicitation
– Appendix A.7, 14.1 TPR
• Compares:
• Expected performance and
• Physical characteristics
• With contractually specified values.
• Basis for reporting established milestones
• Progress toward meeting technical requirements
51/66
52. Derivation and Flow
Down of TPMs
Document, Baseline,
IMS, EVM Parameter
IMP, Functional Baseline Measures Of Effectiveness (MOE)
IMP, WBS, Functional Baseline Measures Of Performance (MOP)
IMP, Allocated Baseline Technical Performance Measure
IMS TPM Milestones And Planned Values
Work Packages TPM% Complete Criteria
See next chart for linkage of technical baselines to technical reviews
52/66
53. Interesting Attributes of TPMs
Achieved to Date (sounds like EV)
Current Estimate (sounds like EAC/ETC)
Milestone
Planned (target) value (sounds like PV)
Planned performance profile (sounds like a PMB)
Tolerance band (sounds like reporting
thresholds)
Threshold (yep, just what we thought)
Variance (sounds like variance!)
53/66