Documente Academic
Documente Profesional
Documente Cultură
THIS OPEN SOURCE AGREEMENT (AGREEMENT) DEFINES THE RIGHTS O REPRODUCTION, DISTRIBUTION, MODIFICATION AND REDISTRIBUTION OF COMPUTER SOFTWARE ORIGINALLY RELEASED BY THE UNITED STATES GOVERNMENT AS REPRESENTED BY THE GOVERNMENT AGENCY LISTED ("GOVERNMENT AGENCY"). THE UNITED STATES GOVERNMENT, AS REPR BY GOVERNMENT AGENCY, IS AN INTENDED THIRD-PARTY BENEFICIARY SUBSEQUENT DISTRIBUTIONS OR REDISTRIBUTIONS OF THE SUBJECT SOF ANYONE WHO USES, REPRODUCES, DISTRIBUTES, MODIFIES OR REDISTRI SUBJECT SOFTWARE, AS DEFINED HEREIN, OR ANY PART THEREOF, IS, BY ACTION, ACCEPTING IN FULL THE RESPONSIBILITIES AND OBLIGATIONS C IN THIS AGREEMENT.
Government Agency: __NASA Marshall Space Flight Center____ Government Agency Original Software Designation: _________________________ Government Agency Original Software Title: TRL Calculator AFRL_NASA Version User Registration Requested. Please Visit http://________________________ Government Agency Point of Contact for Original Software: ED01/James W. Bilbro
1. DEFINITIONS A. B.
C. D. E.
F.
G.
Contributor means Government Agency, as the developer of the O Software, and any entity that makes a Modification. Covered Patents mean patent claims licensable by a Contributor th necessarily infringed by the use or sale of its Modification alone or w combined with the Subject Software. Display means the showing of a copy of the Subject Software, eith by means of an image, or any other device. Distribution means conveyance or transfer of the Subject Software means, to another. Larger Work means computer software that combines Subject Sof portions thereof, with software separate from the Subject Software th governed by the terms of this Agreement. Modification means any alteration of, including addition to or dele substance or structure of either the Original Software or Subject Soft includes derivative works, as that term is defined in the Copyright St 101. However, the act of including Subject Software as part of a Lar does not in and of itself constitute a Modification. Original Software means the computer software first released unde Agreement by Government Agency with Government Agency design Marshall Space Flight Center and entitled TRL Calculator AFRL Version 3beta, including source code, object code and accompanyin documentation, if any.
G.
101. However, the act of including Subject Software as part of a Lar does not in and of itself constitute a Modification. Original Software means the computer software first released unde Agreement by Government Agency with Government Agency design Marshall Space Flight Center and entitled TRL Calculator AFRL Version 3beta, including source code, object code and accompanyin documentation, if any.
greement
VERSION 1.3
NES THE RIGHTS OF USE, EDISTRIBUTION OF CERTAIN E UNITED STATES T AGENCY LISTED BELOW RNMENT, AS REPRESENTED RTY BENEFICIARY OF ALL F THE SUBJECT SOFTWARE. IFIES OR REDISTRIBUTES THE RT THEREOF, IS, BY THAT ND OBLIGATIONS CONTAINED
he developer of the Original tion. le by a Contributor that are odification alone or when
ing addition to or deletion from, the tware or Subject Software, and ed in the Copyright Statute, 17 USC ftware as part of a Larger Work on. are first released under this rnment Agency designation NASA RL Calculator AFRL_NASA ode and accompanying
ftware as part of a Larger Work on. are first released under this rnment Agency designation NASA RL Calculator AFRL_NASA ode and accompanying
Version
TRL Calculator Version 2_2 NASA Variant TRL Calculator AFRL_NASA Version 3beta TRL Calculator AFRL_NASA Version 3beta TRL Calculator Ver B1.beta
Description
Modification of AFRL Calculator
6/6/2007 1/21/2009
1/28/2009
Modified stand-alone version Released Modification allows for questions to be added at discretion of the user and removes links to AD2
ion History
Developer
W. Nolte: AFRL
Description
Calculator
greement Released
Date Saved:
Project:
Date:
4/15/12
Evaluator
Date
Delete?
Delete Project?
Delete Project?
jects
Delete Project?
Delete Project?
Key Points
IF HERITAGE EQUIPMENT IS BEING USED OUTSIDE OF THE ARCHITECTURE AND OPERATIONAL ENVIRONMENT FOR WHICH IT WAS ORIGINALLY DESIGNED IT IS AT MOST A LEVEL 5 UNTIL ANALYSIS AND/OR TEST WARRANTS ITS INCREASE
IF IT IS WITHIN THE EXPERIENCE BASE IT IS ENGINEERING DEVELOPMENT IF IT IS OUTSIDE OF THE EXPERIENCE BASE IT IS TECHNOLOGY DEVELOPMENT IT IS EXTREMELY DIFFICULT TO KNOW APRIORI WHICH IS WHICH THE ONLY RECOURSE AVAILABLE IS TO APPLY STRONG, UPFRONT SYSTEMS ENGINEERING THIS TOOL IS INTENDED TO BE AN AID TO THAT PROCESS BY PROVIDING A SYSTEMATIC ASSESSMENT OF COMPONENT, SUBSYSTEM AND SYSTEM MATURITY AS WELL AS INSIGHT INTO THE COST SCHEDULE AND RISK ASSOCIATED WITH DEVELOPMENT
Background Material
Introduction
The idea that technology immaturity can have a significant impact on the ability to deliver within cost has existed for many years. But how do you know that the technology you need is im you know that you need any technology at all? No manager begins a program/project with the id immature technology in order to meet requirements and in fact, most managers will want to avo all. But what is technology? Technology development is typically associated with the applicatio something completely new or in a completely new way and as such is to be avoided at all cost in program/project that has deliverables at fixed cost and schedule. Perhaps a more useful definition technology development as development that lies outside of our experience base i.e. in the regio where you dont know what you dont know. It is in this area where all too often program/projec resulting in cost overruns, schedule slips and even cancellations or failures. This is in large part d investment in the time and effort to understand what the requirements are and what is required to of up-front systems engineering directed toward understanding whether or not requirements can resources (i.e. the availability of the technology needed to meet requirements at a level of maturit where it can be incorporated within the cost, schedule and risk constraints.) This is particularly tr systems. Systems engineering is often short circuited when dealing with heritage systems bec systems have already been proven. However, a heritage system incorporated into a differen different environment from those for which they were designed may well require modifications th experience. In which case, this too should be considered technology development. In fact, the gr assessment processes accompanying these tools is that the maturity of heritage systems is autom sufficient analysis is done to justify a higher level.
In order to understand whether or not technology development is required - and to subseq cost, schedule and risk, it is necessary to systematically assess the maturity of each system, sub-s the architecture and operational environment. It is then necessary to assess what is required in th the maturity to a point where it can be successfully incorporated within cost, schedule and perform Calculator and the AD2 Calculator were developed to provide a simple, standardized, systematic of systems and the cost, schedule and risk associated with development and infusion. They are no gamed, but they also provide a means of comparing apples to apples and permit everyone to be on the wherewithal to get on the same page.
The TRL Calculator can be used at the very early (concept) stage of the program/project t assessment. The AD2 Calculator can be used at the same time to provide a quick look at the tall of the AD2 Calculator can provide the basis for the Technology Development Plan required for de and to measure progress in the development. The TRL Calculator can also be used to measure pro the Technology Readiness Assessment Report required for delivery at PDR by NPR 7120.5d.
Since my retirement from NASA I have integrated the two calculators into a single workb features including the ability to save and recall projects in the TRL Calculator and to add addition AD2 Calculator. The integrated calculator is available on request.
I am greatly indebted to Bill Nolte of AFRL for his willingness to modify his original TR SAIC, and Vyga Kulpa, MSFC for their part in developing the requirements for using the calcula project. I am also greatly indebted to John Cole for his initial coding of the AD2 Calculator. John provided critical advice and support in the area of software assessment. Bob Duffe of Ames Res
features including the ability to save and recall projects in the TRL Calculator and to add addition AD2 Calculator. The integrated calculator is available on request.
I am greatly indebted to Bill Nolte of AFRL for his willingness to modify his original TR SAIC, and Vyga Kulpa, MSFC for their part in developing the requirements for using the calcula project. I am also greatly indebted to John Cole for his initial coding of the AD2 Calculator. John provided critical advice and support in the area of software assessment. Bob Duffe of Ames Res
ion
pact on the ability to deliver satisfactory products on time and e technology you need is immature? For that matter, how do program/project with the idea that they will have to rely on st managers will want to avoid the use of any technology at sociated with the application of scientific knowledge to do to be avoided at all cost in the development of any haps a more useful definition would be one that defines rience base i.e. in the region of unknown unknowns all too often program/projects have been caught unaware, lures. This is in large part due to a lack of up-front are and what is required to meet them. In other words, a lack ther or not requirements can be met within the available ements at a level of maturity that can be developed to a point aints.) This is particularly true when dealing with heritage ith heritage systems because it is believed that such incorporated into a different architecture and operating in a well require modifications that fall outside of the realm of development. In fact, the ground rule for the technological heritage systems is automatically dropped to TRL 5 until
t is required - and to subsequently quantify the associated turity of each system, sub-system or component in terms of assess what is required in the way of development to advance n cost, schedule and performance constraints. The TRL e, standardized, systematic method for assessing the maturity nt and infusion. They are not magic bullets and they can be and permit everyone to be on the same page or to at least have
age of the program/project to provide a baseline maturity ide a quick look at the tall tent pole issues. Subsequent use opment Plan required for delivery at SRR by NPR 7120.5d also be used to measure progress and to provide the basis for PDR by NPR 7120.5d.
culators into a single workbook and added a number of other alculator and to add additional questions and categories in the
s to modify his original TRL Calculator to Uwe Hueter, ements for using the calculator in the assessment of the Ares of the AD2 Calculator. John Kelly of NASA headquarters nt. Bob Duffe of Ames Research Center developed the
s to modify his original TRL Calculator to Uwe Hueter, ements for using the calculator in the assessment of the Ares of the AD2 Calculator. John Kelly of NASA headquarters nt. Bob Duffe of Ames Research Center developed the
Instructions
Instructions for the TRL Calculator Ver BI.1 beta
February 4, 2009 Double click on the document to open for reading.
N.B. the tabs are hidden for a reason, please use the butto to navigate throughout the workbook and close the workbook using the Close Calculator buttons!
Thanks - JB
Introduction:
Technology Assessment of a complex system requires the assessment of all of systems, sub-systems and components, including those elements that are thou to be mature because of past operational use. It is comprised of two parts, fir determining the current maturity through the use of the Technology Readine Level (TRL) Calculator and then determining what is required to advance th maturity in terms of cost, schedule and risk through the use of the Advancem Degree of Difficulty (AD2) Calculator. This calculator deals with TRLs.
The TRL Calculator includes questions for hardware and software and an edited version of questions for Manufacturing Readiness Levels (MRLs) (thi not the same set as the full MRL tool. It includes the ability to capture and sa project data as a function of the product Work Breakdown Structure.
The TRL Calculator saves copies of the evaluation, but those copies cannot be edited. If a new evaluation is required at a later date all of the questions must answered anew and the data saved with an annotation to the title that it is the evaluation (Widget & Widget II). If this annotation is not made, it will replac the old data with the new and the earlier information will be lost.
The calculator saves evaluation data (answers to the questions) according to t name of what is being evaluated (system name, subsystem name, component name, etc.) All of the data is saved under the project name and stays in the active part of the calculators until either the project is deleted, or saved. Onc is saved, the data is no longer in the active part of the calculator. Data is sorte according to the WBS number entered for each element. Elements do not ha to be entered in the order of the WBS. The calculator will sort the data into a WBS hierarchy when that option is selected. When the project itself is saved, project name is recorded in the project index. The projects can be recalled an their data examined, replaced or augmented.
is saved, the data is no longer in the active part of the calculator. Data is sorte according to the WBS number entered for each element. Elements do not ha to be entered in the order of the WBS. The calculator will sort the data into a WBS hierarchy when that option is selected. When the project itself is saved, project name is recorded in the project index. The projects can be recalled an their data examined, replaced or augmented.
quires the assessment of all of its g those elements that are thought t is comprised of two parts, first use of the Technology Readiness what is required to advance that ough the use of the Advancement culator deals with TRLs.
dware and software and an Readiness Levels (MRLs) (this is es the ability to capture and save Breakdown Structure.
ion, but those copies cannot be date all of the questions must be otation to the title that it is the 2 nd ation is not made, it will replace mation will be lost.
o the questions) according to the subsystem name, component roject name and stays in the oject is deleted, or saved. Once it of the calculator. Data is sorted element. Elements do not have ulator will sort the data into a When the project itself is saved, the The projects can be recalled and
of the calculator. Data is sorted element. Elements do not have ulator will sort the data into a When the project itself is saved, the The projects can be recalled and
Date Saved:
Project:
1/16/2008 Date:
4/15/12
EXAMPLE
3 2 1 2 1 2
Date
1/16/08 1/16/08 1/16/08 1/16/08
Delete?
11
Primary Mirror
n/a Safety Critical? 6 n/a 7 Hardware Software Hardware
Program: Evaluator:
NGST Bilbro
9 2
1 2 3 4 5 Technology Readiness Level Achieved Technology Readiness Level Achieved Manufacturing Readiness Level Achieved
If you wish to view the details of a given level(s) check the appropriate boxes at right and click paste. The entire page
TRL 3
100 70 0 90 90 10 10 0 0 0 Critical functions/components of the concept/application identified? Subsystem or component analytical predictions made? Subsystem or component performance assessed by Modeling and Simulation? Preliminary key parameters performance metrics established? Laboratory tests and test environments established? Laboratory test support equipment and facilities completed for component/proof-ofconcept testing? Component acquisition/fabrication completed? Component tests completed? Analysis of test results completed establishing key performance metrics for components/subsystems? Analytical verification of critical functions from proof-of-concept made?
Level 3 Comments: Industry/government survey indicates that mirror fabrication is feasible in a number of different materials, howeve subscale models need to be fabricated and tested at cryogenic temperatures in order to select optium material fo fabrication at room temperature an operation at 30 K.
MRL 3
60 60 60 0 0 0 40 40 Preliminary design of components/subsystem/systems to be manufactured exists? Basic manufacturing requirements identified? Current manufacturability concepts assessed? Modifications required to existing manufacturing concepts? New manufacturing concepts required? Have requirments for new materials, components, skills and facilities been identified? Preliminary process flow identified? Required manufacturing concepts identified?
Level 3 Comments: Industry/government survey indicates that mirror fabrication is feasible in a number of different materials, howeve subscale models need to be fabricated and tested at cryogenic temperatures in order to select optium material fo fabrication at room temperature an operation at 30 K.
1
1 2
0 0 0
0 0 0 SRL (Software)
logy's Readiness
NGST Bilbro
Date:
1/16/08
ASSESSMENT DETAILS
Level 1 2 3 4 5 6 TRL SRL MRL click paste. The entire page can then be saved to a new work sheet by
vel Details
evel Details
Display Green?
R 7
S 19
T 20
U 20
V 20
W 20
X 20
Y 20
Display Green?
1 700 0 0
2 1200 0 0
3 370 0 260
4 0 0 0
5 0 0 0
6 0 0 0
7 0 0 0
8 0 0 0
9 0 0 0
12
Program: Evaluator:
NGST Bilbro
9 1
1 2 3 4 5 Technology Readiness Level Achieved Technology Readiness Level Achieved Manufacturing Readiness Level Achieved
8 2
If you wish to view the details of a given level(s) check the appropriate boxes at right and click paste. The entire page
TRL 2
100 100 30 100 40 100 100 40 100 100 A concept formulated? Basic scientific principles underpinning concept identified? Preliminary analytical studies confirm basic concept? Application identified? Preliminary design solution identified? Preliminary system studies show application to be feasible? Preliminary performance predictions made? Modeling & Simulation used to further refine performance predictions and confirm benefits? Benefits formulated? Research & development approach formulated?
100 40
Preliminary definition of Laboratory tests and test environments established? Concept/application feasibility & benefits reported in scientific journals/conference proceedings/technical reports ?
Level 2 Comments: 2 design solutions have been identified and 3 more are in process Progress will be documented in SPIE Astronomy meeting proceedings
TRL 3
100 30 0 40 90 10 10 0 0 0 0 Critical functions/components of the concept/application identified? Subsystem or component analytical predictions made? Subsystem or component performance assessed by Modeling and Simulation? Preliminary key parameters performance metrics established? Laboratory tests and test environments established? Laboratory test support equipment and facilities completed for component/proof-ofconcept testing? Component acquisition/fabrication completed? Component tests completed? Analysis of test results completed establishing key performance metrics for components/subsystems? Analytical verification of critical functions from proof-of-concept made? Analytical and experimental proof-of-concept documented?
Level 3 Comments: Industry/government survey indicates that actuator fabrication is feasible in a number of different materials, howe subscale models need to be fabricated and tested at cryogenic temperatures in order to select optium material fo fabrication at room temperature an operation at 30 K.
Industry/government survey indicates that actuator fabrication is feasible in a number of different materials, howe subscale models need to be fabricated and tested at cryogenic temperatures in order to select optium material fo fabrication at room temperature an operation at 30 K.
MRL 3
30 30 40 0 0 0 20 20 Preliminary design of components/subsystem/systems to be manufactured exists? Basic manufacturing requirements identified? Current manufacturability concepts assessed? Modifications required to existing manufacturing concepts? New manufacturing concepts required? Have requirments for new materials, components, skills and facilities been identified? Preliminary process flow identified? Required manufacturing concepts identified?
Level 3 Comments: Industry/government survey indicates that actuator fabrication is feasible in a number of different materials, howe subscale models need to be fabricated and tested at cryogenic temperatures in order to select optium material fo fabrication at room temperature an operation at 30 K.
1
1
Y 2
0 0 0
0 0 0 SRL (Software)
logy's Readiness
NGST Bilbro
Date:
1/16/08
ASSESSMENT DETAILS
Level 1 2 3 4 5 6 TRL SRL MRL click paste. The entire page can then be saved to a new work sheet by
vel Details
evel Details
Display Green?
R 7
S 15
T 16
U 16
V 16
W 16
X 16
Y 16
Display Green?
1 700 0 0
2 950 0 0
3 280 0 140
4 0 0 0
5 0 0 0
6 0 0 0
7 0 0 0
8 0 0 0
9 0 0 0
13
Sunshade
n/a Safety Critical? 6 n/a 7 Hardware Software Hardware
Program: Evaluator:
NGST Bilbro
9 1
1 2 3 4 5 Technology Readiness Level Achieved Technology Readiness Level Achieved Manufacturing Readiness Level Achieved
8 2
If you wish to view the details of a given level(s) check the appropriate boxes at right and click paste. The entire page
TRL 2
100 100 50 100 60 100 100 60 100 100 A concept formulated? Basic scientific principles underpinning concept identified? Preliminary analytical studies confirm basic concept? Application identified? Preliminary design solution identified? Preliminary system studies show application to be feasible? Preliminary performance predictions made? Modeling & Simulation used to further refine performance predictions and confirm benefits? Benefits formulated? Research & development approach formulated?
100 50
Preliminary definition of Laboratory tests and test environments established? Concept/application feasibility & benefits reported in scientific journals/conference proceedings/technical reports ?
Level 2 Comments: Modeling & simulation expected to be completed in 6 weeks Results to be presented at SPIE Astronomy Conference, in July
TRL 3
100 40 20 40 90 10 10 0 0 0 0 Critical functions/components of the concept/application identified? Subsystem or component analytical predictions made? Subsystem or component performance assessed by Modeling and Simulation? Preliminary key parameters performance metrics established? Laboratory tests and test environments established? Laboratory test support equipment and facilities completed for component/proof-ofconcept testing? Component acquisition/fabrication completed? Component tests completed? Analysis of test results completed establishing key performance metrics for components/subsystems? Analytical verification of critical functions from proof-of-concept made? Analytical and experimental proof-of-concept documented?
MRL 3
30 30 40 0 0 0 20 20 Preliminary design of components/subsystem/systems to be manufactured exists? Basic manufacturing requirements identified? Current manufacturability concepts assessed? Modifications required to existing manufacturing concepts? New manufacturing concepts required? Have requirments for new materials, components, skills and facilities been identified? Preliminary process flow identified? Required manufacturing concepts identified?
Level 3 Comments: Industry/government survey indicates that sunshade fabrication is feasible in a number of different materials, how subscale models need to be fabricated and tested
1
1
Y 2
0 0 0
0 0 0 SRL (Software)
logy's Readiness
NGST Bilbro
Date:
1/16/08
ASSESSMENT DETAILS
Level 1 2 3 4 5 6 TRL SRL MRL click paste. The entire page can then be saved to a new work sheet by
vel Details
evel Details
Display Green?
R 7
S 15
T 16
U 16
V 16
W 16
X 16
Y 16
Display Green?
1 700 0 0
2 1020 0 0
3 310 0 140
4 0 0 0
5 0 0 0
6 0 0 0
7 0 0 0
8 0 0 0
9 0 0 0
10
Spacecraft Bus
n/a Safety Critical? 6 n/a 7 Hardware Software Hardware
Program: Evaluator:
NGST Bilbro
9 3
1 2 3 4 5 Technology Readiness Level Achieved Technology Readiness Level Achieved Manufacturing Readiness Level Achieved
If you wish to view the details of a given level(s) check the appropriate boxes at right and click paste. The entire page
TRL 4
80 80 70 70 100 100 10 0 0 0 Concept/application translated into detailed system/subsystem/component level breadboard design? Preliminary definition of operational environment completed? Laboratory tests and test environments defined for breadboard testing? Pre-test predictions of breadboard peformance in a laboratory environment assessed by Modeling and Simulation? Key parameter performance metrics established for breadboard laboratory tests? Laboratory test support equipment and facilities completed for breadboard testing? System/subsystem/component level breadboard fabrication completed? Breadboard tests completed? Analysis of test results completed verifying performance relative to predicitions? Preliminary system requirements for end user's application defined?
40 40 0
Critical test environments and performance predictions defined relative to the preliminary definiton of the operating environment? Relevant test environment defined? Breadboard performance results verifying analytical predictions and defintion of relevant operational environment documented?
Level 4 Comments: 3 weeks to breadboard design complete Prelimnary operational evironment due for completion in 1 week All other activites to be comple within 2 months
MRL 4
40 70 30 70 60 60 80 Manufacturing requirements (including testing) finalized? Machine/Tooling requirements identified? Has training/certification been identified for all skills required (particularly new skills)? Material requirements identified? Producibility assessment intitalized? Machinery/tooling modifciations breadboarded? Key manufacturing processes identified?
80 50 80 80 70 70 70 50
New machinery/tooling breadboarded? Metrology requirements identified? Key metrology components breadboarded? Key analytical tool requirements identified? Key analytical tools breadboarded? Key manufacturing processes assessed in laboratory? Mitigation strategies identified to address manufacturability / producibility shortfalls? All Manufacturing Processes Identified?
1
1 2 3
0 0 0
0 0 0
Y 3 4 SRL (Software)
logy's Readiness
NGST Bilbro
Date:
1/16/08
ASSESSMENT DETAILS
Level 1 2 3 4 5 6 TRL SRL MRL click paste. The entire page can then be saved to a new work sheet by
vel Details
evel Details
Display Green?
R 7
S 19
T 30
U 32
V 32
W 32
X 32
Y 32
Display Green?
1 700 0 0
2 1200 0 0
3 1100 0 800
4 590 0 960
5 0 0 0
6 0 0 0
7 0 0 0
8 0 0 0
9 0 0 0
Date Saved:
Project:
Date:
4/15/12
Evaluator
Date
Delete?
ASSESSMENT DETAILS
Manufacturing Readiness Level Achieved Level TRL SRL MRL 1 2 3 4 5 6
If you wish to view the details of a given level(s) check the appropriate boxes at right and click paste. The entire page can then be saved to a new work sheet by clicking "Save All". The levels achieved and the appropriate project information is saved to the WBS summary at the same time.
Technology Assessm
START
Assess systems, subsystems and components per the hierarchical product breakdown of the WBS
Baseline Technological Maturity Assessment for SRR Technology Readiness Assessment Report for PDR
Identify all components, subsystems and systems that are at lower TRLs th required by the program
Perform AD2 on all identif components, subsystems a systemsthat are below requisite maturity level.
Identify all components, subsystems and systems that are at lower TRLs than required by the program
erform AD2 on all identified omponents, subsystems and ystemsthat are below equisite maturity level.
You have not documented any task accomplishment at this TRL or above. One or more tasks have been accomplished at this TRL or above, but there are enough tasks undone so that you cannot claim achievement of this TRL. You have accomplished many of the tasks required for this TRL, and you may be able to justify achievement of this TRL depending on which tasks are still undone. You can justify a claim that your technology program has achieved this TRL.
Xxx
Xxx
This calculator is a further modification of the calculator developed by Mr. William Nolte of the Air F Research Laboratory and modified by Mr. James W. Bilbro when he was Assistant Director for Techno Technologist, George C. Marshall Space Flight Center (MSFC), AL. This calculator integrates the AD2 with the aforementioned TRL Calculator into a single entity. The original AD2 Calculator was develope Bilbro while at MSFC and was coded by Mr. John Cole also of MSFC. Questions regarding this version directed to Mr. James W. Bilbro, JB Consulting International, 4017 Panorama Drive SE, Huntsville, AL Telephone (256) 655-6273, E-mail jbci@bellsouth.net.
The original calculator was designed and developed by Mr. William L. Nolte, AFRL/SNOX, 2241 Avioni WPAFB OH 45433-7302, Telephone (937) 255-4202 Ext. 4040, E-mail William.Nolte@wpafb.af.mil.
Version 1.0
N/A
1.1
15-Aug-02
1.11
Comments Beta release. First numbered version. Configuration bas Version 1.0 introduces version numbering as a form of configuration management. In this version, we have "swa some bugs discovered in pre-release testing. This hardw calculator is the companion to Software TRL Calculator v Color codes for technical and program questions added. Color code sheet added. Added hidden table to compute values for TRL and PRT. "Yellow" and "Green" levels ac displayed in the calculator's TRL and PRT title bars. Added DT&E and OT&E to AFRL Commentary sheet. The spreadsheet computes each section TRL/PRT sepa You can weight each section according to relative import The red, yellow, and green graphical display is still unwe but the overall TRL/PRT weighted geometric mean is als Updated TRL definitions and descriptions IAW 5000.2-R dated April 5, 2002. Separate sheet for TRL and PRT. (Cal Verity) Fix weighted TRL jump to 9 when 0 selected (Cal Verity) Standard format for hardware and software, TRL and PR both % Complete and Weighted TRL/PRT features. Adjustable point where % complete adds to count Alternate versions with and without PRT created Separates documentation from calculators to eliminate r Added some background material to documentation. Major revision. Beta test release of items included in ver Major revision. MRL added. Aggregate overall TRL comp Allows selection of Hardware, Software or Both.
Allows selection of TRL, PRL, and/or MRL. Allows for adding or deleting questions from TRL compu Allows user to hide unused questions and blank rows. Lets user assume completion of TRL 1 through 3. Top level view added. Questions organized by TRL, not category. Display of Green requires all questions at that level to be Weighted TRL no longer computed. Summary sheet added to display results. Wrote MRL Definitions and MRL Background sheet Approved for public release. Rearranged questions and reorganized display to meet n NASA. See release notes for NASA Variant for details. Added capability to store data as a function of WBS. Rearranged and further modified questions and displays Added capability to store and recall project data.
Integrated TRL and AD2 Calculators into a single workbo Added additional blank questions for the TRL calculator a TRL AD2 Project Status Calculator. Created stand-alone TRL and AD2 Calculators from the version
21-Mar-08
ed by Mr. William Nolte of the Air Force e was Assistant Director for Technology/Chief This calculator integrates the AD2 Calculator ginal AD2 Calculator was developed by Mr. C. Questions regarding this version should be Panorama Drive SE, Huntsville, AL 35801,
Comments mbered version. Configuration baseline. s version numbering as a form of ement. In this version, we have "swatted" d in pre-release testing. This hardware panion to Software TRL Calculator v.1.0. ical and program questions added. ed. Added hidden table to compute numeric RT. "Yellow" and "Green" levels achieved are lator's TRL and PRT title bars. &E to AFRL Commentary sheet. mputes each section TRL/PRT separately. section according to relative importance. green graphical display is still unweighted, RT weighted geometric mean is also given. ons and descriptions IAW 5000.2-R
RL and PRT. (Cal Verity) mp to 9 when 0 selected (Cal Verity) ardware and software, TRL and PRT, using d Weighted TRL/PRT features. e % complete adds to count h and without PRT created ation from calculators to eliminate repetition. und material to documentation. est release of items included in version 2.1. added. Aggregate overall TRL computed. ardware, Software or Both.
RL, PRL, and/or MRL. deleting questions from TRL computation. nused questions and blank rows. mpletion of TRL 1 through 3. . by TRL, not category. uires all questions at that level to be checked ger computed. d to display results. s and MRL Background sheet elease. s and reorganized display to meet needs of otes for NASA Variant for details. ore data as a function of WBS. er modified questions and displays. ore and recall project data.
TRL Calculator
1 2 3 4 5 Technology Readiness Level Achieved Technology Readiness Level Achieved Manufacturing Readiness Level Achieved 6 7 Hardware Software Hardware 8 9 Hide blank rows Change yellow set point below. Green set point: 100% Yellow set point 50%
Do you want to include Hardware TRL? Do you want to include Software SRL? Do you want to include Manufacturing MRL?
HW/SW/Mfg H H H H H H H S S S S S S S
% Complete
Level 1 Comments:
HW/SW/Mfg H H H H H H H H H H H H S S S S S S S S S S S S S S
% Complete
Level 2 Comments:
HW/SW/Mfg H H H H H H H H H H H S S S S S S S S S S S M M M M M M M M
% Complete
Level 3 Comments:
HW/SW/Mfg H H H H H H H H H H H H H S S S S
% Complete
S S S S S S S S S S S M M M M M M M M M M M M M M M Level 4 Comments:
Key parameter performance metrics established for integrated component laboratory tests? Laboratory test support equipment and computing environment completed for integrated component testing? System/subsystem/component level coding completed? Integrated component tests completed? Analysis of test results completed verifying performance relative to predictions? Preliminary system requirements defined for end users' application? Critical test environments and performance predictions defined relative to the preliminary definition of the operating environment? Relevant test environment defined? Integrated component performance results verifying analytical predictions and definition of relevant operational environment documented? Integrated component tests completed for reused code? Integrated component performance results verifying analytical predictions and definition of relevant operational environment documented? Manufacturing requirements (including testing) finalized? Machine/Tooling requirements identified? Has training/certification been identified for all skills required (particularly new skills)? Material requirements identified? Producibility assessment intitalized? Machinery/tooling modifciations breadboarded? Key manufacturing processes identified? New machinery/tooling breadboarded? Metrology requirements identified? Key metrology components breadboarded? Key analytical tool requirements identified? Key analytical tools breadboarded? Key manufacturing processes assessed in laboratory? Mitigation strategies identified to address manufacturability / producibility shortfalls? All Manufacturing Processes Identified?
HW/SW/Mfg H H H H H H H H H H S S S S S S S S S S M M M M M M M M M M M M M
% Complete
Level 5 Comments:
HW/SW/Mfg H H H H H H H H H H H H S S S S S S S S S S S S S S S M M M M M M M M M M M M M M M
% Complete
Level 6 Comments:
HW/SW/Mfg H H H H H H H H H
% Complete
H S S S S S S S S S S S M M M M M M M M M M Level 7 Comments:
Sucessful flight demonstration documented? Hardware interfaces baselined? Design addresses all critical scaling issues? Modeling and simulation used to predict performance in the operational environment? Facilities, computing environment available to support prototype and qualification testing of operational software? Fully integrated software model or scaled prototype system coded that adequately addresses all critical scaling issues and component and hardware interfaces? All software testing/V&V specified in software development plan completed and results documented? All performance specifications verified by test or analysis? Fully integrated prototype software successfully demonstrated in operational environment? All final acceptance testing plans/procedures/criteria have been baselined? Intermediate draft of required software documentation completed? Successful operational demonstration documented? Materials, processes, methods, and design techniques baselined? Most maintainability, reliability, and supportability data available? Manufacturing processes baselined? Production planning complete? Process tooling and inspection / test equipment demonstrated? Machines and tooling demonstrated in pre-production environment? Integration facilities ready & available? Prototype system built on "soft" tooling? Prototype improves to pre-production quality? Ready for Low Rate Initial Production (LRIP)?
HW/SW/Mfg H H H H H H H H H H S S S S S S S M M M M M M
% Complete
Level 8 Comments:
HW/SW/Mfg H H H H H
% Complete
S S S S S M M M M Level 9 Comments:
Software fully integrated and operated in the operational environment? Software performance analyzed, verified and documented as meeting operational requirements? All required software documentation completed? (Minimum NASA software documentation requirements are described in NPR 7150.2. Center/project may elaborate, tailor or augment.) Sustaining software engineering support in place? Software operations, maintenance and retirement procedures finalized and documented? Design stable? Production operating at desired levels? Planned product improvement program in place for future acquisitions? All manufacturing processes controlled to appropriate quality level?
*Development Terminology
Proof of Concept: (TRL 3) Analytical and experimental demonstration of hardware/software concepts that may or may not be incorporated into subsequent development and/or operational units. Breadboard: (TRL 4) A low fidelity unit that demonstrates function only, without respect to form or fit in the case of hardware, or platform in the case of software. It often uses commercial and/or ad hoc components and is not intended to provide definitive information regarding operational performance. Developmental Model/ Developmental Test Model: (TRL 4) Any of a series of units built to evaluate various aspects of form, fit, function or any combination thereof. In general these units may have some high fidelity aspects but overall will be in the breadboard category. Brassboard: (TRL 5 TRL6) A mid-fidelity functional unit that typically tries to make use of as much operational hardware/software as possible and begins to address scaling issues associated with the operational system. It does not have the engineering pedigree in all aspects, but is structured to be able to operate in simulated operational environments in order to assess performance of critical functions. Mass Model: (TRL 5) Nonfunctional hardware that demonstrates form and/or fit for use in interface testing, handling, and modal anchoring. Subscale model: (TRL 5 TRL7) Hardware demonstrated in subscale to reduce cost and address critical aspects of the final system. If done at a scale that is adequate to address final system performance issue it may become the prototype.. Proof Model: (TRL 6) Hardware built for functional validation up to the breaking point, usually associated with fluid system over pressure, vibration, force loads, environmental extremes, and other mechanical stresses.
. Proto-type Unit: (TRL 6 TRL 7) The proto-type unit demonstrates form (shape and interfaces), fit (must be at a scale to adequately address critical full size issues), and function (full performance capability) of the final hardware. It can be considered as the first Engineering Model. It does not have the engineering pedigree or data to support its use in environments outside of a controlled laboratory environment except for instances where a specific environment is required to enable the functional operation including in-space. It is to the maximum extent possible identical to flight hardware/software and is built to test the Engineering Model: (TRL 6 TRL 8) A full scale high-fidelity unit that demonstrates critical aspects of the engineering processes involved in the development of the operational unit. It demonstrates function, form, fit or any combination thereof at a scale that is deemed to be representative of the final product operating in its operational environment. Engineering test units are intended to closely resemble the final product (hardware/software) to the maximum extent possible and are built and tested so as to establish confidence that the design will function in the expected environments. In some cases, the engineering unit will become the protoflight or final product, assuming proper traceability has been exercised over the Flight Qualification Unit: (TRL 8) Flight hardware that is tested to the levels that demonstrate the desired margins, particularly for exposing fatigue stress., typically 20-30%. Sometimes this means testing to failure. This unit is never flown. Key overtest levels are usually +6db above maximum expected for 3 minutes in all axes for shock, acoustic, and vibration; thermal vacuum 10C beyond acceptance for 6 cycles, and 1.25 times static load for unmanned Protoflight Unit: (TRL 8 TRL 9) Hardware built for the flight mission that includes the lessons learned from the Engineering Model but where no Qualification model was built to reduce cost. It is however tested to enhanced environmental acceptance levels. It becomes the mission flight article. A higher risk tolerance is accepted as a tradeoff. Key protoflight overtest levels are usually +3db for shock, vibration, and acoustic; 5C beyond acceptance levels Flight Qualified Unit: (TRL8 TRL9) Actual flight hardware/software that has been through acceptance testing. Acceptance test levels are designed to demonstrate flight-worthiness, to screen for infant failures without degrading performance. The levels are typically less than anticipated levels.
Flight Proven: (TRL 9) Hardware/software that is identical to hardware/software that has been successfully operated in a space mission. Environmental Definitions; Laboratory Environment: An environment that does not address in any manner the environment to be encountered by the system, subsystem or component (hardware or software) during its intended operation. Tests in a laboratory environment are solely for the purpose of demonstrating the underlying principles of technical performance (functions) without respect to the impact of environment. Relevant Environment: Not all systems, subsystems and/or components need to be operated in the operational environment in order to satisfactorily address performance margin requirements. Consequently, the relevant environment is the specific subset of the operational environment that is required to demonstrate critical at risk aspects of the final product performance in an operational environment. Operational Environment: The environment in which the final product will be operated. In the case of spaceflight hardware/software it is space. In the case of ground based or airborne systems that are not directed toward space flight it will be the environments defined by the scope of operations. For software, the environment will be defined by the operational platform and software operating system. Additional Definitions: Mission Configuration: The final architecture/system design of the product that will be used in the operational environment. If the product is a subsystem/component then it is embedded in the actual system in the actual configuration used in operation. Verification Demonstration by test that a device meets its functional and environmental requirements. (ie., was it built right?) Validation Determination that a device was built in accordance with the totality of its prescribed requirements by any appropriate method. Commonly uses a verification matrix of requirement and method of verification. (ie., did I build the right thing?)
Part Single piece or joined pieces impaired or destroyed if disassembled eg., a resistor. Subassembly or component Two or more parts capable of disassembly or replacement eg., populated printed circuit board.. Assembly or Unit a complete and separate lowest level functional item eg., a valve. Subsystem Assembly of functionally related and interconnected units - eg ., electrical power subsystem. System The composite equipment, methods, and facilities to perform and operational role. Segment - The constellation of systems, segments, software, ground support, and other attributes required for an integrated constellation of systems.
Practical application is identified but is speculative, Invention begins, practical no experimental proof or application is identified but is detailed analysis is available to speculative, no experimental support the conjecture. Basic proof or detailed analysis is properties of algorithms, available to support the representations & concepts conjecture. defined. Basic principles coded. Experiments performed with synthetic data.
Analytical and/ or Analytical studies place the Development of limited experimental technology in an appropriate functionality to validate critical critical function or context and laboratory properties and predictions characteristic demonstrations, modeling using non-integrated software proof-of-concept and simulation validate components . analytical prediction.
A low fidelity system/component breadboard is built and operated to demonstrate basic functionality and critical test environments and associated performance predicitions are defined relative to the final operating environment.
Key, functionally critical, software components are integrated, and functionally validated, to establish interoperability and begin architecture development. Relevant Environments defined and performance in this environment predicted.
Component or
A mid-level fidelity system/component brassboard is built and operated to demonstrate overall performance in a
End-to-end Software elements implemented and interfaced with existing systems/simulations conforming to target
A mid-level fidelity End-to-end Software elements system/component implemented and interfaced brassboard is built and with existing operated to demonstrate systems/simulations overall performance in a conforming to target simulated operational environment. End-to-end environment with realistic software system, tested in support elements that relevant environment, meeting demonstrates overall predicted performance. performance in critical areas. Operational Environment Performance predictions are Performance Predicted. made for subsequent Prototype implementations development phases. developed.
A high-fidelity system/component prototype System/subsystem that adequately addresses all model or prototype critical scaling issues is built demonstration in a and operated in a relevant relevant environment to demonstrate environment operations under critical environmental conditions.
Prototype implementations of the software demonstrated on full-scale realistic problems. Partially integrate with existing hardware/software systems. Limited documentation available. Engineering feasibility fully demonstrated.
Prototype software exists A high fidelity engineering having all key functionality unit that adequately available for demonstration addresses all critical scaling and test. Well integrated with issues is built and operated operational hardware/software in a relevant environment to systems demonstrating demonstrate performance in operational feasibility. Most the actual operational software bugs removed. environment and platform Limited documentation (ground, airborne or space). available.
Actual system completed and flight qualified through test and demonstration
The final product in its final configuration is successfully demonstrated through test and analysis for its intended operational environment and platform (ground, airborne or space).
All software has been thoroughly debugged and fully integrated with all operational hardware and software systems. All user documentation, training documentation, and maintenance documentation completed. All functionality successfully demonstrated in simulated operational scenarios. V&V completed..
All software has been thoroughly debugged and fully integrated with all operational Actual system hardware/software systems. flight proven The final product is All documentation has been through successfully operated in an completed. Sustaining successful actual mission. software engineering support mission operations is in place. System has been successfully operated in the operational environment.
No corresponding MRL
No corresponding MRL
Manufacturing Concepts Identified. Assessment of current manufacturability concepts or producibility needs for key breadboard components.
Laboratory Manufacturing Process Demonstration. Key processes identified and Documented test assessed in lab. Mitigation performance demonstrating strategies identified to address agreement with analytical manufacturing/producibility predictions. Documented shortfalls. Cost as an definition of relevant independent variable (CAIV) environment. targets set and initial cost drivers identified.
Manufacturing Process Development. Trade studies and lab experiments define key manufacturing processes and sigma levels needed to satisfy CAIV targets. Initial assessment of assembly needs conducted.
Manufacturing Process Development. Trade studies and lab experiments define key manufacturing processes and sigma levels needed to satisfy CAIV targets. Initial assessment Documented test of assembly needs conducted. performance demonstrating Process, tooling, inspection, and agreement with analytical test equipment in development. predictions. Documented Significant engineering and definition of scaling design changes. Quality and requirements. reliability levels not yet established. Tooling and machines demonstrated in lab. Physical and functional interfaces have not been completely defined.
Critical Manufacturing Processes Prototyped. Critical manufacturing processes prototyped, targets for improved Documented test yield established. Process and performance demonstrating tooling mature. Frequent design agreement with analytical changes still occur. Investments predictions. in machining and tooling identified. Quality and reliability levels identified. Design to cost goals identified.
Prototype Manufacturing System. Prototype system built on soft tooling, initial sigma levels established. Ready for low rate initial production (LRIP). Design changes decrease significantly. Process tooling and inspection Documented test and test equipment demonstrated performance demonstrating in production environment. agreement with analytical Manufacturing processes predictions generally well understood. Machines and tooling proven. Materials initially demonstrated in production and manufacturing process and procedures initially demonstrated. Design to cost goals validated.
Manufacturing Process Maturity Demonstration. Manufacturing processes demonstrate acceptable yield and producibility levels for pilot line, LRIP, or similar item production. All design requirements satisfied. Manufacturing process well understood and controlled to 4sigma or appropriate quality level. Minimal investment in machine and tooling - machines and tooling should have completed demonstration in production environment. All materials are in production and readily available. Cost estimates <125% cost goals (e.g., design to cost goals met for LRIP).
Manufacturing Processes Proven. Manufacturing line operating at desired initial sigma level. Stable production. Design stable, few or no design changes. All manufacturing processes controlled to six-sigma or appropriate quality level. Affordability issues built into initial production and evolutionary acquisition milestones. Cost estimates <110% cost goals or meet cost goals (e.g., design to cost goals met).
Systematic Assessment of the Program/Project Impacts of Technological Advancement and Insertion Revision A
This is a revision of a white paper originally written during my tenure at the George C. Marshall Space Flight Center
This is a revision of a white paper originally written during my tenure at the George C. Marshall Space Flight Center
Technology Maturity Assessment Report Template TBD Project Technology Readiness Assessment Report
late
nology
Project Manager: A B C D E E1 F G H
===============================================================
===============================================================
Class A - Human Rated Software Systems Applies to all space flight software subsystems (ground and flight) developed and/or operated by or for NASA to support human activity in space and that interact with NASA human space flight systems. Space flight system design and associated risks to humans are evaluated over the program's life cycle, including design, development, fabrication, processing, maintenance, launch, recovery, and final disposal. Examples of Class A software for human rated space flight include but are not limited to: guidance; navigation and control; life support systems; crew escape; automated rendezvous and docking; failure detection, isolation and recovery; and mission operations. Class B - Non-Human Space Rated Software Systems Flight and ground software that must perform reliably in order to accomplish primary mission objectives. Examples of Class B software for non-human (robotic) spaceflight include, but are not limited to, propulsion systems; power systems; guidance navigation and control; fault protection; thermal systems; command and control ground systems; planetary surface operations; hazard prevention; primary instruments; or other subsystems that could cause the loss of science return from multiple instruments. Class C - Mission Support Software Flight or ground software that is necessary for the science return from a single (non-critical) instrument or is used to analyze or process mission data or other software for which a defect could adversely impact attainment of some secondary mission objectives or cause operational problems for which potential work-arounds exist. Examples of Class C software include, but are not limited to, software that supports prelaunch integration and test, mission data processing and analysis, analysis software used in trend analysis and calibration of flight engineering parameters, primary/major science data collection and distribution systems, major Center facilities, data acquisition and control systems, aeronautic applications, or software employed by network operations and control (which is redundant with systems used at tracking complexes). Class C software must be developed carefully, but validation and verification effort is generally less intensive than for Class B. Class D - Analysis and Distribution Software Non-space flight software. Software developed to perform science data collection, storage, and distribution; or perform engineering and hardware data analysis. A defect in Class D software may cause rework but has no direct impact on mission objectives or system safety. Examples of Class D software include, but are not limited to, software tools; analysis tools, and science data collection and distribution systems. Class E - Development Support Software Non-space flight software. Software developed to explore a design concept; or support software or hardware development functions such as requirements management, design, test and integration, configuration management, documentation, or perform science analysis. A defect in
Class D software include, but are not limited to, software tools; analysis tools, and science data collection and distribution systems. Class E - Development Support Software Non-space flight software. Software developed to explore a design concept; or support software or hardware development functions such as requirements management, design, test and integration, configuration management, documentation, or perform science analysis. A defect in
n Report
===============
F G H
===============
ed and/or operated by A human space flight uated over the , maintenance, an rated space flight t systems; crew recovery; and mission
h primary mission ght include, but are not ntrol; fault protection; perations; hazard oss of science return
gle (non-critical) or which a defect cause operational ware include, but are n data processing and ngineering parameters, er facilities, data yed by network mplexes). Class C is generally less
ection, storage, and n Class D software safety. Examples of ls, and science data
===========================================================================
If the software to be developed or acquired meets the criteria identified on the left of the table below, then the corresponding software assurance level of effort on the right shall be assigned. If the software meets the criteria of more than one level, the highest level shall be assigned.
Software Assurance Level of Effort NPR 7150.2 Software Class A B C D E E1 Software Safety Criticality Potential for: Catastrophic Mission Failure: Loss of vehicle, or total inability to meet mission objectives Partial Mission Failure: Inability to meet one or more mission objectives Potential for waste of resource investment: Greater than 200 work-years on software Greater than 100 work-years on software Greater than 20 work-years on software Greater than 4 work years on software Less than 4 work years on software Potential for impact to equipment, facility, or environment: Greater than $100M Greater than $20M Greater than $2M Less than $2M Full X X X X X X X X X High Medium Low
Not Applicable
X X X X X
X X X X
Greater than $100M Greater than $20M Greater than $2M Less than $2M X X X
e Level Report
edium
=======================
ied on the left of the table the right shall be assigned. If vel shall be assigned.
X X X X
X X X
X X
X X
Project Manager:
=================================================================== Software shall be classified as safety-critical if it meets at least one of the following criteria: 1. Resides in a safety-critical system (as determined by a hazard analysis) AND at least one of the following apply: a. b. c. d. e. Causes or contributes to a hazard. Provides control or mitigation for hazards. Controls safety-critical functions. Processes safety-critical commands or data (see Note 1 below). Detects and reports, or takes corrective action, if the system reaches a specific hazardous state. f. Mitigates damage if a hazard occurs. g. Resides on the same system (processor) as safety-critical software (see note 4-2 below). 2. Processes data or analyzes trends that lead directly to safety decisions (e.g., determining when to turn power off to a wind tunnel to prevent system destruction). 3. Provides full or partial verification or validation of safety-critical systems, including hardware or software subsystems. Note 1: If data is used to make safety decisions (either by a human or the system), then the data is safetycritical, as is all the software that acquires, processes, and transmits the data. However, data that may provide safety information but is not required for safety or hazard control (such as engineering telemetry) is not safety-critical.
Note 2: Non-safety-critical software residing with safety-critical software is a concern because it may fail in such a way as to disable or impair the functioning of the safety-critical software. Methods to separate the code, such as partitioning, can be used to limit the software defined as safety-critical. If such methods are used, then the isolation method is safety-critical, but the isolated non-critical code is not.
ticality Report
=============== No
===============
following criteria:
system), then the data is safetydata. However, data that may (such as engineering telemetry)
ftware is a concern because it safety-critical software. mit the software defined as s safety-critical, but the
Contact Information
James W. Bilbro JB Consulting International 4017 Panorama Drive SE, Huntsville, AL 35801 Telephone: 256-655-6273 Fax: 866-235-8953 E-mail: jbci@bellsouth.net
ation
mail: jbci@bellsouth.net
Current TRL
Project Status
TRL
Actual system flight proven through successful mission operations
Level
Chaos
AD2
Requires new development outside of any existing experience base. No viable approaches exist that can be pursued with any degree of confidence. Basic research in key areas needed before feasible approaches can be defined.
Risk 90+%
Actual system completed and flight qualified through test and demonstration
Requires new development where similarity to existing experience base can be defined only in the broadest sense. Multiple development routes must be pursued. Requires new development but similarity to existing experience is sufficient to warrant comparison in only a subset of critical areas. Multiple development routes must be pursued. Requires new development but similarity to existing experience is sufficient to warrant comparison on only a subset of critical areas. Dual development approaches should be pursued in order to achieve a moderate degree of confidence for success. (desired performance can be achieved in subsequent block upgrades with high degree of confidence. Requires new development but similarity to existing experience is sufficient to warrant comparison in all critical areas. Dual development approaches should be pursued to provide a high degree of confidence for success. Requires new development but similarity to existing experience is sufficient to warrant comparison across the board. A single development approach can be taken with a high degree of confidence for success. Requires new development well within the experience base. A single development approach is adequate. Exists but requires major modifications. A single development approach is adequate.
80%
Unknown Unknowns
70%
50%
Known Unknowns
40%
Analytical and/ or experimental critical function or characteristic proof-of-concept Technology concept or application formulated
3 2
Well Understood
Well Understood
Exists with no or only minor modifications being required. A single development approach is adequate.
0%