Accessibility / Report Error

Information items to improve Integration Readiness Levels evaluation

Abstract

Systems integration is critical to the engineering of successful complex products and systems. The Technology Readiness Levels (TRL) scale assists in making decisions about the infusion of new technologies into complex systems. This scale presents challenges to represent the integration between technologies in a system. The Integration Readiness Levels (IRL) scale aims to address these challenges. A good practice for objective technology readiness assessments is to rely on evidence-based documentation. The TRL scale shows the evidence required for each level of the scale and the appropriate types of documents to collect this evidence, while the IRL scale proposes only the necessary evidence for each level. This research proposes a set of information items to document the required evidence for each level of the IRL scale. The results show the main stages of investigation and the successful tests of the proposal in a case study with spacecraft developed in the binational program China-Brazil Earth Resources Satellite (CBERS). This article contributes to the research in integration readiness assessments, aims to support professionals who carry out evidence-based evaluations and, consequently, may contribute to improving the accuracy of decision-making in the systems integration by using this scale.

Key words
Aerospace engineering; complex systems; integration readiness levels; systems engineering; system integration; technology readiness levels

INTRODUCTION

Complex products and systems (CoPS) are highly customised, high cost and engineering-intensive goods (Hobday 1998HOBDAY M. 1998. Product complexity, innovation and industrial organisation. Res Policy 26(6): 689-710.) commonly produced for unique projects or small-batches, leading their production emphasis on design, project management and systems engineering. Systems integration is a central element of the systems engineering development effort (Sage & Lynch 1998SAGE AP & LYNCH CL. 1998. Systems integration and architecting: An overview of principles, practices, and perspectives. Syst Eng 1(3): 176-227.).

Systems integration is often assumed to happen in the latter phases of a system development life cycle, but it should be planned in the earlier phases of the project (Sage & Lynch 1998SAGE AP & LYNCH CL. 1998. Systems integration and architecting: An overview of principles, practices, and perspectives. Syst Eng 1(3): 176-227.). The technical job of integration must identify, define and design, analyse, select, and verify the interfaces of the system (Sage & Lynch 1998SAGE AP & LYNCH CL. 1998. Systems integration and architecting: An overview of principles, practices, and perspectives. Syst Eng 1(3): 176-227.).

A spacecraft is an example of a complex system. The China-Brazil Earth Resources Satellite (CBERS) program started at an unprecedented technical and scientific partnership between Brazil and China in the space sector. It enabled Brazil to join the select group of countries that can design and manufacture remote sensing cameras for space applications (INPE 2019INPE - INSTITUTO NACIONAL DE PESQUISAS ESPACIAIS. 2019. CBERS - China-Brazil Earth Resources Satellite Program. Available at: http://www.cbers.inpe.br/. Acessed on: 2019-05-16.
http://www.cbers.inpe.br/...
). The images generated by these satellites are used in applications such as the control of deforestation and burning in the Brazilian Legal Amazon, monitoring of water resources, agricultural areas, urban growth, land occupation, education and many other applications (INPE 2019INPE - INSTITUTO NACIONAL DE PESQUISAS ESPACIAIS. 2019. CBERS - China-Brazil Earth Resources Satellite Program. Available at: http://www.cbers.inpe.br/. Acessed on: 2019-05-16.
http://www.cbers.inpe.br/...
). In this program, five satellites have been successfully launched and operated: CBERS-01 in 1999, CBERS-02 in 2003, CBERS-02B in 2007, CBERS-04 in 2014, CBERS-04A in 2019; CBERS-03 had an unsuccessful launch in 2013 (INPE 2019INPE - INSTITUTO NACIONAL DE PESQUISAS ESPACIAIS. 2019. CBERS - China-Brazil Earth Resources Satellite Program. Available at: http://www.cbers.inpe.br/. Acessed on: 2019-05-16.
http://www.cbers.inpe.br/...
). Technological changes implemented in different satellites, such as new remote sensing cameras and new subsystems, required decision making in systems engineering related to the introduction of new technologies and the integration of these technologies in the system architecture. Figure 1 illustrates the CBERS-04 satellite.

Figure 1
CBERS-4 satellite illustration (INPE 2019INPE - INSTITUTO NACIONAL DE PESQUISAS ESPACIAIS. 2019. CBERS - China-Brazil Earth Resources Satellite Program. Available at: http://www.cbers.inpe.br/. Acessed on: 2019-05-16.
http://www.cbers.inpe.br/...
).

According to Mankins (Mankins 2009MANKINS JC. 2009. Technology readiness assessments: A retrospective. Acta Astronaut 65(9-10): 1216-1223.), Technology Readiness Level (TRL) is an interdisciplinary scale that supports the evaluation and communication concerning the development of new technologies. TRL aids decision making in the development of complex systems related to new technologies introduction (Crawley et al. 2016CRAWLEY E, CAMERON B & SELVA D. 2016. System architecture: strategy and product development for complex systems. Harlow, UK: Pearson, 1st ed., 465 p., Mankins 2009MANKINS JC. 2009. Technology readiness assessments: A retrospective. Acta Astronaut 65(9-10): 1216-1223.). When a technology is not ready, its introduction into a developing system can lead to deviations in project performance, budget and schedule (GAO 1999GAO - GOVERNMENT ACCOUNTABILITY OFFICE. 1999. Better Management of Technology Development Can Improve Weapon System Outcomes. Available at: http://www.gao.gov/products/GAO/NSIAD-99-162. Accessed on 2017-05-04.
http://www.gao.gov/products/GAO/NSIAD-99...
, Mankins 2009MANKINS JC. 2009. Technology readiness assessments: A retrospective. Acta Astronaut 65(9-10): 1216-1223.). Since its inception in the 1970s (Mankins 2009MANKINS JC. 2009. Technology readiness assessments: A retrospective. Acta Astronaut 65(9-10): 1216-1223.) at the National Aeronautics and Space Administration (NASA), the TRL scale has been adopted by many other organisations around the world, not only in the space systems sector but also in the defence, energy and other complex systems industries (Tomaschek et al. 2016TOMASCHEK K, OLECHOWSKI AL, EPPINGER SD & JOGLEKAR NR. 2016. A Survey of Technology Readiness Level Users, In: INCOSE International Symposium (IS 2016), INCOSE Proceedings, Edinburgh, UK, 26th ed., p. 2101-2117.). Despite its decades of use and an increasing number of practitioners, TRL scale presents limitations (Olechowski et al. 2015OLECHOWSKI AL, EPPINGER SD & JOGLEKAR NR. 2015. Technology readiness levels at 40: A study of state-of-the-art use, challenges, and opportunities. In: Portland International Conference on Management of Engineering and Technology (PICMET), Portland, USA: IEEE, 28 p.) mainly related to (Tomaschek et al. 2016TOMASCHEK K, OLECHOWSKI AL, EPPINGER SD & JOGLEKAR NR. 2016. A Survey of Technology Readiness Level Users, In: INCOSE International Symposium (IS 2016), INCOSE Proceedings, Edinburgh, UK, 26th ed., p. 2101-2117.): the representation of the technology integration in a system, maturity of interfaces, modifications in the system and overall maturity of the system.

Indeed, the integration of technology into a system is a theme that deserves much attention in a system development project, because research shows that many failures of space systems originated in their integration (Sauser et al. 2009SAUSER BJ, REILLY RR & SHENHAR AJ. 2009. Why projects fail? How contingency theory can provide new insights - A comparative analysis of NASA’s Mars Climate Orbiter loss. Int J Proj Manag 27(7): 665-679.). Reuse of hardware and software is a trend to reduce space mission costs (Wertz 2011WERTZ JR. 2011. Reducing space mission cost and schedule, In: Wertz Jr et al. (Eds), Space Mission Engineering: The New SMAD. El Segundo, USA: Microcosm Press, 2nd ed., p. 355-396.). Organisations dealing with complex systems may develop their technologies internally or outsourced (Chagas Junior et al. 2017CHAGAS JUNIOR MF, LEITE DES & JESUS GT. 2017. “Coupled processes” as dynamic capabilities in systems integration. RAE-Revista Adm Empres 57(3): 245-257.), which requires appropriate strategies for integrating these technologies.

In order to address these limitations, researchers at Stevens Institute of Technology proposed two new scales (Sauser et al. 2006SAUSER BJ, VERMA D, RAMIREZ-MARQUEZ J & GOVE R. 2006. From TRL to SRL: The concept of systems readiness levels. In: Conference on Systems Engineering Research, Los Angeles, USA: Stevens Institute of Technology, 4th ed., 10 p.): Integration Readiness Levels (IRL) and System Readiness Levels (SRL). IRL scale aims to represent the integration readiness between technologies in a system through a 9-levels scale; while SRL seeks to represent an overall system readiness through mathematically combining the TRL and IRL assessments. Both scales have been researched and practised in aerospace, defence, and oil and gas sectors. Their identified applications comprise systems engineering (Ramirez-Marquez & Sauser 2009RAMIREZ-MARQUEZ JE & SAUSER BJ. 2009. System development planning via system maturity 0ptimization. IEEE Trans Eng Manag 56(3): 533-548., Sauser et al. 2008SAUSER BJ, RAMIREZ-MARQUEZ JE, HENRY D & DIMARZIO D. 2008. A system maturity index for the systems engineering life cycle. Int J Ind Syst Eng 3(6): 673-691.), project management (Magnaye et al. 2014MAGNAYE R, SAUSER B, PATANAKUL P, NOWICKI D & RANDALL W. 2014. Earned readiness management for scheduling, monitoring and evaluating the development of complex product systems. Int J Proj Manag 32(7): 1246-1259.) and technology planning (Baiocco et al. 2015BAIOCCO P, RAMUSAT G, SIRBI A, BOUILLY T, LAVELLE F, CARDONE T, FISCHER H & APPEL S. 2015. System driven technology selection for future European launch systems. Acta Astronaut 107: 301-316.) activities.

Kujawski (2013)KUJAWSKI E. 2013. Analysis and critique of the system readiness level. IEEE Trans Syst Man, Cybern Part A Systems Humans 43(4): 979-987. criticised the SRL scale for being a product between two ordinal numbers. The International Systems Readiness Assessment (SRA) Engineering Handbook provided clarifications on this mathematical operation (ISRACOI 2019ISRACOI - INTERNATIONAL SYSTEMS READINESS ASSESSMENT COMMUNITY OF INTEREST. 2019. System Readiness Assessment (SRA) Engineering Handbook V2.1. Available at: http://www.isracoi.org/Library/Home. Accessed on: 2019-12-18.
http://www.isracoi.org/Library/Home...
). Yasseri (2013YASSERI S. 2013. Subsea system readiness level assessment. Underw Technol 31(2): 77-92., 2016YASSERI S. 2016. A measure of subsea systems’ readiness level. Underw Technol 33(4): 215-228.) proposed an alternative to the definition and calculation of SRL and responded to Kujawski’s (2013) objections.

IRL scale evolved (Jesus & Chagas Junior 2018JESUS GT & CHAGAS JUNIOR MF. 2018. Integration Readiness levels evaluation and systems architecture: A literature review. Int J Adv Eng Res Sci 5(4): 73-84.) since its original version that focused on data integration towards covering generic interface types, based on the TRL scale (Austin & York 2015AUSTIN MF & YORK DM. 2015. System Readiness Assessment (SRA) an illustrative example. Procedia Comput Sci 44: 486-496.). Table I shows the current version of the IRL scale (Austin & York 2015AUSTIN MF & YORK DM. 2015. System Readiness Assessment (SRA) an illustrative example. Procedia Comput Sci 44: 486-496.). The lower levels reflect the definition of the system architecture and its interfaces. Intermediate levels could represent the activities to verify the functionality of the interfaces. Finally, the higher levels may represent the activities to verify the actual performance of the integrated system.

Table I
Integration Readiness Levels scale (Austin & York 2015AUSTIN MF & YORK DM. 2015. System Readiness Assessment (SRA) an illustrative example. Procedia Comput Sci 44: 486-496.).

Technology Readiness Assessment (TRA) is a process to determine the readiness of new technology and should be conducted many times during the technology and systems life cycle (Mankins 2009MANKINS JC. 2009. Technology readiness assessments: A retrospective. Acta Astronaut 65(9-10): 1216-1223.). In the past, TRL assessments were often informal and inconsistent (Frerking & Beauchamp 2016FRERKING MA & BEAUCHAMP PM. 2016. JPL technology readiness assessment guideline, In: IEEE Aerospace Conference Proceedings, 10 p.). Nolte et al. (2003)NOLTE WL, KENNEDY BM & DZIEGIEL RJJ. 2003. Technology Readiness Level Calculator. In: NDIA Annual Systems Engineering Conference Proceedings, San Diego, USA: NDIA, San Diego, USA: NDIA, 6th ed., 16 p. developed questionnaires to support the TRL assessment (Nolte et al. 2003NOLTE WL, KENNEDY BM & DZIEGIEL RJJ. 2003. Technology Readiness Level Calculator. In: NDIA Annual Systems Engineering Conference Proceedings, San Diego, USA: NDIA, San Diego, USA: NDIA, 6th ed., 16 p.). Even when quantitative methods were used, the evaluation could be subjective (Cornford & Sarsfield 2004CORNFORD SL & SARSFIELD L. 2004. Quantitative methods for maturing and infusing advanced spacecraft technology, In: 2004 IEEE Aerospace Conference Proceedings, p. 663-681.). A rigorous TRA process requires clear evidence that technology has reached the intended level of readiness, documenting evidence such as images and test results (Mankins 2009MANKINS JC. 2009. Technology readiness assessments: A retrospective. Acta Astronaut 65(9-10): 1216-1223.). There is an opportunity to improve the TRL scale to be more precise and objective, despite its decades of use (Tomaschek et al. 2016TOMASCHEK K, OLECHOWSKI AL, EPPINGER SD & JOGLEKAR NR. 2016. A Survey of Technology Readiness Level Users, In: INCOSE International Symposium (IS 2016), INCOSE Proceedings, Edinburgh, UK, 26th ed., p. 2101-2117.).

Defining the necessary evidence and supporting information for each level of readiness is essential for an appropriate methodology for assessing technological readiness (GAO 2016GAO - GOVERNMENT ACCOUNTABILITY OFFICE. 2016. GAO Technology Readiness Assessment Guide: Best Practices for Evaluating the Readiness of Technology for Use in Acquisition Programs and Projects -Exposure Draft, 147 p.). ISO 16290 (2013)ISO - INTERNATIONAL ORGANIZATION FOR STANDARDIZATION. 2013. Space systems - Definition of the Technology Readiness Levels (TRLs) and their criteria of assessment. Available at: www.iso.org. Accessed on: 2017-01-05.
www.iso.org...
defines TRLs and their evaluation criteria, providing a summary table of the TRL scale with three columns. The first column describes each level. The second column presents the milestone achieved, while the third column indicates what kind of work should be documented. The documentation proposed in this standard covers three assessment areas (ECSS 2017ECSS - EUROPEAN COOPERATION FOR SPACE STANDARDIZATION. 2017. ECSS-E-HB-11A - Technology readiness level (TRL) guidelines. Available at: www.ecss.nl. Accessed on 05-05-2017.
www.ecss.nl...
): element definition, performance requirements, and verification and validation (V&V) status. Thus, a set of documents such as project description, the definition of performance requirements, test plan and test reports should support the TRL assessment.

Verification and validation terms are often used with a very different meaning (Sellers et al. 2009bSELLERS JJ, LARSON W, KIRKPATRICK D & WHITE J. 2009b. Redefining Space System Verification and Validation. In: US Air Force T&E Days 2009, 12 p.). Sellers et al. (2009a)SELLERS JJ, DUREN RM, ARCENEAUX WH & GAMBLE JR EB. 2009a. Verification and Validation. In: Larson WJ, Kirkpatrick D, Sellers JJ, Thomas LD & Verma D (Eds). Applied Space Systems Engineering Hoboken, USA: McGraw-Hill, 1st ed., p. 385-475. recapitulated NASA definitions, which were subsequently updated to verification as to “proof of compliance with specifications” (NASA 2017NASA - NATIONAL AERONAUTICS AND SPACE ADMINISTRATION. 2017. NASA Systems Engineering Handbook (SP-2016-6105) Rev 2. Available at: https://www.nasa.gov/feature/release-of-revision-to-the-nasa-systems-engineering-handbook-sp-2016-6105-rev-2. Accessed on: 2017-05-29.
https://www.nasa.gov/feature/release-of-...
), and to validation as to “the process of showing proof that the product accomplishes the intended purpose based on stakeholder expectations and the Concept of Operations” (NASA 2017NASA - NATIONAL AERONAUTICS AND SPACE ADMINISTRATION. 2017. NASA Systems Engineering Handbook (SP-2016-6105) Rev 2. Available at: https://www.nasa.gov/feature/release-of-revision-to-the-nasa-systems-engineering-handbook-sp-2016-6105-rev-2. Accessed on: 2017-05-29.
https://www.nasa.gov/feature/release-of-...
). Verification applies at various hierarchical levels of the system, from component to system (Sellers et al. 2009aSELLERS JJ, DUREN RM, ARCENEAUX WH & GAMBLE JR EB. 2009a. Verification and Validation. In: Larson WJ, Kirkpatrick D, Sellers JJ, Thomas LD & Verma D (Eds). Applied Space Systems Engineering Hoboken, USA: McGraw-Hill, 1st ed., p. 385-475.) and can be performed at different stages of the product life cycle (NASA 2017NASA - NATIONAL AERONAUTICS AND SPACE ADMINISTRATION. 2017. NASA Systems Engineering Handbook (SP-2016-6105) Rev 2. Available at: https://www.nasa.gov/feature/release-of-revision-to-the-nasa-systems-engineering-handbook-sp-2016-6105-rev-2. Accessed on: 2017-05-29.
https://www.nasa.gov/feature/release-of-...
). It can employ a variety of methods including test, demonstration, analysis, simulation and modelling, or inspection, while validation typically focuses on the highest level of the system and uses test and demonstration typically (Sellers et al. 2009aSELLERS JJ, DUREN RM, ARCENEAUX WH & GAMBLE JR EB. 2009a. Verification and Validation. In: Larson WJ, Kirkpatrick D, Sellers JJ, Thomas LD & Verma D (Eds). Applied Space Systems Engineering Hoboken, USA: McGraw-Hill, 1st ed., p. 385-475.). Validation refers to the concept of operations, and its testing is performed under realistic or simulated conditions on final products to determine the effectiveness and suitability of the product for use in mission operations by typical users (NASA 2017NASA - NATIONAL AERONAUTICS AND SPACE ADMINISTRATION. 2017. NASA Systems Engineering Handbook (SP-2016-6105) Rev 2. Available at: https://www.nasa.gov/feature/release-of-revision-to-the-nasa-systems-engineering-handbook-sp-2016-6105-rev-2. Accessed on: 2017-05-29.
https://www.nasa.gov/feature/release-of-...
). Validation can be performed at each stage of development using models and not just on delivery using final products (NASA 2017NASA - NATIONAL AERONAUTICS AND SPACE ADMINISTRATION. 2017. NASA Systems Engineering Handbook (SP-2016-6105) Rev 2. Available at: https://www.nasa.gov/feature/release-of-revision-to-the-nasa-systems-engineering-handbook-sp-2016-6105-rev-2. Accessed on: 2017-05-29.
https://www.nasa.gov/feature/release-of-...
).

As shown by Durand-Carrier & Loureiro (2013)DURAND-CARRIER F & LOUREIRO G. 2013. The new ISO standard on TRL. In: Proceedings of the International Astronautical Congress, IAC, Beijing, China: International Astronautical Federation, IAF, 64th ed., p. 7826-7829. to illustrate the content provided in ISO 16290:2013, the following definitions apply to TRL 5. The first column defines TRL 5 as to component and or breadboard critical function verification in a relevant environment. In the second column, the milestone is achieved when the critical functions of the element are identified, and the associated relevant environment is defined. Breadboards not full-scale are built for verifying the performance through testing in the relevant environment, subject to scaling effects. Finally, the third column presents the accomplishment of the work that must be documented:

  • Preliminary definition of performance requirements and the relevant environment;

  • Identification and analysis of the elements critical functions;

  • Preliminary design of the element, supported by appropriate models for the critical functions verification;

  • Critical function test plan. Analysis of scaling effects;

  • Breadboard definition for the critical function verification;

  • Breadboard test reports.

Regarding the evaluation of the IRL scale, Sauser et al. (2010)SAUSER B, GOVE R, FORBES E & RAMIREZ-MARQUEZ JE. 2010. Integration maturity metrics : Development of an integration readiness level. Inf Knowl Syst Manag 9(1): 17-46. proposed the decision criteria to support the evaluation, through a research methodology based on document evaluation and interviews. Document evaluation relied on standards, articles and other documents related to systems engineering and procurement processes. Interviews and discussions were used to evaluate which would be the most relevant decision criteria for each IRL level, performed with experts in the areas of systems engineering, project management and procurement management. The most pertinent decision criteria were incorporated into the evidence description list (Austin & York 2015AUSTIN MF & YORK DM. 2015. System Readiness Assessment (SRA) an illustrative example. Procedia Comput Sci 44: 486-496.), which are shown in Table I.

The structure of the IRL scale could be improved (Jesus & Chagas Junior 2017JESUS GT & CHAGAS JUNIOR MF. 2017. A importância de práticas de verificação e validação no processo de avaliação de métricas de maturidade. In: WETE - Workshop Em Engenharia e Tecnologia Espaciais , São José dos Campos, Brasil: INPE, 8th ed., 8 p.) based on how the TRL structure is established in ISO 16290:2013. ISO 16290:2013 defines the evidence required for each level of the scale and the appropriate types of documents to collect this evidence. The IRL scale proposes a list of required evidence for each level, as shown in the last column in Table I. However, it does not suggest a set of documents to guide the assessment record.

Although systems engineering practices and their proposed documentation vary between organisations, there are international standardisation efforts aimed at meeting a wide range of users, creating sufficiently universal standards without compromising their practical usefulness.

The international standard ISO/IEC/IEEE 15288:2015 (2015)INSTITUTE OF ELECTRICAL AND ELECTRONICS ENGINEERS - IEEE, INTERNATIONAL ELECTROTECHNICAL COMMISSION - IEC & INTERNATIONAL ORGANIZATION FOR STANDARDIZATION - ISO. 2015. ISO/IEC/IEEE 15288-2015 Systems and software engineering : system life cycle processes. Available at: https://ieeexplore.ieee.org/document/7106435. Accessed on: 2017-05-31.
https://ieeexplore.ieee.org/document/710...
establishes a basis of process descriptions for the life cycle of a system. This standard allows practitioners to combine the processes by matching the most appropriate life cycle for their organisations, considering its particularities, instead of describing a general life cycle. It presents the purpose, results, examples of information items and other prescriptions for each of the defined processes of the system life cycle.

The international standard ISO/IEC/IEEE 15289:2017 (2017)IEEE - INSTITUTE OF ELECTRICAL AND ELECTRONICS ENGINEERS, INTERNATIONAL ELECTROTECHNICAL COMMISSION - IEC & INTERNATIONAL ORGANIZATION FOR STANDARDIZATION - ISO. 2017. ISO/IEC/IEEE 15289-2017 Systems and software engineering: Content of life-cyle information items (documentation). Available at: https://ieeexplore.ieee.org/document/7942151. Accessed on: 2017-08-01.
https://ieeexplore.ieee.org/document/794...
specifies the purpose and content of information items (documentation) for the life cycle of a system according to each life cycle process. According to this standard, an information item is a set of information produced for human use and is identifiable separately.

Using both standards, ISO/IEC/IEEE 15288:2015 and ISO/IEC/IEEE 15289:2017, one can derive a list of information items to document specific system engineering activities of interest.

This research aims to propose a set of information items, based on international standards, to complement the definition of the IRL scale. The practical objective is to help practitioners to collect and document the necessary evidence, thus contributing to a more objective assessment.

MATERIALS AND METHODS

The research methodology consisted of four stages of implementation. The first step was to identify the system life cycle processes that better represent the work performed in each IRL, using the ISO/IEC/IEEE 15288:2015. The second step was to select information items from ISO/IEC/IEEE 15289:2017 that correspond to the identified life cycle processes and better represent the necessary evidence for each IRL. In the third stage, these previous results were combined to suggest a set of information items to collect the necessary evidence for each IRL.

The last step was to test the proposed set with empirical data from the CBERS remote sensing satellite program.

This article is based on the research developed by the authors at INPE from 2017 (Jesus & Chagas Junior 2017JESUS GT & CHAGAS JUNIOR MF. 2017. A importância de práticas de verificação e validação no processo de avaliação de métricas de maturidade. In: WETE - Workshop Em Engenharia e Tecnologia Espaciais , São José dos Campos, Brasil: INPE, 8th ed., 8 p.) to 2019 (Jesus 2019JESUS GT. 2019. Avaliação da maturidade de integração entre elementos tecnológicos a partir de visões de arquitetura de sistemas espaciais. Instituto Nacional de Pesquisas Espaciais - INPE. Disponível em: http://urlib.net/8JMKD3MGP3W34R/3SNEMA2. Data de acesso: 23/04/2019.
http://urlib.net/8JMKD3MGP3W34R/3SNEMA2...
).

RESULTS

The first step in the preparation of the proposed list was to analyse which life cycle processes best represent the work performed in each IRL by comparing the definitions of each IRL scale level with the expected purposes and results of the life cycle processes defined by ISO/IEC/IEEE 15288. Table II presents the results of this step. Correlations marked with “X” represent that the expected results of the life cycle process correspond to the essential work performed in the IRL. Correlations marked with “O”, as will be explained later, represent that the corresponding life cycle process presents results that are supporting information items for that IRL.

Table II
System life cycle processes selected for each Integration Readiness Level.

Regarding the results of Table II, the life cycle process selected for IRL 1 aims to define the needs and requirements of stakeholders, producing the concept of operations document (Blanchard & Fabrycky 2006BLANCHARD BS & FABRYCKY WJ. 2006. Systems engineering and analysis, 4th ed., Upper Saddle River, USA: Prentice-Hall, 804 p.) that expresses high-level integration concepts. For IRL 2, the selected life cycle process produces the system requirements (Blanchard & Fabrycky 2006BLANCHARD BS & FABRYCKY WJ. 2006. Systems engineering and analysis, 4th ed., Upper Saddle River, USA: Prentice-Hall, 804 p.) and the system description containing its primary interfaces. The architecture definition process represents the work of IRL 3 and establishes the system architecture, which identifies and defines the initial description of the interfaces. These initial descriptions of interfaces may be refined in the design definition process.

Moreover, IRL levels 4 to 7 present milestones achieved in design evaluation (Blanchard & Fabrycky 2006BLANCHARD BS & FABRYCKY WJ. 2006. Systems engineering and analysis, 4th ed., Upper Saddle River, USA: Prentice-Hall, 804 p., NASA 2017NASA - NATIONAL AERONAUTICS AND SPACE ADMINISTRATION. 2017. NASA Systems Engineering Handbook (SP-2016-6105) Rev 2. Available at: https://www.nasa.gov/feature/release-of-revision-to-the-nasa-systems-engineering-handbook-sp-2016-6105-rev-2. Accessed on: 2017-05-29.
https://www.nasa.gov/feature/release-of-...
) related to the development of interfaces, which comprise verification and validation activities. The verification process was chosen for this IRL range, as it precedes the validation process and to follow the logic that ISO 16290 uses the verification process to track TRL achievements. Although these IRL represent tests under different conditions, they can share the same set of information items related to verification activities, such as test procedures and reports.

Furthermore, the integration process relates to IRL 8, aiming to integrate the realised system. Finally, the operation process stands for using and tracking the performance of the realised system, matching the IRL 9 achievements.

Using the ISO/IEC/IEEE 15288:2015 selected system life cycle processes, it was possible to find in ISO/IEC/IEEE 15289:2017 a group of information items recommended to document the output from each of the system life cycle processes, as well as a complete definition for each information item. This group was then filtered to choose the most relevant information items that could collect the necessary evidence for the IRL. Table III shows these results.

Table III
Information items selected for each system life cycle process.

Finally, taking into account the selected processes of the system life cycle for each Integration Readiness Level, from Table II, and the information items chose for each life cycle process, as shown in Table III, a set of information items was proposed to document the required evidence for each level of the IRL scale. Table IV presents these results, where the second column could be read as a supplementary column to the definition of IRL in Table I.

Table IV
The proposed set of information items for each Integration Readiness Level.

Based on the patterns of required documentation for the TRL scale found in ISO 16290:2013, the set of information items for the IRL scale provides not only the items to collect the essential evidence for each IRL, as classified in Table II, but also suggests supporting items to create a complete set of documents. For example, the set of items to document the IRL from 4 to 9 includes the description of the system architecture and the description of the interfaces as a manner to record the system architecture and the interfaces that were used in the IRL assessment. Similarly, the verification procedure supplements the verification report, and the assembly procedure relates to the integration and test report. Indeed, ISO/IEC/IEEE 15288:2015 adds to the verification definition that the characteristics to be verified comprise the product, the description of its design and the associated requirements. Moreover, the verification requirements shall be included in the verification report per ISO/IEC/IEEE 15289:2017.

The information items proposed for IRL 8, especially the integration and testing report, could be used as evidence to the system commissioning. A system-level validation process could use the information items for IRL 8 and IRL 9. If the organisation is interested in using the validation process to assess IRLs from 4 to 7, the validation procedure and report information items shall be used.

The set of information items was designed to be used in addition to the current IRL scale. ISO/IEC/IEEE 15289:2017 provides complete definitions about information items and can support professionals to apply them. The content to be included in these documents shall be customised on each IRL using the column ‘Evidence description’ of the definition of IRL in Table I.

The next stage of the research was to carry out a case study with satellites of the CBERS program to test the set of information items with empirical data.

The first activity in this stage was to analyse the documents that defined the CBERS project documentation standard (PMI 2013PMI - PROJECT MANAGEMENT INSTITUTE. 2013. Um Guia do Conhecimento em Gerenciamento de Projetos (Guia PMBOK), 5th ed., Newtown Square, USA: PMI, 595 p.) to find which types of documents were equivalent to the information items from Table III. Equivalent documents were found for all the proposed information items. For some information items, more than one type of a CBERS document was needed to cover the full scope of a proposed information item. Table V shows the correspondence between CBERS document types and information items.

Table V
Document types found in the case study.

The second activity consisted of analysing the case study documents and verifying if their content was equivalent to what was expected for the information items described in ISO/IEC/IEEE 15289:2017. The results were successful.

The authors express that, after carrying out this case study, the results of Table V were used to support IRL assessments, which were mainly interested in evaluating the IRL of CBERS satellite interfaces at milestones between the stages of development. Documents for IRL 1 and 2 are produced in the early stages of the project, are applied over the primary satellite interfaces and were easy to find. The interface description showed which interfaces reached IRL 3, and the system architecture description was instrumental in understanding the system and mapping its multiple interfaces. The verification reports were decisive in determining the IRL from 4 to 7. The integration and test report for IRL 8 is an important document for the project, showing that system integration is complete. The operation reports for IRL 9 provided evidence for the validation of critical interfaces, such as satellite control, operation and data downlink. Verification procedures and assembly procedures were not essential for these evaluations, but they would be relevant to compose a more accurate and formal assessment.

DISCUSSION

As both activities in the last stage were successful in their results, the set of information items proposed in this research was successfully tested for the case study. About the practical benefits, the author states that it was easier to find the evidence documents needed for assessments of the IRL scale using the proposed set, which is one of the expected benefits.

The results shown in Table II may help practitioners to better understand the relationships between the IRL scale and their systems development processes and timeline. Although V&V terms are often used with a different meaning, the verification process has been selected to represent IRLs from 4 to 7. Table III shows the selected information items, while Table IV presents the complete set to support IRL evaluation. The latter includes items to collect essential evidence and supporting information in order to create a complete set of documents for each IRL assessment.

This paper contributes to the literature of the IRL scale by improving its evaluation process based on best practices of the TRL scale. The contribution consists of advancing previous studies related to the required evidence for the IRL scale and proposing a list of information items following the patterns observed on the TRL scale. A good practice for objective technology readiness assessments is to rely on evidence-based documentation (GAO 2016GAO - GOVERNMENT ACCOUNTABILITY OFFICE. 2016. GAO Technology Readiness Assessment Guide: Best Practices for Evaluating the Readiness of Technology for Use in Acquisition Programs and Projects -Exposure Draft, 147 p.). Furthermore, practitioners recognised the need to improve the objectivity of technological readiness assessments (Tomaschek et al. 2016TOMASCHEK K, OLECHOWSKI AL, EPPINGER SD & JOGLEKAR NR. 2016. A Survey of Technology Readiness Level Users, In: INCOSE International Symposium (IS 2016), INCOSE Proceedings, Edinburgh, UK, 26th ed., p. 2101-2117.).

For practice, this article can support practitioners to perform evidence-based IRL assessments, since the use of the list of information items can make it easier to find the necessary evidence and can make the evaluation more objective by serving as a basis for recording their evidence. Since the proposed documentation list is based on consolidated international systems engineering standards developed by ISO, IEC and IEEE organisations, practitioners may benefit from terms and definitions that have already been widely discussed to correspond to a broad range of organisations. This research can, therefore, contribute to improving the accuracy of decision making in systems integration by making the evaluation of the IRL scale more objective.

CONCLUSION

This article showed the main steps taken to propose a set of information items for the IRL scale intended to support its practitioners in performing more objective assessments. It also showed the successful overall results in the case study.

A limitation is that this study was applied in only one system development project. However, the fact that the study was based on international systems engineering practices should minimise the impact of this limitation, since life cycle processes and information items used in the list of proposed information items are generally accepted definitions.

A suggestion for future studies is to test the proposed set in other projects and sectors to validate the proposal in a broader set of practitioners of the IRL scale.

REFERENCES

  • AUSTIN MF & YORK DM. 2015. System Readiness Assessment (SRA) an illustrative example. Procedia Comput Sci 44: 486-496.
  • BAIOCCO P, RAMUSAT G, SIRBI A, BOUILLY T, LAVELLE F, CARDONE T, FISCHER H & APPEL S. 2015. System driven technology selection for future European launch systems. Acta Astronaut 107: 301-316.
  • BLANCHARD BS & FABRYCKY WJ. 2006. Systems engineering and analysis, 4th ed., Upper Saddle River, USA: Prentice-Hall, 804 p.
  • CHAGAS JUNIOR MF, LEITE DES & JESUS GT. 2017. “Coupled processes” as dynamic capabilities in systems integration. RAE-Revista Adm Empres 57(3): 245-257.
  • CORNFORD SL & SARSFIELD L. 2004. Quantitative methods for maturing and infusing advanced spacecraft technology, In: 2004 IEEE Aerospace Conference Proceedings, p. 663-681.
  • CRAWLEY E, CAMERON B & SELVA D. 2016. System architecture: strategy and product development for complex systems. Harlow, UK: Pearson, 1st ed., 465 p.
  • DURAND-CARRIER F & LOUREIRO G. 2013. The new ISO standard on TRL. In: Proceedings of the International Astronautical Congress, IAC, Beijing, China: International Astronautical Federation, IAF, 64th ed., p. 7826-7829.
  • ECSS - EUROPEAN COOPERATION FOR SPACE STANDARDIZATION. 2017. ECSS-E-HB-11A - Technology readiness level (TRL) guidelines. Available at: www.ecss.nl Accessed on 05-05-2017.
    » www.ecss.nl
  • FRERKING MA & BEAUCHAMP PM. 2016. JPL technology readiness assessment guideline, In: IEEE Aerospace Conference Proceedings, 10 p.
  • GAO - GOVERNMENT ACCOUNTABILITY OFFICE. 2016. GAO Technology Readiness Assessment Guide: Best Practices for Evaluating the Readiness of Technology for Use in Acquisition Programs and Projects -Exposure Draft, 147 p.
  • GAO - GOVERNMENT ACCOUNTABILITY OFFICE. 1999. Better Management of Technology Development Can Improve Weapon System Outcomes. Available at: http://www.gao.gov/products/GAO/NSIAD-99-162 Accessed on 2017-05-04.
    » http://www.gao.gov/products/GAO/NSIAD-99-162
  • HOBDAY M. 1998. Product complexity, innovation and industrial organisation. Res Policy 26(6): 689-710.
  • INSTITUTE OF ELECTRICAL AND ELECTRONICS ENGINEERS - IEEE, INTERNATIONAL ELECTROTECHNICAL COMMISSION - IEC & INTERNATIONAL ORGANIZATION FOR STANDARDIZATION - ISO. 2015. ISO/IEC/IEEE 15288-2015 Systems and software engineering : system life cycle processes. Available at: https://ieeexplore.ieee.org/document/7106435 Accessed on: 2017-05-31.
    » https://ieeexplore.ieee.org/document/7106435
  • IEEE - INSTITUTE OF ELECTRICAL AND ELECTRONICS ENGINEERS, INTERNATIONAL ELECTROTECHNICAL COMMISSION - IEC & INTERNATIONAL ORGANIZATION FOR STANDARDIZATION - ISO. 2017. ISO/IEC/IEEE 15289-2017 Systems and software engineering: Content of life-cyle information items (documentation). Available at: https://ieeexplore.ieee.org/document/7942151 Accessed on: 2017-08-01.
    » https://ieeexplore.ieee.org/document/7942151
  • INPE - INSTITUTO NACIONAL DE PESQUISAS ESPACIAIS. 2019. CBERS - China-Brazil Earth Resources Satellite Program. Available at: http://www.cbers.inpe.br/ Acessed on: 2019-05-16.
    » http://www.cbers.inpe.br/
  • ISO - INTERNATIONAL ORGANIZATION FOR STANDARDIZATION. 2013. Space systems - Definition of the Technology Readiness Levels (TRLs) and their criteria of assessment. Available at: www.iso.org Accessed on: 2017-01-05.
    » www.iso.org
  • ISRACOI - INTERNATIONAL SYSTEMS READINESS ASSESSMENT COMMUNITY OF INTEREST. 2019. System Readiness Assessment (SRA) Engineering Handbook V2.1. Available at: http://www.isracoi.org/Library/Home Accessed on: 2019-12-18.
    » http://www.isracoi.org/Library/Home
  • JESUS GT. 2019. Avaliação da maturidade de integração entre elementos tecnológicos a partir de visões de arquitetura de sistemas espaciais. Instituto Nacional de Pesquisas Espaciais - INPE. Disponível em: http://urlib.net/8JMKD3MGP3W34R/3SNEMA2 Data de acesso: 23/04/2019.
    » http://urlib.net/8JMKD3MGP3W34R/3SNEMA2
  • JESUS GT & CHAGAS JUNIOR MF. 2018. Integration Readiness levels evaluation and systems architecture: A literature review. Int J Adv Eng Res Sci 5(4): 73-84.
  • JESUS GT & CHAGAS JUNIOR MF. 2017. A importância de práticas de verificação e validação no processo de avaliação de métricas de maturidade. In: WETE - Workshop Em Engenharia e Tecnologia Espaciais , São José dos Campos, Brasil: INPE, 8th ed., 8 p.
  • KUJAWSKI E. 2013. Analysis and critique of the system readiness level. IEEE Trans Syst Man, Cybern Part A Systems Humans 43(4): 979-987.
  • MAGNAYE R, SAUSER B, PATANAKUL P, NOWICKI D & RANDALL W. 2014. Earned readiness management for scheduling, monitoring and evaluating the development of complex product systems. Int J Proj Manag 32(7): 1246-1259.
  • MANKINS JC. 2009. Technology readiness assessments: A retrospective. Acta Astronaut 65(9-10): 1216-1223.
  • NASA - NATIONAL AERONAUTICS AND SPACE ADMINISTRATION. 2017. NASA Systems Engineering Handbook (SP-2016-6105) Rev 2. Available at: https://www.nasa.gov/feature/release-of-revision-to-the-nasa-systems-engineering-handbook-sp-2016-6105-rev-2 Accessed on: 2017-05-29.
    » https://www.nasa.gov/feature/release-of-revision-to-the-nasa-systems-engineering-handbook-sp-2016-6105-rev-2
  • NOLTE WL, KENNEDY BM & DZIEGIEL RJJ. 2003. Technology Readiness Level Calculator. In: NDIA Annual Systems Engineering Conference Proceedings, San Diego, USA: NDIA, San Diego, USA: NDIA, 6th ed., 16 p.
  • OLECHOWSKI AL, EPPINGER SD & JOGLEKAR NR. 2015. Technology readiness levels at 40: A study of state-of-the-art use, challenges, and opportunities. In: Portland International Conference on Management of Engineering and Technology (PICMET), Portland, USA: IEEE, 28 p.
  • PMI - PROJECT MANAGEMENT INSTITUTE. 2013. Um Guia do Conhecimento em Gerenciamento de Projetos (Guia PMBOK), 5th ed., Newtown Square, USA: PMI, 595 p.
  • RAMIREZ-MARQUEZ JE & SAUSER BJ. 2009. System development planning via system maturity 0ptimization. IEEE Trans Eng Manag 56(3): 533-548.
  • SAGE AP & LYNCH CL. 1998. Systems integration and architecting: An overview of principles, practices, and perspectives. Syst Eng 1(3): 176-227.
  • SAUSER B, GOVE R, FORBES E & RAMIREZ-MARQUEZ JE. 2010. Integration maturity metrics : Development of an integration readiness level. Inf Knowl Syst Manag 9(1): 17-46.
  • SAUSER BJ, RAMIREZ-MARQUEZ JE, HENRY D & DIMARZIO D. 2008. A system maturity index for the systems engineering life cycle. Int J Ind Syst Eng 3(6): 673-691.
  • SAUSER BJ, REILLY RR & SHENHAR AJ. 2009. Why projects fail? How contingency theory can provide new insights - A comparative analysis of NASA’s Mars Climate Orbiter loss. Int J Proj Manag 27(7): 665-679.
  • SAUSER BJ, VERMA D, RAMIREZ-MARQUEZ J & GOVE R. 2006. From TRL to SRL: The concept of systems readiness levels. In: Conference on Systems Engineering Research, Los Angeles, USA: Stevens Institute of Technology, 4th ed., 10 p.
  • SELLERS JJ, DUREN RM, ARCENEAUX WH & GAMBLE JR EB. 2009a. Verification and Validation. In: Larson WJ, Kirkpatrick D, Sellers JJ, Thomas LD & Verma D (Eds). Applied Space Systems Engineering Hoboken, USA: McGraw-Hill, 1st ed., p. 385-475.
  • SELLERS JJ, LARSON W, KIRKPATRICK D & WHITE J. 2009b. Redefining Space System Verification and Validation. In: US Air Force T&E Days 2009, 12 p.
  • TOMASCHEK K, OLECHOWSKI AL, EPPINGER SD & JOGLEKAR NR. 2016. A Survey of Technology Readiness Level Users, In: INCOSE International Symposium (IS 2016), INCOSE Proceedings, Edinburgh, UK, 26th ed., p. 2101-2117.
  • WERTZ JR. 2011. Reducing space mission cost and schedule, In: Wertz Jr et al. (Eds), Space Mission Engineering: The New SMAD. El Segundo, USA: Microcosm Press, 2nd ed., p. 355-396.
  • YASSERI S. 2013. Subsea system readiness level assessment. Underw Technol 31(2): 77-92.
  • YASSERI S. 2016. A measure of subsea systems’ readiness level. Underw Technol 33(4): 215-228.

Publication Dates

  • Publication in this collection
    02 Dec 2020
  • Date of issue
    2020

History

  • Received
    12 June 2019
  • Accepted
    21 Jan 2020
Academia Brasileira de Ciências Rua Anfilófio de Carvalho, 29, 3º andar, 20030-060 Rio de Janeiro RJ Brasil, Tel: +55 21 3907-8100 - Rio de Janeiro - RJ - Brazil
E-mail: aabc@abc.org.br