Chapter 12

Chapter 12

Technical analysis and test and evaluation

12.1      Proponents of OTS acquisition have highlighted that the selection of developmental products is a source of added complexity and greater uncertainty to an acquisition project, thereby increasing the risk of problems emerging during the procurement process. The committee observes that not only has this view led to the current preference for OTS acquisition expressed by many in government, Defence and the media, it may well have given rise to the increasing practice of manufacturers claiming that products are OTS when in fact they turn out to be developmental. Witnesses have presented numerous cases whereby the expectation that a procurement activity is OTS has led Defence to believe that a product is more mature or an outcome more predictable than experience (or an experienced review) would indicate. The conspiracy of optimism, referred to by a number of witnesses, appears to have led Defence to undervalue the role that developmental test and evaluation can play in the early stages of the acquisition cycle to identify and analyse risk in a quantifiable and defensible manner.

12.2      The committee also notes observations made by the Helmsman Institute suggesting that the complexity of some Defence projects was so high that they were 'placed at risk of never delivering the required capability, and failing to work'.[1] This has proven to be true even for some projects that were presented to be OTS, such as the MU90 where integration across a number of platforms was compounded even further by a decision to constrain phasing to line up with other projects (for example JASSM on AP3). It has also proven to be the case where other purchasers withdrew, leaving Australia holding more of the risk. There is a moral hazard faced by industry and CDG in that both parties have an incentive to support the view that a particular technological reach or level of integration with other weapons systems is achievable.

12.3      The committee notes that this conspiracy of optimism may have tended to crowd out or ignore dissenting voices that could alert Defence to the true extent of capability, technology, integration and certification (hence cost and schedule) risk represented by a proposed project.

12.4      In the previous chapter, the committee referred to the importance of Defence being a knowledge-based organisation: of having a deep understanding of the products it intends to purchase; and of the critical importance of having the right people able to ask the right questions. It is particularly important to note that the problems experienced by some projects were due to an underestimation of the amount of developmental work required. This lack of knowledge about the technical maturity of a capability raises the question about the analysis undertaken of the proposed project, and highlights to the committee the absence of early test and evaluation conducted by qualified ADF T&E practitioners as part of a structured risk identification process. In this chapter, the committee uses test and evaluation as a means of assessing the quality of analysis underpinning Defence's capability development process.

Off-the-shelf purchase

12.5      According to Pappas, technical risk accounts for more than 50 per cent of post approval slippage in projects approved after the Kinnaird review.[2] Many analysts, advisers and Defence and industry personnel familiar with defence procurement recognise that purchasing off-the-shelf can reduce the risk of things going wrong. Usually, the costs are known and the performance is proven.[3] The government endorses these views. The 2009 White Paper and Defence's procurement manual make clear that off-the-shelf solutions to Defence's capability requirements 'will be the benchmark against which a rigorous cost-benefit analysis of the military effects and schedule aspects of all proposals will be undertaken'.[4] As noted, in the previous chapter, this discipline while necessary to limit the developmental risks of service wish lists, restricts the need for industry locally to grow engineers through developmental activities. It has also unfortunately been used as a rationale to limit the same development of skills with Defence, both at the high tech end of capability assessment, and for operational and sustainment activities. The longer term effect is that Defence has fewer qualified people able to test and evaluate thoroughly information provided by industry early in the procurement process, especially where it  is not all that it is marketed to be (i.e. the system is still really developmental or that the level of integration sought with other platforms may in fact be difficult to achieve).

12.6      It does not automatically follow that MOTS requires sustainment to be outsourced either domestically or overseas. Such decisions should depend on the normal costs and benefits, local industry capacity, and any strategic needs for self-reliance. Regardless, there must be in place within Defence a cadre of technical skill to manage properly both procurement and sustainment with assured continuity, integrated organisationally under single line accountability, drawing on a superior skill base supported with career paths, and without the risk of complete dependency on suppliers.

Possible secondary risks

12.7      Although recognised as an effective way to reduce risk, purchasing OTS may introduce secondary risks that need to be assessed, treated, monitored and reviewed. Miller Costello and Co noted that a MOTS procurement can be 'either a model for risk management or it can disguise risk and lead government into painfully bad decisions'.[5] So while OTS may initially be the preferred option, it may also pose significant risks that involve:

12.8      With regard to the last dot point, the committee notes that OTS may be purchased under terms that preclude any ADF unique modifications which may further reduce the opportunities where Defence can grow and sustain skill sets such as engineering, certification, T&E and R&D, and increase dependency on overseas suppliers. 

12.9      Other witnesses similarly underscored the caution Defence needs to exercise when purchasing an OTS product.[6]

12.10         Defence also noted the limitations of an OTS purchase. It acknowledged that while OTS equipment minimises procurement risk, such equipment would 'not always meet the needed long-term capability requirement'. It stated further an OTS may not readily integrate with other capabilities in service; may not always be available; may not suit Australia's geographic and strategic circumstances; and/or may not be available in a timeframe that allows Australia to avoid gaps in its defence capability.[7]

12.11         Despite this awareness of possible technical complications associated with OTS, decisions have been taken on such purchases that clearly indicate no robust consideration was applied to such risks. Indeed, one of the identified causes of problems in defence procurement has been the underlining assumptions about products purchased off-the-shelf. The Helmsman Institute noted:

A number of projects started with the assumption that as a product was being offered as an existing design by a supplier, that the product was 'Off the Shelf'. The approach that was then applied assumed that the product could move into mass production immediately. Helmsman believes that true 'Off the Shelf' approaches can only be used if the products have achieved a high volume production rate and are in service in significant numbers in military service, and will only have limited customisation to fit local regulatory requirements.

All other projects need to assume that high levels of testing and evaluation will be required for testing and acceptance given the ADF regulatory environment. Helmsman believes that some of the highest complexity added to projects was that created by 'First of Type' or 'Early in Type' products being treated as 'Off the Shelf'. The unplanned need for substantial certification, systems integration, design and modification created addition complexity in stakeholder management, cultural clashes and journey complexity.[8]

12.12        Accordingly, even an apparently straightforward purchase requires a deep knowledge of the product.[9] Thus, with OTS products, Defence needs the capability to exercise a rigorous test and evaluation regime in order to understand fully the maturity of the capability it intends to acquire. Clearly then, for customised purchases and developmental projects the need for sound and comprehensive analysis is even greater. For example, the committee has referred to the conspiracy of optimism where both industry and the customer are drawn toward the leading edge technology. The danger is that 'an ambitious set of specifications' could be locked in before the associated risks are properly identified and understood.[10] Mr Bruce Green noted that going beyond the 'leading edge to the bleeding edge of technology is a recipe for disaster as these types of projects just bleed money'.[11]

Analysis—test and evaluation

12.13         Kinnaird fully appreciated the central role of analysis in defence procurement practices. In his view, there must be:

12.14         Mortimer also recommended that 'any decisions to move beyond the requirements of an off-the-shelf solution must be based on a rigorous cost-benefit analysis of the additional capability sought against the cost and risk of doing so'. He stated that this analysis must be clearly communicated to government so that it is informed for decision-making purposes.[13] For projects that are not genuine MOTS, Pappas similarly recommended that 'improving technical risk management practices would help reduce schedule and cost escalation'. Specifically, this would involve:

12.15         Many witnesses underscored the importance of good quality and 'systematic independent analysis'.[15] In this regard, the ANAO noted that 'International experience shows that adopting a systems engineering approach in concert with program management of a high order offers the greatest likelihood of success for the delivery of complex and large scale projects, including Defence major capital acquisitions.[16] It explained:

Systems engineering involves the orderly process of bringing complicated systems into being through an integrated set of phased processes covering user requirements definition, system design, development and production, and operational system support.[17]

12.16         People familiar with the complexity of defence acquisitions appreciate the critical role of test and evaluation (T&E) activities in providing information about risk and empirical data to validate models and simulations.[18] Defence similarly recognises T&E as an integral part of the systems engineering process for identifying and reducing technical risk in the acquisition of defence equipment, though this recognition is on paper and not necessarily in practice.[19] The focus on OTS however has led many to believe that Defence only requires a T&E capability at the end of the process: i.e. operational T&E as part of introduction into service. What numerous Defence projects have shown however is that Defence must sustain, develop and employ personnel with experience in developmental T&E in order to conduct pre-contract analysis with rigour.

Early testing

12.17         T&E is a process that can be applied at the initial feasibility stage of a project and continues through to its delivery into service. Clearly, T&E at the feasibility stage helps ensure that a capability will operate as intended and can be produced in line with cost, schedule, and quality targets.[20] This observation about the importance of early analysis is based on wide project experience.[21] For example, with regard to technology risk, the GAO noted:

When technology risks are not managed early, an acquisition program can run into difficulties in later phases. Having a feasible, stable preliminary design for a weapons program early in the acquisition process is also important in lessening risk...by demonstrating that a product's design can meet customer requirements, as well as cost, schedule, and reliability targets.[22]

12.18         Supporting this contention that the customer cannot leave all design activities to the manufacturer, the GAO found that in recent years programs that have held critical design reviews reported higher levels of design knowledge. The committee notes the Haddon-Cave Review (UK) which found that a critical design review is only of value if the stakeholders involved (including the customer) have the necessary qualifications and design/certification experience to understand and challenge the information presented to them.

12.19         Witnesses similarly referred to the value of early research and development.[23] The Australian Business Defence Industry Unit noted that in order to avoid problems, a project must be set on the right course from the start. It suggested that 80 per cent of problems occur in the first 20 per cent of a project's life.[24]

12.20         Air Marshal Harvey agreed with the proposition that there is a case for conducting detailed technical risk analysis of a proposed capability at an earlier stage. He indicated that Defence do so, though not initially, in a formal technical risk analysis sense. According to Air Marshal Harvey technical risk analysis supports both first pass and second pass and forms part of the capability gate review board.[25] He made clear that DSTO follows 'a very rigorous process' for its technical assessments for first and second pass.[26]

12.21         The committee notes that the Air Marshal was referring to the process and not the capacity to analyse relevant risks (technology, integration, capability and certification). The committee has already noted the difference between process and application: i.e. what the manuals prescribe and what actually happens, as noted on many occasions by ANAO. Also, Air Marshal Harvey referred only to DSTO advice. The visit to the Aerospace Operational Support Group at RAAF Edinburgh highlighted to the committee that Defence has other centres of expertise that should be more effectively utilized early in the procurement process to identity the full range of risks presented by a particular solution.

12.22         The committee notes that a capacity to conduct developmental T&E is the same skill set as that needed to conduct effective risk identification and analysis. Proponents of OTS acquisition rightly point out that the original equipment manufacturer does not require ADF to have a developmental T&E capability—that it is industry's job to provide people to run that part of the process. But without a developmental T&E capability, Defence cannot assess the veracity of what they are being told or shown, either in absolute terms or within the certification and training frameworks required by Australia. The number of products accepted as OTS when they were in fact developmental has a strong correlation to situations where T&E expertise was not available, not engaged or not listened to.

Implementation

12.23         Ultimately, under the current process, the Chief Defence Scientist is responsible for the provision of technical risk assessments, technical risk certifications, the development of project S&T plans and for providing other S&T support as required.[27] As one of the fundamental documents that support the first pass approval, the TRA forms part of the Capability Proposal First Pass and needs to be in place.[28]

12.24         The Project Science and Technology Advisor, a DSTO officer,  prepares the TRA for second pass approval. It is intended to allow Defence to advise government on the areas and levels of technical risk of the options being proposed for acquisition. The Chief Defence Scientist signs off on the Technical Risk Certification which is included in the ministerial or cabinet submission.

12.25         There is no doubt that the procurement system should be sufficiently robust to ensure that information on the readiness of a platform for operational service is known. But as noted in chapter 5, one of the problems with risk management is the lack of awareness or the unresponsiveness of some personnel to emerging risk. Evidence suggested that despite Defence's recognition of the importance of test and evaluation, Defence does not pay sufficient attention to this most important aspect of risk management.

12.26         For example, notwithstanding Kinnaird's recommendation for small amounts of early up-front investment to quantify and minimise risk in complex projects, some witnesses were concerned that that was not happening.[29] In his review of the latest Major Projects Review, Air Commodore (retired) Bushell stated:

...the primary cause of project risk lies in the operational and technical areas of the project, and that these (largely potential and manageable) risks demand a very different approach, an approach requiring skills and competencies different from commercial (contract terms and conditions) management. Effective capability management requires that all capability functions—operational, systems and equipment engineering, test and acceptance functions and support requirements, including their associated risks, must come under tight Project and Systems Engineering management, and that commercial management must be constrained to contract management that supports project management objectives.[30]

12.27         In his view, the difficulties that are endemic throughout Defence's major projects indicate that 'the DSTO's capability development, test and acceptance and technical risk assessment and management input have not been adequate'. According to Air Commodore Bushell, such tasks 'were historically, and still are, a natural extension of the fundamental responsibility of the Capability Managers for raising, training and sustaining force'. He argued that DSTO has a role to play, but 'it is one that supports the Capability Managers, not replacing or double-guessing them'.[31]

12.28         In the previous chapter, the committee highlighted the overall shortage of skilled engineers in the area of defence procurement, especially in the Services, and most notably the hollowing out of such skills in the Navy. This shortage has serious implications for test and evaluation.

12.29         Air Marshal Brown gave the AEW&C as 'a classic example' of where there was inadequate T&E. He named two core things that were not done correctly on that project. The first was the contractor's decision to use emulators instead of real equipment on the systems integration lab. He explained that this decision meant that 'a lot of the integrations problems, instead of occurring inside the lab, occurred when we built the aeroplane'. According to Air Marshal Brown:

That decision was objected to by the Commonwealth quite strenuously at the time, but it was taken on a cost basis by the contractor. He decided that that was one way to save money, and they were confident in their design.[32]

12.30         He then referred to the AEW&C program's six-month development, test and evaluation program. He informed the committee that:

If you benchmark that against any other similar sort of highly developmental program, you will find that most people allow about three years. Guess what? That is about the time that it has taken us to do...My view of that program is that we have lost time but we are going to end up with the capability we contracted for.[33]

12.31         During its visit to South Australia, the committee learnt of another example of inadequate T&E. The committee was told that Defence believed the MRTT to be effectively an OTS purchase, with all indications in the tender process pointing to a purchase with a proven performance record for each of the major systems involved. Late in the program, however, the Australian test team needed to be boosted in numbers in order to get the data it needed to have the aircraft accepted into service. The committee also heard that Defence did not articulate clearly enough the Air Force's certification requirements in the contract. Finally, Defence did not manage its observation of the overseas tests at all well, resulting in a gap in its understanding of the tests. Defence did not make early investment in developmental T&E qualified staff on the resident project team a sufficient priority.

12.32         The committee's findings on T&E in defence acquisition projects are consistent with those of the broader issue of risk management in Defence's procurement of major capital equipment. Defence believes that its procedures are appropriate and should ensure that up front analysis followed by systematic test and evaluation activities would prevent unexpected major technical difficulties surfacing later in a product's build. The type of problems that emerged with the Super Seasprite, Landing Watercraft, Wedgetail, Tiger, the MRTT and the MRH-90 Helicopter suggest otherwise.[34]

12.33         If in fact DSTO is solely responsible for technical risk analysis as has been asserted, then the committee suggests that Defence fails to understand the full gamut of technical risk analysis and management from project inception to completion. If in fact CDG is no longer required to fund a preview evaluation by a qualified developmental test team, the committee's concern is amplified. The difference in quality of risk analysis from a CDG officer without relevant experience who is following a 'more thorough checklist of questions' as compared to that provided by a subject matter expert drawing on experience seems to be lost on Defence.

12.34         The committee has considered the underlying causes for the discrepancy between written guidelines and procedures and the implementation of sound risk management practices. The same causes are evident with Defence's T&E regime—non-compliance with policy and guidelines and unawareness or unresponsiveness to risk. As an example, the ANAO found that in a number of cases, the description of technical risk for project proposals did not provide sufficient guidance for decision-makers, or provide confidence that an adequate risk assessment had been conducted.[35] The committee has already referred to the observation made by Pappas that DSTO's technical risk assessments were not always paid the respect they warranted.[36] As the examples in chapter 2 clearly attest, the same observation can be applied to risk assessed by other Defence T&E personnel.

12.35         The lessons to be learned from recent projects underscore the need for improvement in test and evaluation. Such observations have particular relevance for defence projects still in the early stages of their capability development especially the need for up-front investment in research and analysis.

Resourcing test and evaluation

12.36         The committee notes that Kinnaird found that greater resources needed to be allocated to conduct comprehensive and rigorous T&E programs as part of project funding.[37] In this regard, the committee highlights a stark message that came out of the committee's site visits to South Australia:

An organisation cannot support high technical capability without the ability to test it. If it does complex things, it should set requirements but importantly it must understand the skills set it needs to validate requirements.

12.37         Dr Davies stressed a recurring theme throughout this inquiry that improving the quality of analysis is needed rather than improving the quantity of process and of information.[38] He also acknowledged that it takes a long time to grow that analytical capability. In his view, Defence, in the first instance, might have to rely on external contractors with expertise such as the RAND Corporation and Access Economics and use this expertise at least until in-house analytic capability can be built up.[39]

12.38         In this regard, the committee notes the challenges facing the capability managers in developing this level of expertise which to date, only exists in a formal sense for the aerospace domain. For example, Service chiefs are responsible for the initial officer training and specialist training (engineer, pilot etc) and for the 2–3 years of operational experience. Each individual T&E practitioner requires a further year of masters level full time training at a cost of around $1 million. After training, there is normally a period of 1–2 years of supervised T&E conduct and involvement in the ADF airworthiness and certification systems before an individual would be deemed competent to support DMO in a project role away from the test centre. Thus T&E personnel would need to enter the training pipeline several years well in advance of a project's need. This capacity therefore has to exist ahead of the project but given the high cost of training, should be an integral part of a consolidation capability procurement and sustainment team under the direct control of the capability manager, in line with the committee's preferred organisational model.

Long-standing concern

12.39         In its report on materiel acquisition and management in Defence, tabled in March 2003, the committee expressed a lack of confidence in Defence's 'capacity or will to address T&E concerns seriously'. At that time, Defence was preparing a revised T&E policy. The committee was particularly keen to ensure that the policy would be fully integrated (planned and funded) with the capability development process; provide for T&E to be carried out in an independent fashion; and embed a 'cradle to grave' philosophy.[40]  

12.40         Five years later, in its 2008 T&E Roadmap, Defence highlighted a raft of shortcomings in Defence's T&E pointing to a need for greater funding, improved training and attracting and retaining skilled and experienced personnel. They included:

12.41         The Roadmap indicated that steps would be taken to address these findings.

12.42         Vice Admiral Jones, the sponsor for T&E, recognised that the Roadmap was 'quite a significant document' though he noted that there were 'a lot of utopian views in it and a bit of nirvana'.[42] This observation appears to be at odds with the clear articulation of the need for a robust T&E capability in Defence from previous reviews, reports and witness statements which lend weight to the recommendations of the 2008 T&E Roadmap. Vice Admiral Jones referred to work done since the publication of the Roadmap which has resulted in:

12.43         In particular, he referred to the early test planning directorate, a group of six individuals, who specialise in writing test concept documents which they write in conjunction with the relevant T&E organisations. Group Captain Keith Joiner, Director General Test and Evaluation, explained further:

We are tightening that journey of discovery process there, so we have introduced a large number of additional questions into the test concept document writing guide as a result of some of the experiences we have had bringing into service military off-the-shelf and commercial off-the-shelf. That is delivered annually to the T&E principals, so it gets input from all domains, not just land and joint.[44]

12.44         The committee notes with concern that there appears to be a significant investment in form and process but not necessarily in the professional qualifications and work opportunities to gain relevant experience that will—over time—lead to real capacity to identify and analyse risk prior to contract signature.

12.45         Mr King accepted that at one time Defence 'did too much trusting and not enough verification' but was also of the view that Defence had 'moved on quite a distance from there'. Even so, he thought there was a role 'for improved analysis and testing of clams of maturity'.[45] Based on the committee's 2003 report, the 2008 Roadmap and more recent evidence, the committee is not convinced that Defence is moving quickly or decisively enough to address the matters raised in 2003 and 2008.

12.46         For example, the committee understands that ADTEO largely coordinates or conducts operational T&E for the land domain and coordinates some joint OT&E activities. However, the committee is also advised that its staff have no capacity for developmental T&E and the organisation plays no role in the management of the ADF's only developmental T&E agencies— Aircraft Research and Development Unit and Aircraft Maintenance and Flight Trials Unit. Their regulations are contained in ADF airworthiness regulations maintained by Air Force. If this is the case then the committee believes that significant rationalisation of both T&E policy and practice is required.

12.47         Moreover, looking ahead to where Defence needed to go, Vice Admiral Jones indicated that there would be a defence manual 'which all the services and DMO and DSTO and CDG have to sign up to'. He explained that in the manual 'we actually have to start to chart where we are going to go with the workforce and how we are actually going to grow and sustain the workforce'.[46] According to the Vice Admiral, standardisation and professionalisation remains an area where 'there is a lot of work that we have to do'. He stated:

At the moment we are in this situation where we have started to really get a much greater appreciation across the board of the importance of T&E, so that has been a big change probably in the last five years and we are seeing the value of that objective data for our decision making. But what we have to...have is a sustainable path for our workforce, and we see this next iteration and development of a manual as an opportunity whereby we can tease some of those issues out and then actually have some goals to set for ourselves to get to where we need to be.[47]

12.48         The committee notes that Defence was reviewing and developing a T&E concept paper and policy in 2003 and that its T&E Roadmap was produced in 2008. Now, Defence is still talking about producing a manual—that is about process. In this regard, the Haddon-Cave Review into the loss of the RAF Nimrod aircraft has some salutary advice for Defence:

The instinctive reaction of many governmental organisations to problems is the creation of more complexity, not less, and the 'bolting on' of more process, procedures, boards, committees, working parties, etc rather than stripping away the excess and getting down to the essential elements. The net result for the MOD was, unfortunately, an increasingly complicated safety and airworthiness system which was accompanied by a significant weakening of airworthiness oversight and culture during the period leading up to the loss of XV230 in September 2006. Over the past decade, responsibility for risk and risk management has been divided, dissipated and dispersed. Risk has effectively been 'orphaned' by being made part of an extended family, with everyone involved but no-one responsible.[48]

12.49         In the committee's view, while the people who own the process are talking about manuals, those with the responsibility and competence are not being heard.

Conclusion

12.50         Defence would have the committee believe that the organisation has an integrated and effective T&E regime operating throughout the capability life cycle to minimise the chance of unexpected technical difficulties arising. The T&E activities are meant to ensure the delivery of a fully functioning platform with safety-critical systems meeting all requirements. In practice, however, the failings indentified in some major projects stem from poor quality or inadequate analysis. The committee reinforces the message that early investment in analysis is an indispensable component of an acquisition. The Service Chiefs, in particular, as the ultimate users of an acquisition, must have the personnel with the skills and experience to stipulate from the early stages of a capability development cycle the test and evaluation activities required before they will accept an asset into service. Hence the committee's concern in principle about the real responsibility of the capability manager.

Recommendation

12.51         The committee recommends that the government make a long-term commitment to building technical competence in the ADF by requiring Defence to create the opportunities for the development of relevant experience.

Recommendation

12.52         The committee recommends that capability managers should require their developmental T&E practitioners to be an equal stakeholder with DSTO in the pre-first pass risk analysis and specifically to conduct the pre-contract evaluation so they are aware of risks before committing to the project.

12.53         Given that the capability to conduct this T&E and analysis needs to be extant prior to the commencement of any given project, the committee is concerned that cost pressures will lead individual services and projects to degrade this capability over time.

Recommendation

12.54         The committee recommends:

Recommendation

12.55         The committee recommends that Defence build on the capability already extant in aerospace to identify training and experience requirements for operators and engineers in the land and maritime domains and apply these to ADTEO. Capability managers will need to invest in a comparable level of training to enable their personnel to conduct (or at least participate in) developmental testing. The intention is to provide a base of expertise from which Defence can draw on as a smart customer during the first pass stage and to assist in the acceptance testing of capability.  

Recommendation

12.56         The committee recommends that Defence mandate a default position of engaging specialist T&E personnel pre-first pass during the project and on acceptance in order to stay abreast of potential or realised risk and subsequent management. This requirement to apply also to MOTS/COTS acquisition.  

Navigation: Previous Page | Contents | Next Page