This review of the 2015-16 Major Projects Report also includes analysis of ANAO Audit Report No. 11 (2016-17), Tiger – Army’s Armed Reconnaissance Helicopter.
As part of its review, the JCPAA held one public hearing in Canberra on 31 March 2017 with representatives from the ANAO and Defence. A list of witnesses is at Appendix C. A list of submissions to the inquiry is provided at Appendix B.
The JCPAA inquiry focused on the following key issues:
Principles underpinning the Major Projects Report
Transparency across the project lifecycle
Guidelines around project removal
Implementation of Committee recommendations
Capability performance reporting
Capability Performance Analysis
Risk management and use of spreadsheets
Collins Class Submarine Reliability and Sustainability
Principles underpinning the Major Projects Report
Qualified audit findings
The Auditor-General made a qualified audit finding in the MPR. The Auditor-General outlines the bases for the qualified conclusion in his letter to the Presiding Officers. In part, it states that the ARH Tiger Helicopters (hereafter ‘Tiger’) Project Data Summary Sheets have: ‘not been prepared on the basis of the Guidelines’, as:
Funding for caveats, capability deficiencies and obsolescence issues need to be provided separately to the acquisition project, as this cost is unable to be identified
The Project Maturity Score given to the Tiger project ‘does not accurately or completely represent the project’s maturity as at 30 June 2016’
While the Project Financial Assurance Statement for the Tiger states that sufficient funding is available to complete the acquisition, the ANAO notes the statement does not address significant caveats, capability deficiencies and obsolescence issues identified when Final Operational Capability (FOC) was declared in April 2016. Funding to fix these issues would need to be provided separately to the acquisition project – an amount that is unable to be quantified.
Further, material inconsistencies were identified in the forecast information for the Project Data Summary Sheets (PDSSs) of the Tiger project, and the LHD Landing Craft.
Accordingly, the Auditor-General stated: ‘I believe the evidence I have obtained is sufficient and appropriate to provide a basis for my qualified conclusion.’
At the public hearing, the ANAO elaborated on its findings on the Tiger helicopters:
The key concern around Tigers was that in order of appearance through the PDSSs was the statement of whether or not there were sufficient funding to complete the project; and secondly, that the maturity scores of the project were not representative of what was required within the guidelines. Then there were also notes on other areas that were included within the PDSSs – which we believe were materially misstated – which included the representation of capability and one other area.
As the Auditor-General noted, ‘we cannot provide assurance that the statements – with respect to its state against the budget – can be relied upon, because the nature of the work required under the project has not all been completed.’
Defence addressed the qualified audit conclusion, stating:
We would prefer not to have a qualified audit report. …the work that the Auditor-General has done is an example of how we might get a percentile differential, and we are happy to have that discussion with them.
In response the Auditor-General observed:
I think it is desirable not to have a qualified audit report. What we work with the Department to do is come to a consistent view so that you can have assurance on the information that is put before you. With respect to the Tiger, the issue for us there is probably the more substantive one, particularly around capability and maturity. Within the caveats, particularly within the caveats relating to the rate of effort to maturity and capability scores, because the rate-of-effort numbers – compared to what was expected of the aircraft – are so significantly lower, that is a key point… …for Tiger that is the key point with regard to those maturity scores and the capability assessment.
Defence stated that, while there was a difference of opinion between Defence and the ANAO on the Tiger project, the maturity scores assigned to the Tiger project were set in accordance with the MPR guidelines.
Examining the history of previous Major Projects Reports, there have been several qualified audit findings. Most of these occurred early in the development and refinement of the MPR process. During the public hearing, the ANAO characterised these qualified findings as follows:
The first four MPRs were qualified on the basis of what was then called base date dollars. It was more about a question of whether that information that was required by the guidelines could actually be provided or not for the various projects that were included. Eventually, the Committee decided that there was a better representation if we moved away from displaying base date dollars within the MPR. This is the first time that we have had a disagreement over the presentation of the PDSSs.
Transparency across the project life cycle
The objective of the MPR is to ‘improve the accountability and transparency of Defence acquisitions for the benefit of Parliament and other stakeholders’. At the public hearing, the ANAO noted that developing a document like the MPR was ‘something that you work at and puzzle at over time’, that ‘a sensible conversation and debate will often help progress it, perhaps by incremental steps’, and ‘there is a value in the dialogue.’ The Auditor-General noted the underlying information that led Defence to one conclusion and the ANAO to another (the qualified audit finding) was available to all, and that the debate was more about the conclusion drawn from that information, than the information itself.
Defence agreed, stating:
…you actually have two outstanding issues amongst the probably thousands of issues that were in the report. The auditor qualified that; the Secretary provided his response. Both of them are transparent. ….from a public accountability perspective, you are seeing exactly what’s going on.
Another key principle underpinning the Committee’s review of the MPR was retaining transparency over projects not yet complete that had been given FOC status by service chiefs. In the case of the Tiger, FOC was declared by the Chief of Army in April 2016. However, the MPR notes the Tiger Project is not yet complete, as FOC was declared with nine caveats outstanding.
At the public hearing Defence stated: ‘broadly, those caveats have now been resolved, and most of them did not apply specifically to the project aspects; they were broader, fundamental inputs to capability that needed to be managed’. (Individual issues relating to the Tiger are considered below).
In responding to Defence’s claims that caveats had been resolved, the Auditor-General replied:
At the point where we signed off on the audit opinion we had not been provided with evidence that any of the caveats had been removed at that stage. I am not aware that they have been removed since that time. The FOC, with caveats, is still what is in place at the moment, as far as we are aware.
The Committee asked whether it was common for a project to be given FOC with caveats, with Defence replying that it was ‘a rare event’, and that the Wedgetail aircraft was accepted and put into FOC in the last five years with ‘six partially complete Final Operational Capability requirements’.
Guidelines around project removal
The section of the MPR guidelines relevant to project removal state:
1.9 The removal of projects from the MPR is based on achievement of Final Operational Capability (FOC) or on a post-Final Materiel Release (FMR) risk assessment of the timely achievement of FOC and subject to the following criteria:
(a) the outstanding deliverables post-FMR, against the relevant Materiel Acquisition Agreement (MAA) and/or Joint Project Directive (JPD);
(b) the remaining schedule post-FMR, against the relevant MAA and/or JPD;
(c) the remaining budget post-FMR, against the relevant MAA and/or JPD;
(d) the remaining project risks and issues; and
(e) the Capability Manager’s assessment, including overall risk rating and the extent to which this risk rating relates to the Capability Acquisition and Sustainment Group (CASG’s) responsibilities.
1.10 All projects selected for removal from the MPR will be proposed by Defence based on the above criteria, and provided to the JCPAA for endorsement, by the ANAO by 31 August in the year to which the MPR relates.
1.11 Once projects have met the exit criteria, they should be removed from the PDSSs and information included within Defence’s section of the MPR in the subsequent year.
At the public hearing, the Committee asked Defence whether the treatment of Tiger in the MPR fulfilled the intent or principles of the MPR guidelines. Defence replied that it was ‘comfortable with the words in the MPR’, and:
The areas where the ANAO have indicated they are uncomfortable stretch into the sustainment space. In that space the words in the MPR fully align with the guidelines of the JCPAA. We followed those directions precisely and that has led to some friction as the ANAO have sought to get greater granularity of the sustainment aspects of Tiger.
The Auditor-General stated that ‘disagreement’ was:
…largely around whether the work that we do under the MPR stops when FOC is declared, irrespective of whether there is more work that will continue post-FOC, as is the case with Tiger, or whether the work continues until the Committee is satisfied that it is finished with the project, effectively. And, if there is disagreement over that type of definition, …the Committee should make it clear to all parties involved in the process what its expectation is, because, at the end of the day, this report is prepared for the purposes of the Committee… …If there is a misunderstanding or disagreement over it, that should be resolved.
Implementation of Committee recommendations
First Principles Review
JCPAA Report 458 recommended that Defence publish the outcomes from Recommendation 2.11 of the First Principles Review as soon as practicable and that a summary of this information be included in the next Major Projects Report.
The First Principles Review recommended that ‘significant investment’ be made ‘to develop an operational framework which comprehensively explains how the organisation operates and the roles and responsibilities within it’.
Defence’s response to JCPAA Report 458 agreed to the Committee’s recommendation, and stated that the outcome of First Principles Review recommendation 2.11 would be published in March 2017. As to why the outcome had not yet been published Defence replied:
We had expected to have that completed by the end of March. It has not been completed as yet. It is in the program to be completed between now and the end of July, the two-year period for the implementation of first principles…
…It is a very complex recommendation. It involves a whole stack of changes that are happening across the Department. It includes changes to the capability life cycle, which is fundamental to the acquisition, sustainment and disposal of materiel and equipment.
The Committee expects Defence to provide this information as soon as possible.
Capability performance reporting
Expected capability delivery for each project is reported by Defence in the PDSS for each project. The outcome is then depicted in a pie chart. The pie chart provides a percentage breakdown of materiel capability delivery performance, representing Defence’s expected capability delivery, and outlines any areas of concern, using a ‘traffic light’ system. Accurate reporting on expected or future capability has been an issue of discussion across MPR reviews.
In 2014 the Committee recommended a more objective method be applied to capability performance reporting, noting Defence had no systems in place to track inputs to capability. At the public hearing, the Committee asked Defence what had happened in the past three years to address this issue. Responding on notice, Defence advised that it had disagreed with the Committee recommendation as at the time Defence had limited ability to ‘have full visibility of all Fundamental Inputs to Capability’ and ‘due to the intrinsic difficulty in objectively measuring the capability provided compared to that originally sought.’
The official response to the JCPAA recommendation from the Defence and the ANAO in 2014 was as follows:
As highlighted in the March 2014 hearing, there is no system that universally tracks the inputs to capability by [Defence] and hence no easily auditable representation of information.
However, [Defence] has suggested including the key deliverables which constitute Initial Materiel Release and Final Materiel Release as stipulated in the Materiel Acquisition Agreements for each project, which we consider is appropriate at this time, noting that there can be sensitivities in regard to disclosing information about elements of capability.
The draft 2014-15 MPR Guidelines have been updated accordingly, with a new table (Section 4.2).
The ANAO reported that ‘Defence has not developed this measure, reporting that the difficulties relate to the varied nature of projects being managed, the inherent subjectivity of the content, and the lack of a system that tracks at a sufficient level of detail the progress of inputs to capability.’
Capability performance analysis
The ANAO reported that, while it is unable to formally review and verify Defence’s reporting on capability, it nonetheless attempts to assess capability based on the information available.
Notwithstanding the lack of progress by Defence in improving capability performance analysis and reporting, the ANAO has added a new level of analysis that may provide perspective to assist the Parliament in understanding capability performance. The 2015-16 MPR provides two worked examples, as discussed below, of projects that Defence reports as achieving 100 per cent capability despite being at vastly different stages of development.
Figure 3.1 below is the current level of Defence analysis for the AWD Ships project.
Figure 3.1: AWD Ships Materiel Capability Delivery Performance
Source: 2015-16 Major Projects Report, p. 165.
Table 3.1 constitutes ANAO analysis of the same project, using a different methodology.
Table 3.1: Capability Delivery Progress Assessment - AWD Ships
All three Hobart class Ship Systems with up to Category 5 (sea acceptance) trials, testing and certification completed
No ships have been delivered to date
Initial sustainment arrangements in place to support IOC
Initial sustainment arrangements are not yet in place
Training of the Hobart Class Systems for the commissioning crew to support IOC
Training has not been completed
All sustainment arrangements in place to provide materiel support to the Hobart Class
Final sustainment arrangements are not in place
Total (per cent)
Source: 2015-16 Major Projects Report, p. 14
The ANAO states that Defence’s capability performance in the PDSS has ‘an element of uncertainty’, and Defence’s pie chart states that ‘capability assessments and forecast dates are excluded from the scope of the review’.
Figure 3.2 below is the current level of Defence analysis for the Bushmaster Vehicles project.
Figure 3.2: Bushmaster Vehicles Materiel Capability Delivery Performance
Source: 2015-16 Major Projects Report, p. 295.
Table 3.2 constitutes ANAO analysis of the same project, using a different methodology.
Table 3.2: Capability Delivery - Bushmaster Vehicles
Commencement of delivery of full rate production for Production Period 1 (PP1) vehicles.
All PP1 vehicles have been completed
Completion of vehicle deliveries for all five production periods as detailed in Section 1.1 FMR is scheduled for September 2016
Only eight vehicles were outstanding at 30 June 2016
Total (per cent)
Source: 2015-16 Major Projects Report, p. 15.
Using its own methodology, the ANAO has also developed a summary in the MPR, including all projects alongside the data provided by Defence:
Figure 3.3: Capability Delivery
Source: 2015-16 Major Projects Report, p. 15.
The ANAO provided context on the tables depicted above at the public hearing:
…we have adjusted the tables by having a look at the data within each of the PDSSs and looking at the anomalies that we saw across the five projects that had the largest numbers of deliverables. Removing those has taken us from 26,000 deliverables across the total PDSSs down to just 426 for the remaining 21 projects. If we look at those 21 projects at table 8, we see 44 per cent of all deliverables in the current MPR as being delivered, with 54 outstanding.
Project Maturity Scores
Project Maturity Scores are present in the MPR to provide ‘quantification, in a simple and communicable manner, of the relative maturity of capital investment projects as they progress through the capability development and acquisition life cycle.’
The scores report current project status against a benchmark, assigning a score from 1-10, which ranges from the beginning of a project to completion.
Project Maturity Scores have been a topic of ongoing discussion between the ANAO and Defence for the last four years. The ANAO indicated in the MPR that it had previously identified inconsistency in the application of Project Maturity Scores as an issue. The ANAO noted incidences of inconsistency have reduced across recent reviews, but that the policy guidance underpinning Project Maturity Scores should be reviewed for internal consistency and to ensure they maintain a relationship to Defence’s contemporary business.
In reviewing the 2014-15 Major Projects Report, the Committee of the previous Parliament recommended:
To ensure consistency with project level risk information and to improve reliability, the Committee recommends that the Department of Defence review the procedure for development of expected capability estimates for future Major Projects Reports.
Further, the Committee recommended that Defence ‘work with the [ANAO] to review and revisit Defence’s policy regarding Project Maturity Scores in time for the approach to be implemented in the next [MPR]. The ANAO reports that an updated policy has not been provided to the ANAO for input.
Defence agreed with the recommendation, stating that the ‘Capability Estimates and Maturity Score Graphs will be reviewed to align with the new Capability Life Cycle and Smart Buyer processes, and that Defence had engaged a contractor to ‘develop a more appropriate methodology to support the presentation of the Maturity Score Graphs’.
The ANAO states that ‘Defence’s project maturity framework is not appropriately structured to assign project maturity progress through the project life cycle, particularly within the acquisition phase, which is predominantly the longest and most expensive component.’
At the public hearing, Mr William Divall of Defence noted project maturity scores were a ‘work in progress’, stating:
We are reforming the whole system that we operate under, so the things I mentioned before – capability life cycle, new approach to engagement with industry, new approach to smart buyer components, establishing centres of expertise across the organisation.
At this point in time we are looking at those but have not landed on a position with the ANAO. We have a contractor who is helping us with the maturity assessment component. I think we are unlikely to get it in the next [MPR]… …We do want to make incremental improvements in the information that is provided to the Committee, but we do not want to lose something that is valuable and… …easy to read and compare.
Defence, stated that ‘antiquated’ Defence IT systems were also an underlying problem and were subject to a major review. Further, Defence stated:
We need to get to a single unified system of accountability and reporting inside the organisation. We are developing that, but it also takes time to make sure our IT systems can manage a new reporting mechanism and that I can draw directly from the finance system, I can draw directly from our risk management systems, I can draw directly from our OPP schedules, so that we have a single unified view. I understand the frustrations of the auditors in seeing differences. That is part of the part we are trying to resolve.
Defence stated that work on project maturity scores was ‘a high priority within the organisation’ but that there were ‘some absolute and fundamental dependencies that need to be put in place before we can actually improve the maturity model’. Further, Defence noted project complexity was an additional issue:
Having a single maturity score that measures a project – that may well be for a major platform versus another project that may be for something different; it might be satellite communications or it might be something else that is in the [MPR] – is a complex issue.
Mr Kim Gillis of Defence asked whether there should be:
…consensus about whether a broad, fit-for-purpose assessment is something that should be in the maturity scores. Or should it be a separate component which may then moderate the maturity scores? I do not want qualified audit reports, and the Auditor-General does not want qualified audit reports. I think we need to be working together to get to a conclusion that provides the Committee and Government with some clarify about what the issues are.
The Auditor-General agreed that it was worthwhile to attempt to come to a consensus on project maturity scores, starting from the basis of what the objective of the score is, and then moving onward from there.
The Committee asked Defence about similar reporting regimes in other jurisdictions, and sought further feedback on these models. Defence noted that the United Kingdom used a similar methodology in its reporting regime, and stated that it would source further information on the guidelines behind the methodology.
The ANAO has monitored schedule slippage (the amount of time lost over the course of a project) across multiple MPRs. It has found that total schedule slippage has trended downward over the last three years. However, the reduction downward in 2015-16 is primarily a result of projects with accumulated slippage exiting the review.
The ANAO noted that project classification functions as a general indicator of procurement difficulty. Projects are classified as Military Off-The-Shelf (MOTS), Australianised MOTS, or Developmental. Projects that are more developmental in nature are more likely to experience schedule slippage, while MOTS projects are less likely to experience slippage. The ANAO suggested that the scope and complexity of Australianised MOTS and Developmental projects are underestimated by Defence.
Further, the ANAO noted that ‘projects which experience schedule slippage may result in platforms being delivered with significant obsolescence issues, for example ARH Tiger Helicopters’.
Tiger was sold as an off-the-shelf project, but was more like a Developmental project, resulting in unrealistic expectations around the timeliness of capability delivery. Defence replied that this was one of the ‘fundamental lessons’ Defence learnt from Tiger.
Asked about the current state of slippage noted in the MPR, Defence responded that over time there had been a reduction in projects added to the ‘projects of concern’ list, and that large and problematic projects were exiting the MPR.
The Committee observed that MOTS equipment was less likely to experience slippage than projects categorised as ‘Developmental’. Defence agreed, noting slippage on materiel bought off a production line was ‘almost negligible’. Defence also noted that developmental Australian projects such as the Bushmaster and Wedgetail delivered high quality capability, but that the original schedules for these projects ‘were far too optimistic’.
Defence advised that ‘there have been a number of occasions when that exact debate about how mature the capability is, has occurred…’ and the Defence, Science and Technology Group and Chief Defence Scientist were now required to provide a technical readiness level including a detailed analysis of capability maturity.
The Committee asked whether Defence used a clear or official definition of Australianised MOTS compared to MOTS, or off-the shelf. Defence stated that its Capability Development Manual contained more detail, and that:
OTS solutions may either be Military Off-The-Shelf (MOTS), or Commercial Off-The-Shelf (COTS). There is no individual definition for MOTS, COTS or Australianised OTS/COTS/MOTS. However, the policy explains these variations of OTS, including ‘Australianisation’ to meet Australian Defence Force operational requirements.
Major projects are provided with contingency budgets. According to Defence, project contingency budgets ‘provide adequate budget to cover the inherent risk of the in-scope work of the project’. PDSSs are required to include a statement regarding the application of contingency funds during the year. In the 2015-16 period, seven projects had contingency funds applied. The ANAO examined these statements, and found the clarity of the relationship between contingency application and identified risks had improved. Out of the 25 project offices that had a formal contingency allocation, only four projects (Growler, Bushmaster Vehicles, Additional Chinook and BMS) did not align their contingency log with their risk log as required by Project Risk Management Manual (PRMM) version 2.4.
A total of 23 of 25 project offices applied contingency as required by PRMM 2.4, while two project offices (HATS and Maritime Comms) used the method outlined in a previous version of PRMM. Further, the ANAO found that three out of the 26 Major Projects did not update their Risk Management Plan in line with PRMM 2.4.
Risk management and use of spreadsheets
The ANAO noted that there were exclusions in the ANAO’s review, as Defence systems still do not provide complete and accurate evidence, such as through the continued use of potentially inaccurate spreadsheets.
The ANAO noted that several project offices still used spreadsheets as a risk management tool. The ANAO characterises this practice as ‘high risk’, as spreadsheets ‘lack formalised change/version control and reporting, increasing the risk of error.’
Of the 26 project offices, 15 used spreadsheets as the primary risk management tool, with 11 using a software program called ‘Predict!’ The ANAO indicated that it was important for Defence to ensure its risk management systems and processes are used appropriately, especially for higher cost developmental projects.
Defence stated that the ‘intellectual rigor’ applied to risk management was more important than the tool used. The Auditor-General noted that the comment made by the ANAO in the MPR related to auditability, not usefulness in risk management: ‘From an audit perspective, because the control settings around spreadsheets are not very strong, it is a less auditable tool… …than some of the other ones.’
Defence noted that there were equal interests in audit and direct management perspectives, and that project managers should continue to manage risk in a way that is ‘consistent and transparent’. The Auditor-General responded that the ANAO was interested in bringing Defence’s risk management framework into the scope of its audit work, but that this would not be included if it did not add value to the process. Defence replied that it would be happy to share its risk management approach with the ANAO.
Cost Per Flying Hour
The ANAO reported that cost per flying hour is ‘a metric widely used by military services for use in contract management and performance comparisons across aviation platforms’. In considering the Tiger helicopters project, the ANAO reported in the MPR that there was no consistent calculation methodology that supported the calculations provided by Defence. While the PDSS reported a cost per flying hour of $39,825 in 2013-14, calculations provided to the ANAO for the purposes of Audit Report No. 11 (2016-17), Tiger – Army’s Armed Reconnaissance Helicopter, indicated a cost per flying hour of $43,026 for the same period.
The Committee asked about the discrepancy between these two figures, with Defence advising:
[The discrepancy arises] when we remove those elements that do not pertain specifically to aircraft support and flying – for example, we remove the costs of upgrading Centre DIC training devices and we reduce the money that is transferred to other groups within Defence to provide IT support or facility support. That has led to a difference of perspective around the cost of ownership.
Defence noted the cost of ownership of the Tiger was ‘high’, because:
This is one of those programs where our systems program office has been outsourced to Airbus…
…That cost – 130 people fully burdened – of about $25 to $26 million of that total investment is included in that cost of ownership; however, when you compare with other products like our Black Hawks or Chinooks – where we have a traditional internal systems program office, those costs are not included. That naturally inflates the cost of ownership.
The ANAO reiterated that Defence should have a ‘defined, consistent methodology… …’so comparative views can be taken into account’.
ARH Tiger Helicopters
The Committee included consideration of ANAO Audit Report No. 11 (2016-17), Tiger – Army’s Armed Reconnaissance Helicopter in its review of the Defence MPR. The audit made several concerning findings about the capability of the Tiger:
As at April 2016 the Tiger had 76 capability deficiencies, 60 of which were deemed by Defence to be critical. Other limitations relate to shipborne operations, interoperability and communications, airworthiness, and the roof mounted sight.
Sustainment costs have exceeded the original contract value.
The cost per flying hour in June 2016 was $30,335 compared to a target of $20,000.
On average 3.5 aircraft of an operating fleet of 16 were serviceable in 2015 against a target of 12.
Chief of Army declared Final Operating Capability (FOC) in April 2016 with nine caveats, some of which are being managed by the Tiger sustainment organisation, and others managed by the Capability Manager. The April 2016 declaration was 82 months behind schedule of the original FOC forecast.
In terms of funding to support Tiger into the future, the PDSS for Tiger stated that funding to address caveats would be provided through sustainment or ‘other means’. The Committee sought clarification on this point, with Defence stating:
The funding streams to support Tiger into the future are through the Army sustainment budget and also through the Army Capability Assurance Program [CAP]. Specifically that really focuses on investing to solve any obsolescence issues that will affect Tiger in future years.
…Obsolescence is funded right now because all technology systems suffer obsolescence. Therefore they have to be funded through some means, either through the Army sustainment budget or through a CAP program. There is a CAP program that will be taken through our normal process to gate zero by the end of this year to identify where those funds will come from to deal with the obsolescence issues on Tiger.
Defence advised that Tiger had been in the sustainment phase since December 2004, and that sustainment commenced at the point of delivery of the first aircraft. Defence clarified the status of the Tiger process, stating:
…it is in the in-service phase, and sustainment is one part of the in-service aspect of operating this system. Once again, that involves a larger number of fundamental inputs to capability to actually deliver the capability.
The PDSS for Tiger stated that the ‘Tiger sustainment organisation had also experienced issues with staff turnover and retention.’
Additionally, the Tiger PDSS stated:
Defence and industry engineering capacity is constrained with the potential to affect capability. Defence and industry are closely managing Tiger engineering priorities. This issue is being managed by the Tiger sustainment organisation. Constrained Defence and industry engineering capacity has the potential to affect capability.
As to how the Tiger project had been directly affected by both increased staff turnover and constrained engineering capacity, Defence observed:
During the transition from the extended project acquisition period to the in-service team arrangements, Commonwealth staff numbers were reduced accordingly. Remaining Commonwealth positions (Military and Australian Public Servants) have always been challenging and problematic to fill, due to the limited availability of people with the required technical and specialist skill sets. Many of these positions were filled by using Industry service providers for packages of work over defined periods. Due to the number of rectification programs and new capability enhancements, work has always been programmed to meet Army’s highest priorities and safety requirements. Some lower priority activities were delivered later than planned.
Collins Class Submarine Reliability and Sustainability
As noted in the MPR, there has been a persistent defect in the Special Forces Exit and Re-Entry on HMAS Dechaineaux requiring further investigation and minor redesign, effecting materiel release of capability.
In response to Committee queries as to whether sea trials to determine whether the problem had been fixed had taken place, Defence noted that trials had been undertaken, and capability had been safely demonstrated.
LHD Landing Craft
The ANAO provided an update on a project of interest to the Committee of the previous Parliament – the LHD Landing Craft. The 2014-15 MPR reported 100 per cent materiel capability performance. However, during the public hearing into the 2014-15 MPR, the Committee was advised that trials to test the landing craft’s ability to transport an M1A1 Main Battle Tank were required to be conducted prior to the achievement of Final Operational Capability.
Subsequent trials in May 2016 were unsuccessful, as transporting an M1A1 requires the landing craft to operate in an overload state. The PDSS cited ‘safety reasons’ as the explanation for the cancellation of a trial of the project, and stated that trials are now scheduled for ‘early 2017’.
Further, the PDSS for the LHD Landing Craft noted a 99 per cent materiel capability delivery performance. Defence stated that the landing craft (not the LHDs themselves) one percentage differential relates to the requirement that the landing craft must be able to carry the Abrams tank.
The ANAO stated that, ‘in consideration of the unsuccessful trials, the PDSS depicts that one per cent of capability for the LHD Landing Craft is [under threat, and] …empirical evidence to support the estimated 99 per cent was not available during the review.’
Defence noted that all other tests had been conducted, and the final test of the Abrams tank was meant to take place ‘in the first half of this year.’ In explaining why a one per cent score had been given, Mr Kim Gillis of Defence stated:
It is a judgement call as to the importance versus the number of vehicles that we have, the different types, what is the percentage. Our call was one per cent because it was such a small capability. I do not want to get into capability decisions from the Chief of Army’s perspective on the number of times we would carry an Abrams tank in a watercraft and try to land it. I cannot contemplate an environment where that would happen. So in a capability judgement sense, yes, it is an issue. But we would not normally be loading and discharging Abrams tanks, which are extremely heavy, because there are not many places in the world you can actually discharge them from a watercraft. That was the broad discussion we had.
At the public hearing, the Auditor-General stated:
…we did not receive any evidence from Defence which justified that that particular component was worth one per cent… …we do not have a number. The capability seems to be a reasonably important capability for the landing craft and, without any evidence as to why it was one per cent, we just do not know whether that is an adequate representation of the level of capability delivered.
25 October 2017