Chapter 10

Chapter 10

Contest of ideas and independent advice

10.1      The various individuals and groups involved in the capability development process provide numerous opportunities for project assumptions and proposals to be reviewed and tested. Firstly, the authors of the White Paper draw on a range of skills and experience to identify the capabilities needed to protect the national interest. The group responsible for initiating capability proposals, generally the capability managers, are able to consider risk as part of their proposal at conception stage and, as the end-users, throughout the capability development cycle. There are many others in ideal positions to question and test the analysis and assumptions underpinning proposals. They include the CDG, the Stakeholders Group, the DSTO, DMO, the Capability Investment and Resources Division, the Central Agencies, the Options Review Committee (now the Project Initiation and Review Board), the Capability Gate Reviews, the Defence Capability Committee, the Defence Capability and Investment Committee, the Service Chiefs and Group Heads and finally the CDF and Secretary, who clear submissions in preparation for the government's consideration. Defence industry also has a critically important role in capability development and is discussed in chapters 13 and 14.

10.2      Only recently, however, the minister expressed concern about what he believed was a 'lack of contestability of view'.[1] In this chapter, the committee continues its consideration of how Defence manages contestability so that any submission to government on capability development reflects a robust and thorough consideration of recommended proposals.

Defence's quality assurance framework

10.3      In 2008, Mortimer was of the view that DMO needed 'an internal project review mechanism to identify and fix problems earlier than presently occurs'.[2] In response, Defence noted that DMO had a number of mechanisms in place designed to provide an independent review function and to assist project teams where necessary. For example, the organisation's Assurance Boards formed a key element of the DMO's corporate governance framework. They were intended to provide independent assurance and advice on the adequacy of governance frameworks for each equipment acquisition and through-life support activity; and on issues and risks involving schedule, cost, capability and sustainability. According to Defence, the composition of the board 'ensured that they were in a position to cast a 'fresh set of eyes' across a project or sustainment activity'. DMO staff were encouraged to use the Assurance Boards as 'sounding boards' and to seek guidance and feedback from them on any aspect of concern involving a project or sustainment activity.

10.4      The Assurance Boards have since undergone change. In evidence, Defence referred to its new rigorous series of internal quality processes and committees, working groups, stakeholder groups and gate reviews. They examined each project's capability, cost, schedule and risks in detail to ensure that each project was positioned to deliver as required. Defence explained its procurement quality assurance processes:

To confirm options for Government consideration at First or Second pass, Defence projects must pass through a number of internal quality assurance processes. These processes, which include internal committees, assess and test advice from Capability Managers and Defence Groups. The quality assurance processes ensure that a robust and compelling case can be developed for capability proposals before they are put to Government for consideration at First and Second pass. In doing so, Defence stakeholder views are drawn together to ensure critical interdependencies are acknowledged and addressed.[3]

10.5      In its preliminary report, the committee described the process whereby stakeholders and committees review documents including those dealing with risk. The committee also noted that the Secretary and CDF approve the submission that goes to the minister or Cabinet for approval to move the project through to the next phase. An important aspect of this review process involves gate reviews, now under the purview of the Independent Project Performance Office (IPPO).

Independent Project Performance Office

10.6      In 2008, Mortimer recommended the establishment of an Independent Project Performance Office (IPPO). Defence agreed to the recommendation and established such an office, which commenced operation on 1 July 2011. Located within the DMO, its functions are to:

10.7      The IPPO is also responsible for the conduct of the annual gate reviews of all major defence capital acquisition projects. In preparation for a gate review board meeting, the office conducts an evaluation of the project to ensure that issues have been identified and would be brought to the board's attention for investigation.[5] The committee is not certain that the Independent Project Performance Office should be located in the DMO.

Gate reviews

10.8      The use of gate reviews is not new and the benefits to project delivery are well recognised.[6] Gateway reviews are an important project assurance tool designed to improve the on-time and on-budget delivery of major projects. The Department of Finance and Administration's, Guidance on the Gateway Review Process—A Project Assurance Methodology for the Australian Government, suggests that gate reviews support 'project teams by providing them with an independent information resource that can add value to their project (for example, by the early identification of issues that may need to be addressed)'. They do so by 'providing an arm's length assessment of a project at critical stages of a project's life'.[7] The guide stipulates that members of a review team should not have worked on the project under review, except in the capacity as a gate reviewer at previous gates.[8] As experienced peer reviewers not associated with the project, they are intended to assess the project against its specified objectives at a particular stage in the project's lifecycle. They provide early identification of areas that may require corrective action and validation that a project is ready to progress successfully to the next stage.[9]

10.9      The guide notes that a gate review may take place at the needs phase of a project before its start up stage, which takes the form of a broad strategic review to inform decision-making to confirm the alignment with the intended outcomes.

Gate reviews in Defence's procurement processes

10.10         Gate reviews now form part of Defence's risk control framework designed to enable the early identification of potential problems. In 2008, Defence started to implement gate reviews to supplement the 'red team' and 'deep dive' reviews that had been operating previously.[10] The key function of the gate reviews is to test, review and clear capability proposals and supporting documentation.[11] They are intended to ensure that the advice provided to Defence and government on the health and outlook of major projects is of high quality and reliable. Gate reviews may also be used as a diagnostic tool to assess potential projects of concern and projects that have triggered early warning signals.[12]

10.11         One witness noted that during 2011 much was made of 'new' management controls such as gate reviews and projects of concern. In his view these measures were 'all good' but not new: some were repackaging and improvement of previous work that did the same thing. He explained:

Gate Reviews are a new name for what had been called 'Deep-Dives'...The process is used at major firms to get to the truth of why a program is failing in some way.[13]

10.12         Mr King informed the committee that a team, within the IPPO, which is 'independent of the line management', would provide a brief to the gate review board. He cited contributions from teams from the land system division, the maritime system division and the aerospace division that collect 'a lot of background information'. They undertake field work but might talk to industry if it is involved in a problem.[14]

10.13         In the 2009-10 Major Projects Review, DMO indicated that:

The GRAB [Gate Review Assurance Board] process is a proactive activity that has led to early identification, intervention and resolution of risks and issues across numerous projects in DMO. Given the success of this methodology, the GRAB process will be extended to all major projects.[15]

10.14         A year later, the Major Projects Review (MPR) recorded that the gate reviews were mandatory for major projects at six specified milestones—DCP entry; Options Review Committee consideration; first pass approval; second pass approval, contract solicitation and contract negotiation.[16] The number of gate reviews undertaken currently indicates that these mandated reviews are still a work in progress.

10.15         Previously known as Capability Development Boards, the gate review boards normally comprise five people with a range of skills and experience to make sure that resourcing, budgeting and operational capability matters are covered.[17] According to the ANAO, the chair, who is a general manager not in the line control of that project, may be assisted by the general manager who is in the line control. Board membership is tailored according to the specific issues confronting a project (business case, project management, commercial, engineering, stakeholder etc). Members include senior DMO 'line management, relevant people with key skill sets from other parts of the DMO, and at least two external independent members with extensive Defence or commercial experience'.[18] The capability manager or representative is invited to each of the gate reviews.[19] The inclusion of two external members was intended to strengthen the board. DMO explained:

Board meetings provide a forum for robust, pluralistic discussion that injects a strategic perspective, filters optimism, analyses issues, recommends actions and assists the project to resolve those issues.[20]

10.16         In October 2011, Mr King informed the committee that there were somewhere between 18 and 24 independent members. He had been encouraged by 'the almost volunteer status' of the people coming forward who had been involved in big difficult projects.[21]

10.17         The gate review enables DMO 'senior executives to consider the readiness of a project to proceed to the next stage before they commit any further resources or enter into any new formal undertakings'.[22] According to DMO, if the board is not convinced of a project's maturity or readiness to progress to the next stage of its lifecycle, the project must address those risks and issues before proceeding and a further board review may be required before progression to the next stage.[23] Air Marshal Harvey informed the committee that Defence call it a 'gate review' because unless all required documents are in place the project does not progress to the next stage.[24]

10.18         An independent member of the gate reviews informed the committee that the boards report to the CEO DMO, but that the independent members have access to the CEO. He gave the example of where they may disagree with the chair of the board, in which case 'we have a right to put down our separate view'. To his mind, being able to do so underpinned their independence.[25]

10.19         In May 2011, the minister directed that the gate review program be expanded to include all major projects at least annually.[26] The 2009–10 Major Projects Review recorded that 20 projects had been subjected to a gate review. A year later, approximately 50 projects had undergone gate reviews. According to Mr King:

Given the sheer volume of work involved in review of where the project is and where its risk register is and what risks it is mitigating, it would be impossible to do it more often than once a year.[27]

10.20         He was satisfied with how the gate review program was progressing and gave the example of the reviews that happen prior to contract negotiation where 'we actually go through the negotiation strategy and the trade-offs and so on'.[28] Reviews, which often take three to four days, may also take place at other times, depending on the results of the project risk assessment.[29] He did not want the quality of the gate reviews to deteriorate, explaining:

What is critical is that they get to the heart of issues, whether they are technical, commercial or resource. We have to be able to see what the real issues are and address them. We must not allow it to fall into just a process.[30]

10.21         The ANAO noted that it was expected to take a number of years for the results of gate reviews to flow through.

10.22         Dr Thomson commented on the gate reviews, noting that having people from the previous generation of project managers on the boards would undoubtedly add value to the process.[31] The committee was also told during its site visit to South Australia and Western Australia that gate reviews were a positive activity: that they allowed for contestability and provided an opportunity to bring a wider corporate view.

10.23         On 13 June 2012, six independent members of the gate reviews appeared before the committee and were unanimous in their views that the boards were 'not a tick and flick' exercise. Based on their experience, the reviews do a lot more than form part of Defence's assurance framework. They also help to improve performance through the guidance and advice provided informally and formally and through 'the dissemination of better practice with the line management as well as within projects'.[32] Mr Irving also referred to their mentoring role whereby project team members gain not only the skills but experience to help them better understand the business they are in.[33] The independent members warned, however, against unrealistic expectations—'gate reviews can help improve the prospects of success and help in formulating a workable project, but they cannot prevent things going wrong with projects'.[34]  

10.24         One witness closely involved in defence procurement argued that in order to ensure gate reviews are effective tools, three basic rules should be observed:

10.25         According to the witness, the limitation on gate reviews as a system across all defence capital projects is 'finding and engaging enough expert greybeards'—'hardened project people'—who have developed from 'a culture of accumulating domain knowledge and ongoing practice of tradecraft'.[36] While he supported the new arrangements relating to gate reviews, he did not agree with insisting that they 'be performed regularly on every project (rather than just troublesome ones)'. The witness also cautioned against attempting to do too many new gate reviews which would likely 'take things backwards because there are insufficient deeply experienced people in Australia with the necessary time (and security clearances) to review 200 projects each year.' He took the view, that regular mandated gate reviews would go too far, add to bureaucracy and divert scarce resources from better use. It would increase DMO staff numbers to more than needed and without necessarily raising quality.[37] He explained that it would: better to use those scarce resources on the in-difficulty projects only, after an initial 'cut' or 'scan' of all projects likely to be facing difficulty. If resources are smeared too broadly, it will affect depth and quality which would make Gate Reviews of every project a 'make work, tick-the-box' exercise under the control of an army of generalists.[38]

10.26         The Australian Association for Maritime Affairs suggested having a 'Team B' approach to 'reduce the risk of project planning going off on politicians', senior officers' or even project team members individual "frolics"'. By a 'Team B', it meant having 'a small team of suitably qualified personnel administratively outside the main project structure and tasked with shadowing and checking the assumptions and decisions of the primary project team at key stages in the project's development'.[39] The committee believes that gate reviews, if properly constituted and resourced, would be similar in function to a 'Team B'.

10.27         It should be noted that the ANAO recently published an audit report on the gate reviews which found overall that they were making a positive contribution to improving Defence's performance in acquiring major defence capital equipment. The ANAO examined in detail three gate reviews, which revealed some weaknesses in this quality assurance framework.

10.28         For example, the August 2011 gate review of the M113 Upgrade exposed limitations to this quality assurance mechanism. The ANAO noted that the project office sought to narrow the scope of the review. It explained:

IPPO informed the project office that [that] was not appropriate but, in the event, the available documentation suggests that the Gate Review focused on the schedule-related aspects of a Contract Change Proposal.[40]

10.29         This restriction on the focus of the gate review through documents prepared for it raises critical questions about the degree of influence that the project office is able to exert over the review board and the potential to weaken its independence.

10.30         Also, the committee notes that the Department of Finance and Administration's Guidance on the Gateway Review Process stipulates that members of a review team should not have worked on the project under review, except in the capacity as a gate reviewer at previous gates.[41] In this context, the ANAO found that the M113 gate review was chaired by the Division Head responsible for the project, and comprised one external member and three other DMO officers internal to the division in which the project resided, two of whom delegated their attendance to more junior staff.[42] Unfortunately this departure from accepted procedure was not an isolated incident. The ANAO referred to a significant number of gate reviews being chaired by individuals with line management responsibility for the project under review.[43]

10.31         Indeed, a very worrying trend noted in the ANAO's audit of gate review boards was the growing tendency for gate reviews to be chaired by a manager with a strong connection to the project. According to the ANAO this trend had increased in recent times:

During the first year DMO conducted Gate Reviews (July 2009–June 2010), 33 per cent of Gate Reviews of ACAT I and II projects were chaired by a manager with some responsibility or accountability for the project under review. During the second year (July 2010–June 2011) this increased to 42 per cent. During the first six months of IPPO's management of all Gate Reviews this increased further to 50 per cent.[44]

10.32         This practice of the chair of a gate review having some responsibility or accountability for a project contravenes the very principle underpinning such reviews. It is also in breach of DMO policy. The ANAO's findings underscore the importance of Defence ensuring that policies and guidance provided in manuals are implemented properly. In its audit report, the ANAO noted that the latest policy amendment, approved on 3 May 2012, strengthened the requirement for the independence of the chair.[45] In light of the committee's findings on non-compliance, this new policy amendment is clearly not enough (see chapter 6).

10.33         Dr Neumann cited the ASLAV enhancement project as an example of a gate review that missed a critical aspect of the project. He informed the committee that the technical risk advice provided to the board did not refer to the solution they were reviewing. The engineering and scientific advice before them said the solution was not a high-risk when in fact the one they were reviewing was risky.[46] He explained that the ASLAV project evolved from quite a simple pragmatic solution 'to a further stretch, and neither management nor documentation actually caught up to get the cross-correlation'.[47] The ANAO found that the 2010 gate review of the ASLAV placed a heavy focus on one particular aspect—schedule—which may have inhibited attention being given to other potential risks—technical risk.[48] 

10.34         It is notable that the gate reviews for both the M113 and ASLAV focused on schedule and not technical risk. The two case studies not only highlight the limitations of gate reviews but underline the critical importance of having technical and subject matter experts involved and, importantly, listened to throughout an acquisition project. With regard to gate reviews, the ANAO also drew attention to:

...the absence of a mechanism to follow up action items, the risk of developing an over-optimistic perception that the Gate Review program will identify and address all problems, and the resource impact on project staff and senior management.[49]

10.35         The committee has referred to comments about the resource intensive aspect of gate reviews, especially securing the services of independent members with the necessary project background and experience. It has also noted the danger of relying too heavily on gate reviews to fix problems and of devaluing the whole concept of gate reviews. With regard to ensuring that the recommendations of the review boards are implemented, the committee endorses the ANAO's recommendation that 'defence ensures that a control mechanism is deployed to monitor the status and completion of actions recommended by Gate Review Assurance Boards and agreed by the relevant executive'.[50] 

Committee view

10.36         In the committee's view, gate review boards have the potential to provide the robust review and contestability necessary to reveal deficiencies in the procurement process and to identify potential problems. To be effective, the boards must, however, have the time, resources, authority and skills to be able to reassess and re-evaluate the soundness of the analysis and the assumptions underpinning the project and its progress to date.

10.37         The committee supports the addition of two external independent members to each gate review board but would stress that they must have the authority to ensure that their findings and recommendations carry weight. It also recognises, as noted by a witness, that a major difficulty would be securing the services of hardened project people to be part of a gate review team. Even so, the committee believes that the benefits would more than compensate for the effort required to locate and entice these experienced experts to join the boards.

10.38         Conscious of overdoing gate reviews to the extent that they lose their potency, the committee urges Defence to ensure that gate reviews become and remain powerful quality assurance measures. Compliance with policy such as maintaining the independence of the chair is also critical to the credibility and ultimate success of the reviews. The committee is also aware of unrealistic expectations and of the potential for Defence to rely too heavily on gate reviews to identify problems and their solutions. In this regard the committee stresses that gate reviews are intended to be part of a quality assurance framework and should not be relied on to compensate for shortcomings in project management. Defence must ensure that project management teams take full responsibility for the performance of their project.

10.39         In this report, the committee has referred to numerous instances of non-compliance with policy or guidelines. The gate review examples cited by the ANAO throw into sharp relief, how genuine, sound initiatives can be rendered useless by a management structure that cannot or will not exert authority. Surely, this latest clear disregard of policy whereby the independence of gate review chairs was compromised justifies the committee's scepticism about the effectiveness of other recent initiatives such as project charters, MAAs, and the Project Initiation and Review Board.

10.40         The committee is aware, however, of the great value of gate reviews when used properly and, although disappointed in the shortcomings exposed in the ANAO report, believes that located within the appropriate management structure they have potential to become an effective quality assurance mechanism.

Independent advice and contestability

10.41         On 7 October 2011, Mr King stated that the acquisition process was now 'far more structured'. He explained that a capability manager's focus (or the organisation's) was on capability, schedule and costs in that order. In his view, capability managers were 'not always as well informed as they might be about the various risks of the projects' they want undertaken. According to Mr King, these potential blind spots are offset by the advice of the CEO DMO on cost, schedule and risks associated with that project and the contribution of the DSTO on technical risk.[51] Mr King stated that the DMO in a programmatic way, and the DSTO in a technical sense, 'play the role of devil's advocate'.[52] Similarly, the scrutiny by the Capability Investment and Resources Division group is supposed to provide contestability by weighing up the risks to determine whether the project is the right one on which to spend that amount of money.[53] Mr King concluded:

In many ways, my experience is that the level of contestability, debate, analysis that goes on in project progression at the moment is greater than any time I know.[54]

10.42         In the following section, the committee considers DMO, DSTO and Capability Investment and Resources Division's provision of independent advice and the contribution they make to minimising the risk of 'things going wrong'.

Defence Material Organisation

10.43         DMO is responsible and accountable for developing military equipment cost and schedule estimates (including the associated risk assessments), and for developing and implementing the Acquisition and Support Implementation Strategy. It is also responsible for:

10.44         According to Defence's procurement handbook, the CEO DMO provides independent advice to government on the cost, schedule, risk and commercial aspects of Major System acquisitions.[56] Mr King explained:

Throughout the CDG management phase...DMO is inputting on cost, schedule, not technical risk but capability risk and programmatic risk. When we go to second pass, for example, I am still required by secretary, CDF and government to provide an independent assurance on cost, schedule and risk associated with the program.[57]

10.45         It should be noted that the White Paper suggested that the DMO 'must strengthen its capacity to provide independent advice on the cost, risk, schedule and acquisition strategies for major capital equipment'.[58] This observation is particularly significant as Mortimer made a similar recommendation. In its response to Mortimer, Defence agreed that CEO DMO must be in a position to provide advice to government on the cost, schedule, risk and commercial aspects of all major capital equipment acquisitions. It stated, however, that 'it would not be appropriate for DMO to make coordination comments on Defence cabinet submissions because, for procurement matters, DMO is intimately involved in preparing these submissions'.[59]

10.46         Defence's attitude contradicts the Mortimer principle and effectively negates the element of contestability which depends on independence for its effectiveness. The committee is uncertain about the meaning of DMO's independence in this context and how it can register any concerns it may have about an acquisition project.

Defence Science and Technology Organisation

10.47         One of DSTO's major responsibilities is to provide advice throughout the planning and development phases of defence acquisition programs including advice on all aspects of technical risk and risk mitigation strategies.[60] DSTO's Risk Assessment Handbook makes clear that the Chief Defence Scientist is responsible for providing independent advice to government on technical risk for all acquisition decisions.[61] As discussed in chapter 12, the Chief Defence Scientist is required to provide 'an independent assessment of the level of technical risk presented by a project at each Government consideration'. This assessment occurs primarily at first and second pass approval, but also when a submission is made to government seeking an amended project approval, such as a Real Cost Increase for a change in scope.[62]

10.48         Air Marshal Harvey stressed the importance of the independence of the Chief Defence Scientist's assessment. He informed the committee that DSTO's technical risk analysis and technical risk certification signed off by the Chief Defence Scientist is the independent advice to government.

10.49         In this regard, the committee recalls Pappas' observation about DSTO's advice not receiving the respect it deserves (see paragraphs 8.49–8.50). The committee also notes that many of the schedule delays in projects have not been due to underlying technology problems but integration and certification issues. In the light of DSTO evidence that its personnel have limited experience in these areas, the committee is concerned that CCDG does not appear to recognise this flaw in the process it has approved.

Capability Investment and Resources Division

10.50         The Capability Development Handbook states clearly that the Capability Investment and Resources Division (CIR Division) provides 'independent analysis and review of capability proposals and related costs, including the overall balance of investment in current and future capability, major investment proposals and priorities'. For example, the Cost Analysis Branch looks at the project's costings. Headed by the First Assistant Secretary Capability Investment and Resources, the division develops the draft ministerial and cabinet submission and presents it for DCC consideration. [63]

10.51         On a number of occasions, Air Marshal Harvey referred to the independence of the CIR Division. He informed the committee that the division provides independent scrutiny—'a complete, independent look at what is required'.[64] It considers submissions from the perspective of what government requires in terms of level of evidence to progress the submission and considers whether there are any gaps in the argument. According to Air Marshal Harvey, those in the division:

...are  independent of the sponsor and of the capability manager, but their view is to make sure it is a sound case put forward to government and we have addressed all the issues that are required.[65]

10.52         General Hurley reinforced this view. He stated that the CIR Division 'subjects capability proposals to rigorous independent scrutiny, covering capability, scope, schedule, risk and cost.[66] Furthermore, its advice 'comes as a separate submission' to the DCIC:

So you have the CDG's submission and then we have a separate view that asks the hard questions about issues that are in that document.[67]

10.53         It should be noted that Defence's supplementary submission stated that the Division 'does not contest for its own sake'. It is responsible for drafting initial, first and second pass cabinet submissions 'to support Government's consideration of a project'.[68]

10.54         In the previous chapter, the committee referred to the FDA as a mechanism for injecting contestability in the capability development process and providing independent advice on proposed major defence acquisitions. According to Air Marshal Harvey, who previously worked in the FDA, the CIR Division, which resides in the CDG, performs broadly the same tasks as the FDA. He noted in particular that the division provides that independent scrutiny. He recognised that contestability 'is good' and said:

We try to do that in a non-confrontational way.[69]

10.55         Air Marshal Harvey explained further that the CIR Division was 'actually larger now than when FDA finished', with about 73 people and has 'a very strong independent voice'.[70] General Hurley also noted that the division is 'considerably larger, more comprehensive and more penetrating in its advice' and has a cost-analysis and assessment branch.[71] He emphasised that the CIR Division reports directly to the positions of CDF and Secretary.[72]  Likewise, the Acting Secretary of Defence, Mr Simon Lewis informed the committee that:

CIR Division is larger than the old FDA with a higher work rate and output. It analyses the projects from their inception and provides objective critique at each capability development milestone. This advice is provided independently of the Chief of the Capability Development Group. What is clear is that the work of CIR in producing agendum for these milestones has repeatedly tested assumptions and materially improved the quality of submissions going forward for Government consideration.[73]

10.56         Mr Lewis held that the CIR Division also performs a role in 'stewarding the proposal through Government consideration' which ensures that the 'contestability is grounded in the realities of obtaining central agency and Government approval'. He asserted that the level of central agency scrutiny of projects is 'far greater' than during the FDA days.[74]

10.57         Dr Brabin-Smith acknowledged that the CIR Division did some of the former FDA's work but that it did not have the broad remit of the old FDA. A number of witnesses were especially concerned that the old FDA's role of bringing rigour, which comes with scientific method, to the evaluation of Defence proposals was missing.[75]

10.58         It should also be noted that on 9 August 2011, the minister announced that the CIR Division's capacity to provide internal contestability would be strengthened by separating it from the CDG and having it report directly to the newly created position of Associate Secretary (Capability).[76] According to Air Marshal Harvey, the CIR Division would then sit parallel to CDG but underneath the Associate Secretary (Capability). He explained that the division's work 'would be in the same mode but to the associate secretary' rather than direct to the CCDG.[77]

10.59         Clearly, the minister saw the need to strengthen internal contestability by splitting the division from the CCDG and having it report to the Associate Secretary (Capability). As noted earlier, however, this appointment is no longer proceeding.[78] In this regard, it should be noted that Mr Simon Lewis informed the committee that refinements were underway to strengthen the CIR Division through processes including the Capability Development Improvement Program to further professionalise division staff.[79]

Central agencies

10.60         The Department of the Prime Minister and Cabinet, the Department of Finance and Deregulation and Treasury are known collectively as the central agencies. They have an important part in the consideration and approval of the capability proposal and provide an additional level of scrutiny and advice on capability development proposals from a whole-of-government perspective.

10.61         For example, the Department of Finance and Deregulation must agree on the detailed acquisition and operating costs and financial risk assessment for each first and second pass submission.[80] The department noted, however, that, to form an opinion on the veracity of cost and risk estimates, its staff rely heavily on the technical risk assessment provided by DSTO; the evidence made available to support the Defence cost model; and other Defence capability development documentation.

10.62         In August 2011, the minister announced that Defence would work with the three central agencies: ensure they play a greater role at earlier stages of significant projects and that their specialist advice on cost, risk and alignment with Government policy is an integral component of the recommendations made to Government.[81]

10.63         Air Marshal Harvey also referred to the role of the central agencies which, in his view, compared with the days of FDA, 'have a much stronger role as well in terms of contestability, in terms of looking at our proposals as they go through'.[82] According to Air Marshal Harvey, the organisations were brought together 'to achieve contestability' but also to avoid the barriers between them as proposals are developed.[83] The current CCDG told the committee that CDG 'probably need to engage central agencies better then we have up to this point';—in terms of when they scrutinise our cost models and so on—'that we can actually understand the risk and have factored in the risks'.[84]

10.64         Dr Brabin-Smith argued that objective analysis and contestability are central to the processes of public sector decision-making and an integral part of the machinery of government. He noted that the Department of Finance and Deregulation provides such impartial analysis 'when it runs its not-always-friendly ruler over new policy proposals from other departments'. In his opinion, however, Finance's capacity to examine defence spending proposals was limited. He noted that 'the sheer volume of analysis would require a significant and specialised workforce', and besides 'the more important arguments for defence spending are strategic in nature, and cannot readily be reduced to an economic (or surrogate economic) basis or comparison'.[85] Accepting these limitations, the committee supports the independent scrutiny that Department of Finance and Deregulation provides.

Effectiveness of internal contestability

10.65         The committee notes the use of the word 'independent advice' or 'independent scrutiny' in respect of the DMO, DSTO, CIR Division and the central agencies. The committee, however, questions the extent to which they are able to play the role of 'devil's advocate'. The committee raised these concerns at its hearing in June 2012 with Defence officials and conducted a roundtable discussion with defence analysts to consider how to strengthen contestability within the procurement decision-making process. Apart from Defence's assurances that these organisations and agencies provide independent review and advice, there was no concrete evidence that they do in practice expose proposals to rigorous scrutiny and questioning.

10.66         For example, Dr Brabin-Smith stated that the mechanisms for internal debate and resolution of issues were inadequate—that there needed to be a better mechanism for contestability which has 'to be seen to be working'.[86] Dr Davies similarly argued that Defence needs greater levels of internal contestability to help ensure higher levels of confidence in decision-making. Air Commodore (retired) Bushell supported the view that there is 'a real need for an independent review of capability development and acquisition decisions'.[87]

10.67         The committee also notes the role of external defence analysts as another means of strengthening contestability, yet the engagement of the various think tanks and industry tends to follow a decision not inform it.

One Defence view 

10.68         A number of various internal groups or committees also consider the progress of a project at key stages in its development—Capability Development Stakeholder Group, Options Review Committee (replaced by the Project Initiation and Review Board) and the Defence Capability Committee (DCC). The procurement handbook states that this system of higher level Defence committees 'is designed to provide a corporate view (i.e. ‘One Defence’) on capability proposals before they are submitted to government for approval'. The outcome of these committees is a ‘One Defence’ proposal for government approval.[88] It should be noted that the Defence Capability Development Handbook makes clear that the DCC reviews the draft ministerial or cabinet submission to provide assurance that the proposal recommends capability options that are consistent with strategic guidance and government direction, and are viable, cost effective, and within scope and budget. It also makes sure that the submission presents a ‘One Defence’ view.[89] While, government is entitled to, and indeed should receive, advice from the Secretary and CDF that is well considered, coherent and authoritative, it is important that their advice on procurement is complete and has been informed by all groups involved in developing the proposals, especially those with reservations.

10.69         One witness, however, questioned the management theme, 'One Defence', which was introduced in 2009. According to the witness, this approach:

...reduced diversity of views from senior leaders into just one view and advice to government...In such an environment, it is likely that risks may not always be identified or discussed and information can be inadequate. It certainly can affect the morale and commitment of many leaders. [90]

10.70         In his assessment, the 'One Defence' voice reduces accountability because senior executives and Star ranked line managers are able to blur the lines of responsibility as they defer to the 'One Defence' party line view imposed upon them.[91] He concluded:

'One Defence' is arguably the problem. A diversity of views would be of more value.[92]

10.71         Dr Brabin-Smith was of the view that it was appropriate for Defence to speak with one voice, 'provided that what that one voice says has been arrived at via a process of thorough contestability and lots of frank and fearless advice, carefully listened to within Defence'.[93] Dr Davies agreed that a Defence one voice was 'fine' if that voice 'communicates the degree of uncertainty, the credible dissenting positions within the department that were put forward during the discussion and the degree of risk that has been assessed'.[94] For him, the communication of the rigour of the process and the areas of uncertainty was the really important matter.[95]

10.72         As noted previously, Dr Black found that there were too many internal committees while contestability of advice within Defence had been diluted.[96] The committee understands that achieving the right balance between the integration and the contestability of views would make for a sound decision-making process. Evidence before the committee, however, suggests that the so-called independent role of DMO, DSTO and CIR Division does not provide the contestability required to ensure that the 'One Defence' view presented to government represents robust and thoroughly debated proposals.


10.73         There is no doubt that Defence needs greater levels of contestability to ensure that assumptions are thoroughly tested and decisions well informed. The committee acknowledges that Defence has a quality assurance framework that is designed to provide internal contestability and external scrutiny. Yet problems such as mistaking a developmental project for a genuine off-the-shelf product indicate that this internal filter and the gate reviews have not worked as well as they should. Surely a robust risk process would at some stage before second pass approval have corrected false assumptions. The check and review management process, if implemented properly, should test the veracity of assumptions underpinning assessments on costs, schedule, technology, capability, and workforce requirements. Clearly to date, there have been significant breakdowns in this area.

10.74         A number of witnesses strongly supported Defence's revamped gate reviews, which are an improvement on their predecessors, especially the inclusion of two independent experts. The committee, however, does not want to see the contribution of gate reviews rendered ineffective because of a fundamentally flawed management structure. The committee underlines the importance of Defence ensuring that the members of the gate review boards have the relevant skills, knowledge and competencies to scrutinise the proposals before them effectively. The committee would like to see the independence of the external members guaranteed and their ability to provide genuine contestability assured. It would also like to see concrete measures taken to ensure that the implementation of recommendations made by the review boards is monitored, recorded and reported to the relevant capability manager, CCDG and CEO DMO.

10.75         While the committee understands that, if used properly, gate reviews are an important project assurance tool, it recognises that they have their limitations and should not be regarded or relied on by Defence to compensate for failings in its management structure. Gate reviews should be retained as part of a tighter, more streamlined acquisition process and an important quality assurance tool but they must adhere to the principles of objectivity and impartiality and bring the required specialist knowledge and experience to the review. 

10.76         In this regard, gate reviews should be overseen by an authority that can exert its independence and authority to ensure that the reviews remain at arm's length from the influence of those with a vested interest in the project under consideration. The contraventions identified by ANAO require Defence to look carefully at ways to safeguard the integrity of these reviews. A new policy amendment designed to strengthen the independence of the chair is not enough. The committee has demonstrated repeatedly that changes to policy and to manuals (process) do not always work.


10.77         The committee notes concern about the gate reviews losing their potency and simply becoming part of the process if overused. The committee believes an annual gate review for major projects would add value but recognises that the format and/or structure may need to be scaled to suit project scope/cost. The committee recommends that full gate reviews be:


10.78         In light of revelations about breaches of policy such as chairs of boards having line management responsibility and of misunderstandings stemming from the documentation provided to the gate review boards, the committee recommends further that the Independent Project Performance Office (IPPO):

To ensure that the IPPO has the authority and resources to discharge it functions, the committee further recommends that Defence consider carefully whether the functions of the Office should be located in CDG or another agency.


10.79         With regard to ensuring that the recommendations of the review boards are implemented, the committee endorses the ANAO's recommendation that 'Defence ensures that a control mechanism be deployed to monitor the status and completion of actions recommended by Gate Review Assurance Boards and agreed by the relevant executive'.[97]

10.80         The committee also noted the important role of bodies such as the DMO, DSTO, CIR Division, and central agencies in providing independent advice. It was concerned, however, about Defence's interpretation of 'independent advice' and the ability of the various internal groups to probe behind the documentation presented to them and to test the underpinning analysis and assumptions. In light of the 'One Defence' view, the committee is concerned that the independence of their voices may not be heard. Considering that the gate reviews report to the CEO DMO, it is imperative that DMO's advice is contained in the submissions that go to cabinet. Otherwise, the benefit of both the gate reviews and DMO's advice and perspective may be lost.

10.81         In this regard, the committee is not convinced that the so-called independence of the DMO and DSTO is truly independent and would like to see measures taken to strengthen that independence and their ability to register concerns in submissions to the DCIC and government. Despite Defence's assurances that the CIR Division provides greater scrutiny than the FDA, such assurances were not supported by tangible evidence. Moreover, given the fact that a number of analysts were of a contrary view and supported the re-instatement of the FDA-type function, the committee is concerned that the concept of robust contestability and independent advice promoted by Defence has not been adequately demonstrated. The history of poor project performance suggests that DMO, DSTO, the CIR Division and the central agencies have not fulfilled the devil's advocate role effectively.


10.82         The committee recommends that the minister review, update and reinstate the Ministerial Directive to CEO DMO. The directive is intended to set boundaries and expectations and establish clear accountability for achievement of Defence capital acquisition programs. It should include the requirement that CEO DMO provides independent advice to the minister in DMO's specialist area of major capital projects.


10.83         The committee recommends that the government should again look carefully at making DMO a statutorily independent agency, as previously recommended by Kinnaird and Mortimer, but rejected by Defence and government. The CEO’s salary should be set by the Remuneration Tribunal and, as stipulated in the previous recommendation, direct access to the minister should be restored pursuant to a re-instatement of a ministerial directive which has fallen into disuse. The intention behind this recommendation is to find a better way to: guarantee DMO's independence and assist it to provide frank advice to government, have its functions and responsibilities spelt out in legislation, and allow it more latitude to employ specialist personnel.


10.84         The committee recommends that the minister consider how best to ensure that DSTO's specialist advice on technical risk associated with Defence's major capability developments are conveyed to government in a clear and accurate way. The Ministerial Directive to CEO DMO may serve as a model.


10.85         The committee recommends that the Technical Risk Assessments and Technical Risk Certifications (currently presented to the Defence Capability Committee and the Defence Capability and Investment Committee) should be a joint activity overseen by the relevant Service T&E agency head and the Chief Defence Scientist. In light of past underestimation of technical risk, the intention would be to review past experiences and current documentation to determine how risk assessments could be better presented to non-technical experts to minimise the opportunity for risk assessments to be misinterpreted. The reporting structure also needs to be transparent such that assessments cannot be ignored without justification to the key decision-makers (e.g. minister).

10.86         Another important consideration is whether the capability managers, members of the gate review boards and the various committees, and the relevant personnel in DMO, DSTO and CIR Division have the appropriate skills, experience, resources and corporate knowledge to conduct and interpret analysis and/or provide advice. The committee considers these matters in the following chapters.

Navigation: Previous Page | Contents | Next Page