4. Evaluation and implementation of lessons learned

Committee conclusions and recommendations

4.1
The Commonwealth Grants Rules and Guidelines (CGRGs) outline that grants administration involves all granting processes and activities—programme design, grant selection process, development and administration of grant agreements and programme evaluation activities.1 The CGRGs also include, as one of the key principles for grants administration, an outcomes orientation.2 The Committee strongly supports the ANAO’s assertion that the timely and effective delivery of an evaluation strategy is important to achieving planned programme outcomes.3
4.2
Evaluation design should be considered as part of the design of a grant programme (this should be clearly included in guidance materials and expected in practice). Through the ANAO’s audit coverage in this area, the Auditor-General has observed that departments may develop evaluation frameworks quite late in the grant programme lifecycle (and sometimes, not at all).4 The Committee emphasises the need for departments to take a ‘strong early focus’ on formulating an evaluation strategy.5
4.3
The Committee recognises that departments, through well planned and effective implementation of evaluation strategies, can also identify opportunities to improve programme design and implementation in a timely manner.6 In order for the full benefits to be realised, the Committee considers it important that departments develop implementation plans so that follow through on evaluation findings and recommendations occurs.
4.4
Outside of formal programme evaluation frameworks, departments can capture other considerable benefits by addressing learnings from their own experience, as well as that of others. As highlighted in this Inquiry, similar issues in public grants administration continue across different granting departments as well as within individual departments. The Committee stresses the importance of departments actively collecting, sharing and acting upon lessons learned, particularly as two departments that appeared before the Committee were found by the ANAO not to have made sufficient progress on the lessons learned from previous audits, including fully implementing past recommendations.
4.5
The Committee encourages departments to continue to establish and strengthen grants ‘communities of practice’, as well as develop other model guidance documents (which can be tailored to individual programmes across a department) to share and implement learnings. However, the Committee notes that while it is prudent to disseminate learnings from audits to relevant programme areas and staff across a department,7 it is not simply enough to do so. A supporting culture, led by the Executive cohort, along with complementary expertise, quality controls and monitoring at important points across a grant’s lifecycle, is required. Such arrangements would ensure that prior grants administration issues become a focus for monitoring at important points across the grants life-cycle so that an assessment can be made on whether the processes are on-track, or changes are required.
4.6
As outlined, the design and implementation of grant programmes are enhanced through an early focus on programme evaluation. Furthermore, the Committee notes that for those programmes with short implementation timeframes, good governance (including up-front planning and risk management) coupled with realistic advice to Ministers also mitigates shortcomings that continue in departments’ design of, and later, implementation of grants programmes in this circumstance. This may be appropriately addressed within the framework of the partnership agreements referred to in Recommendation 1 of this report, but in addition, good administrative practice must require identification and communication of risks associated with each programme. The administering area is responsible for not only identifying and recording the risks within their department’s processes, but should make these clear to relevant decision-makers. While advice on these risks may not change implementation timeframe decisions, decisions are made ‘consciously’, within the context the evidence.8
4.7
In line with the evidence reviewed in this chapter, the Committee strongly endorses the four recommendations contained in the ANAO’s cross-portfolio report on the Delivery and Evaluation of Grant Programmes,9 which include for departments to: provide more realistic advice to Ministers on programme delivery timeframes during new programme design; as well as develop implementation plans to follow through on evaluation findings and recommendations.
4.8
The Committee also wishes to stress the importance of granting departments implementing the lessons learned across all of the areas reviewed in this Inquiry. As such, the Committee strongly supports the seven recommendations in the three ANAO reports that were subject to a public hearing. The recommendations address continuing deficiencies in the approach and processes adopted by the respective departments to grants administration. The Committee acknowledges the respective departments’ progress against these recommendations, and recognises the current backdrop of the whole-of-government Grants Initiative. Within this context, full implementation is essential for continued shortcomings to be rectified.

Recommendation 3

4.9
The Committee recommends that the Department of the Environment and Energy, the Attorney-General’s Department and the Department of the Prime Minister and Cabinet respectively, report back to the Committee on progress against the Auditor-General’s recommendations from each of the following individual audit reports: Award of Funding under the 20 Million Trees Programme (No. 4 of 2016–17); The Design of, and Award of Funding under, the Living Safe Together Grants Programme (No. 12 of 2016–17); and Indigenous Advancement Strategy (No. 35 of 2016–17). The advice to the Committee should clearly indicate whether full implementation of the respective recommendations has been achieved or provide a plan of activities and date of expected achievement, and where applicable, any barriers to implementation.

Review of the evidence

4.10
This section reviews ANAO Report No. 25 (2015–16) Delivery and Evaluation of Grant Programmes, a cross-portfolio audit,10 as well as the evidence presented before the Committee on the development and implementation of evaluation strategies and the timely implementation of lessons learned by the:
Department of the Environment and Energy (DEE)—for the Award of Funding under the 20 Million Trees Programme (20 Million Trees);
Attorney-General’s Department (AGD)—for the Living Safe Together Grants Programme (LST); and
Department of the Prime Minister and Cabinet (PM&C)—for the Indigenous Advancement Strategy (IAS).
4.11
The evidence review of the IAS includes some broader public administration matters, in particular PM&C’s establishment and implementation of the IAS.

Delivery and Evaluation of Grant Programmes (cross-portfolio)

4.12
As part of the audit background, the ANAO acknowledged that performance audits of individual grant programmes have increasingly focussed on the processes by which funds have been awarded. In part, this focus was in response to the types of concerns raised by key stakeholders such as Parliamentarians and Parliamentary Committees. To balance coverage across the lifecycle of grants administration, the objective of the audit was to assess the effectiveness of three departments’:11
management of the delivery of projects awarded funding for three programmes where the ANAO had previously audited the award of grant funding; and
development and implementation of evaluation strategies for each of those programmes.12
4.13
Box 4.1 below, outlines the key audit findings. The audit’s recommendations are provided in full in Appendix B of this report.

Box 4.1:   Key audit findings

Management of grant agreements—delivery, oversight, payment and reporting

Advice to government on the design of the three programmes had not identified timeframes ‘as being unrealistic’.13 Only the Energy Efficiency Information Grants programme was delivered in accordance with the planned timeframe.
While a sound overarching framework was in place for each department to monitor project progress and completion, the administration of this framework was inadequate including: delays in obtaining reports; and reports not being obtained at all. Consequently, there were ‘inadequacies’ in each department’s oversight of progress and completion of the funded projects.14
Further, grant funds were paid significantly in advance of need for all three programmes. The approaches taken were ‘inconsistent with the obligation under the financial framework to make proper (efficient, effective, economical and ethical) use of resources’.15
During the audit, errors in DSS’s website reporting of grant funding locations remained. Notwithstanding that DSS previously accepted, and ‘purportedly implemented’, an ANAO recommendation that the department would correct its approach to website reporting.16

Programme evaluation—conducting evaluation and evidence of desired behavioural change

Two of the three audited departments put in-place evaluation plans (DIIS and DSS). Further, contrary to advice from DIRD in response to an earlier ANAO audit on Liveable Cities,17 the ANAO found that while DIRD had conducted some work towards planning evaluation activity for Liveable Cities, ‘the plan was not finalised and no evaluation work was or is now planned to be undertaken’.18
While evaluation activities that were planned were undertaken, the departments did not give adequate attention to acting on evaluation findings and recommendations.19
The ANAO found that each of the three programmes ‘had limited success in demonstrably influencing the behaviour of entities other than the funding recipient’.20
4.14
As part of this Inquiry, DSS submitted to the Committee its progress against the ANAO’s recommendations.21 The department had agreed to three of the four recommendations, noting Recommendation No. 4.22 DSS informed the Committee that it has made ‘significant improvements’ to its grants administration. DSS highlighted that the Program Delivery Model (PDM), which is being used to build the administrative processes of the grants Hubs23 is central to its approach to addressing the ANAO’s recommendations.

Award of Funding under the 20 Million Trees Programme–DEE

4.15
The ANAO’s audit coverage of DEE’s grants administration has been significant over recent years. Three previous audit reports identified a number of weaknesses in DEE’s grant assessment and selection processes. The audits were: Administration of the Biodiversity Fund
Program (2014–15); Administration of the Strengthening Basin Communities Program (2013–14); and Administration of the Smart Grid, Smart City Program (2013–14). Three respective recommendations were aimed at strengthening the department’s assessment and selection processes, including recommending that the department:
… design clear eligibility criteria; establish clear eligibility requirements in grant guidelines; assess all eligibility criteria in all grants rounds conducted; retain sufficient documentation to evidence eligibility and merit assessments; and implement probity arrangements.24
4.16
The ANAO’s audit findings into 20 Million Trees indicated ‘limited progress has been made by the department to respond to earlier ANAO findings’.25
4.17
The Committee received evidence that the department’s Grants Administration Framework includes a Lessons Learned Register (as part of the framework’s Evaluation phase). Line areas are encouraged to ‘populate’ the register with issues and guidance for reference in administering future programmes.26 DEE advised however, that the function of this register will need to be considered with reference to the department’s transition to the Business Grants Hub.27
4.18
The department was asked during the public Inquiry what assurances it could provide to the Committee that the ‘necessary improvements’ to its grants administration have been or are being made.28 The department acknowledged that there are some ‘common elements’ across the current audit and previous reports that are yet to be addressed, which has in-turn ‘formed the basis’ of DEE’s response to the 20 Million Trees audit.29
4.19
The department advised that through its business improvement project it is seeking a more ‘integrated’ and ‘systemic’ approach to responding to audit findings. DEE submitted that the project identifies key improvements to ‘optimise the performance of program governance and risk management frameworks to better support the department’s program delivery’.30 The project deliverables are as follows:
guidance material, tools and an improved governing structure to support consistent, accountable and transparent decision making at all levels of program management
a renewed program board to govern and provide strategic oversight across all biodiversity-related programs
principles of proportionality to guide program design, project management, reporting and audits and a consistent framework to support the acquittal of grants going forward.31
4.20
The integration of this work across DEE is overseen by the Departmental Governance and Performance Committee.32
4.21
Since December 2015, there has been a Chief Risk Officer in the department. The CRO (who is embedded as a permanent adviser to the department’s Governance and Performance Committee)33 is overseeing and working closely with the relevant areas on the business improvement project.34 The CRO provides independent advice on the risks (current and future) faced by the department and is responsible for ensuring that all staff understand ‘how to effectively manage risk and that risk can be taken within sensible boundaries’.35 DEE advised that as part of the role, the CRO is also responsible for ensuring that ‘learnings from pervious audit reports are going to be taken forward and shared across the whole department and into other business areas’.36
4.22
In accordance with DEE’s advice to the Committee, the ANAO commented during the public hearing that the department has implemented a focus at the ‘highest levels of the organisation’ to address its ongoing grants administration issues.37 The ANAO highlighted that the challenge for DEE will be to:
… keep pushing down into the organisation—the success—and managing the risks against some things that are repeatedly happening.38
4.23
During the public hearing, the department raised that machinery-of-government changes over the years, has brought along with it the challenge of achieving ‘consistency across principles and practices’.39 The Committee heard that one of the ANAO’s recent reports into machinery-of-government changes40 identifies that one of the risks during machinery-of-government changes is ‘integration of different business processes’, which along with different IT and HR systems ‘creates challenges for agencies to make the integrations quick, smooth and effective’.41 However, the ANAO did highlight that the standardisation of administrative frameworks and rules assists in these circumstances. The work of central agencies, such as the Department of Finance, which has establishing standardised processes and rules, such as the CGRGs, ‘reduces the risk of difference’ during machinery-of-government changes.42
4.24
The measurement of one of the 20 Million Trees programme objectives—that is 20 million trees and associated understory by 202043—was raised during the public hearing, including the challenges associated with determining whether a tree has survived over several years into the future.44 The department advised that a tree is considered ‘a plant that will get to two metres high’ and that for longer term/larger grants, applicants are given three years to plant the trees (recognising the impact of seasonal variation over several years).45
4.25
DEE also outlined that through the programme guidelines it assesses applicants’ capability to maintain the trees planted. For example, if a Landcare group was involved, the department ‘would have a high degree of confidence’ that those trees planted would be successfully maintained. As part of the acquittal, or end of a project, the department also requires:
… evidence from the recipient that they have planted those trees; that might be visual evidence, or it might be a site visit.46
4.26
Further, the department submitted that successful 20 Million Trees applicants are required to detail their project plans in the Monitoring Evaluation Reporting and Improvement Tool (MERIT). Reporting via MERIT can include the: impact of the project; the biodiversity and natural heritage conservation activities delivered; and collection of project specific information about the natural resource management.47
4.27
At the programme level, the 20 Million Trees Monitoring, Evaluation and Reporting Plan was developed in the initial programme design phase. The plan identifies the need for a mid-term and an end-of-term evaluation. DEE advised that as part of delivering on this commitment, a mid-programme performance update is currently being developed and due for release in 2017.48

Living Safe Together Grants Programme–AGD

4.28
Consistent with the Committee’s recommendation in Report No. 452,49 the ANAO included in the scope of the audit an assessment of whether AGD had applied lessons it advised the Committee that it learned as part of its response to the ANAO’s audit of the Safer Streets50 programme.51
4.29
The ANAO concluded that:
Whilst some progress has been made, there remains considerable scope for improvements in AGD’s administration of grant programmes. This is reflected by the continuing deficiencies in AGD’s approach to assessing the eligibility and merit of grant applications, as well as in the advice to decision-makers about those applications that should be funded and those that should be rejected.52
4.30
Prior to the ANAO tabling the Safer Streets audit, and during the assessment of the LST applications, the CVE Branch Head was informed that the audit was ‘critical of many of the processes and/or procedures that were followed’ (and identified the senior executives within the department that could provide ‘further insights’).53
4.31
During the public hearing, AGD advised it had ‘sought to socialise some of the recommendations’ from the Safer Streets audit throughout the department, and advised the relevant people at a ‘high level’ about the issues identified.54 However, the draft report was not disseminated as AGD was of the understanding that ‘it was inappropriate to disseminate it broadly’.55
4.32
The ANAO acknowledged that there is a ‘balancing act for accountable authorities’ to manage the confidentiality provisions as per the Auditor-General Act 1997 (which has quite serious consequences if breached) and the sharing of audit findings, so that recommendations are implemented and the best and most timely response to the lessons identified occurs.56 The ANAO surmised that while AGD’s statements regarding improvements, such as implementing ‘communities of practice’ (refer to Chapter 2 for a discussion on AGD’s Grants Community of Practice57) accorded with the ANAO’s ‘observations’ about where departments are performing better, they were not evident at the time of the LST audit.58
4.33
Consequently, while the merit assessment work for LST was ‘strong’ the ANAO commented that there was a ‘similar failing’ across both Safer Streets and LST programmes.59 Contrary to the design of both programmes, the merit assessment work was not used to ‘clearly identify’ the most meritorious applicants for funding—particularly, when providing advice to Ministers for funding approval decisions.60
4.34
AGD’s approach to programme evaluation was an area of interest during the public hearing. The department advised that it developed the evaluation plan by considering the LST programme objectives and the principles in section 1061 of the CGRGs.62 AGD noted that the evaluation of the LST programme will be disseminated to senior departmental staff and to the Grants Community of Practice, with evaluation findings to inform the department’s design of future grants programmes.63
4.35
The department discussed that in addition to the ‘acquittal-type evaluation’ another important aspect is to determine whether the programme’s ‘substantive outcomes’ have been achieved.64 AGD recognises that a national approach to evaluating CVE programmes is important, particularly as it is a new social policy area with corresponding government activity. The department submitted that:
As part of the COAG CVE Taskforce, a Research and Evaluation Working Group was established to develop a national CVE evaluation framework. The Australia-New Zealand Counter-Terrorism Committee’s CVE Sub-Committee (CVESC) is currently developing the National CVE Evaluation Framework and Guide (the Guide). This framework sets out best practice and provides guidance on how jurisdictions can build the evidence base for evaluating the effectiveness of CVE programs in a more consistent manner.65

Indigenous Advancement Strategy–PM&C

4.36
The ANAO’s audit into the IAS assessed whether PM&C had effectively established and implemented the strategy to achieve the Government’s outcomes.66 As part of the audit, and to form a conclusion against the audit objective, the ANAO examined the establishment and implementation of the IAS, as well as the performance framework, including evaluation.
4.37
During the public hearing, PM&C’s shortcomings in implementing its community engagement strategy were discussed, which included not keeping records of meetings undertaken.67 The ANAO reported that while the department developed a consultation strategy, the approach outlined in the strategy was not fully implemented.68 During the audit, the ANAO received 82 public submissions received from stakeholders, 27 per cent indicated that there had been a ‘lack of consultation between the department and the Indigenous communities’.69 Further, 12 per cent of the 114 applicants interviewed, said that they were consulted prior to the IAS commencing, while 69 per cent advised that they had not.70
4.38
In responding to this matter during the Inquiry, the department acknowledged that ‘with the benefit of hindsight, it is clear that implementation of the IAS would have been strengthened with more effective communication and engagement with stakeholders affected by the changes’.71 PM&C advised that since the audit, the department had distributed more than $2.1 billion to over 6800 projects and that at this stage the department ‘was not getting irritation from the ground about lack of consultation’.72
4.39
The department also contextualised the challenges of community engagement for the IAS, including key aspects of the IAS’ formation:
The IAS was the product of 150 programmes, from nine separate government departments that were collapsed into one overarching programme (and five sub-programmes).
PM&C inherited a regional network of around 600 staff, which at that stage was spread over approximately 110 locations in Australia, including remote areas.73
4.40
The department reiterated that while the IAS was forming, regional network staff were constantly engaged with Indigenous communities. Also, PM&C pointed to the approximately 80 stakeholder sessions74 it conducted around the country with organisations, communities and individuals in the lead-up to the grant round (during August and September 2014).75
4.41
The department further submitted that it ‘acknowledges the importance of co-design’ and as an example, informed the Committee that it recently met with representatives from the Building Better Lives for Ourselves programme and committed ‘to conducting a session with interested participants on the workings of the IAS Guidelines’.76
4.42
There was also interest in the public hearing for PM&C to expand on part of its opening statement that ‘there were developments that arose that were beyond expectations and planning’.77 The discussion on this point included the following issues:
Management of risk.78
The department acknowledged that they were ‘not as prepared’ as they should have been for the number and complexity of the applications. Nearly 2400 organisations applied for funding.79 One application could contain multiple projects, for example ‘employment services, domestic violence services, a night patrol bus and economic development initiative’.80
The ‘contest’ between timeliness, and thoroughness81, noting the period between the IAS’ approval and implementation (July 2014) was seven weeks.
The department acknowledged the challenge and noted that there was a three month difference in announcing funding results from the original timeframe (March 2015, as opposed to December 2014).82 However, within that period PM&C ‘ensured transition arrangements were in place for every agreement’.83
4.43
The ANAO raised in both its submission to the Inquiry and during the public hearing the influence of ‘expedited timeframes’ to the ‘shortcomings’ in departments’ implementation of grants programmes.84 The ANAO highlighted the importance of good governance and planning in this circumstance, particularly ‘up-front planning and up-front risk management’,85 observing that:
Tight timeframes require that entities provide realist advice to Ministers on program delivery risks when new grant programmes are being designed, as well as improving the identification and management of risk.86
4.44
In terms of the IAS, the audit identified that PM&C ‘did not advise the Minister of the risks associated with establishing the Strategy within a short timeframe’.87 During the public hearing, the department highlighted that implementing the IAS was a ‘big change for the market’ and ‘a big challenge’ for the department to deal with corporately.88 PM&C briefed the responsible Minister that:
… it would be a challenge [but did not] brief to say it was not possible… we indicated challenges. But we implemented in line with the time frame as required.89
4.45
The department went on to further outline that while it did not brief the government on detailed risks,90 PM&C identified 11 risks to the implementation of the IAS, and ‘implemented’ or ‘partially implemented’ 27 of the 30 risk treatments.91 The importance of full and frank advice to decision makers was noted by members of the Committee during the public hearing.92
4.46
With regard to the department’s approach to measuring performance for the IAS, the ANAO found that while the department established arrangements to monitor service providers’ performance, specified targets for providers were not in-place. For example, 1100 projects are required to ‘report against a mandatory indicator for number and proportion of Indigenous people employed in delivery of the project’.93 However, for 832 of those projects, a target number or proportion of Indigenous people was not specified.94 Furthermore, the audit found that arrangements to monitor were not ‘always risk based’.95
4.47
The ANAO found that while the department drafted an evaluation strategy in 2014, it was not formalised. The draft evaluation strategy was presented to the Indigenous Affairs Reform Implementation Project Board. However, further refinements were requested. Subsequent work was deferred until the funding round results were collected. Consequently, in July 2015 the Program Management Board agreed to the establishment of an evaluation register. The audit outlined that the evaluation and analysis activities that ensued focussed on:
legacy program evaluations;
advice on the development of performance indicators; and
data collection to support the funding round assessment.96
4.48
The main constraint on the evaluation activities was the absence of a formalised strategy, as well as the provision of funds to implement the strategy.
4.49
Throughout the Inquiry, PM&C was not able to point the Committee to an approved, overall evaluation framework, nearly three years into the $4.8 billion strategy. In May 2016, the Minister ‘agreed to an evaluation approach and a budget [$3.5 million (GST exclusive)] to conduct evaluations of Strategy projects in 2016–17’.97 The report further outlined that as at May 2016: nine evaluation activities were underway or planned; and 13 activities were approved and funded for 2016–17. The activities included impact analysis, case studies, reviews, wellbeing studies as well as outcomes measurement development and testing.98 In the public hearing the department advised the Committee that ‘… there are two strands of this work: there is the performance monitoring of our day-to-day contracts, and there is the evaluation. We have made a significant investment on those two strands of work over the last 12 to 18 months’.99
4.50
The Committee was also informed that a key focus of the department is to:
… strengthen the IAS performance framework, our key performance indicators, performance monitoring and reporting processes, and program evaluation.100
4.51
Specific to evaluating the IAS, the department advised that a longer term programme of evaluations is being developed. PM&C pointed to the Minister for Indigenous Affairs’ announcement in February 2017, committing $40 million over four years to strengthen IAS evaluation activities,101 as well as the Prime Minister’s announcement for a further $10 million for research into Indigenous policy and its implementation.102 Through this, PM&C will be:
… better able to assess where our investment needs to be focused in the future, and ensure the IAS continues to deliver positive outcomes for Indigenous Australians.103
4.52
At the public hearing, the department was asked to provide details on what formal processes were in-place to apply lessons learned from the IAS—that is, informing staff of these lessons and supporting staff to apply those learnings to other programmes across PM&C.104 The department advised that it is ‘developing a full implementation plan around the findings of the audit’.105 Box 4.2 outlines a number of the specific activities the department cited as part of addressing and applying the lessons learned.

Box 4.2:   Applying the lessons from the IAS

Engaged with a specialist provider to assist in understanding and analysing the performance data it has captured for the ‘day-to-day’ contacts.
Developed an ‘activity recording framework’ for grants, so that categorisation of grants can occur.
Implemented the Assessment Management Office, which is a key part of the department’s response to the record keeping issues highlighted in the audit (further details on the office and its functions are provided in Chapter 2).
Revised the department’s grant operating model and assessment manual.
Developed a reporting solution for grants (above the existing grant system purchased from the Department of Social Services) for ‘better granularity’ in grants reporting.106
4.53
PM&C was also asked how it was sharing with other departments, findings and best practice.107 PM&C advised that:
[it has] a formal relationship with the Department of Social Services. We purchased the grant system from them, so we have formal governance arrangements in place with them. We certainly have discussed our audit findings and the impact on our grant system.
There are at least four governance forums related to that hub work that every granting agency across the Commonwealth sits on. We certainly talk about the findings from individual audit reports and more general audit reports on grants and how that impacts upon the building of grant systems and the business processes in place for grants.108
4.54
The ANAO’s report made four recommendations. Two of these recommendations, Recommendation No. 1 and Recommendation No. 4109 relate to the review of evidence in this chapter. The department agreed to both recommendations and as part of this Inquiry provided the Committee with an update on progress.110
Senator Dean Smith
Chair
16 August 2017

  • 1
    ANAO, Audit Report No. 25 (2015–16), Delivery and Evaluation of Grant Programmes, p. 15.
  • 2
    Department of Finance, Commonwealth Grants Rules and Guidelines (CGRGs), p. 25.
  • 3
    ANAO, Audit Report No. 25 (2015–16), p. 15.
  • 4
    ANAO, Submission 7, p. 2.
  • 5
    ANAO, Implementing Better Practice Grants Administration, p. 92; and ANAO, Audit Report No. 25 (2015–16), Delivery and Evaluation of Grant Programmes, p. 15.
  • 6
    ANAO, Audit Report No. 25 (2015–16), p. 15
  • 7
    Noting the balance departments need to ensure when managing the confidentiality obligations of the Auditor-General Act 1997, and effectively sharing audit findings, prior to an audit’s publication.
  • 8
    Senator Dean Smith, Chair, Joint Committee of Public Accounts and Audit (JCPAA), Committee Hansard, Canberra, 24 March 2017, p. 8.
  • 9
    ANAO Report No. 25 (2015–16), Delivery and Evaluation of Grant Programmes, p. 9.
  • 10
    . Cross-portfolio audits provide the ANAO with the opportunity to assess the same criteria, across a number of different departments and programme areas. In this case, the audit resulted in four recommendations aimed at all Commonwealth departments, not just those that were subject to the audit. Note: this cross-portfolio report was not the subject of any Committee public hearings.
  • 11
    The three departments and their respective grant programmes are the: Department of Industry, Innovation and Science (DIIS)—Energy efficiency Information Grants programme; Department of Social Services (DSS)—Supported Accommodation Innovation Fund; and Department of Infrastructure and Regional Development (DIRD)—Liveable Cities programme.
  • 12
    ANAO, Audit Report No. 25 (2015–16), p. 7.
  • 13
    ANAO, Audit Report No. 25 (2015–16), p. 18.
  • 14
    ANAO, Audit Report No. 25 (2015–16), p. 22.
  • 15
    ANAO, Audit Report No. 25 (2015–16), p. 25.
  • 16
    ANAO, Audit Report No. 41 (2012–13), The Award of Grants Under the Supported Accommodation Innovation Fund, Recommendation No. 3, p. 20.
  • 17
    ANAO, Audit Report No. 1 (2013–14), Design and Implementation of the Liveable Cities Program, p. 135.
  • 18
    ANAO, Audit Report No. 25 (2015–16), 32.
  • 19
    ANAO, Audit Report No. 25 (2015–16), p. 34.
  • 20
    ANAO, Audit Report No. 25 (2015–16), p. 37.
  • 21
    Refer to DSS, Submission 3, pp. 5–6, for the department’s progress against each recommendation.
  • 22
    ANAO, Audit Report No. 25 (2015–16), p. 11.
  • 23
    Refer to the Grants Overview Section in Chapter 1 for further details of the Program Delivery Model and grants Hubs.
  • 24
    Refer to ANAO Report No. 4 (2016–17), Award of Funding under the 20 Million Trees Programme, Table 1.3, p. 20.
  • 25
    ANAO, Report No. 4 (2016–17), p. 9.
  • 26
    DEE, Submission 2, p. 14.
  • 27
    DEE, Submission 2, p. 14.
  • 28
    Senator Dean Smith, JCPAA, Committee Hansard, Canberra, 24 March 2017, p. 13.
  • 29
    Mr Matthew Dadswell, Assistant Secretary, Department of the Environment and Energy (DEE), Committee Hansard, Canberra, 24 March 2017, p. 13.
  • 30
    DEE, Submission 2, p. 3.
  • 31
    DEE, Submission 2, p. 3.
  • 32
    DEE, Submission 2, p. 3.
  • 33
    The Governance and Performance Committee is accountable to the department’s Executive Board for a number of matter including, advice and oversight of governance and risk, as well as advice and oversight of evaluation and performance reporting. Refer to DEE, Submission 2.1, p. 3.
  • 34
    Ms Kylie Jonasson, First Assistant Secretary, Department of the Environment and Energy (DEE), Committee Hansard, Canberra, 24 March 2017, p. 13.
  • 35
    DEE, Submission 2.1, Answer to Question on Notice, p. 3.
  • 36
    Ms Jonasson, DEE, Committee Hansard, Canberra, 24 March 2017, p. 13.
  • 37
    Ms Rona Mellor, Acting Auditor-General, Australian National Audit Office (ANAO), Committee Hansard, p. 13.
  • 38
    Ms Mellor, ANAO, Committee Hansard, Canberra, 24 March 2017, p. 13.
  • 39
    Ms Jonasson, DEE, Committee Hansard, Canberra, 24 March 2017, p. 16.
  • 40
    ANAO, Report No. 3 (2016–17), Machinery of Government Changes.
  • 41
    Ms Mellor, ANAO, Committee Hansard, Canberra, 24 March 2017, p. 16.
  • 42
    Ms Mellor, ANAO, Committee Hansard, Canberra, 24 March 2017, pp. 16–17.
  • 43
    For the four programme objectives, refer to ANAO Report No. 4 (2016–17), Award of Funding under the 20 Million Trees Programme, p. 15.
  • 44
    Ms Madeleine King MP, Joint Committee of Public Accounts and Audit (JCPAA), Committee Hansard, Canberra, 24 March 2017, p. 17.
  • 45
    Mr Dadswell, DEE, Committee Hansard, Canberra, 24 March 2017, p. 17.
  • 46
    Mr Dadswell, DEE, Committee Hansard, Canberra, 24 March 2017, p. 17.
  • 47
    DEE, Submission 2, p. 11.
  • 48
    DEE, Submission 2, p. 12.
  • 49
    Joint Committee of Public Accounts and Audit, Report 452: Natural Disaster Recovery; Centrelink Telephone Services; and Safer Streets Program, December 2015, p. 67.
  • 50
    ANAO, Report No. 41 (2014–15), The Award of Funding under the Safer Streets Programme.
  • 51
    ANAO, Report No. 12 (2016–17), The Design of, and Award of Funding under, the Living Safe Together Grants Programme, p. 7.
  • 52
    ANAO, Report No. 12 (201617), p. 8.
  • 53
    ANAO, Report No. 12 (2016 17), p. 7.
  • 54
    Ms Katherine Jones, Deputy Secretary, Attorney-General’s Department (AGD), Committee Hansard, Canberra, 24 March 2017, p. 25.
  • 55
    Ms Jones, AGD, Committee Hansard, Canberra, 24 March 2017, p. 25.
  • 56
    Ms Mellor, ANAO, Committee Hansard, Canberra, 24 March 2017, p. 26.
  • 57
    Created in October 2015, the AGD’s Grants Community of Practice was implemented to improve the governance culture and provide a forum for best practice grants administration, including sharing knowledge from audit and evaluation findings. Refer to AGD, Submission 4, p. 7.
  • 58
    Ms Mellor, ANAO, Committee Hansard, Canberra, 24 March 2017, p. 26.
  • 59
    Mr Brian Boyd, Executive Director, Australian National Audit Office (ANAO), Committee Hansard, Canberra, 24 March 2017, p. 26.
  • 60
    Mr Boyd, ANAO, Committee Hansard, Canberra, 24 March 2017, p. 26.
  • 61
    CGRGs, Guidance on Key Principles—section 10: An Outcomes Orientation, pp. 25–27.
  • 62
    AGD, Submission 4, p. 7.
  • 63
    AGD, Submission 4, p. 7.
  • 64
    Ms Jones, AGD, Committee Hansard, Canberra, 24 March 2017, p. 27.
  • 65
    AGD, Submission 4, p. 6.
  • 66
    ANAO, Audit Report No. 35 (2016–17), Indigenous Advancement Strategy, pp. 15–17. The report outlines that the IAS’ outcome is ‘to improve results for Indigenous Australians, with a particular focus on the Government’s priorities’ including by ensuring that children go to school, adults go to work and Indigenous business is fostered.
  • 67
    ANAO, Audit Report No. 35 (2016–17), p. 23.
  • 68
    ANAO, Audit Report No. 35 (2016–17), p. 22.
  • 69
    ANAO, Audit Report No. 35 (2016–17), p. 23.
  • 70
    ANAO, Audit Report No. 35 (2016–17), p. 23.
  • 71
    PM&C Opening Statement, Committee Hansard, Canberra, 24 March 2017, p. 2.
  • 72
    Mr Andrew Tongue, Associate Secretary, Department of the Prime Minister and Cabinet (PM&C), Committee Hansard, Canberra, 24 March 2017, p. 4.
  • 73
    Mr Tongue, PM&C, Committee Hansard, Canberra, 24 March 2017, p. 4.
  • 74
    Ms Susan Black, First Assistant Secretary, PM&C, Committee Hansard, Canberra, 24 March 2017, p. 5.
  • 75
    ANAO, Audit Report No. 35 (2016–17), p. 23.
  • 76
    PM&C, Submission 1.1, p. 3.
  • 77
    PM&C Opening Statement, Committee Hansard, Canberra, 24 March 2017, p. 2.
  • 78
    Senator Dean Smith, JCPAA, Committee Hansard, Canberra, 24 March 2017, p. 6.
  • 79
    Ms Black, PM&C, Committee Hansard, Canberra, 24 March 2017, p. 6.
  • 80
    Mr Tongue, PM&C, Committee Hansard, Canberra, 24 March 2017, p. 7.
  • 81
    Senator Dean Smith, JCPAA, Committee Hansard, Canberra, 24 March 2017, p. 6.
  • 82
    Ms Black, PM&C, Committee Hansard, Canberra, 24 March 2017, p. 6.
  • 83
    Ms Black, PM&C, Committee Hansard, Canberra, 24 March 2017, p. 6.
  • 84
    ANAO, Submission 7, p. 1.
  • 85
    Ms Mellor, ANAO, Committee Hansard, Canberra, 24 March 2017, p. 7.
  • 86
    ANAO, Submission 7, p. 1.
  • 87
    ANAO, Audit Report No. 35 (2016–17), p. 26.
  • 88
    Ms Black, PM&C, Committee Hansard, Canberra, 24 March 2017, p. 7.
  • 89
    Ms Black, PM&C, Committee Hansard, Canberra, 24 March 2017, p. 7.
  • 90
    Ms Black, PM&C, Committee Hansard, Canberra, 24 March 2017, p. 7.
  • 91
    ANAO, Audit Report No. 35 (2016–17), p. 9; and Ms Black, PM&C, Committee Hansard, Canberra, 24 March 2017, p. 7.
  • 92
    Committee Hansard, Canberra, 24 March 2017, pp. 7–8.
  • 93
    ANAO, Report No. 35 (2016–17), p. 54.
  • 94
    ANAO, Report No. 35 (2016–17), p. 54.
  • 95
    ANAO, Report No. 35 (2016–17), p. 53.
  • 96
    ANAO, Report No. 35 (2016–17), p. 64.
  • 97
    ANAO, Report No. 35 (2016–17), p. 63.
  • 98
    ANAO, Report No. 35 (2016–17), p. 64.
  • 99
    Ms Black, PM&C, Committee Hansard, Canberra, 24 March 2017, p. 9.
  • 100
    PM&C, Submission 1, p. 2.
  • 101
    Minister for Indigenous Affairs, $10m a year to strengthen IAS evaluation, 3 February 2017, <https://ministers.pmc.gov.au/scullion/2017/10m-year-strengthen-ias-evaluation> [accessed 1 June 2017]
  • 102
    PM&C, Submission 1, p. 3.
  • 103
    PM&C, Submission 1, p. 3.
  • 104
    Ms Nicolle Flint MP, Joint Committee of Public Accounts and Audit, Committee Hansard, Canberra, 24 March 2017, p. 9.
  • 105
    Ms Black, PM&C, Committee Hansard, Canberra, 24 March 2017, p. 10.
  • 106
    Ms Black, PM&C, Committee Hansard, Canberra, 24 March 2017, pp. 9–10.
  • 107
    Ms Flint MP, JCPAA, Committee Hansard, Canberra, 24 March 2017, p. 10.
  • 108
    Ms Black, PM&C, Committee Hansard, Canberra, 24 March 2017, p.  10.
  • 109
    ANAO, Report No. 35 (2016–17), p. 11.
  • 110
    Refer to PM&C, Submission 1, pp. 2–3, for details on the department’s progress against the two recommendations.

 |  Contents  |