Chapter 4 Performance reporting for national funding agreements
This chapter examines the performance reporting requirements for national
funding agreements. Firstly, it explores the necessity for cultural change
across all levels of government to adapt to the reporting requirements for an
outcomes focused framework. The chapter then looks at the factors that must be
considered in order to provide viable, relevant data. This leads to a
discussion about data quality and collection. Finally the chapter examines some
of the improvements that are underway to address the issues identified in the
A strong performance reporting framework is essential to provide
transparency, accountability and scrutiny of the Intergovernmental Agreement on
Federal Financial Relations (IGA FFR). These areas will be covered in Chapter 5.
Accountability and performance reporting
The IGA FFR is outcomes-based, promising a ‘rigorous focus on the
achievement of outcomes–that is mutual agreement on what objectives, outcomes
and outputs improve the well-being of Australians’.
While an outcomes approach focuses on the achievement of objectives and
provides better value for money and flexibility, witnesses warned that it also
demands increased accountability. The Commonwealth Auditor‑General
advised the Committee that performance measurement was central to
accountability, explaining that parliament cannot be confident national funding
agreements are achieving the required outcomes without successful reporting:
Without clear, adequate and consistent reporting against
meaningful performance measures, the Parliament is constrained in its ability
to understand and assess how Commonwealth funding is contributing to the
achievement of value-for-money outcomes in areas covered by national funding
The Council of Australian Governments Reform Council (CRC) told the
Committee that, to ensure accountability, a strong reporting framework was
necessary, including objectives, outcomes and performance indicators:
The [CRC] must be able to assess the jurisdictions’ progress
over time in the areas covered by the national agreements, and it therefore
must have access to adequate and reliable information and data to inform its
The Committee heard that there are several issues that need to be addressed
in order to improve accountability and the reporting framework for national
n cultural change;
n setting reporting
defined outcomes; and
n data quality;
data collections; and
Witnesses warned that achieving the benefits of an outcomes-based
framework requires significant cultural change at all levels of government. The
CRC identified the need for extensive cultural change, reminding the Committee
that the reform of federal financial relations ‘challenge[s] conventional
practices’ and will necessitate ‘[g]enuine cooperation, a commitment to
outcomes, respect for roles and responsibilities, and real accountability’.
The Committee asked whether or not government departments and agencies
had accepted the need for cultural change and made the necessary adjustments.
The Victorian Government maintained that the changes had been accepted and
embraced at ministerial and central agency level but admitted that there were
problems at line agency level:
The cultural challenge that we have is one where some people
in some of the relevant line agencies …are taking a while to absorb what is a
really marked conceptual shift. …The challenge that we have is persuading some
of those who for many years in line agencies and both levels of government have
been dealing with these very prescriptive SPPs to realise that the world has
The Committee asked what the barriers were to achieving this cultural
shift. Echoing the Victorian Government’s comments, the Queensland Government cautioned
that an inputs approach was entrenched in many government departments and this mind-set
would take some time to adjust:
It is simply a matter of getting people to change the way
they think. In this case what Commonwealth line agencies require from state
agencies under a particular agreement is an issue that can be resolved in time,
but it is very hard to change overnight people’s views and expectations about
what needs to be done under a particular agreement.
The Queensland Auditor-General suggested that it is more difficult to
measure outcomes than inputs and this explained the reluctance of departments
and agencies to change:
Everyone is comfortable with controlling inputs because you
can measure them; you can see them. People are less comfortable with outcomes
because they are a bit more difficult to describe and to measure.
The Productivity Commission (PC) elaborated on this concept, proposing
that agencies were not comfortable being held accountable for outcomes which
are less easy to control than inputs:
Outcomes can be affected by a lot of external contextual
factors, an agency can say, ’I don’t control the unemployment rate, and the
unemployment rate is actually a major factor in homelessness, so you can’t hold
me responsible for that high-level outcome.’
The Committee asked what steps are being taken by the various parties to
promote cultural change. The Queensland Government admitted that it is still
working through solutions to address this issue but told the Committee that it
is implementing a range of methods to encourage and support the necessary
We are looking at a few mechanisms, but mostly in the area of
what we can do in terms of training material and documentation. For instance,
we are looking at guidance, practitioner’s toolkits and that kind of material
at a fairly technical level.
The Department of the Prime Minister and Cabinet (PM&C) confirmed
that cultural change was a long term goal that required ‘an ongoing educative
exercise’ and acknowledged that the federal departments responsible for implementing
the IGA FFR had ‘underestimated the amount of work that we would need to do to
change the culture’.
Setting reporting objectives
To lay the foundation for reliable performance reporting clear reporting
objectives have to be established by precisely defining the outcomes required
for each individual national funding agreement. Once objectives are clear, key
performance indicators can be developed to measure and assess outcomes.
Clearly defined outcomes
The Commonwealth Auditor-General told the Committee that outcomes must
be clearly defined and that this required ‘clarity around the policy
objectives’ of national funding agreements. This advice was
reiterated by state Auditors-General and academics.
The Queensland Auditor‑General told the Committee:
…it is knowing what you want and what you want to use it for
which is important.
The Queensland Auditor-General went on to provide the example of the
Building the Education Revolution (BER), telling the Committee that lack of
clarity around policy outcomes hampered performance reporting for this national
I am still bemused as to whether the stimulus package [BER]
was about spending the money quick or actually achieving some outcomes by way
of school buildings. If I look at the agreement, there is not a clear sense as
to should the money have just been spent and spent quickly,…or should it have
been spent well to actually deliver some buildings and some capacity.
While agreeing with the need to specify measureable outcomes, the
NSW Auditor-General cautioned that this can be a difficult task in itself.
He explained that outcomes must neither be too broad or too specific:
…if the outcomes are too broad it becomes very difficult to
measure whether they have been effective. For example, if they say, ‘Here’s a
bucket of money – the outcome we want is better health,’ it is quite difficult
to measure whether that has been achieved. But if they say, ‘Here’s a bucket of
money– we want you to buy this number of syringes,’ that is pretty pointless as
The Queensland Auditor-General identified another difficulty. Given the
multijurisdictional nature of many national funding agreements it is difficult
to articulate outcomes and specify measurements:
So there will be a number of factors that are impacting on
the achievement of the outcome, not all of which will be controlled by either
the state or the Commonwealth…
Key performance indicators
Effective performance measurement is facilitated by the development of
relevant key performance indicators (KPIs). The Commonwealth Auditor‑General
advised the Committee that the shift to outcomes measurement ‘requires
performance indicators that link directly to outcomes’.
Likewise, the CRC told the Committee that its ability to provide useful
performance reporting depends on developing relevant KPIs that are linked to
objectives and outcomes.
The Committee asked the PC what process is used to develop the KPIs. The
PC informed the Committee that the high level indicators were developed by the Council
of Australian Governments’ (COAG) working groups and endorsed by COAG.
These indicators were ‘fairly broad’ so the PC consulted with Ministerial
Council data subcommittees, PC review working groups, data providers and the
CRC to develop more specific indicators. The PC advised that the
indicators are continually revised on advice from the CRC and stakeholder
consultation is repeated with each revision.
Despite the attempts to align KPIs with outcomes and objectives,
witnesses repeatedly claimed that KPIs are inadequate. The Queensland Government
told the Committee that a number of KPIs are ‘not relevant’.
In their submission, the NSW Government argue that KPIs are often not
Performance indicators should be well connected to, and
provide comprehensive coverage of, the high level objectives and outcomes of
agreements. Currently this is not the case for many NAs and NPs.
Specifically, witnesses were highly critical of the profusion of KPIs. A
number of witnesses pointed to the National Healthcare Agreement as an example
of the unwieldy overuse of KPIs. The NSW Government told the Committee that
although 70 indicators may provide a useful overview of the health system it is
neither sustainable nor meaningful. Further, there is
insufficient data available to measure the indicators and the CRC could only
‘meaningfully report against 25 of the 70 indicators’ in 2010.
The NSW Government concedes that data could be provided for more of the
indicators but warns that the cost of achieving such improvements ‘cannot be
Ultimately, the NSW Government argues for rationalisation of KPIs:
Fewer, more meaningful indicators across the spectrum of
agreements will facilitate a sharper focus on what really matters, and make it
easier for the public to understand the performance of their governments.
Another example, provided by the Queensland Government, is the National
Agreement for Indigenous Reform. This Agreement has 27 indicators and the
Queensland Government only has useful data for 14 of the KPIs and questions
whether or not it should set up programs to satisfy the other 13 indicators or
‘focus on the areas for which we do have data and time series data.’
The Treasury (Treasury) acknowledged that this is an area that needs
attention and indicated that the Heads of Treasuries (HoTs) Review has addressed
the issue. Treasury told the Committee that a balance of KPIs is required that
will ‘give the best analysis of the outcome you are trying to achieve whilst
keeping [the number of KPIs] to the minimum you need to do that’.
The Queensland Government, among others, drew the Committee’s attention
to the Conceptual Framework for Performance Reporting endorsed by COAG
in February 2011 which provides a general guide for developing KPIs and
addresses many of the issues raised by witnesses.
Data quality and collection
Witnesses repeatedly stressed the inadequacy of the data available to
assess performance under national funding agreements. The CRC highlighted its
frustration over this ongoing issue and told the Committee its ability to
‘meaningly report on the achievement of outcomes year‑on‑year is
significantly constrained by the availability and quality of nationally
comparable data’. The CRC said bluntly
that ‘accountability and transparency fail’ without quality data.
National Disability Services (NDS) like many other witnesses supported
the move to an outcomes focus, however also warned that currently available
data was inadequate to measure outcomes.
The Committee heard that a number of areas need to be addressed to
improve the availability and collection of data and ensure effective
performance reporting, including:
n timeliness; and
In their submission the CRC identified the timeliness of data as one of
the areas that required attention. This concern was shared
by others including NDS, who provided the Committee with a relevant example of
the problems caused by time gaps in data collection. NDS informed the Committee
that the CRC had been forced to draw on data from the 2003 Survey of Disability
Ageing and Carers (SDAC) for its latest report which would suggest that the
findings do not accurately reflect the current situation:
…a six-year gap between collections makes tracking the
effectiveness of the Agreement in relation to a range of key indicators,
including employment participation, almost impossible.
In his testimony to the Committee, the Chief Executive of NDS elaborated
on the distortion caused by the lack of adequate, timely data in this area:
I think the standout case there is figures on workforce
participation and employment. These are available at present only through one
source, which is the Survey of Disability, Aging and Carers, and that occurs
every six years. ... In that area where the government has such a strong focus
on increasing workforce participation, where workforce participation is such a
key driver of the economy, it seems to me ludicrous to be relying on figures in
this area that are six years out of date.
The Committee pursued this matter with the Australian Bureau of
Statistics (ABS) and were told that, due to policy changes in this area, the
SDAC will be expanded and will be run every three years.
Another issue of concern to the CRC is the availability of nationally
comparable data. The CRC explained to the
Committee that, while it supports the need for state and territory flexibility,
in the interests of transparency and accountability there is a need to be able
to compare data across governments. The challenge for the
CRC is that different governments set different targets to achieve the outcomes
of national funding agreements.
For example, the CRC told the Committee that, with regard to the
Literacy and Numeracy National Partnership, variations across jurisdictions
n proportion of
participating schools and students and the criteria for selecting participating
n domains, year levels,
size of student cohort, student characteristics and sectors for measurement;
n calculation of
targets and methodologies for establishing baselines and the total number of
The Committee asked the ABS what steps have been taken to improve the
comparability of data across the nation. The ABS explained that the
Commonwealth processes had to be clarified before the issue could be taken up
with the states and territories. Now that those processes
are established at the Commonwealth level, the states and territories are being
engaged and attempts made to collect comparable data across jurisdictions:
There is a whole range of differences in the systems in
states and territories that we are sorting out in that process as well so that
we can get common measurement not so much in the way the services and systems
work differently but in what the outcome differences are.
The CRC also alerted the Committee to the need for data to be built up
over time to provide both meaningful comparisons across jurisdictions and
longitudinal comparisons of changes and trends, a point made by PM&C as
well. The CRC advised that, as
the CRC has been operating for nearly three years, solid foundations have been
put in place and this aspect of data collection is improving.
Generic data collections
To streamline data collection and remove some of the burden being placed
on line departments and service delivery agencies, the Committee heard that it
would be useful if generic data collections could be developed to satisfy the
requirements of various reporting frameworks. Dr Baker, Chief Executive Officer
of NDS suggested that the PC principle ‘one report, many uses’ would
significantly relieve the reporting burden on smaller agencies:
The picture from the point of view of service providers is
that they are often feeding into multiple data collections, some of which
overlap and not all of which include meaningful data items. It seems to me that
there would be sense in…auditing that and producing a reduced but more
meaningful consolidated set which may have multiple purposes.
The Committee questioned if it was feasible to develop a uniform set of
KPIs that would satisfy the reporting requirements of different sectors.
Dr Baker was unsure but pointed to similar work being done to rationalise
quality compliance systems and suggested this may provide a model:
There are attempts at present to try to ... look at the different
quality systems to which organisations have to comply and do a cross-check. So,
if they have complied with one quality accreditation system, then they may have
complied, in effect, with 90 per cent of another requirement. The same
principle could apply to data as well. There will be some data required that is
distinct to one particular program or sector and there will be some that will
be common to many.
Meaningful interpretation of data to assess the quality of outcomes was
also of concern. The Committee heard that the focus on narrow or quantitative
data could make it difficult to assess the quality of the larger outcomes being
achieved. For example, NDS explained to the Committee that while public policy
outcomes stress the participation of people with a disability ‘in all domains
of life, not just the economy but civil society’ such an outcome is very
difficult to measure:
…our capacity to measure in a meaningful way what
participation in non-economic terms means is not easy.
The Committee pursued this issue with a number of witnesses and asked
what methods were being implemented to ensure meaningful interpretation of the
data collected. In particular, the Committee wanted to know what was being done
to supplement the facts and figures of quantitative data with more sophisticated
qualitative information about the quality of outcomes. PM&C recognised the
need to assess ‘quality of life’ outcomes but maintained that this could be
done without resorting to direct qualitative methods as Mr Hazlehurst, First
Assistant Secretary, explained:
It is in those spaces where the data challenges of course
become the most challenging. It is still data. I do not think we are likely to
end up in a situation where it would go as far as qualitative data in the form
of things like reports from focus groups. However, there is plenty of data that
is collected, including of course by the ABS, which is survey data which goes
to people’s level of satisfaction with either services that they have received,
their quality of life, the amenity of their local neighbourhood, their feeling
of empowerment in the workplace,…things like that where it is not about things
that you can see as easily, it is more about people’s perceptions.
While acknowledging the validity of collecting qualitative data,
PM&C informed the Committee that collecting qualitative data is not cost
The obvious thing to say about that is that those require
intensive, expensive, nationally collected surveys in order to come up with an
accurate representation of, if you like, the pattern of outcomes across the
country that one requires for administering a set of arrangements like this.
For a particular one-off purpose we are talking about national data collections
that have to be replicated over and over again…
The Committee questioned if the focus on facts and figures would blur
the distinction between outputs and outcomes and inhibit the cultural
transition needed to ensure the quality of the outcomes of the IGA FFR.
PM&C held that quantitative data was the only means of measuring many
outcomes and cited the example of reducing infant mortality where a figure
indicates the quality of the outcome.
To clarify the issue, PM&C used the example of the National
Education Agreement where the emphasis is on measuring student results:
The sorts of measures in there are not measures of how many students
receive X, Y and Z. …They are actually measures of achievement of outcomes by
those students. So…wherever possible, the desire has been to shift to those
measures that are not to do with counting the number of things that have been
delivered to people.
The Committee asked how to ensure that both qualitative and quantitative
results were being considered in the analysis of data to guarantee that quality
outcomes were being achieved. Treasury advised that the issue must be addressed
when KPIs are being formulated at the beginning of the process:
You would expect these qualitative issues to be considered in
the policy design up front and thought about in terms of the policy that is
formulated and then looking to get an appropriate expression of the outcomes of
that and then indicators to measure it.
The Committee’s questioning on qualitative results related to the
question of how to effectively assess the overall quality of outcomes in a
sector. The Committee also questioned whether sectoral assessments alone were
sufficient due to the large scale of the spending, and hence whether a more
integrated measure or limited set of high level indictors of national outcomes
might be needed. Members of the Committee asked the PC whether ‘development’ or
‘wellbeing’ indicators at a national level were being used to assess the
outcomes for national funding agreements—such as the ABS’s program into Measures
of Australia’s Progress—and whether data used for national funding agreements was
being cross linked to these types of initiatives.
The PC told the Committee that it had not been requested to use this
type of complementary measure of welfare. Further, the PC representative made a
point of detail saying that he was cautious of using indices due to the
I know how to interpret a specific indicator. I know what the
numerator and denominator were. I can work out the data quality issues and the
context and I can make an informed assessment. When you start putting an index
together, you have to be very careful because the debate then becomes about
what is included in the index, what you left out of the index and how you weighted
the different components of the index.
The Committee is aware that, as with the implementation process, the
performance reporting framework has been reviewed by the HoTs and the CRC and
acknowledges that these bodies have made a number of recommendations to address
the issues identified in this chapter. Additionally, witnesses drew the
Committee’s attention to a range of improvements that are underway, including:
Framework for Performance Reporting;
n whole-of-government data
n CRC reports and recommendations;
n Steering Committee
for the Review of Government Service Provision analysis and reforms.
Conceptual Framework for Performance Reporting
The Queensland Government drew the Committee’s attention to the Conceptual
Framework for Performance Reporting endorsed by COAG in February 2011 which
provides a general guide for developing KPIs. The guidelines set out
clear steps to ensure KPIs are linked to objectives and outcomes, and are
meaningful, timely and comparable. The guidelines are to be
used in conjunction with the IGA FFR and the Federal Finances Circular 2010/01,
Developing National Partnerships.
Whole-of-government data integration project
The ABS advised the Committee that a whole-of-government data
integration project has been established to facilitate the collection and
comparability of data across the country. The project has been
initiated by portfolio secretaries and a governance board chaired by the
Australian Statistician has been set up.
The ABS told the Committee that the project will allow all information
held by Commonwealth agencies to be interrogated for statistical and research
purposes. However, the ABS assured
the Committee that steps have been taken to ensure transparency and
accountability, including legislative accountability:
It is high powered in terms of liberating the data for the
sorts of purposes that…this committee is interested in, but it also has very
strict controls around privacy and confidentiality. It can only be done for
COAG Reform Council reports and recommendations
The CRC is the key accountability body for COAG under the IGA FFR
and is tasked with reporting on performance for all National Agreements (NAs)
and National Partnerships (NPs). The Committee was told
that the CRC plays a ‘pivotal role’ in providing
transparency and accountability for the COAG reform agenda. The Queensland
Auditor-General spoke of the ‘rigour’ that the CRC is bringing to performance
In its second annual report on the reform agenda, the CRC found that
overall performance reporting is improving and is simpler, standardised, and
more transparent. However, the CRC
continues to be concerned over data quality including ‘data availability,
comparability, timeliness, frequency of collection, accuracy and the ability to
disaggregate data’. The CRC has recommended
that these issues must be addressed, particularly with regard to NPs.
Questioned on the seeming slowness of reform with regard to the performance
reporting framework and data quality and collection, the CRC cautioned that
reform in this area takes time. The CRC said that data
development is complex and expensive, explaining that service delivery data,
drawn from administrative data, is essential for monitoring NAs.
Administrative data is largely collected by state and territory governments and
changes and improvements involve considerable negotiation:
…[administrative data systems] were set in place at the state
level for the state government’s purposes or even for the service providers’
purposes, and you are trying to aggregate administrative data not only to, say,
the school system of a jurisdiction but then to jurisdictions across Australia
so that it is comparable. It takes a long time to agree how they are going to
define certain indicators and how they are going to collect the data. It
changes computer systems. It changes administrative systems.
The Business Council of Australia (BCA) was critical of COAG’s slow
response rate to CRC reports and recommendations.
The Committee asked the CRC if its recommendations were being responded to in a
timely manner by COAG. The CRC assured the Committee that COAG is taking action
on its recommendations. The CRC advised the
Committee that many of its recommendations have been simple reforms not
requiring COAG approval and have been directly taken up by data development
committees across jurisdictions.
The CRC informed the Committee that, in response to a range of
recommendations concerning the performance reporting framework from the CRC and
others, COAG had initiated the Heads of Treasuries Review (HoTs Review). More
recently, COAG has announced a review to specifically assess the performance
framework of each individual NA ‘to ensure progress is measured and all
jurisdictions are clearly accountable to the public and COAG for their efforts’.
The Conceptual Framework for Performance Reporting, mentioned earlier,
details the process to be followed for this review. The Committee asked what
timeframe was in place for this review and was told that the review will be
completed by the middle of 2012.
The Treasury (Treasury) and PM&C updated the Committee on the
progress of this review. The performance framework for each agreement is being
examined to ensure that there are clear links between outcomes and indicators
and that indicators are sound:
…ensuring that there is no ambiguity in indicators in how
they relate to the outcomes so that from the lay person’s or public’s point of
view there is a clear understanding; if you have an indicator, if you see a
movement or change in the data over time, you know what that is trying to
Treasury told the Committee data is being reviewed to ensure its
veracity and frequency and that it is measureable over time.
Treasury advised that gaps in data are being identified and the opportunity
taken to assess whether or not such data can be collected cost effectively:
If there is an absence of data in an area, you need to look at
the benefit of the data being captured versus the cost of actually doing it.
In its submission to the Committee, the Secretariat for the Steering
Committee for the Review of Government Services elaborated on this principle
n the benefits of new
data collections or improvements to collections and reporting must be
reasonably expected to outweigh the associated costs to service providers, data
agencies, reporting agencies and agencies required to respond to reports.
Steering Committee for the Review of Government Service Provision
Supported by a Secretariat within the Productivity Commission, the Steering
Committee for the Review of Government Service Provision provides support to
the CRC through a range of reports. Specifically, under the
IGA FFR, the Steering Committee collects and collates the performance data for all
NAs, and a number of NPs, for the CRC. The Secretariat advised
the Committee that the Steering Committee also has a role in assessing the
quality of data collected for performance reporting:
…the data providers provide a data quality statement
according to the Australian Bureau of Statistics’ data quality framework. The
Steering Committee then summarises that information and adds some of its own
commentary to that in what are called ‘comments on data quality’.
The Secretariat is optimistic that there is overall improvement in data
systems including in the quality, availability and timeliness of data.
The Secretariat identified both the HoTs Review and the current review of the
performance framework of each individual NA as positive steps and told the
Committee that a lot of work is being done at ground level to improve data
The Committee asked the Steering Committee Secretariat how advanced the
improvement in data collection was. The Secretariat explained that improvement
varies across the different areas covered by the NAs, depending on the specific
problems associated with different types of data. For example, the National
Indigenous Reform Agreement and the National Disability Agreement present
inherent difficulties that will take some time to resolve. On the other hand, data
quality and collection for the National Healthcare Agreement has made ‘rapid
progress’ due to a concerted effort across jurisdictions:
That has come about through system changes at the
jurisdiction level, where jurisdictions are doing things differently, and
through significant changes by the main collector or manager of the health
data, which is the Australian Institute of Health and Welfare. They have done a
good job of making more data available more quickly.
The Committee asked if the work on improving data quality and collection
is being undertaken formally and coordinated between the Commonwealth and
state/territory governments. The Secretariat advised that the ABS and the
Australian Institute of Health and Welfare, both national data agencies, were
responsible for the data improvement process.
The Committee asked if there was a timeframe for completion of the
improvement process and was told that the timeframe could vary depending on the
type of work that was needed. For example, the Secretariat explained that the
issues regarding data on homelessness for the National Affordable Housing
Agreement presented conceptual problems that would need academic research:
…developing a new methodology for counting the homeless is an
academic piece of work and it is taking academic time frames to be resolved. It
is quite a difficult conceptual issue and you want it done right.
Performance reporting is crucial to understanding the success or
otherwise of COAG priorities and the IGA FFR itself. However, through the inquiry
the Committee found that better performance reporting will require additional
effort and sustained focus. For this to become a reality there will be a need
n drive cultural
n clearly set
objectives and outcomes, and develop valid key performance indicators;
n collect data of a
higher quality more quickly, while streamlining administrative burdens; and
n ensure meaningful
interpretation of the data that is collected.
Each of the above improvement points is discussed below.
Driving cultural change towards full adoption and implementation of the
principles in the IGA FFR is critical to realising the potential benefits it
promises. Achieving deep seated cultural change will require additional effort
within the Australian Public Service (APS) and across the different levels of
government. It will also require time and concerted effort. The Committee
recognises that the IGA FFR is part of a broader shift in public sector
management, with an emphasis on outcomes and enhanced accountability. Commitment
will be needed to overcome entrenched practices which do not accommodate the
fundamental principles of this new perspective on public administration.
The Committee is satisfied that the cultural change required is well
understood within central agencies and that cultural change is underway.
However, the evidence suggests that personnel on the ground in line departments
and service delivery agencies have still not grasped, or at least have not
fully adopted, the consequences of the change.
While the Committee accepts that cultural change will take time to
filter down through the various layers of the bureaucracy, it believes that
positive steps can be taken to encourage and support such change. The Committee
notes that some steps are being taken with the development of Federal Finances
Circulars and the Conceptual Framework for Performance Reporting. However,
more can still be done.
The delivery of training to all staff involved in IGA FFR processes and
performance reporting is a key additional step towards the cultural change
The Committee therefore recommends that PM&C and the central agencies
implement a structured approach to ensure that all relevant staff receive specific
training to enhance their understanding of the framework and develop the skills
required to meet performance reporting requirements. In addition to dedicated
training, the Committee recommends that relevant broader APS training be
amended to incorporate information on the IGA FFR. For example, training on the
Financial Management and Accountability Act 1997 and general public
sector administration courses should reference and explain the importance of
the IGA FFR and its principles. This is considered important to raise
awareness of the IGA FFR to APS staff generally.
The Committee also recommends that the Commonwealth works to ensure that
other jurisdictions implement a similar approach to training. Up-skilling only
APS staff is insufficient; to realise the full benefits inherent in the IGA FFR
state and territory officials will need equivalent skills and buy-in.
The Committee considers it essential that clear objectives and outcomes for
national funding agreements be negotiated, agreed and documented – and that
these are supported by valid key performance indicators. The Committee believes
that the apparent lack of clarity surrounding outcomes for different agreements
is seriously undermining the principles of the IGA FFR. If outcomes cannot be
satisfactorily agreed and articulated in a way that provides sufficient clarity
to all parties this suggests that the practicality of the principles of the
IGA FFR may need to be reconsidered. However, this should not be
interpreted to mean that every aspect of an agreement needs to be quantified or
linked to a numerical KPI for the agreement to be meaningful. Agreement and
clarity of outcomes is about mutual understanding of the end goal, not whether
the end goal can be perfectly broken down into a long list of numerical KPIs.
In this regard, the Committee stresses the need to ensure that serious
consideration is given to the relevance of KPIs, and whether these KPIs are
supported by existing data collections. The Committee notes that the Conceptual
Framework for Performance Reporting is being used to streamline KPIs with
the aim of ensuring that KPIs are measureable, relevant and directly related to
outcomes. The Committee understands that this review is well underway and that
it will rationalise and simplify KPIs. The Committee considers that this
exercise is critical to address the concerns raised over the proliferation and
meaningfulness of indicators.
Despite a potentially improved set of KPIs, the Committee remains concerned
at the ongoing problems with data quality and collection. In many instances
data needs to be collected, complied and analysed more quickly than is
currently the case. In instances where data is not currently collected, consideration
must be given to the allocation of funds to support its collection. If there is
no data available to measure a KPI and collecting the necessary data would not
be cost effective, the KPI should be removed or amended.
Regarding the reporting burden, national funding agreements should seek
to streamline reporting requirements and consolidate data collections wherever
possible. In essence, the goal should be easier data collection with fewer
‘survey’ forms. Although the Committee is aware that various attempts are
underway to improve data collection approaches, it was not clear that enough
work had been done towards the ‘single report to multiple agencies’ ideal, or towards
compiling core data sets for key national priorities.
The benefits of moving towards the ‘single report to multiple agencies’
ideal are obvious, including minimising the need to reconfigure and repackage
the data collected for each respective reporting requirement. The Committee also
sees potential benefit in developing a core set of standard data requirements for
all reporting which include key areas of national interest such as indigenous
affairs and provision of services to low socio economic status members of the
The Committee therefore recommends that PM&C, Finance and Treasury report
back to the Committee on work undertaken to move towards the ‘single report to
multiple agencies’ ideal and the potential to develop a core set of standard data
requirements for key areas of national interest.
In this regard the Committee is particularly interested in the
development of the whole‑of‑government data integration project and
urges all jurisdictions to take whatever steps necessary to ensure it
It is not sufficient just to collect data, it is necessary to ensure
meaningful interpretation by decision makers and the community. The Committee is
concerned that the proliferation of KPIs comes at the detriment of higher level
measures or summary indicators that allow decision makers to meaningfully gauge
ultimate outcomes. Despite some suggestions that sectoral or national level
indicators (either single or multiple) are complex to compile the Committee
feels that it is important that a tangible set of indicators is available. The
Committee therefore supports moves towards better links between national
agreement reporting and complementary measures of national outcomes such as
ABS’s Measuring Australia’s Progress initiative, even if further research is
required to achieve this result.
Overall the Committee has identified a range of potential improvements
to the performance reporting framework under the IGA FFR. The Committee
acknowledges that there are many initiatives underway towards improvement, but
is keen to see more action to bring these to fruition and ensure full
implementation. The Committee notes that the CRC has already identified many of
the problems examined in this chapter, however remains concerned at the slow response
time by COAG to the CRC’s reports and recommendations. The Committee recommends
that COAG take steps to respond to the reports and recommendations of the CRC
in a timelier manner.
The Committee recommends that a structured approach be
developed and implemented by the Department of the Prime Minister and Cabinet
and other central agencies to ensure relevant staff receive specific training
to enhance understanding of the Intergovernmental Agreement on Federal Financial
Relations and develop the skills required to meet outcomes focused performance
The Committee recommends that the Department of the Prime
Minister and Cabinet, in consultation with other central agencies, establish
processes to ensure that there is clarity of the outcomes to be achieved and
these are clearly reflected in national funding agreements. The committee
asserts that to underpin the achievement of outcomes, mutual understanding of
the end goal must drive the cultural change, the training and skill
development, and the quality and timeliness of data collection and
publication. At all times, outcomes should be the focus in the development of
all national agreements.
The Committee recommends that the Department of the Prime
Minister and Cabinet, in collaboration with agencies such as the Australian
Public Service Commission, should lead a process to provide training across
the broader Australian Public Service which incorporates information on the Intergovernmental
Agreement on Federal Financial Relations to explain the importance of the Agreement
and its principles.
The Committee recommends that the Commonwealth works through
the Council of Australian Governments to ensure that states and territories
develop and implement a similarly structured approach to foster cultural
change throughout departments and agencies and ensure all staff receive
relevant training to enhance understanding of the framework and develop the
skills required to meet outcomes focused performance reporting requirements.
The Committee recommends that the Department of the Prime
Minister and Cabinet and central agencies report back to the Committee within
six months on work undertaken to move towards the ‘single report to multiple
agencies’ ideal and the potential to develop a core set of standard data
requirements for key areas of national interest.
The Committee recommends that the Prime Minister through the
Council of Australian Governments, take steps to respond to the reports and
recommendations of the Council of Australian Governments Reform Council within