Chapter 13 - The evaluation of Australia's public diplomacy programs
13.1
DFAT's submission states that it 'delivers quality PD programs which
provide Australian taxpayers with value-for-money and compare well with the
activities of countries with much larger PD budgets'.[1]
The committee in this chapter examines the mechanisms DFAT uses to gauge the
success or otherwise of its public diplomacy programs.
The department's public diplomacy objectives
13.2
The department's public diplomacy programs are intended to promote 'an
accurate and contemporary view of Australia', manage or rebut negative or
inaccurate perceptions and build goodwill'.[2]
It is against these objectives that DFAT measures the effectiveness of its
public diplomacy.
Tools for evaluating public diplomacy programs
13.3
DFAT's submission identified the various tools it uses to monitor its public
diplomacy programs. They include:
- annual reporting of departmental public diplomacy programs (Senior
Executive Service reviewed);
- exit interviews with participants in the International Media
Visits and International Cultural Visits programs;
- monthly summaries of local press reportage, compiled by IAB; and
-
'modest' opinion surveys to judge the wider impact of public
diplomacy activities.[3]
The Annual report
13.4
DFAT regards its annual report as a key accountability instrument that
provides the information necessary to assess its performance in areas such as
public diplomacy. Dr Strahan said:
I know you have said it is not going to be a best seller and
that people will not read these documents closely, but it is a very important
way that we communicate in a formal way with the parliament and the Australian
people to set out what we are doing.[4]
13.5
The committee uses a few examples from the 2005–2006 Annual Report to
illustrate the type of information it provides and how this assists in
assessing DFAT's success in delivering its public diplomacy programs.
Bilateral Councils
13.6
The committee looks first at the section on the foundations, councils
and institutes (FCIs) using the Council for Australian–Arab Relations as an
example. In its Annual Report, DFAT recorded that the Council:
...continued its work to broaden awareness and understanding
between Australia and the Arab world, to promote a greater understanding of
mutual foreign policy interests, and to encourage activities that lead to
mutual economic benefit and promote Australia's image in the Arab world.[5]
13.7
It then listed activities including the launch of a teachers' resource
kit for use in schools in the United Arab Emirates, Qatar and Kuwait; support
provided to a visit to Australia by two Saudi Arabian health officials through
the Young Professionals Exchange Program, and the provision of seed funding to
assist in the development of Deakin University's Arabic Online Learning
program.[6]
13.8
This style of reporting on the activities of the Council is a template used
for the nine FCIs—there is a general mission statement about broadening and
deepening people-to-people links followed by a list of activities which
includes conferences, exhibitions, visits, exchange programs and scholarships designed
to meet these objectives.[7]
Overseas posts
13.9
The annual reporting on public diplomacy activities by the overseas posts
also relied heavily on listing public diplomacy activities such as briefings, the
placement of articles in 'influential newspapers', seminars and conferences without
providing an indication of the extent to which they achieved their objectives.[8]
For example, the report contained information on the joint Indonesia-Australia public
information campaign on illegal fishing. The report stated clearly that the
exercise was designed to 'ensure wide understanding of the issues involved, as
well as an outreach campaign to fishing villages explaining the risks
associated with illegal fishing in Australian waters'.[9]
Although the intention of the project was clear, there was no information on
the effectiveness of the campaign—did it reach members of the target audience,
did they listen to and learn from the message and did it change their views or
actions?
Special visits programs
13.10
The same tendency merely to describe and list activities is also evident
in the section reporting on the visitors' programs. The annual report stated
that the Special Visits Program is the department's 'premier visits program'. It
maintains that the program is carefully targeted and brings to Australia
influential or potentially influential people for meetings and engagements with
Australian government, business and community interests. It lists some of the
26 visits. There is an assumption, but no indication, that these visits were
effective in promoting an accurate and positive perception of Australia.[10]
13.11
It should be noted that there are a few exceptions in the annual report where
the information goes beyond listing or describing an activity to demonstrating
how the activity contributed to the department's public diplomacy objectives.
For example, among the many visits held under the International Visits Program,
DFAT's annual report notes that four senior defence journalists from Malaysia, Thailand,
Cambodia and Indonesia observed the 'Pacific Protector 06'
counter-proliferation exercise managed by the Department of Defence. It records
that subsequent reporting by these journalists provided 'informed coverage of Australia's
contribution to regional security'.[11]
13.12
On its reading of the annual report, the committee found that generally
it provided a comprehensive overview of DFAT's public diplomacy activities. It did
not, however, provide the type of information that would allow the committee to
obtain an insight into the effectiveness of DFAT's public diplomacy programs.
Observations on the annual report
13.13
Witnesses to the inquiry expressed the same difficulties in trying to
gain an understanding of the success or otherwise of DFAT's public diplomacy
programs from the annual report. The International Public Affairs Network was
of the view that neither DFAT's submission to the inquiry nor its annual
reports 'contain data to validate the department’s claims or fully analyse its
performance under this reference'.[12]
It stated:
DFAT reporting on its public diplomacy is dominated by lists of
activities rather than outcomes. The emphasis is on activity with no evidence
of evaluation or validation of the impact on target audiences. Many activities
listed are merely attempts to project traditional diplomacy in public. For
example, the DFAT Annual Report 2005-2006 highlights in its overview of public
diplomacy ‘the launch of the Asia–Pacific Partnership on Clean Development and
Climate in Sydney in January 2006, the inaugural ministerial meeting of the
Trilateral Strategic Dialogue in Sydney in March 2006, the launch of a
Government paper on weapons of mass destruction counter-proliferation in
October 2005 and ongoing negotiations for bilateral free trade agreements'.[13]
13.14
Mr Trevor Wilson stated that he had 'pored over all the DFAT annual
reports that you can access on the website' and found 'very little, almost no,
attempt to measure outcomes in public diplomacy, rather than outputs'.[14]
Jacob Townsend also referred to DFAT's method of reporting with its tendency
to 'focus on outputs rather than outcomes in measuring the effectiveness of
public diplomacy activities'.[15]
He used the number of visits to an Australian cultural exhibition overseas as
an example:
...the real aim or objective of public diplomacy activities is to
shift those visitors’ opinions. The output of a visitor attending might
actually be in direct opposition to the outcome of counter-terrorism. For example,
that visitor could take a dislike to Australia on the basis of what is in the
program...The point is that you need to measure the outcomes, not the outputs. As
far as I have seen, for example, DFAT measurements of public diplomacy activities
are very much on outputs and not outcomes, and that is something to definitely consider.[16]
13.15
Mr Prakash Mirchandani agreed that the main confusion arises from mixing
up outputs and outcomes. He suggested:
There may be the most frenzied activity involved on Australia’s
behalf, with an impressive amount of funding attached to it...yet if all this
does not lead to defined outcomes, it results in really just a ‘feel good’
relationship alone, which is not what we believe public diplomacy is all about.[17]
13.16
Mr Peter White, Executive Director, ANAO, noted that portfolio budget
statements and the annual report make assertions that DFAT has an effective
public diplomacy program. He explained that ANAO's first question to them would
be to 'demonstrate to us how you do that'.[18]
13.17
RMIT expressed similar concerns about the tendency of DFAT's Annual
Report to describe activities which provide a 'snapshot with little discussion
of overarching objectives, no review of progress over time'. It also drew
attention to the limited scope of reporting on public diplomacy.[19]
RMIT stated:
DFAT’s report on public diplomacy activity in 2005-2006 notes a
number of successful initiatives, but confines itself almost entirely to the
activities undertaken by DFAT and through Australian government posts abroad.
This suggests a relatively narrow approach to public diplomacy, with little
inter-agency activity or partnership. There is little discussion of what the
goals of public diplomacy might be, outside reference to supporting the
specific policy goals of government; thereby reducing it to a relatively minor
subset of official diplomacy.[20]
Committee view
13.18
Based on the committee's reading of DFAT's Annual Report and the
comments by a number of witnesses, the committee finds that DFAT's Annual
Report does not provide the information required to actually measure the
effectiveness of its public diplomacy programs. In most cases, the report lists
and describes activities without providing any indication of the direct
outcomes from these activities. There appears to be an untested assumption that
these activities produce positive outcomes. There is no indication in the
Annual Report that DFAT measures the immediate effect of its public diplomacy
programs or the long-term contribution they make to the department's foreign
policy objectives.
13.19
The committee noted previously in chapter 8, that there are many
government departments and agencies involved in public diplomacy activities,
often in partnership with DFAT. DFAT's Annual Report, as observed by RMIT, does
not encompass the broad range of Australia's public diplomacy activities. There
appears to be no reporting or coordinating mechanism that captures all of these
activities and definitely no overall monitoring of Australia's public diplomacy
as a whole.
13.20
DFAT also informed the committee that it uses a range of other methods,
including internal reviews of public diplomacy activities and surveys, to
evaluate the effectiveness of its public diplomacy programs.
Continuing dialogue and self assessment on performance
13.21
In response to a direct question about how the department evaluates the
effectiveness of its public diplomacy programs, Dr Strahan said that a lot of DFAT's
evaluation takes place internally. In his view, it was important for an organisation
to be self-critical. He placed great emphasis on the frequent exchanges between
people engaged in public diplomacy activities within DFAT and across
departments and agencies as a means of monitoring and assessing the
effectiveness of programs. According to Dr Strahan, their external evaluation
is 'integrated with the very nature of the work itself and it is incumbent upon
us to always have that dynamic conversation with our partners'. Conversations
take place with the posts and within the department in Canberra.[21]
He explained:
... the very nature of our work means that we are in constant dialogue
with a whole range of other organisations. We run an IDC with other federal
agencies twice per year. That is one communication channel. We are constantly
talking to all of these people. We sit around an enormous table with 30 or so
organisations talking to us about what we are doing and how we can connect with
them. The councils and foundations are a very good example of how we reach out
to external entities, because the boards of all of the foundations, councils
and institutes involve people from outside the department. They are usually
eminent people from a variety of business, academic and cultural fields, so
there we are building in outside opinion, outside ideas and outside evaluation.[22]
13.22
Dr Strahan gave the example of a brainstorming session in 2006 where 'we
all stopped and tried again to get a handle on public diplomacy from a holistic
point of view'. Another session is anticipated in 2007. According to Dr Strahan,
the discussion at these sessions 'feeds directly into our senior executive and
will come back into how we run our work when the senior executive communicates
back with us'. He noted how this process ensures that there is a continuing
conversation about public diplomacy and how resources should be allocated to
it.[23]
He then turned to overseas posts and the measures in place to evaluate their activities:
Our posts are required every year to comment on and report on
the effectiveness of their programs. They have to give us quite concrete
material about what kinds of public diplomacy activities they have implemented,
why and what the results were. We use a variety of measures to try to judge the
effectiveness of those programs, from monitoring local press coverage through
to the direct responses of particular participants in our visit programs and
other activities. One of the functions of my branch is to provide our senior
executive with quarterly assessments of the effectiveness of our programs. We
have an inbuilt cycle of doing this as a matter of our daily work.
...each post has a public diplomacy plan and it has a post plan
for the activities of the embassy in general that will contain a series of
benchmarks and outcomes which should be achieved. Then at the end of each year
there is reporting against those benchmarks and outcomes. Then there is
critical assessment back here in Canberra of the extent to which embassies are
meeting those outcomes. That is then also done at the divisional level here in Canberra.
My area has a series of outcomes which we should be striving to meet and we
have to report against those. And we have just gone through a mid-term phase of
that general reporting process where we will inform the senior executive of the
major issues on our agenda and the major achievements and identify the
challenges which lie ahead.[24]
13.23
It is clear that DFAT has a strong communication network which
facilitates discussion on public diplomacy programs and allows close monitoring
of these activities. However, it is not clear whether the reporting regime and
subsequent discussions within the department also constitute 'critical
assessment' as claimed by Dr Strahan.
Self assessment as an appropriate
way to evaluate public diplomacy programs
13.24
Mr Kirk Coningham was of the view that 'a self-assessment is never a
quality assessment'.[25]
He asked:
Every post has a public diplomacy plan, so who critically
evaluates them? Are they efficient? What do they achieve? Who establishes what
the objectives are? What are the broad international objectives that we want to
achieve? [26]
13.25
Mr John Meert, Group Executive Director, ANAO, believed that DFAT has a
responsibility to assess its public diplomacy programs and that it is
appropriate for DFAT to conduct internal evaluations. He stated plainly that
the normal accountability rests with the agency, adding that it is very
important that it does so because 'that is how you are going to drive
improvements'.[27]
In his view, if an agency is asserting that their program is effective, there
is an expectation that it has 'mechanisms in place to measure that
effectiveness'.[28]
Applying this approach to DFAT, Mr Peter White, ANAO, said that ANAO would
expect DFAT to measure 'the immediate impact of somewhat specific programs'. ANAO
would also want to be satisfied that DFAT:
...have a program in place that measures the long-term changes in
attitude in particular countries; whether DFAT get independent feedback on
their work; and whether they measure the attitudes of target countries. With
the performance indicators we would be trying to see whether they were
adequate, whether or not they set targets.[29]
13.26
Mr Trevor Wilson also had no concerns about the appropriateness of DFAT
conducting in-house evaluation of their activities. He did, however, question
the usefulness of their measurements. As mentioned previously, he could find no
evidence that they focus on outcomes.[30]
Jacob Townsend also noted DFAT's self-assessment, but was concerned that it was
'monitoring mostly outputs, not necessarily matching the strategy to the
outcome of an activity'.[31]
13.27
Indeed, many witnesses disagreed with DFAT's view that its public
diplomacy programs are evaluated.[32]
RMIT was not aware of any systematic evaluation of the effectiveness of current
public diplomacy programs and activities in achieving the objectives of
government.[33]
Media Gurus referred to an absence of quantitative and qualitative surveys.[34]
Asialink also noted that it:
...had difficulty sourcing credible qualitative or quantitative
research on the impact of public diplomacy initiatives. Whilst public opinion
surveys are increasingly becoming available from Australian and international
sources, there is insufficient investment in studying the effectiveness of
alternative public diplomacy strategies and interventions. Such investment
would assist both government and partner agencies in decision making and
resource allocation.[35]
13.28
Ms Jennifer McGregor, Asialink, informed the committee that the absence
of evaluation of public diplomacy activities had 'long been a frustration' for
them. In her view although public diplomacy was a soft science, hard data in this
area was needed.[36]
Difficulties evaluating public diplomacy
13.29
Most witnesses agreed with the view that there were difficulties in
accurately and systematically evaluating the success or otherwise of a public
diplomacy program.[37]
Mr Greg Nance from the Sports Commission told the committee, 'it is still early
days with sport for development to be able to monitor in hard numbers what it
is that you are doing, because the outcomes are, by definition, longer term and
a little bit different'.[38]
Ms Sara Cowan, DEST, highlighted the same difficulty. She noted that public
diplomacy is not the primary objective for the department but occurs as a
consequence of their work.[39]
13.30
Mr Freeman noted that public relations practitioners had been grappling
for years with ways to measure the effectiveness of public diplomacy
activities. He accepted that there were no easy formulas; that there was a lot
of theory behind the evaluation of public diplomacy and there were many
options. He was not aware of any one foolproof, effective accounting mechanism that
could determine whether a particular result had managed to achieve value for
money. He added, however:
That does not mean that there are not plenty of signposts and
plenty of ways that we can make various assessments...you could certainly use
size of audience, the kind of media coverage you might have been able to
influence, the number of third-party influences you might have brought onboard
and convinced to support your point of view in the host country, and so on. A
lot of this tends to be statistical and anecdotal.
The real dilemma comes when you try to measure the extent to
which you have changed behaviour or thoughts or attitudes. Frankly, even when
you can demonstrate that an attitude has been changed, it is not always easy to
make a direct causal link between what you have been doing and the actual
change. There are often lots of factors at play.[40]
13.31
Even though he believed that it was difficult and sometimes impossible
to draw a causal link between a public diplomacy activity and changes in
perception, he suggested that efforts to measure the effectiveness of public
diplomacy were worth while and that there were some good and sensible ways to
measure effects.[41]
13.32
Mr John Meert, ANAO, agreed that from an audit perspective, public diplomacy
is a difficult subject because it deals with 'something which is not
necessarily tangible'.[42]
In his view, however, there was the danger that because evaluation of public
diplomacy was thought to be too difficult it would be deferred.[43]
He argued that 'you have to try to come up with a range of measures that at
least assist you'.[44]
He went on to say that the ANAO would expect agencies to measure performance in
this area given the amount of money spent on public diplomacy.
13.33
Mr Trevor Wilson agreed with the view that evaluation was not easy but
that it could be done.[45]
Dr Yusaku Horiuchi, a political scientist and applied statistician teaching
research methodology at the ANU, endorsed this view. He challenged what he
termed the 'dominant view' that it is difficult if not impossible 'to measure
the impact of public diplomacy'.[46]
He outlined to the committee three ways to measure the effects of public
diplomacy.
13.34
The first, borrowed from a method used in market research and social and
political psychology, is called a 'randomised experiment'. When using this
method, a large group of people randomly divided into at least two groups
participate in a traditional paper-and-pencil survey or in computer-based
polling. One group is exposed to information intended to influence their
opinions on Australia and the other is not. According to Dr Horiuchi, after
this information stimulus is given, the groups can be asked a set of questions
about perceptions, attitudes and images and, if there is a significant
difference between the groups, then conclusions can be drawn about the
influence of the information on the recipient group.[47]
13.35
The second method is called propensity score matching and is similar to
the randomised experiment. The third method identified by Dr Horiuchi involved
measuring the effects of high-level visits. It is based on a statistical
comparison of attitudes, one with a visit and one without a visit. Any discernable
difference can then be attributed to the visit.[48]
Surveys and polls
13.36
Market research methods are an important tool that can be used to
measure changes in behaviour or attitudes. The committee discussed surveys and
opinion polls in chapter 6 as part of its consideration of how well Australia
understands its target audiences. In that regard, the committee observed that
surveys undertaken by DFAT over the past decade were few in number, conducted
on an ad hoc basis and without any long-term objective. An absence of
this type of research means that DFAT does not have benchmarks against which to
measure shifts or changes in attitudes or behaviour toward Australia.
13.37
While overall Australia's public diplomacy programs lack independent and
systematic evaluation, there were some agencies engaged in Australia's public
diplomacy that do conduct tighter evaluation of their programs using methods
such as surveys. It is interesting to note that the agencies that took a
serious approach to evaluating their programs have a clear focus and strong
economic interest.
Tourism Australia
13.38
Tourism Australia relies heavily on market and public relations type of
evaluation to formulate its marketing strategies. Mr Cameron-Smith, Manager,
International Operations, Tourism Australia, explained:
We actually run brand tracking surveys...We have an independent
survey that goes to those markets and surveys a sample group of people. There
is campaign recall which asks: ‘What did you do? After seeing that campaign did
you call a travel agent, did you get some literature or did you go online and
are you intending to visit?’ Those measures are then compiled into a summary to
assess the effectiveness of the campaign. That then helps us to be more effective
in terms of our media buy and working with our agency on adapting creative
[campaigns].[49]
13.39
Tourism Australia also uses technology to identify behaviour related to
their internet website in order to determine 'what has been looked for and what
has not been looked for, and adapt the content accordingly'.[50]
Invest Australia
13.40
Invest Australia informed the committee that it has engaged public
relations firms in key markets—France, Germany, the UK and the US. They are
engaged 'to generate positive media coverage about Australia as an investment
destination in targeted markets and to improve knowledge and awareness of its
strengths and advantages in these regions'.[51]
It asserted that each public relations team is performing well and cited as an
example 56 recorded instances of positive media coverage globally in a 7-month
period. Invest Australia was able to report that 'based on previous performance
evaluations, the Return on Investment of PR activity is expected to be a
minimum of 150% of the contract value'.[52]
13.41
When asked how Invest Australia measures the effectiveness of its public
diplomacy expenditure of $1.95 million last year and $2.5 million for financial
year 2006–07, the CEO of Invest Australia, Mr Barry Jones, replied it was one
of the more difficult areas to measure in terms of direct impact. He explained:
Invest Australia’s primary measure in terms of our impact—our
outcomes—is the number of investment projects that we assist to bring to
fruition every year. The ultimate aim, clearly, is to increase investment into Australia
and the ultimate measure is the number of investment projects that are
announced as going ahead in Australia because we contributed in some way. The public
diplomacy efforts and the kinds of awareness raising contributes to investors
becoming aware of Australia in the first place and leading on through the
process of bringing an investment project to fruition, but it is sometimes very
difficult to measure a direct connection, if you like, between that initial
awareness raising [and the eventual investments].[53]
13.42
Invest Australia benchmarks its achievements mainly through the
traditional media sense by measuring exposure in international media and looking
at things such as the number of hits to their website which 'are partly as a
result of people becoming aware of our website through our advertising'.[54]
The Council on Australia Latin
America Relations
13.43
Mr Wheelahan, the Council on Australia Latin America Relations, informed
the committee that the Council has a business focus. He explained that it
evaluates its performance against the strategic plan and the business plan,
which are specifically designed to align with the objectives of Austrade. It
sets benchmarks which, according to Mr Wheelahan are 'fairly arbitrary targets
for the increase in Latin American students studying in Australia, simply as a
why-not'. He explained further:
We have established a group of key performance indicators of our
own, as businesses do, and certainly they have been far exceeded. The
universities, TAFEs and ELICOS centres have been far more successful in selling
Australia to Latin American students than we had anticipated. We set
ourselves an objective when we kicked off to get six flights a week through Auckland
to Santiago. We have got there. I will concede it has more to do with Geoff Dixon
[CEO of Qantas] than it has to do with us, but we have kept pressure on them
every inch of the way. We have set objectives of making student, tourist and
business visas from all of the Latin countries much easier to obtain.[55]
13.44
In short, he stated that the Council's measures are 'business measures'.
The key performance indicators include matters such as the numbers of tourists,
numbers of students, numbers of businesses setting up offices in Latin America
and numbers of Australian exporters dealing with Latin America.[56]
13.45
In commenting on the value of audits, Mr Wheelahan noted that they are
expensive and time consuming and care has to be taken to ensure that the
auditor does not simply give you the information you want.[57]
Even so, he acknowledged that business does it 'all the time'.[58]
Radio Australia and Australia
network
13.46
Radio Australia and Australia Network also use surveys and tracking
trends in behaviour to gauge their success in attracting audiences and in some
cases gaining an insight into attitudes toward Australia. Mr Jean-Gabriel Manguy,
Radio Australia, Australian Broadcasting Corporation, explained they use
audience surveys, which they purchase from the bigger players such as the BBC
and the Americans. He explained:
We buy the figures off them and that gives us a sense of how
effective we are in some countries. It is not possible in all countries. In
some countries like China and Vietnam, where that information is controlled, it
is not easy to get figures. In such countries we have other ways of measuring
whether we are successful or not. The internet is a new platform that is a very
good indicator for us. Our accesses last year totalled 18 million to Radio Australia’s
website and half of these come from China. You can see that the Chinese may not
be writing much anymore but they are accessing the website, and that is a new
indicator for us.[59]
13.47
He also cited measures they use in Indonesia where 30 local stations
rebroadcast Radio Australia daily. He noted that there are, during such
sessions, about 100 to 150 calls from listeners and SMSs from listeners to
those stations. He stated:
Clearly there is an interest from the audience to get in touch
and link up with us. For me, it is a new way of broadcasting and I would argue
that it is a very effective way to reach broader audiences in places such as Indonesia.
That is a useful indicator for us. The fact that some of the stations want to
relay us indicates that for them it makes good sense to carry our content
because it is good and credible with their audience.[60]
13.48
As noted previously, DFAT does not appear to use these types of research
tools—surveys, focus groups, questionnaires—in any systematic way that has long-term
objectives. The survey conducted in the Philippines in 1998, mentioned in
chapter 6, shows the potential to measure performance but the failure to follow
up on this activity suggests that DFAT does not employ these evaluation tools as
part of a rigorous and critical self-assessment of its performance in public
diplomacy.[61]
Proposals to improve evaluation
13.49
A number of witnesses put forward proposals for improving the evaluation
process of Australia's public diplomacy programs. Mr Meert noted that 'a lot of
the agencies are stuck at the activity measure' and 'struggling with how to
determine effectiveness'.[62]
He said:
It is easy to measure activity because you can say that X amount
of money was spent on an advertising campaign. It is the next step that, it
seems to me, most countries are struggling with.[63]
13.50
He suggested that a range of indicators are needed to ascertain whether
the activities being undertaken are 'having the desired effect'. He noted that
there are methods available to measure changes in attitudes or perceptions. He suggested,
however, that one indicator 'on its own may not give you the result but a range
of indicators may give you that indication'.[64]
As an example, he cited 'surveys, direct testing of consumer groups or direct
questionnaires as people come through an immigration checkpoint'.[65]
From his position as an auditor, he would be looking at how agencies are
developing these indicators over time.[66]
He also suggested that 'you learn to walk before you run' and proposed that he
would 'stick to the public diplomacy programs and try to build up a capability
in monitoring there before you run off into spin-off public diplomacy impacts'.[67]
13.51
Mr Prakash Mirchandani was of the view that public diplomacy 'which does
not result in measurable public advocacy outcomes on Australia’s behalf is work
only half done'. He suggested that if public diplomacy is successful, one
simple and measurable yardstick of this success would be the active engagement
of influential stakeholders in target countries on Australia’s behalf.[68]
He also proposed a mandatory 'public diplomacy outline (and outcome) attached
to key activities and issues'. In his view, 'this would make subsequent
evaluations much more effective, allow for better coordination of scarce
resources' and 'ultimately place considerable onus on the Heads of Mission to
take a personal and direct interest in PD, in addition to their focus on
bilateral relationships'. He stated further:
While we understand that DFAT does have such mandated activities
in place for its missions, we believe that these are of necessity constrained
by resource limitations, and could well merit a second look. We suggest a
qualitative evaluation of Whole of Government messages in target countries to
specifically measure whether the outcomes initiated by missions, have actually
changed perception about Australian policies in those countries.[69]
13.52
Mr Trevor Wilson suggested that an independent outside evaluation was
another means of gauging the success of a public diplomacy program.[70]
Jacob Townsend also thought that 'some sort of not grand but insulated unit
might be needed to enforce or monitor' the outcome of activities.[71]
Dr Alison Broinowski proposed that an international survey of comparable
countries be undertaken, 'just to see which way world's best practice goes in
our evaluation'.[72]
Mr Kirk Coningham believed that an arm of government—a different form of
machinery—was needed to establish Australia's public diplomacy objectives and
to evaluate critically the post's public diplomacy plans.[73]
Dr Alan Hawke, Chancellor of the Australian National University, and former
high commissioner in New Zealand, cited the work of ASPI and the Lowy Institute
which, he suggested, do valuable work in measuring the degree of success of
public diplomacy efforts to improve attitudes toward Australia.[74]
Committee view
13.53
The committee acknowledges that evaluating public diplomacy is not easy.
It notes the advice from a number of witnesses that, although difficult, the
evaluation of Australia's public diplomacy programs can and should be done. The
committee agrees with this view. The committee notes the advice from ANAO that
if it were to undertake an audit of DFAT's public diplomacy programs, it would likely
concentrate on the performance indicators the department uses to evaluate the
effectiveness of its programs and how it sets targets.[75]
ANAO would be looking to see whether DFAT has the mechanisms in place to
evaluate its own programs.
Important role for ANAO
13.54
The committee can see a valuable role for the ANAO
in undertaking a performance audit of DFAT's public diplomacy evaluation
activities. Accordingly, the committee requests that the ANAO
conduct a performance audit of DFAT's public diplomacy programs.
Need for performance indicators
13.55
As previously stated, the committee has recommended that tracking
opinions in key target countries toward Australia should be an essential part
of DFAT's public diplomacy. It suggests that this type of data gathering would
also serve as an important performance indicator. It notes the advice, however,
from the ANAO that one indicator 'on its own may not give you the result but a
range of indicators may give you that indication'.[76]
13.56
The committee is of the view that the evidence before it on the
importance of measuring the effects of public diplomacy programs over time or
progress toward public diplomacy objectives is compelling. As already noted,
DFAT does not employ such indicators and as a matter of urgency, the committee
recommends that DFAT put in place performance indicators that would allow it to
monitor and assess the effectiveness of its public diplomacy programs.
Recommendation 17
13.57
The committee recommends that as a matter of priority, DFAT put in place
specific performance indicators that would allow it to both monitor and assess
the effectiveness of its public diplomacy programs.
An independent, comprehensive review of Australia's public diplomacy
13.58
The International Public Affairs Network suggested that 'a global review
and audit of Australia's public diplomacy is required to fill information gaps,
remove inconsistencies, and assess the outcomes, if any, of DFAT's activities'.
It was of the view that the ANAO was the proper Commonwealth authority to lead
a review and audit.[77]
ANAO has suggested that it is not in fact the appropriate authority to carry
out this type of broad review but that it could conduct an audit. As mentioned
previously, ANAO would be concerned with how DFAT is developing its performance
indicators over time.[78]
Mr Meert told the committee that if he were conducting an audit he would concentrate
on the public diplomacy specific programs first. Mr White added:
If you look at the Foreign Affairs submission, they talk about
quality and quantity indicators, and the relevance of culture and media
activities. That is the sort of measure you want to get to...That is where we are
going to: how do you measure public perceptions if you have got a program which
aims to change public perceptions?[79]
13.59
The committee notes, however, that in recent years, the governments of
the UK and Canada have commissioned comprehensive reviews of their public
diplomacy programs. In 2005, Foreign Affairs Canada engaged Universalia, a
consulting firm, to evaluate the group of programs that 'projects Canadian
values and culture'. The review was to assess the extent to which the current
set of Canadian programs contributed to the attainment of Canada's foreign
policy objectives as a whole. Universalia was also asked to review the program
mix of other allies and partners.[80]
13.60
In 2005, a review team headed by Lord Carter of Coles conducted a review
of the UK's public diplomacy. The review team examined the effectiveness of the
current public diplomacy activities in delivering outcomes that contributed to
the achievement of the UK government's objectives.
13.61
The United States Government Accountability Office has conducted
numerous comprehensive audits of various aspects of US public diplomacy.[81]
Committee view
13.62
At this stage, the committee is reluctant to recommend an independent,
comprehensive review of Australia's public diplomacy along the lines of Canada
or the UK. It believes that this Senate inquiry has increased the focus on Australia's
public diplomacy and started a debate that was long overdue. Indeed, DFAT has
already responded positively to evidence taken by the committee and is making
changes, for example through the IDC to reach agreement on a definition of
public diplomacy.[82]
13.63
If the ANAO agrees to undertake an audit, the results from this audit
will provide clearer guidance on the measures DFAT needs to have in place to be
able to determine the effectiveness of its programs.
13.64
The committee believes that having opened up the debate on Australia's
public diplomacy, it should monitor developments in this area and, allowing
time for the implementation of initiatives, review these developments. To
assist the committee in this regard, the committee makes the following
recommendation.
Recommendation 18
13.65
The committee recommends that, two years after the tabling of this
report, DFAT provide the committee with a report on developments in, and
reforms to, Australia's public diplomacy programs giving particular attention
to the role and functions of the IDC and the way DFAT evaluates the effectiveness
of its public diplomacy activities.
Navigation: Previous Page | Contents | Next Page