3. University Research Funding

Chapter three discusses university research funding. It introduces the dual funding system used to support university research and identifies ways to simplify and streamline application and assessment processes.

Dual funding system

The Australian Government supports university research through a dual funding system of competitive grants and research block grants. In 2017-18, 99 per cent of Australian Government funding for university research was provided through competitive grants and research block grants.1
Competitive grant funding is awarded to universities to undertake specific research projects. Research funding is awarded to successful applicants following a merit-based expert peer review process.
The Australian Research Council (ARC) and the National Health and Medical Research Council (NHMRC) are largely responsible for the administration of competitive grant funding in Australia. Both research councils provide competitive grant funding for a mix of individual research projects, industry linked research, collaborative centres and research fellowships.
As noted in Chapter 2, there are around 60 different competitive grant programs administered across six Commonwealth portfolios.2
Research block grants are allocated to universities to support the indirect or systemic costs of conducting research. These indirect costs essentially fund research overheads and include equipment and infrastructure, information technology facilities, and researchers’ salaries etc. Research block grants are allocated to universities according to a performance based formula which considers income received from competitive grants.
Submissions to the inquiry were largely supportive of the dual funding system, noting that the system is not fundamentally broken. Notwithstanding, a number of issues were identified regarding the efficiency and administration of competitive grant application and assessment processes, and research block funding arrangements.

Competitive grants

The Australian National University (ANU) described the current competitive grants landscape. In particular, the ANU highlighted fragmentation and complexity, and the impact of this on university administration, investment and planning.
The 2018 Australian Competitive Grants Register (ACGR) includes 12 active ARC schemes, 34 active NHMRC schemes, and 34 other active schemes (including Rural R&D schemes). These schemes have non-aligned guidelines, submission and assessment processes, and non-coordinated deadlines, imposing significant administrative overhead on Universities. These schemes operate alongside contracts, consulting, international funders, trusts and CRCs [Cooperative Research Centres] (categories 2-4), many of which fund related areas and projects. As a result, there are frequently schemes or funding opportunities announced ad hoc, with short notice for submission of applications (4-6 weeks), and frequently with new guidelines and applications processes. This fragmentation does not allow for sustained, long term investment and planning in research priorities by Universities.3
Universities told the Committee that it is administratively challenging to stay across the different funding requirements and processes, particularly when they are subject to change on a regular basis. In addition, more pressure is placed on universities when they are required to employ staff, in specialist positions, to manage and oversee grant applications.
To give some scale of the administration involved, the University of Sydney set out the workload and resourcing that it dedicates to support funding from Australian Government and other grant schemes. It noted:
in 2017, 1,375 domestic and international grant applications were submitted; 70 per cent to federal funding schemes;
eight full-time staff are employed to manage and deliver research development and collaboration initiatives and support services;
six full-time equivalent Research Administration Officers spend on average 32 hours a week processing funding applications (ARC and NHMRC grants represent 30 per cent of this work for most of the year);
between January to April, five of these six staff spend 100 per cent of their time administering ARC and NHMRC grants;
16 casual staff (approximately) are employed for four weeks each year from January to February, and dedicate 35 hours a week to ARC and NHMRC funding applications;
four full-time equivalent Research Administration Officers spend
100 per cent of their time on post-award administration activities, plus a seasonal casual staff member who assists the team during peak periods;  
two full-time equivalent Research Reporting and Compliance Officers spend 100 per cent of their time on the monitoring, review and submission of progress and final reports to funding agencies; and  
one full-time equivalent Major Initiative Coordinator spends on average 20 hours per week leading the administration of significant and complex funding applications 100 per cent focused on ARC and NHMRC, with these hours increasing from January to March.4
Many submissions to the inquiry called for competitive grant processes to be simplified, and suggested a whole-of-government approach to research funding in Australia. That is, streamlining research guidelines, timelines, processes and contract management across all Commonwealth departments and agencies. For example, in its submission, Research Australia called for a review of all research funding programs to ensure as much uniformity as possible.5


Australian universities spend considerable time and effort applying for research funding through competitive grants. Approximately 80 per cent of these applications will be unsuccessful.
There was consistent acknowledgement that the grant application and assessment processes are the source of much inefficiency in the research funding system.
In particular, evidence to the Committee highlighted issues with the design of the application forms, the information and data requested, and the systems used to collect the information. Some of this evidence included:
application forms are too long, and time consuming to complete;
forms are not compatible with other software or programs, and there is inconsistency across form data fields, word and page limits;
the level of detail and information requested is burdensome and often unnecessary; and
there are frequent changes to the prescribed formats of grant applications as well as eligibility criteria.
At the Committee’s public hearing in Melbourne, two dense applications were shown to the Committee as an indication of the volume of information requested in a competitive grant. The Committee was surprised to see the amount of information provided in support of an application, and to hear the amount of time researchers are said to be spending applying for funding. Some of these estimates included 28 to 38 days, 3 to 4 months, and 550 years cumulatively across all proposals for a round.
Evidence regarding the volume and time associated with grant application forms and processes was compelling. The strong message to the Committee was that ‘the issue is not the writing of the actual research proposals, but rather the disproportionate amount of time and effort necessary to complete the supporting elements’.6
For example, Ms Maaike Wienk, Research and Policy Adviser at the Australian Mathematical Sciences Institute told the Committee that:
… nobody questions writing the core of the research proposal, thinking about what research is to be undertaken, how it fits with existing theory and what the method should be. I think the researchers find that a useful exercise.
It’s the other bits. An aspect of it that people resent is having to put in a detailed budget rather than just tick a box. … it is quite painful. You have to calculate salaries, you have to add a multiplier specific to your university and you have to make an assessment of travel costs three years in advance. All those sorts of exercises, when in actual fact the budget that you apply for you never receive. Those are the kinds of things.7
Similarly, Professor Frank Gannon, Fellow of the Australian Academy of Health and Medical Sciences (AAHMS) noted the complexity of the paperwork. Professor Gannon shared his experience of Australia’s grant funding system compared to other countries:
… I’ve run these in all of Europe and in Ireland. When I came to Australia, I found it’s incredibly complex here. There’s the attention to minutiae required to be addressed and, if you do not address them, you’re excluded. It’s extraordinary. It’s counterproductive. It’s a burden. There’s nothing wrong with it, but, as has been said in more than one case, there’s a difference between what’s legally right and what’s right. I think we’ve got ourselves in Australia into a quagmire of making sure that we address the paperwork rather than the problems we’re really trying to address.8
It is important to emphasise that evidence to the inquiry questioned the assumption that completing grant applications is wasted time and effort. Rather, submissions asserted that the process of applying for a competitive grant – regardless of the outcome – is a valuable one. As noted earlier, this is because it allows researchers to test ideas, to work together and innovate, and to develop good research proposals.
The issue for this inquiry to consider however is one of balance. As put to the Committee the ‘benefits are not generally sufficient to justify the commitment of time, resources and effort where the overall levels of competitive funding and the success rate are too low.’9

Timing of grant applications

Currently, most deadlines for competitive grant funding are scheduled for early in a new year—February and March. The Committee heard that this timing may disadvantage some researchers, particularly women and those researchers with children who may have family commitments over the Christmas and New Year period.
Dr MaryAnne Aitken, Executive Director at Latrobe University, raised this issue at the Committee’s Melbourne public hearing:
… we are running the Athena SWAN [Scientific Women’s Academic Network] SAGE [Science in Australia Gender Equity] program for gender equity in research. We’ve had focus groups and surveys and all sorts of things. One of the most common issues that we hear about is the fact that the major grant rounds are in February and March, and if you have children on summer holidays for six weeks it ruins your summer holidays. We hear this over and over again. I think that having different timing would be good.10
Revising the timing of annual grant cycles was supported by other witnesses at the Melbourne round table. It was acknowledged however that this is a very difficult issue that would require more consultation with stakeholders, particularly given potential clashes with other deadlines and processes across the research sector.11

Post award management

Like grant application processes, the Committee heard that post award management processes are unnecessarily onerous. This is particularly apparent when variations to contracts are required.
Evidence to the Committee highlighted that universities consider the level of detail required to satisfy public sector accountability requirements far outweigh the changes being made. For example, in its submission, the University of Wollongong said:
In many cases variation documentation and processes required by funding agencies are overly onerous, with no nett benefit to researcher or funder. The types of variations required for approval by the funder could be drastically reduced using a risk-based approach.12
The ANU also recommended funding agencies take a risk-based approach to ongoing project management and variations.13
It was suggested that the post award management of funding be streamlined in the same way as recommended for the grant management system—that is, by adopting a more uniformed approach. For example, Mrs Lyn McBriarty, Strategic Advisor, Office of the Senior Deputy
Vice-Chancellor at the University of Newcastle, told the Committee:
I would extend the opportunities to streamline beyond the application process to the actual management of the applications and the contracts. There are so many different types of contracts, each with little nuances, that you have to be an expert in each one of them, and that’s very time consuming. There are huge opportunities to streamline, both pre- and post-award, to make it much more efficient.14

Improving the application process

Single research management system

There were persistent calls for more compatibility between the ARC and NHMRC, and to consolidate processes across the Commonwealth. Many submissions identified the need for a single whole-of-government system or portal to manage competitive grant funding.
The University of Technology Sydney described the advantage of this consolidation:
The availability of one online government platform to manage the grant lifecycle across multiple grant schemes, from application submission, post-award management to final report submission, would not only create greater efficiency for research institutions but also assist funders to streamline funding programs and access data on research investment.15
Similarly, the University of Notre Dame also suggested a central system and provided some more detail regarding a single research management system:
… this could involve a GrantsConnect research management system which might be utilised by ARC, NHMRC, GRDC [Grains Research and Development Corporation] and all government departments or agencies. Government agencies could also harmonise standard templates for contracts for public funding, including the wording of terms and conditions of those contracts. Furthermore, an agreed date and format for annual reporting for all government schemes would be more efficient and save money. Greater consensus and consistency in reporting to government would be welcomed within the university sector.16
The wider use of GrantConnect and other existing systems such ORCID (Open Researcher and Contributor ID) and the ARC’s Research Management System (RMS) were identified as viable ways to streamline and consolidate fragmented processes.
GrantConnect is the Australian Government’s central online facility that publishes information about upcoming and current competitive grant opportunities, grant guidelines and grants awarded.17 It is currently used by the ARC and NHMRC.
ORCID is a global online researcher identification system that records the professional information of researchers.18 Professor Kathryn McGrath, Deputy Vice-Chancellor, Research at the University of Technology Sydney explained the value of ORCID in providing relevant, current researcher information through a single source:
The researcher is only having to provide their ORCID number and the government then has immediate access to a whole suite of information. If the government and funding agencies started to use ORCID, that would drive the researchers to provide better and better source information. There would be more assurance that we are getting a complete story for each of our researchers that have been named on funding applications. Then it is all sides of the sector: the individuals are ensuring their source data is appropriate; the universities can ensure that they utilise ORCID as well, in a whole range of different things; and the government is massively increasing the amount of information they have readily available through a single source point.19
The Committee was strongly encouraged to consider ORCID as an option for identifying and obtaining current researcher information, rather than researchers providing this information for each grant application. Evidence to the inquiry suggested that the universal use of ORCID may also prevent data entry errors, the use of out-of-date publication lists, and negate possible compliance issues like incorrect headings or fonts.20
Furthermore, the use of ORCID may facilitate auto-population of data through compatible platforms, and better support the movement of researchers and their data between institutions and funding agencies.21
The ARC’s RMS is a purpose built information technology system that enables electronic creation, submission and assessment of grant applications, and post-award management of progress reports. The system has received ‘international attention as a best practice system’.22 It is the view of the ARC—and other submissions to this inquiry—that such a system could be used to support ‘better research grants administration across government’.23
Universal use of online systems such as GrantConnect, ORCID and RMS for all grant funding applications and management was acknowledged as a positive step to reduce inefficiency, duplication and administrative burden on researchers and research institutions. Building automatic compliance checks into application systems was also recommended as a means to improve efficiency. For example, compliance checking undertaken by the ARC’s System to Evaluate Excellence of Research (SEER) was identified as an example of this process.24
The Committee sees value in using existing applications to streamline and consolidate application processes. It also sees value in accessing data and information that is already used and maintained by researchers and research institutions. That is, making use of what is already available and known to the research sector. Exploring options to link to these data systems is recommended.

Two-stage process

Two-stage processes were identified as a potential way to reduce the time and effort associated with completing lengthy grant applications, and assessing them. It also provides for applications to be assessed on the quality and strength of the research idea, in the first stage, before considering other factors in stage two. This process was strongly supported in evidence received by the Committee.
While there are variations on the two-stage process, generally the first stage calls for an Expression of Interest (EOI) comprising a short research proposal, and possibly other brief supporting information. The EOI is designed to be a truncated application, with submitters suggesting anywhere from 1-20 pages in length.
The second stage is only open by invitation to those with a competitive proposal assessed at the first stage. At this second stage, a full application is provided.
Two-stage processes are already undertaken by some funding programs in Australia including the Cooperative Research Centres, ARC Centres of Excellence and the Australian Renewable Energy Agency. Two-stage processes are also used in some international schemes including the European Commission, the Leverhume Trust in the United Kingdom (UK), and in New Zealand, the Health Research Council and the Marsden Fund.25
All were considered good examples of two-stage processes that could be adopted more widely.

Other application processes

Some other suggestions identified for improving the application and funding processes included:
A rolling deadline or continuous application process that allows for grant funding applications to be submitted and assessed anytime. Greater flexibility around when applications can be submitted ‘may increase success rates for researchers who can contribute more effective time, data, and coherency in their proposals’.26 It may also benefit those researchers who may be out on field work when there are application deadlines.27
A single application process whereby fundable but not funded applications stay in the mix for subsequent funding rounds. This process acknowledges the ‘near miss’ grants that are highly rated and fundable yet miss out because the ‘funding pool can only extend “so far” down the list of fundable projects’.28
The introduction of ‘revise and resubmit’ categories for high quality applications that are considered close but ‘not ranked quite highly enough’ to be funded in a particular round. These ‘revise and resubmit’ applications are then considered in the next round of funding.29
The Committee notes that some of these processes are already in place across some competitive grant schemes.

Flexible funding

The Australian Academy of the Humanities (AAH) noted that there are efficiencies to be gained from taking a more flexible approach to the way in which grant programs allocate their funding. In particular, to differentiate between low and high cost research, rather than favour the latter. The AAH argues that there is a strong case for allowing flexibility in funding thresholds to cater for different funding requirements across disciplines.30
Similarly, the Australian Association of Philosophy (AAP) observed that ‘it makes sense to have grant bodies that are specific to very different families of disciplines’.31 In its submission, the AAP noted that ‘Many excellent projects in the Humanities simply do not cost enough to qualify for ARC funding and therefore receive no support at all’.32
The AAP argued that better outcomes in humanities and social sciences (HASS) might flow in competitive grant schemes by ‘funding more projects, each of which costs less than the current minimum thresholds for Discovery and Linkage grants’.33 The AAP highlighted a two-stream funding model used by the Social Sciences and Humanities Research Council of Canada, which offers funding for low and high cost research projects.
Dr Nicholas White illustrated the same point in his submission, noting that in the field of chemistry ‘Discovery Project applications that request less than $100,000 a year are not taken seriously by reviewers and so are almost never funded.’34
Dr White suggested that there be two tiers of funding within the same scheme to accommodate researchers who may not want a large amount of money. Dr White suggested that these can:
… be requested with a short application and high success rate, while more ambitious programmes that want larger sums of money… need a more detailed application and have a lower success rate.35
As an alternative, Dr White suggests that university block funding could be altered to ensure that researchers are directly funded to pay for basic equipment and consumables.36
The Committee appreciates the needs of researchers will differ across disciplines, and acknowledges that different levels of funding are required to support these needs. In supporting flexibility within the research funding system, the Committee recommends that consideration be given to funding smaller scale projects.

Committee Comment

The Committee supports the diversity of research funding opportunities inherent in the current system. It agrees that such arrangements are necessary to support Australia’s broad research interests, and to provide a flexible system that is responsive to new and emerging research priorities.
Evidence to the inquiry highlighted that for the 119 funding initiatives across the research sector, each has their own processes and arrangements. There is little doubt that this landscape is administratively challenging, particularly for universities, accompanied by significant time, cost, and frustration.
The Committee supports the need to better coordinate research funding efforts, especially the administrative processes associated with applying for competitive grants. It recommends reform of three key areas – management of research grants, the application and assessment process, and the information and documentation to support an application. It is envisaged that such reforms will reduce the time spent by researchers preparing voluminous applications by making use of information that is already available, restricting information to that which will inform a decision, and keeping relevant forms, guidelines and contracts as uniform as possible.

Recommendation 1

To reduce the administrative burden on research agencies and streamline grant funding processes, comprehensive reform is needed across three key areas—research management, application and assessment, and grant funding documentation. In particular, the Committee recommends:
Single online research management system
the introduction of a single whole-of-government online system for all Commonwealth grant applications and post award management; and
that, where possible, this online system link to existing systems and databases, such as ORCID (Open Researcher and Contributor ID), to prepopulate necessary data fields.
Two-stage application process
a two-stage grant application process be introduced across Commonwealth funding schemes to reduce the amount of information provided in support of applications; comprising:
Stage one: a competitive component including an expression of interest designed to give weight to the research proposal, and prepopulated information regarding track record and research capacity; and
Stage two: a non-competitive component for successful stage one applicants to provide a full proposal and budget.
Grant funding documentation
that only information and documentation that will inform a decision be provided across grant funding schemes; and
the standardisation of grant funding documentation, including application forms, guidelines and contracts, across the Commonwealth.

Recommendation 2

The Committee recommends that the Australian Government introduce a risk-based approach to post award variations that would allow universities to make minor or simple variations without seeking approval from funding agencies.

Recommendation 3

The Committee recommends that grant funding be available to support smaller scale research projects across disciplines.

Peer review

Peer review was overwhelmingly supported in evidence to the Committee. In particular, peer review was described as the ‘bedrock of all academic research’37, a ‘hallmark and driver of high quality’38, and ensuring ‘merit and value’.39
In its submission to the inquiry, the ARC noted that peer review represents international best practice, and highlighted the commitment of the international research community to this process. The ARC also noted that peer review is fundamental to the agency, its risk management and to the accountability of publicly funded research.40
Notwithstanding the support for peer review, submissions did identify some shortcomings with this process, including:
it is a burden on universities and peer reviewers;
there is no consistency in peer reviewers and a lack of institutional memory from year to year;
it is open to cronyism, conflicts of interest, and biases related to gender, age, career stage, and discipline of researcher; and
there is limited or unhelpful feedback on research proposals.
The role of peer reviewers and feedback provided on grant applications is discussed further below.

Peer reviewers

The current peer review system was described as ‘labour intensive’41, ‘untenably strained’42 and also undertaken to ‘different standards’43. In its written submission, Macquarie University noted:
While peer review is the gold standard for evaluating the merit of research, it is very time and effort intensive and causes enormous workloads for researchers around the globe. It is often difficult for funding agencies to source enough leading researchers to meet the need for peer review, which causes delays and overburdens those researchers who contribute reliably to the peer review ecosystem.44
The Committee heard that more needs to be done to support the peer review system. This is to ensure that peer reviewers are equipped for the task, and to reduce the likelihood of other factors influencing merit-based decisions. For example, Professor James McCluskey, Deputy Vice-Chancellor, Research at the University of Melbourne said:
Could I be unequivocal and say that, at all cost, we must preserve the peer review system. We can improve it. We could do more training of our grant assessors. We could coach them in what we consider to be important, how to make balanced and objective judgements, and have training around unconscious bias. I think these are terribly important and non-trivial. We don’t do a particularly good job of that at the moment. Where the peer review system has fallen into disrespect, it’s often because the overload has led to sourcing of assessors who are not fit to assess. They’re either too junior or they’re out of their disciplines or, as happens occasionally, they’re from somewhere outside of Australia and they don’t understand our system.45
The University of Notre Dame also noted a lack of diversity on assessment panels which can arise from cost and time challenges that prevent participation, particularly of Western Australian researchers. This lack of diversity was considered to lead to lost opportunities for innovation in assessing applications, and difficulties in assessing interdisciplinary research projects. In response, the University of Notre Dame called for new mechanisms to ensure greater diversity and inclusion within the assessment processes, such as rotation systems, video conferences, and better incentives and financial support for travel.46


Feedback on competitive grant applications is an enormous benefit for both the individuals and institutions.47 It not only allows for individual researchers to improve and further develop their research ideas and proposals, but it allows universities to make better judgements about whether to submit grant applications in the first place.
This point was made by Murdoch University in its submission to the inquiry, which noted:
If universities are to submit competitive grant proposals for consideration, then it is critical that they receive detailed feedback on the assessment of applications to be able to refine internal processes.
The administrative load for granting agencies is reduced when the number of applications is reduced. It is in their interests, therefore, to improve feedback that will allow universities to more accurately assess the likelihood of success of internally submitted proposals.48
Professor Adrian Manning asserts that feedback is one of the best and simplest ways to improve the quality of research applications, and also a way to ensure the best ideas are funded.49 Yet many submissions to the inquiry noted a lack of transparency in the assessment process and a lack of consistent and constructive feedback overall.
For example, in their submission to the inquiry, Dr Mathew Lewsey and his colleagues described a lack of transparency with the ARC assessment process. In particular, they noted:
… a great lack of transparency around how individual applications are scored. The specific comments from the General Assessors performing the first review are given to the applicants, who are then invited to respond through a rejoinder, but the associated scores are never disclosed. After the funding announcement (success/failure), the final scores of the application are coarsely given in terms of broad percentile bands… but no comments or feedback are provided from the final assessment or the discussion at the College of Experts.50
Dr Lewsey suggests that this lack of feedback prevents applicants from making informed decisions about the next steps for their research proposal, and also creates the impression that the system is unfair. His submission is one of several which calls for accurate scores and feedback to be provided to all grant applicants; noting the European Union Framework Programme as a good example of this process.51
It was suggested that improving the training of peer reviewers may also assist in providing improved feedback for applicants and support the process of internal peer review.52

Lottery system

In their joint submission to the inquiry, Professors Adrian Barnett and Philip Clarke challenged the merits of the peer review process and introduced the idea of a lottery system to fund research, based on the following principles:
Researchers prepare short applications that outline how their proposed research addresses the goals of the funding scheme, potentially including national research priorities.
Applications are screened for eligibility and to check that the researchers are qualified and have some basic experience.
The available funding is allocated randomly amongst the remaining applications. Applications could be stratified to award more funding to help fix current issues of under-representation, such as more funding for women and younger researchers.53
The authors note, that while a ‘seemingly radical idea’, lotteries are supported by scientific evidence.54 This evidence includes research which found little association between peer review scores and subsequent success—for example, in published papers and patents.
At the Committee’s public hearing in Brisbane, Professor Barnett told the Committee:
I think a lot of what we are doing here when judging grants is not necessarily peer review, it is peer preview; we are trying to predict which grants are going to give the best return. In the national institutes of health in the US, they looked at their data, where they looked at all the projects that were funded—from the ones that everybody thought were amazing to the ones that got it just above the funding line, and all the ones in between—and there was very little correlation between those scores and the impacts of those grants. And the conclusion was, science is wonderfully hard to predict, and it’s very hard to predict where you’re going to get the biggest returns.55
Professors Barnett and Clarke assert that any policy change should be based on scientific evidence. They highlight the element of randomness already in our current research funding system, particularly when deciding between highly competitive grant applications. They also claim that the introduction of a lottery system would reduce the time and effort applying for research grants, prevent some of the perceived biases in the current system, and increase diversity in research.56
The Committee understands that while the idea of a lottery might have some appeal, it supports the principles of open competition and expert peer review within the grant funding system. It also acknowledges that peer review is strongly supported by the research sector as best practice and is the best process for determining the allocation of grant funding.

Recommendation 4

The Committee recommends that the peer review system be maintained to support competitive grant funding in Australia.
The Committee recommends that the peer review process be strengthened by:
providing detailed information and constructive feedback to unsuccessful applicants; and
ensuring that the training of peer reviewers is of the highest standard.

Research block grants

The second component of the dual funding system is research block grants. The most consistent issue in written submissions and evidence at public hearings regarding block grant funding was that it does not cover the indirect costs of conducting government funded research. As stated by Universities Australia:
For many years there has been a significant gap between the full cost of research and the block grants provided by Government. Each new grant scheme that does not support the indirect costs of research makes this issue worse – to the point that in some cases, institutions may not be able to adequately support their researchers to undertake competitive grant research. On average, each dollar of competitive grants requires supporting, flexible expenditure of 85 cents.57
Monash University provided a specific example of this shortfall, noting the amount of block funding it received to support competitive grants:
In 2018, each $100 of Category 1 funding acquired by Monash University was accompanied by $23.58 in block funding, which is less than half the amount required to cover the costs of implementation of this research.58
Similarly, in 2011, the University of Melbourne estimated that it spent $1.71 for every dollar of competitive research income.59 It notes that block grant funding has remained static at around 22 cents. This is considered to be substantially lower than for universities in the UK and the United States (US), where indirect costs are met at 80 per cent and 50-60 per cent respectively for competitive grants.60
Two key issues with the current research block funding arrangements were identified. First, it creates a perverse system that effectively costs successful universities when they win competitive grants. Second, it requires universities to increasingly cover these indirect costs through other university revenue such as general university funds and international student fees.
In its submission to the inquiry, the Association of Australian Medical Research Institutes (AAMRI) highlights the first issue that arises when indirect funding does not cover research costs:
Inadequate financial support for systemic costs of research creates a perverse system which effectively punishes excellence as the more successful a research organisation is at receiving competitive peer reviewed grants, the greater the financial gap they must bridge each year.61
A similar view was expressed by QIMR Berghofer Medical Research Institute when it highlighted a $10 million gap between grant funding and the ‘true cost’ of research. It noted that ‘the more successful a research institute is in terms of winning competitive funding, the greater the funding deficit that the institute must make up from other sources’.62
The second issue, the trend of universities covering these indirect costs through other means, was canvassed by the National Tertiary Education Union (NTEU). It highlighted the vulnerability of university research to this type of funding:
The problem of relying on general university funds to pay for over half of our universities research efforts however, is that research is not only affected by policy or funding changes directly targeted at university research… but is also affected by policy changes related to Commonwealth Supported Places as well as other policy changes such as changes to visa requirements that might directly impact on international student enrolments.63
Ms Victoria Hocking, Executive Manager, Research at Macquarie University drew this latter point out further by noting that the more universities dip into revenue from international students to cover the indirect costs of research, the less money universities have to invest in international student experiences and attract them to Australia, which ultimately impacts on overall university revenue.64
Not surprisingly, there were consistent calls from universities and research institutions to increase block grants to cover the full cost of research.
Others however called for more systemic changes. For example, Swinburne University of Technology suggested that the current approach to dual funding be reviewed to bring it into line with other economies such as the UK and the US.65 The Group of Eight made a similar recommendation calling for ‘serious consideration’ of a full economic cost model as operated in the UK and US, under which a percentage of the indirect and direct costs of research are delivered together.66

Recommendation 5

The Committee recommends that the administration of research block grants be reviewed to provide more timely and adequate support for the indirect costs of research.

Funding inconsistency

Some submissions to the inquiry called for greater parity under the dual funding system between universities and other research institutions. For example, it was noted that for competitive grant and block grant funding purposes, medical research institutes (MRIs) are treated differently to universities and are preventing from applying. This was considered by some to be arbitrary, given that the same work is being treated in two different ways, based on the location of the researcher.67
This point was explained by Professor Tony Cunningham, President of the AAMRI, when discussing interdisciplinary research within medical research institutes:
Many of our engineers and biostatisticians would normally be able to apply to ARC if they were situated on a university campus. If they’re situated in a medical research institute, they can’t do that. So I think there’s a level of inequity there. It’s important in that it influences Australia’s competitiveness in this area. We want the best people to get our research grants and put out the best research, and there shouldn’t be a restriction on the site where people are. We’ve just had an agreement on this with the cooperative research centre funding. That is now open to medical research institutes as it is to CSIRO and universities.68
In its written submission, the University of Melbourne sets out the inconsistencies in block grant funding between universities and other research providers.69 Professor James McCluskey, Vice-Chancellor, Research at the University of Melbourne told the Committee that these inconsistencies may lead to competitive tensions, and called for a simpler set of arrangements for block funding:
What this does is to fracture the research community because they receive their funding support in different ways, which creates competitive tensions whenever there’s a funding shortage or a shift in the policy settings and leads to language around an uneven playing field—‘Why are you different?’ So it would make a lot of sense if we could have a single way of providing indirect costs to research providers.70
Alphacrucis College emphasised the potential for greater competition when calling for ARC grants and other government funding opportunities to be opened to private providers. In particular, the College argued:
Extending eligibility to apply for ARC grants to all institutions which have the capacity to support research would be cost neutral for the government as the total pool of funds would not change, only the allocation between institutions. Besides equity and improving the efficiency of the system through greater competition, the research carried out at private providers tends to be user-focused and more in line with national research priorities.71
At the Committee’s Sydney public hearing, Professor Paul Oslington, Dean of Business at Alphacrucis College asserted that the competitive grants system should include a threshold of administrative capacity, plus free competition in the relevant schemes for those who have passed that threshold. Specifically, Professor Oslington said:
I think it should be that any institution that has a demonstrable research capacity, a capacity to administer a grant, a prima facie active research program or an administrative capacity supported grant should be allowed to apply to the relevant scheme. Then, once you’re over that threshold, it should be free competition: may the best project and researcher win.72
The different arrangements applied to universities and research agencies was said to not only restrict competition but also hamper industry collaboration. This is particularly apparent when other research institutions or industry bodies are required to partner with universities to be eligible for funding. This may lead to additional administrative burden, as well as unnecessary partnerships.
For example, the AAMRI and the Australian Nuclear Science and Technology Organisation (ANSTO) highlighted the inefficiency that arises when a university partner is required to be eligible for an ARC Linkage Grant.73 In its written submission, ANSTO said:
At times, ANSTO’s ability to contribute to this collaborative research effort is hindered by its ineligibility to apply directly to the Australian Research Council (ARC) Linkage Grant Scheme.
… In cases where the industry participant requires access to ANSTO’s landmark and national research infrastructure to complete the project, the university partner often acts as an essentially superfluous intermediary.74
ANSTO suggests that amending the eligibility criteria would ‘make it easier for industry to access landmark and national research infrastructure managed by PFRAs [publicly funded research agencies], diversifying the user base and maximising the benefit derived.’75 In addition, ANSTO asserts that such an amendment would enable PFRAs to lead more collaborative industry-focused research and increase opportunities to strengthen cohesion across the research sector.76
In Brisbane, Professor Frank Gannon from the AAHMS raised the risk of gaming that arises when MRIs are required to partner with a university to access ARC funding:
Medical research institutes know how to play the game. If you appoint somebody with 20 per cent of a university position, they’re eligible. They’re not going to go to the university and they’re not going to do anything with the university, but they put their money through that and then they’re eligible. I think that’s not a great idea.77
Professor Gannon suggested that by changing the eligibility criteria to include MRIs there would be better applicants and better competition.78


Similar views were expressed by those in the TAFE sector who noted two ways in which the TAFE research sector is not supported. The first is exclusion from existing research funds, and the second is an absence of programs explicitly designed and targeted towards research in TAFE.79
The Committee was told that TAFEs are not considered part of the research sector in Australia. This is because TAFEs do not fall under the definition of a publicly funded research organisation. They may only participate in research if they partner with a university. It was thought that this was a missed opportunity, particularly given the strong links between TAFE and industry and its capacity to undertake applied research.80
As noted earlier, the Committee supports the principles of open competition and excellence in research. It also supports simplified funding arrangements.

Recommendation 6

The Committee recommends equitable access to and open competition for all research providers, including TAFE, regardless of their links to or partnerships with universities to ensure that research is assessed on merit.

  • 1
    Department of Industry, Innovation and Science, Submission 42, p. 5.
  • 2
    Department of Education and Training, 2018 Australian Competitive Grants Register, <www.education.gov.au/news/2018-australian-competitive-grants-register-acgr-now-available> accessed 8 October 2018.
  • 3
    Australian National University, Submission 31, p. 1.
  • 4
    University of Sydney, Submission 87, p. 6.
  • 5
    Research Australia, Submission 53, p. 8.
  • 6
    Australian Mathematical Sciences Institute, Submission 94, p. 1.
  • 7
    Ms Maaike Wienk, Committee Hansard, Melbourne, 6 August 2018, pp. 23-24.
  • 8
    Professor Frank Gannon, Committee Hansard, Brisbane, 30 July, p. 20.
  • 9
    Queensland University of Technology, Submission 34, p. 3.
  • 10
    Dr MaryAnne Aitken, Committee Hansard, Melbourne, 6 August 2018, p. 16.
  • 11
    See Committee Hansard, Melbourne, 6 August 2018, pp. 16-18.
  • 12
    University of Wollongong, Submission 50, pp. 6-7.
  • 13
    Australian National University, Submission 31, p. 3.
  • 14
    Mrs Lyn McBriarty, Committee Hansard, Sydney, 7 August 2018, p. 2.
  • 15
    University of Technology Sydney, Submission 85, p. 4.
  • 16
    University of Notre Dame, Submission 13, p. 4.
  • 17
    See Department of Finance, GrantConnect, <www.finance.gov.au/resource-management/grants/grantconnect/> accessed 8 October 2018.
  • 18
    See ORCID, <https://orcid.org/> accessed 8 October 2018.
  • 19
    Professor Kathryn McGrath, Committee Hansard, Sydney, 7 August 2018, pp. 7-8.
  • 20
    Regional Universities Network, Submission 16, p. 2.
  • 21
    University of Technology Sydney, Submission 85, p. 5.
  • 22
    Australian Research Council, Submission 46, p. 9.
  • 23
    Australian Research Council, Submission 46, p. 9.
  • 24
    University of New South Wales, Submission 62, p. 3.
  • 25
    Deans of Arts, Social Sciences and Humanities, Submission 21, pp. [3-4].
  • 26
    Ecological Society of Australia, Submission 66, p. 3.
  • 27
    Dr Bek Christensen, Committee Hansard, Brisbane, 30 July 2018, p. 27.
  • 28
    University of Wollongong, Submission 50, p. 4.
  • 29
    University of New South Wales, Submission 62, p. 2.
  • 30
    Australian Academy of the Humanities, Submission 60, p. [3].
  • 31
    Australasian Association of Philosophy, Submission 58, p. [5].
  • 32
    Australasian Association of Philosophy, Submission 58, p. [4].
  • 33
    Australasian Association of Philosophy, Submission 58, p. [4].
  • 34
    Dr Nicholas White, Submission 3, p. 1.
  • 35
    Dr Nicholas White, Submission 3, p. 1.
  • 36
    Dr Nicholas White, Submission 3, p. 1.
  • 37
    Australian Research Council, Submission 46, p. 10.
  • 38
    Charles Sturt University, Submission 32, p. 3.
  • 39
    Deakin University, Submission 10, p. 2.
  • 40
    Australian Research Council, Submission 46, p. 10.
  • 41
    Innovative Research Universities, Submission 90, p. 6.
  • 42
    The University of Western Australia, Submission 9, p. [2].
  • 43
    Ecological Society of Australia, Submission 66, p. 4.
  • 44
    Macquarie University, Submission 59, p. 3.
  • 45
    Professor James McCluskey, Committee Hansard, Melbourne, 6 August 2018, p. 14.
  • 46
    University of Notre Dame, Submission 13, p. 3.
  • 47
    Professor Kathryn McGrath, Committee Hansard, Sydney, 7 August 2018, pp. 16-17.
  • 48
    Murdoch University, Submission 28, p. [3].
  • 49
    Professor Adrian Manning, Submission 97, p. 1.
  • 50
    Dr Mathew Lewsey, Submission 64, p. [2].
  • 51
    Dr Mathew Lewsey, Submission 64, p. [3].
  • 52
    Regional Universities Network, Submission 16, p. [3].
  • 53
    Professors Adrian Barnett and Philip Clarke, Submission 33, p. 2.
  • 54
    Professors Adrian Barnett and Philip Clarke, Submission 33, p. 2.
  • 55
    Professor Adrian Barnett, Committee Hansard, Brisbane, 30 July 2018, p. 5.
  • 56
    Professors Adrian Barnett and Philip Clarke, Submission 33, pp. 1-3 and Submission 33.1, pp. 1-2.
  • 57
    Universities Australia, Submission 27, p. 11.
  • 58
    Monash University – School of Biological Sciences, Submission 7, p. 2.
  • 59
    University of Melbourne, Submission 81, p. 10.
  • 60
    Swinburne University of Technology, Submission 75, p. 4.
  • 61
    Association of Australian Medical Research Institutes, Submission 36, p. 3.
  • 62
    QIMR Berghofer, Submission 22, p. 1.
  • 63
    National Tertiary Education Union, Submission 93, p. 8.
  • 64
    Ms Victoria Hocking, Committee Hansard, Sydney, 7 August 2018, p. 20.
  • 65
    Swinburne University of Technology, Submission 75, p. 4.
  • 66
    The Group of Eight, Submission 91, p. [6].
  • 67
    Association of Australian Medical Research Institutes, Submission 36, p. 3.
  • 68
    Professor Tony Cunningham, Committee Hansard, Sydney, 7 August 2018, p. 22.
  • 69
    University of Melbourne, Submission 81, pp. 10-11.
  • 70
    Professor James McCluskey, Committee Hansard, Melbourne, 6 August 2018, p. 18.
  • 71
    Alphacrucis College, Submission 6, p. [3].
  • 72
    Professor Dean Oslington, Committee Hansard, Sydney, 7 August 2018, p. 25.
  • 73
    Association of Australian Medical Research Institutes, Submission 36, pp. 8-10.
  • 74
    Australian Nuclear Science and Technology Organisation, Submission 54, p. 2.
  • 75
    Australian Nuclear Science and Technology Organisation, Submission 54, p. 2.
  • 76
    Australian Nuclear Science and Technology Organisation, Submission 54, p. 2.
  • 77
    Professor Frank Gannon, Committee Hansard, Brisbane, 30 July 2018, pp. 22-23.
  • 78
    Professor Frank Gannon, Committee Hansard, Brisbane, 30 July 2018, p. 22.
  • 79
    Victorian TAFE Association, Submission 40, p. 5.
  • 80
    Ms Francesca Beddie and Ms Linda Simon, Submission 5, pp. [1-2].

 |  Contents  |