This chapter sets out the Committee’s findings in relation to the Department of Home Affairs’ implementation of recommendations from Australian National Audit Office (ANAO) Report No. 43 (2017‑18), Domestic Passenger Screening – Follow‑Up. It comprises the following sections:
Committee conclusions and recommendations
Committee conclusions and recommendations
The Department of Home Affairs (the department) undertakes domestic passenger screening in order to prevent prohibited items such as weapons and explosives from being carried onto aircrafts. The detection and control of prohibited items requires the use of specialised equipment and screening personnel at 62 security controlled airports across Australia. Passenger screening needs to be conducted efficiently and must meet legislated and regulatory requirements.
The ANAO tabled Auditor‑General Report No. 5 (2016‑17), Passenger Security Screening at Domestic Airports on 31 August 2016. This performance audit made five recommendations aimed at improving the department’s (then, the Department of Infrastructure and Regional Development) regulatory performance. The audit found that the department:
…was unable to provide assurance that passenger screening was effective, or to what extent screening authorities had complied with the Regulations, due to poor data and inadequate records. The ANAO also found that the Department did not have meaningful passenger screening performance targets or enforcement strategies and did not direct resources to areas with a higher risk of non‑compliance.
The ANAO undertook a follow‑up performance audit in 2017‑18 due to the significance of the findings and recommendations of the previous audit, and the response from the department ‘advising that a number of initiatives to address the shortcomings identified were already underway’. The follow‑up audit was tabled on 29 May 2018 and focused on the extent to which the Department of Home Affairs (the Office of Transport and Security was moved into the Department of Home Affairs on 20 December 2017), had implemented the five recommendations from the previous report. The follow‑up audit examined the department’s compliance monitoring program, learning and development framework, and performance monitoring and reporting arrangements.
The follow‑up audit found that as at March 2018, the department had implemented one and partially implemented four of the five recommendations from the previous audit. In its response to the follow‑up audit findings, the department noted that it had made progress despite the audit being initiated less than 12 months after the previous audit and the department responding to a significant aviation security event in July 2017.
The follow‑up audit made three recommendations and concluded that although the department had made progress, it was not yet ‘well placed to provide assurance that passenger screening is effective and that screening authorities comply with the Regulations’. The department agreed with all three recommendations in the follow‑up audit, but disagreed with the ANAO’s conclusion:
I do not agree with comments made in the report about the Department’s inability to provide assurance that screening authorities comply with the regulations. As acknowledged in the report, the Department measures the compliance of screening authorities with the regulations on a regular basis and these activities assure us that the authorities are compliant. Where they are not, further compliance action is taken.
At the public hearing, the department further explained that it had a comprehensive compliance regime and that the ‘issues identified by the ANAO are but one narrow slice of performance measures that could be looked at’. The department stated that the ‘way that the audit focused that finding was very narrow on the type of measures they were looking at. It doesn’t take into account the full spectrum of our compliance program’.
The ANAO responded that the follow‑up audit focused on the department’s progress in implementing the five recommendations from the previous audit. As four of the recommendations were not implemented, the ANAO could not conclude that the department was ‘now well placed to be able to provide that evidence or information about their performance’.
The Committee agrees that follow‑up audits are important for holding entities to account and examining whether entities respond appropriately to audit findings and agreed recommendations. The Committee notes that the prevention of prohibited weapons and explosives from being carried onto aircraft is a matter of public interest and safety, and is concerned that the ANAO’s follow‑up audit concluded that the department is still ‘not yet well placed to provide assurance that passenger screening is effective and that screening authorities comply with the Regulations’.
The Committee recommends that the Australian National Audit Office consider conducting a follow-up audit in 2019-20, focusing on the department’s implementation of eight agreed ANAO recommendations and considering the department’s entire compliance program.
The Committee is concerned that both the previous audit and the follow‑up audit identified issues regarding the quality of data used by the department. The department has acknowledged that ‘the ease of use of the system is not perfect’ and that in response to challenges with data, it has developed ‘arrangements to effectively work around some of those challenges’.
The Committee notes that the department has commenced producing two main analytical products to inform compliance activities and will undertake an upgrade of the Regulatory Management System (RMS) in the first quarter of 2019. The follow‑up audit outlined that the department has made improvements to the quality and reliability of data in the RMS, however issues still remained: inconsistent and incomplete data, and constraints on data extraction and reporting. The department obtained approval to develop a data product suite in November 2017 to address issues regarding analysis and reporting across systems, however the ANAO reported that as at March 2018, the department had not developed an associated plan.
The Committee considers that the use of baseline data – data which measures conditions before a project or initiative is commenced for the purposes of later comparisons – is important in measuring the outcomes of projects and initiatives. Following the findings of both audit reports, the Committee expects that the department will prioritise improvements to data quality, and measure and report on the improvements.
The Committee recommends that the Department of Home Affairs:
gather an appropriate baseline data set prior to the implementation of Regulatory Management System upgrade in the first quarter of 2019;
monitor and review the outcomes of the upgrade against the baseline data set, particularly whether data quality is improving; and
either prioritise an additional upgrade to the Regulatory Management System in order to establish an efficient and effective reporting function, or prioritise the development of the data product suite (as was approved on 10 November 2017) in order to address the limitations to data quality and performance reporting identified in the ANAO report.
The Committee is concerned that the department has not provided detailed information regarding milestones and timeframes for the implementation of the compliance and enforcement framework. Further, the department has stated that the three‑year timeframe may be impacted by 12 to 24 months if the department decides to proceed with the development of a centralised case management system. Considering the importance of compliance and enforcement in providing assurance that passenger screening is effective, the Committee recommends the department provide more information regarding the implementation of the compliance and enforcement framework.
The Committee recommends that the Department of Home Affairs report back to the Committee on progress in the implementation of the compliance and enforcement framework in detail, including objectives, timeframes, milestones, lines of responsibility, performance measures, evaluation and reporting arrangements, and further information on the centralised case management system.
The Committee notes that the department has commenced delivery of a learning and development program which involves staff undertaking qualifications in a Certificate IV in Government and a Diploma of Government. The department has outlined that staff will undertake training for core capabilities, specialist capabilities, and system test training. However, the Committee is concerned that the information provided by the department does not wholly address whether staff will receive training in the use of the Regulatory Management System to improve data quality, as was the original intention of Recommendation No. 4 of ANAO Report No. 5 (2016‑17).
The Committee recommends that the Department of Home Affairs report back to the Committee on training undertaken by staff in relation to the Regulatory Management System, and whether improvements to data quality, reliability and accuracy are being measured and achieved.
The Committee notes that the department commenced delivering training as part of the learning and development program in February 2018, all staff will be qualified by the start of 2019, and an annual re‑accreditation process will be undertaken by all staff. At the public hearing, the department stated that it was satisfied that available baseline data will enable monitoring and evaluation of the effectiveness of the training.
The Auditor‑General explained that the previous audit identified that the department had not monitored and collected data on staff attendance at training. Further, the Auditor‑General had previously suggested that the department use the training needs analysis completed in 2017 as baseline data for the purposes of monitoring and evaluating the learning and development framework.
The Committee is concerned that while the department has established and made available data points for the monitoring and evaluation strategy, the data by itself is not enough to address concerns or identify emerging risks without adequate monitoring and evaluation activities.
The Committee considers that the department has the opportunity to strengthen the available baseline data for the purposes of measuring the effectiveness of the learning and development framework.
The Committee recommends that in relation to the learning and development framework, the Department of Home Affairs:
establish a formal monitoring mechanism to provide assurance that all ongoing and new operational staff have undertaken the required qualifications and re‑accreditation; and
consider using the results of the training needs analysis from February 2017 as baseline data in the monitoring and evaluation of the learning and development framework.
The Committee notes the long‑term delays in the department’s implementation of performance measures for passenger screening. The Committee also notes that the ANAO reported that full implementation was expected by February 2019, however the department later advised that there was no expected implementation date. The Committee is concerned that there is no longer a set implementation date and agrees that in order to achieve the expected benefits, implementation of accepted recommendations should be timely and in line with the intended outcomes.
The Committee recommends that the Department of Home Affairs report back to the Committee on the implementation of performance measures for passenger screening, including:
the outcomes of the trial implementation period of performance measures with industry participants; and
the department’s target date for full implementation of performance measures.
Review of evidence
This section reviews the evidence received by the Committee regarding the department’s compliance monitoring, learning and development, and performance monitoring and reporting.
Data analysis function
The previous audit found that the department ‘had limited information about the compliance history of individual industry participants and lacked reliable data and analysis to identify systemic issues, or compare performance across industry participants’. The ANAO recommended that the department establish a data analysis function to capture accurate and reliable compliance data in order to identify trends in non‑compliance.
The department consequently established a data analysis function that delivered a single analysis of compliance information in July 2017. The analysis provided a ‘breakdown of compliance activities conducted for each category of airport and aircraft operator, the number of findings by type for each state, and the number of findings per security mitigation category’.
The ANAO found that the July 2017 analysis ‘did not identify trends in non‑compliance or identify industry participants that required a higher level of support to comply with the regulated requirements’. The follow‑up audit recommended that in implementing Recommendation No.2 from the previous audit:
The Department should ensure that its approach delivers a meaningful analysis of passenger screening compliance activities and outcomes. The analysis should: be capable of accurately identifying non‑compliance trends; generate results that are used to inform the development of the risk and compliance prioritisation ratings; and be incorporated into subsequent compliance monitoring programs.
The ANAO also noted concerns regarding compliance data quality; the ANAO examined the compliance data extracted from the Regulatory Management System (RMS) and found that inconsistent and incomplete data was still being entered into the system. Further, the retrieval of data from the RMS was limited for the purposes of analysis and reporting due to the use of free text fields, ‘which do not facilitate easy retrieval for reporting and analysis’.
On 10 November 2017, the department obtained approval to develop a data product suite in order to address issues within the current systems, including:
the number of disparate business systems used to store data;
the quality and continuity of data sets;
limited data extraction and reporting capabilities; and
different data structures and definitions used across the various business systems in use.
As at March 2018, the department had not yet progressed with a plan outlining how to develop the data product suite, improve the systems’ ability to provide meaningful analysis of compliance activity and outcomes, and enable comprehensive reporting.
At the public hearing, the department noted that the RMS system remains in use; however the department is planning an upgrade of the RMS in the first quarter of 2019. The upgrade will focus on simplifying the data entry and improving data quality by completely revising the user interface.
National Compliance Plan
The previous audit found that the department ‘was unable to assess the risks of non‑compliance or identify systemic non‑compliance and apply resources where the risk is higher’, due to ineffective processes in collecting, storing and retrieving data for the purposes of evaluating individual industry participant performance. The ANAO recommended that in order for the department to focus compliance resourcing on areas with a higher risk of non‑compliance, the outcomes of compliance activities should be incorporated into future compliance plans.
The single analysis delivered in July 2017 was not used in the development of the National Compliance Plan (NCP) for 2017‑18. The department instead developed the NCP by extracting other compliance data from the RMS and Microsoft Excel regional office work plans. The ANAO concluded that as a result, the NCP was not informed by ‘an analysis of trends in non‑compliance or identify industry participants exposed to a higher risk of non‑compliance’.
In response to the follow‑up audit recommendation, the department stated:
The Department has established a dedicated team responsible for producing analytical products that inform compliance activities. Analysis is informing both the annual National Compliance Plan (NCP) and the targeting of industry participants to ensure that compliance activities are directed to the highest risks. The NCP is updated, and program delivery adjusted, based on a monthly review of compliance activity results, non‑compliance trends, and threat and risk information.
At the public hearing, the department further outlined that ‘we are now in the position, following the follow‑up audit, to implement the type of analytical products that feed into the national compliance program which were recommended by the ANAO’. The department explained that two main analytical products are informing the NCP in 2018‑19: one product is the analysis of covert system test results which aims to identify systemic issues, such as at particular screening points or access points at airports. The department confirmed that this is an increasing program of approximately 1000 covert system tests to be conducted over 2018‑19.
The follow‑up audit outlined that the department ‘directs compliance resources to categories of airports and airline operators that present a higher security risk’. The department determines the risk and compliance prioritisation ratings of categories of airline operators and airports, and resources and compliance activities are directed based on these ratings. However, the department does not identify individual industry participants – individual airports and airline operators – where the risk of non‑compliance is higher, and does not direct compliance resources accordingly.
At the public hearing, the department outlined that the other main analytical product that informed the NCP in 2018‑19 was the analysis of industry participant compliance history. This analysis examines individual industry participants and trends in their compliance history, assessing whether compliance is improving or deteriorating.
Compliance and enforcement framework
The previous audit found that the department ‘did not have an enforcement policy and did not utilise the full suite of enforcement actions available to it’. Consequently, the previous audit suggested the department develop a clear enforcement strategy.
Following the previous audit, the department provided training to all transport security inspectors to improve consistency in the issuing of non‑compliance notices and observations. The department also began to ‘make greater use of additional enforcement options such as infringement notices’.
The follow‑up audit found that the department developed a strategic plan for 2017‑20, a compliance and enforcement framework, and an implementation plan. The strategic plan broadly outlines the department’s compliance approach and indicates that the department ‘intends to build its capacity to undertake enforcement action over the course of the next three years’.
The ANAO outlined that the implementation plan included components such as the management of enforcement activities, the proposed delivery model, the establishment of an enforcement team, and suggested training for personnel focused on enforcement and investigative skills. The implementation plan also identified that the department should provide clear guidance indicating the various enforcement options and the escalation process to assist in the management of non‑compliance. At the time of the follow‑up audit, the implementation plan was not yet approved.
At the public hearing, the department indicated that the three‑year timeframe for the implementation of the compliance and enforcement framework included an initial focus on the learning and development framework. This was to ensure that personnel are capable of ‘effective decision‑making, to collect evidence effectively and to have consistency and lawful decision‑making by our inspectors’. Training for staff commenced in February 2018 and included staff enforcement capability training.
The department argued that the absence of a finalised implementation plan does not mean ‘that we aren’t doing very effective and strong compliance already. We are very active in this space’.
The department further explained that the implementation plan outlined ‘that the core elements of the enforcement capability will be implemented within 12 months’. The core elements include completing staff enforcement capability training and starting enforcement trials, and developing enforcement procedures and supporting materials. In December 2017, the department also commenced a limited infringement notice trial which has now been expanded nationally. The department explained that ‘the trial enables application of staff enforcement training, with outcomes informing the development of enforcement procedures’.
However, the timeframe for the implementation of the framework may be extended by 12 to 24 months if the department decides to develop a centralised case management system.
Learning and development
The previous audit found that departmental staff were not trained in using the Regulatory Management System (RMS), which resulted in significant issues with the completeness and accuracy of the data captured in the RMS. The ANAO recommended that the department ‘conduct a training needs analysis for users of the RMS system, deliver appropriate training and monitor its effectiveness’.
The department initially advised that the training needs analysis would be completed by December 2016. However, the department decided to undertake a broader training needs analysis than the ANAO had recommended, and completed this analysis in February 2017. The department commenced consolidating the recommendations from the training needs analysis into a learning and development framework in June 2017. The department began delivering the training as per the learning and development framework in February 2018. Tailored specialist modules, including training on the RMS, were scheduled for delivery from June 2018.
At the public hearing, the department explained that all operational staff had commenced ‘either a nationally recognised qualification in cert IV in government or a diploma in government’ in February 2018. This training includes core capabilities, specialist capabilities, and system test training, and all staff will complete the training program by November 2018 with assessments to take place throughout December 2018. IT system processes and ensuring the integrity of data entry are included in this training program.
As part of the learning and development framework, all staff are required to complete training in core capabilities, and managers are required to complete leadership modules that include performance management, continuous improvement, and coaching and developing others. Staff will undertake re‑accreditation and assessment on an annual basis. The core capabilities included in the training are:
communication and representation;
principles of decision‑making;
administrative law and legislation;
analysis and critical thinking; and
judgement and evidence‑based decision‑making.
In addition to the core capabilities, there are two specialist capability streams which have been tailored to be role‑specific to departmental staff, as outlined in Table 3.1.
Table 3.1: Role-specific capability streams for staff
Preparing documentation and writing reports
Analysis and judgement
Preparing documentation and writing reports
Conducting audits and inspections
Acting on non‑compliance
Foundation and advanced training courses for the covert system test function
Source: Department of Home Affairs, Submission 8.1, pp. 2-3.
The follow‑up audit found that the department’s learning and development framework ‘broadly aligns with six of the seven principles for better practice in learning and development identified by the Australian Public Service Commission’. However, the learning and development framework did not include a monitoring and evaluation component. The ANAO found that the department was developing a monitoring and evaluation component, but as at March 2018 it was not yet finalised.
The monitoring and evaluation strategy was finalised in August 2018 and will commence from January 2019. These timeframes were agreed to allow for the implementation of the certificate IV qualification training program for operational staff in 2018. From January 2019, the department will ‘start monitoring those data points across our systems to see if we’re getting a lift in performance both in terms of data quality and in terms of outcomes’.
The department continuously examines the quality of the work being produced through the internal operations governance and quality assurance process within the division. This involves ongoing activities such as assessing whether appropriate procedures were followed, and undertaking the quality and assurance function within the division. The department explained that:
… setting January as the formal start point, that doesn’t mean that we’re not looking at the data now; it means in terms of technically doing the performance evaluation of the L&D framework, we’re starting the evaluation once the L&D implementation is complete.
At the public hearing, the Committee queried what baseline data the department was using to evaluate the performance of the delivery of the training. The department confirmed that it was satisfied that the baseline data that it had access to will enable it to monitor and evaluate the effectiveness of the training, and outlined:
…we’re looking at our internal audit process and quality assurance process. We’re looking at the actual data in our IT system, the regulatory management system itself, so we can see if we’re having quality control issues with the data and if that’s improving through time. We’re looking at our staff and development performance agreements and performance reviews, so those processes with staff, and also looking at training evaluation reports coming out of the L&D process.
Baseline data measures conditions before a project or initiative is commenced for the purposes of later comparisons. In response to the department’s explanation of the learning and development framework, the Auditor‑General noted that the department’s implementation of a structured framework was a positive step. However, in terms of baseline data, the Auditor‑General stated that:
I don’t think there would be much historic data to rely on, because there wasn’t any system for collecting that data or identifying who was trained or who wasn’t, according to the evidence that we saw in the previous audit.
The follow‑up audit noted that in order to evaluate the extent to which the objectives of the framework are being met, the department may use the results of the training needs analysis as baseline data. The department’s response to the recommendation in the previous audit (Recommendation No.4), outlined that the training needs analysis would include an assessment of the skills, knowledge and capabilities of staff against their roles, which will be used to address gaps.
Performance monitoring and reporting
The previous ANAO audit found that the department ‘did not have appropriate performance measures in place to determine whether passenger screening is effective’. As a result, the ANAO recommended that the department develop practical, achievable and measurable performance measures in consultation with stakeholders.
Since 2002, four of 10 reviews conducted to investigate aviation security have specifically noted the need for the department to develop and implement performance measures. This includes ANAO Report No.26 (2002‑03), Aviation Security in Australia. The department had commenced the process to develop performance measures on a few occasions since 2009.
In October 2017, draft performance measures were approved by the department and circulated to industry stakeholders. The ANAO examined the performance measures developed by the department and found that that the performance data required was already being generated by most industry participants. Further, the reporting frequency and timeframes of the proposed measures were considered ‘practical and achievable from an industry perspective’.
The follow‑up ANAO audit explained that the department was due to commence the implementation of performance measures in March 2018 by initially undertaking a pilot program with a selection of industry participants. Wider implementation was to occur from 1 July 2018 as part of the department’s Enhanced Mandatory Reporting Project. The implementation of performance measures was to be finalised by February 2019. As the department had developed but not yet implemented performance measures, the follow‑up audit recommended that:
In implementing Recommendation No.3 from the previous audit, the Department should ensure that performance measures are established in a timely manner alongside an effective monitoring and review mechanism to provide assurance that the performance measures developed for passenger screening are practical, achievable and measurable.
At the public hearing, the department outlined that the trial period was underway – the pilot commenced in May 2018 and is due to be completed in October 2018. In response to the Committee query regarding when the department would achieve full implementation, the department responded:
We’ll have to assess the outcomes of the trial data with airports following the conclusion of the trial in October. Because we are relying on the provision of data from airport operators, we’ll have to agree with them on the timings that are practical in terms of settling the measures, and to establish a plan going forward from there.
When the Committee further queried if there was an expected implementation date, the department responded that:
I don’t think one has been set. We’re doing the trial to identify these issues, as we have said, and then we’ll have to take those into account in setting a reasonable path forward, given that we will be requiring industry to provide this information.
In response to the department’s explanation, the Auditor‑General noted that the ANAO report outlined the original target implementation milestones for performance measures. The Auditor‑General explained that the advice given by the department at the public hearing was different to the ANAO report and ‘seems to have a longer time frame associated with it’.
Following completion of the trial in October 2018, the department plans to undertake analysis of the results and commence further industry consultation during the first quarter of 2019. Subject to regulatory approval, the department ‘will then develop regulatory amendments to require the mandatory provision of screening performance data’.
The department explained that the trial period for performance measures involves airports of varying sizes, system maturity, and data collection methods. The findings from the trial have highlighted that due to the range of industry participants and the variety in the systems used, ‘there are implementation issues with consistency of data and in individual airports’ ability to provide data in a consistent way’.
The department also outlined that this trial period was ‘an important aspect of testing that this data is in fact practical and achievable and measureable, as set out by the Audit Office’. The department is planning to assess whether the performance measures are practical, achievable and measurable, by considering:
what data received through the trial provided valid measurement of passenger screening performance; and
the industry’s ability to provide the Department with the data on a continual and consistent basis.
The previous audit noted limitations to the department’s performance reporting, including the Regulatory Management System (RMS) not having a reporting function. The department initially addressed this issue by endorsing 15 operational reports; six of these reports were ‘high priority’ and would cost $160 000. Instead, the department chose to build the reporting function in‑house.
The in‑house reporting function involves the manual extraction of data from the RMS. As at December 2017, the RMS cannot produce ‘accurate standard reports such as industry participant contact details, and compliance histories of individual industry participants’. The follow‑up audit concluded that the department has improved its capacity to provide reports to meet stakeholders’ business needs. However, the production of these reports is resource intensive due to the requirement to manually extract data from the RMS and Microsoft Excel based regional office work plans.
Senator Dean Smith
1 April 2019