- Data, privacy and surveillance
This chapter explores how data and privacy breaches can occur in relation to AI and ADM, and cautions against excessive monitoring and surveillance.
Data and privacy
Data collection
4.2The collection of worker data is a necessary part of business operations. From names, addresses, photos, bank details, health history, criminal records, and trade union memberships, organisations handle considerable amounts of sensitive and personal information about their workers.
4.3Big data is used to build, test, and use AI and ADM. As highlighted by DEWR, data can be used to ‘create a feedback loop where adoption of AI systems necessitates greater collection of employee data’. Figure 4.1 shows how big data fits into the cycle of innovation in workplace technologies, including AI and ADM.
Figure 4.1Cycle of innovation in workplace technologies

Source: DEWR, Submission 3, p. 11.
4.4With the increasing digitisation of workplaces, including work from home arrangements, worker data is often being collected without prior knowledge or consent. Dr Lisa Heap, Senior Researcher, Centre for Future Work, revealed that when workers are asked to consent, they are often asked to sign extremely broad waivers. These waivers are allowing employers to use worker data in any way they deem fit for business function. Software providers are relying on these waivers to access and use the data input into these systems by employers, often without workers’ or employers’ awareness.
4.5The Victorian Trades Hall Council outlined that excessive data collection is also occurring through biometric testing, 24/7 monitoring, and invasive machine learning techniques. This is being used to infer information about the ‘most intimate parts of workers lives’. Concerns were also raised about the validity of collecting sensitive biometric data, including blood samples, facial features, finger printing, and other personal characteristics.
4.6Dr Lisa Heap, further explained that even legitimate data collection can be misused. For example, worker data is being collected for employer-offered health and wellbeing services. The issue is that it is unclear how this worker data is being disclosed beyond these services.
4.7The QUT Centre for Decent Work and Industry submitted that work from home arrangements, along with the use of personal devices by workers, ‘blurs the distinction between work and non-work activities and space’. Workers are not able to adequately preserve the ‘public/private distinctions between work and non-work time’. Excessive data collection regarding workers’ personal lives is now commonplace, calling into question whether there is a genuine link to business function.
4.8Mr Robert Potter, National Secretary, Australian Services Union, also explained that excessive monitoring of work from home arrangements could also undermine protection of children at home. Where workplace tracking is accessed by poor actors, including perpetrators of domestic violence, workers would be exposed to a serious breach and harm. Excessive monitoring and surveillance in these instances not only jeopardises the safety of workers, but also their families.
4.9As highlighted by the Finance Sector Union, data collected by organisations and fed into AI does not consider contextual information about workers. This is required for drawing sound and fair conclusions on workers’ productivity and habits. Use of excessive monitoring, like key-strokes monitoring, as a ‘proxy’ for worker productivity does not tell ‘a complete story’. Workers are often experiencing unfair and unfavourable evaluations that endanger their employment because of these practices.
4.10The Victorian Trades Hall Council also explained how technologies may be used to ‘pre-emptively define which workers are most likely to ‘rock the boat’:
employers have been caught using these technologies to discover the following about potential or current workers in their organisation:
- Their likelihood of falling ill and needing to take above average sick days.
- Their likelihood of becoming pregnant.
- Their sleep, fitness, and wellbeing habits.
- The structure of their personalities, including tendencies for disobedience.
- Their work satisfaction.
- Their likelihood of wanting to stay in the company long term.
- The Victorian Trades Hall Council further noted that it is unacceptable where the inferences drawn from the data collected provide the basis for adverse management decisions, including deciding to unfairly ‘eliminate annual raises for [workers], expel them from the candidate selection process, or otherwise push them out of the organisation’.
Disclosure and breach of privacy
4.12Even though worker consent is required prior to collection of data, there are no limits on how organisations may use this worker data once held. While the Australian Privacy Principles, contained within the Privacy Act, provide best practice guidance for handling of worker data, application of the principles for most Australian organisations is discretionary and unenforceable. This is because of the exemptions under the Privacy Act that apply to SMEs and worker data.
4.13Workers, and most employers, must look to the Fair Work Act to see what rules apply to the protection of worker data. Protections under the Fair Work Act, however, are limited to record keeping obligations, rather than privacy protections.
4.14The gap between the frameworks under the Privacy Act and Fair Work Act means workers have limited control over how their data is being used once collected. There is also limited recourse at the Commonwealth level when workers’ privacy is breached because of use and disclosure by organisations.
4.15Mr Robert Potter, National Secretary, Australian Services Union, illustrated how worker data can be breached, exposing workers to significant harms:
we're seeing a spike in cyberhacking and where people’s whereabouts on a daily basis and their pattern of work and their locations could be linked to some pretty poor actors out there in the world. quite often we see employers wanting to track workers by GPS to try and put some level of monitoring and performance measures around that. Because that is location based, if there was a data breach, we fear that those people and their whereabouts could be tracked. Poor actors, for example, would know when they’re not home.
4.16The AI Threat Landscape Report, referenced by Basic Rights Queensland, found that98 per cent of IT leaders believed their AI models were crucial to success, but 77percent had already experience data breaches. These breaches included:
- exposure of personal data leading to identity theft, fraud, and other malicious uses
- re-identification of individuals from anonymous data by correlating data points
- disclosure of sensitive information leading to embarrassment, discrimination or financial harm
- monitoring and surveillance through cameras or social media without consent
- secondary use of data for targeted advertising, profiling or influencing political opinions
- generation of scripts to scrape data for malicious purposes.
- While data breaches are becoming more common, trading of worker data often occurs through lawful data disclosures. Organisations are being encouraged to capture excessive worker data regardless of connection to business function. Problematically, organisations are not fully appreciating how this is being used. MrRupert Walsh, Chief Technology Officer, InnovateGPT Group, told the Committee that many organisations do not have sufficient policies nor experience to effectively protect worker data. He also highlighted that these data disclosures are being made to overseas companies.Reliance on AI and ADM owned by companies based in the UK, US and China exposes Australian workers to risks of their data ‘falling into the “wrong hands”’.
- The lifespan of data is also getting shorter as the market and economy becomes more dynamic and volatile. This means much of the data, including worker data, is obsolete by the time it produces outcomes. Secondary use of worker data for training of AI and ADM calls into question whether these technologies are actually capable of producing accurate and up-to-date outputs.
- Treating workers as ‘data subjects’ overlooks the real harms that can occur if their data is misused or disclosed to unscrupulous groups. Organisations should exercise greater caution because ‘once there is a privacy breach, it is too late to act; the repair is not possible’.
Monitoring and surveillance
4.20As noted above, monitoring and surveillance is one method of data collection that has intensified with the digital transformation. Various surveys suggest 60–70 per cent of workplaces use digital surveillance to collect worker and workplace data.
4.21Ms Nicole McPherson, National Assistant Secretary, Finance Sector Union, shared her view that workers are under surveillance every minute of the working day:
Basically every moment from when they arrive at work, swipe their card to get in the building and log onto their computer to when they leave at the end of the day. Every step that they take at work is largely monitored. We see specifically log-on and log-off times monitored. We see things like the time spent on calls or time spent on tasks, recording of all internal and external interactions, keystrokes logging, and logging of how they interact on programs like Teams.
4.22Figure 4.2 identifies that monitoring and surveillance is carried out through a broad range of techniques and technologies:
Figure 4.2Types of workplace surveillance and monitoring techniques

Source: DEWR, Submission 3, p. 8.
4.23According to the Victorian Trades Hall Council, monitoring and surveillance can be legitimate, context-dependent or illegitimate.
- Legitimate surveillance observes processes, including trends on general operations, protection of property and assets, ensuring WHS, and evaluation of production processes. Data is not attributable to specific workers.
- Context-dependent surveillance observes performance of teams or workers. It can be used positively for feedback or training or negatively to influence unwarranted disciplinary action.
- Illegitimate surveillance observes people and is used to make inferences about workers and their personal idiosyncrasies outside the scope of their job functions. This includes monitoring of their personal life, attitudes towards management, industrial relations engagement, medical history, and health and lifestyle.
- Legitimate monitoring and surveillance, carried out lawfully and with sufficient oversight, serves an important function to ensure business is conducted effectively, efficiently, and safely. The Committee heard however that excessive monitoring and surveillance is leading to significant harms to workers, businesses, customers and the public more broadly.
Work health and safety
4.25Safe Work Australia maintain that obligations of safety risk management apply regardless of whether the risk is caused by physical hazards or technology:
the approach to managing risks and the health and safety in your workplace arising from new ways, new software or whatever, will be the very same as when we manage the risks of guarding machinery or other risks and hazards in the workplace. The key message we want to give today is that the fundamentals of safety risk management are designed to deal with change.
4.26There are significant WHS implications arising out of excessive monitoring and surveillance of workplaces. It was identified that this can amount to extreme micromanagement and can have devastating consequences.
4.27Dr Fiona Macdonald, Acting Director, Centre for Future Work, used home-care work as an example of how AI-enabled excessive monitoring and surveillance amounted to micromanagement for workers:
Home-care workers are required to clock-on and clock-off at the beginning and end of tasks as well as at the beginning and end of a job. Their locations are tracked from place to place and their journeys from one job to another are tracked to make sure they stick with certain routes. AI is used in those circumstances to send messages to managers or supervisors, who can contact them then they think the worker has done something wrong, so there is that constant surveillance and immediate contact.
4.28The QUT Centre for Decent Work and Industry also outlined how these technologies track workers’ every move, in and out of work to ‘optimise worker productivity’. This includes automatically taking screenshots of remote workers’ computers, keystroke tracking, voice and facial expression monitoring, recording of conversations, and even sleep habit tracking.
4.29The Australian Services Union noted that this is creating workplaces characterised by suspicion and worry, especially when they involve ‘wearable monitors, pervasive surveillance systems, and tracking software’. This environment is directly contributing to WHS hazards, including increased rates of ‘strain and injuries’, and ‘exhaustion and mental stress’ among workers. These are manifesting as repetitive stress injuries, sleep difficulties, and depression and anxiety.
4.30For instance, the Victorian Trades Hall Council identified a blue-collar worksite subjected to ‘excessive use of surveillance’. This resulted in a ‘‘highly dysfunctional and negative’ work environment which saw three workers placed on suicide watch—with one even taking their own life’.
KPIs
4.31Organisations can use AI systems to set intensified and unachievable key performance indicators (KPIs). KPIs are imposed on workers who are tracked. This can lead to significant physical and psychosocial harms in the workplace if KPIs are unsustainable or unrealistic.
4.32Mr Bernie Smith, NSW Branch Secretary and Treasurer, Shop, Distributive and Allied Employees Association, shared that work intensification to meet unreasonable KPIs is seeing organisations attempt to ‘remove every bit of downtime between breaks’ and ‘discourage normal human interaction’ between workers.
4.33Further, some organisations are using monitoring and surveillance technology to encourage risk-taking behaviour, as demonstrated by the Toll Transport case. In this case, Toll Transport installed real time monitoring cameras inside cars to identify workers that may be distracted or fatigued. Despite limited studies on the effect to health, infrared beams were shone into the eyes of drivers for up to 12 consecutive hours. This monitoring and surveillance technology was focused on ‘constantly assessing driver behaviour and productivity’ at the expense of worker and road safety.
4.34In the case of delivery drivers, the Committee heard that work intensification is leading them to take unacceptable risks to meet intensified KPIs, including ‘speeding, ignoring road signs, and driving in poor weather conditions’. These KPIs are ‘contributing to long work hours, fewer breaks and driving when fatigued’. In extreme cases, this is leading to workplace deaths. According to the Transport Workers Union, since 2018, 18 gig-economy transport workers have died because of these conditions:
workers are entirely managed through artificial intelligence … the outcome of AI used in this way in the transport industry is that people die. When AI pressure to deliver at unrealistic speed leads to dangerous practices on our roads, people die.
4.35AI systems are also leading managers to draw incorrect inferences about workers’ performance and customer sentiment based on surveillance material. As the Finance Sector Union explained, if a customer makes a comment to a worker like, ‘unfortunately, the weather isn’t great’ during a recorded conversation, technology is picking up the word ‘unfortunately’ and rating the interaction as negative. The worker is then accountable for this ‘negative’ interaction.
4.36Workers internalise negative outcomes instead of attributing an inability to meet unreasonable KPIs or overcome poor performance evaluations to AI and ADM. Mr Oscar Kaspi-Cruchett, Researcher, Politics, Victorian Trades Hall Council, told the Committee that workers are not recognising that the KPIs were unattainable in the first place:
If they get a low score from these systems, it triggers shame, hurts their self-esteem and creates anxiety. Workers may have the idea that, ‘No, I’m being overworked; I’m being asked to do too much without any increase in my pay’, but these technologies are used to avoid them reaching that conclusion, and instead they blame themselves.
4.37Excessive monitoring and surveillance to track workers’ attainment of KPIs can create unsafe workplaces. Regardless of organisations’ intentions, stakeholders submitted that these ‘profit-seeking’ and cost-saving tactics undermine the health and safety of workers, and directly contradict employers’ obligations to eliminate or minimise such hazards.
Dignity and autonomy
4.38Excessive monitoring and surveillance can reduce dignity and job control for workers. The Australian Industrial Transformation Institute explained that, ‘dignity at work is undermined when technology cannot be overruled and those subjected to surveillance are denied the right to comment/correct/present their side’.
4.39According to the Queensland Nurses and Midwives’ Union workers at the Commonwealth Bank of Australia were forced to take leave from their roles because of covert excessive monitoring and surveillance. A desk booking system was used to determine if workers were productive enough. If a worker was deemed unproductive in this AI-informed assessment, they were intimidated into taking leave.
4.40Reduction of workers’ ability to control the way they carry out their jobs limits their autonomy and ability to exercise their expert judgement. Mr Simon Mitchell, Queensland Nurses and Midwives’ Union, and councillor at the Queensland branch of the Australian Nursing and Midwifery Federation, highlighted how low job control in healthcare creates significant risks for clinicians and patients:
if you're looking at a patient, it's the ability to very rapidly assess a patient who may be deteriorating very rapidly and asymmetrically—that is, a small disease process can have a devastating effect on them because of comorbidities. So it's important that we have a health culture that allows not so much command and control but that collegiality and flexibility for nurses to be able to quickly adapt to that environment. A big issue for us, particularly in big health institutions, is managerialism—the control by management over clinicians, to the point where clinicians cannot or don't feel confident to be able to actually carry out their roles to full capacity.
4.41Where management is completely automated through systems, like gig-economy jobs, WHS issues and feelings around job control can be left unaddressed. For example, Mr Utsav Bhattarai, a gig-economy transport worker, told the Committee:
I raised a safety issue through the app, an issue that concerned my well-being on the job. The response I received was automated, irrelevant and entirely disconnected from the reality of my situation. It was as if I was shouting into a void, hoping for help but not receiving any—only receiving an echo of a system that doesn’t listen.
Bullying, discrimination, and inappropriate conduct
4.42Excessive monitoring and surveillance technologies can enable bullying, harassment, discrimination, and other inappropriate conduct in the workplace. Marginalised cohorts who are overexposed to workplace discrimination are ‘more likely to feel the effects of surveillance’. This includes women, multicultural communities, First Nations people, young people, and people with disabilities.
4.43The Victorian Trades Hall Council shared an example of a young female copywriter who reached out to the Young Workers Centre to report how monitoring and surveillance was being used by her manager to sexually harass and bully her team:
[she] was required to be on zoom every hour she was working, even while she was on the phone or writing material as part of her work, so her employer could watch her work. Her manager separated her out into breakout ‘rooms’ and used the rooms to sexually harass and bully her. She also overheard him bullying other staff members, and witnessed him punching a wall in his home office.
4.44While the onus of safe workplaces rests with employers, work from home and remote work arrangements can undermine this. Stakeholders stated that these arrangements have opened more ‘online channels which provide new opportunities for sexual harassment’ by colleagues and managers.
4.45Basic Rights Queensland shared research which outlined the rates of workplace-technology facilitated sexual harassment (WTFSH). This was defined as ‘unwelcome and/or threatening sexual conduct using mobile, online and other digital technologies within a workplace context’. The research found that:
- 1 in 7 survey respondents perpetrated WTFSH, with 24 per cent identifying as men, and 7 per cent identifying as women
- 1 in 4 respondents who perpetrated WTFSH claimed they were motivated by negative feelings, including wanting to annoy, humiliate, and frighten their victims
- 45 per cent of respondents who perpetrated WTFSH worked in male dominated industries
- 39 per cent of respondents who perpetrated WTFSH never had reports or complaints made against them.
- The research also identified that persistent sexist and discriminatory attitudes at work reinforced sexual harassment and low levels of reporting by victims.
- Surveillance is often more ‘concentrated’ in low-paid, ‘feminised’ industries, including retail, customer service, hospitality, and caring sectors. Global research found that in the private sector, women experience higher rates of surveillance, with non-unionised women 52 per cent more likely to experience surveillance at the workplace. Women workers also have more sensitive personal data collected by their employers which could exacerbate harms. This includes pregnancy history or experiences of family violence.
- Excessive monitoring and surveillance can also amount to bullying, intimidation, and extreme invasions of privacy. This occurred in 2017 when workers took industrial action on a mine in Queensland. During this period dubbed ‘Project Zuckerberg’ by the employer and security firm, workers were subjected to a ‘full scale intelligence operation’, which included:
- charter planes and ‘operative’ agents engaged to track workers
- workers’ homes being encircled by unidentified security agents wearing body cameras
- workers being filmed in their homes and out with their families
- reading of internal emails to identify incriminating information about workers.
- The above example also highlights how some organisations are using excessive monitoring and surveillance to thwart workers’ lawful right to organise. For example, Amazon have reportedly engaged in a wide range of activities to discourage union activities, including:
- hiring staff for global surveillance of union activity across all sites
- using AI-driven heat maps to forecast the probability of certain worksites to unionise
- forcing workers to attend and watch anti-union meetings and propaganda
- banning discussions about unionisation on the shop floor
- threatening to withhold wage increases if workers unionise
- placing postal boxes for collection of votes by workers seeking to unionise under constant surveillance.
- Use of excessive monitoring and surveillance to perpetuate harm is furthered when it targets specific workers or occurs without workers’ awareness. Professor Paul Henman, Chief Investigator, Australian Research Council Centre of Excellence for Automated Decision Making and Society, explained:
An important and growing aspect of contemporary surveillance is what Lyon calls ‘surveillance as social sorting’ – whereby digital surveillance sorts or classifies employees and customers into different categories to be treated differently and different levels of surveillance and monitoring [applied].
4.51Research by the QUT Centre for Decent Work and Industry found that ‘all but one’ of the popular monitoring technologies ‘offered covert monitoring to ensure that employees were “unaware” that they were being monitored’.
4.52While technology can and does act as a ‘disincentive’ for workers engaging in bullying and sexual harassment, it is also true that these technologies enable and worsen this conduct if used inappropriately.
Committee comment
4.53The Committee recognises that the collection of worker data can be a necessary and legitimate part of business operations, and that big data—which can involve worker data—is used to develop AI and ADM. However, with the rapid digital transformation of workplaces, employers are arguably engaging in excessive data collection.
4.54Further, the Committee heard about significant regulatory gaps for the protection of workers’ data. For example, while worker consent is required prior to the collection of worker data, there are no limits on how organisations may use this data once provided and stored.
4.55The Fair Work Act is limited to protections around record keeping obligations, rather than privacy. It is the Committee’s view that certain disclosures should be prohibited, as well as the sale to third parties of workers’ personal data and any data collected in connection to work or undertaken during employment.
4.56The Committee also supports strengthening dispute resolution processes to better deal with workplace disputes in relation to privacy matters. As the Privacy Act is not well equipped to deal with such workplace disputes, it is recommended that the Fair Work Act establish—within the Fair Work Commission—a focus on dispute resolution processes for complaints about non-compliance with privacy obligations.
4.57Protecting data and privacy will remain a primary concern for government, particularly as cyber-attacks lead to more frequent and damaging data breaches—for example, Medibank and Optus data breaches (2022), and the Latitude data breach (2023). The 2023–2030 Australia Cyber Security Strategy demonstrates that shared efforts across government, industry, business and the community are imperative to protect Australian data. The risk of data and privacy breaches will increase if existing regulatory gaps remain regarding the use and handling of worker data. Australia requires robust and comprehensive protections for worker data across all sectors.
4.58The Committee is conscious of the impacts relating to the collection and handling of data through technology-enabled monitoring and surveillance. While acknowledging that monitoring and surveillance can be carried out for legitimate and lawful purposes, the Committee is concerned about illegitimate or excessive surveillance leading to significant harms to workers, their families, businesses, customers and society more broadly. The Committee believes that meaningful consultation and transparency with workers on the use of surveillance measures and data used by AI systems in the workplace is necessary.
4.59With the increasing digitisation of workplaces—from work from home arrangements to phone recordings—there is a surge in worker data being collected by employers. This can have significant impacts on workers. Excessive monitoring and surveillance of workplaces can negatively affect WHS. Technology used to track workers’ attainment of KPIs can create unsafe workplaces. This can lead to significant physical and psychosocial harms. It can reduce dignity and job control for workers, leading to mental health impacts. Excessive monitoring and surveillance technologies can also facilitate bullying, harassment, discrimination, and other inappropriate conduct in the workplace. The evidence suggests that is even more pronounced for marginalised cohorts.
4.60Regardless of whether existing WHS frameworks are poised to respond to complex technology-related hazards, employers need more support to safely deploy AI and ADM. The Committee recommends the development of a Code of Practice, with accessible guidance for deployers and users. Realising the benefits of AI and ADM must not displace the paramount responsibility of employers to secure the health and safety of their workers.
4.61The Australian Government’s newly enacted right to disconnect legislation acknowledges that workers should be allowed to enjoy a balanced life outside of work. However, increasingly intrusive monitoring and surveillance techniques call into question whether this balance can be realised for all workers. Harmonised action across all states and territories is critical to ensuring that highly monitored workplaces are not undermining the privacy and dignity of workers.
4.62The Fair Work Act prohibits employers from taking an adverse action against an employee or a prospective employee because of a protected attribute, like political opinion, gender identity, family or carer responsibilities, and pregnancy. The Committee underscores that employers should not use technological surveillance in relation to an employee’s protected attribute.
4.63The Committee recommends that the Australian Government review the Privacy Act 1988 (Cth) and Fair Work Act 2009 (Cth) to protect workers, their data, and privacy by:
- banning high-risk uses of worker data, including disclosures to technology developers
- prohibiting the sale to third parties of workers’ personal data and any data collected in connection to work or undertaken during employment
- requiring meaningful consultation and transparency with workers on the use of surveillance measures and data used by AI systems in the workplace
- empowering the Fair Work Commission to manage the dispute resolution process for complaints relating to breaches of workers’ privacy obligations.
4.64The Committee recommends that the Australian Government:
- work with states and territories to develop greater consistency and better protections against excessive and unreasonable surveillance in the workplace
- explicitly prohibit employers from using technological surveillance in relation to an employee’s protected attribute.
4.65The Committee recommends that the Australian Government work with Safe Work Australia to develop a Code of Practice that identifies and addresses specific work health and safety risks associated with AI and ADM. This includes establishing limits on the use of AI and ADM in workplaces to mitigate psychosocial risks.