Automated systems make employment decisions, causing job gains and losses.
Ponder this for a minute: would you still have your job if an algorithm had called the shots during the hiring process? Give it some thought. Your financial history, health records, social media connections, union membership, fitness tracker data, shopping habits, and leisure activities – these could all impact your employment prospects, or even determine your fate in the workplace. Sounds absurd, right? Well, it might not be as far-fetched as you think.
The rise of "management-by-algorithm," or automated decision-making, is turning out to be a global phenomenon. Data from multiple sources is increasingly being used in HR processes, with little regulation to protect against misuse. Outside Europe, there are minimal restrictions on how companies can collect and utilize workers' personal data, making it crucial for trade unions to address this regulatory gap and put workers' data rights on the agenda.
Consider this: could an algorithm prevent you from getting hired, demote you, or even fire you? You're leaving a digital footprint everywhere you go. From social media profiles and search histories to customer service calls, medical records, and location data, you're constantly sharing your personal data – sometimes willingly, sometimes without even realizing it. This data is often used as currency, fueling the growth of "free" digital services.
Algorithms and artificial intelligence are increasingly being deployed in HR and productivity planning. Companies are jumping on the bandwagon, automating everything from job applicant sorting to productivity measurement, employee mood testing, and even determining what motivates you. This raises serious concerns about surveillance, manipulation, and algorithmic control in the workplace.
Union action is vital in this regard. We need to organize, ally with like-minded organizations, and demand a fair share of the data wealth. Regulatory gaps must be filled, and workers' data rights must be established on various levels, from collective agreements to national and international legislation and conventions. We must mobilize the ILO, the UN's Human Rights Council, national governments, social partners, and companies themselves to protect workers' fundamental rights.
Are you being managed by an algorithm? UNI Global Unionwarns that "management-by-algorithm is spreading" and warns that data concentration is putting companies into an unacceptable position of economic, digital, social, and even political power[1]. They advocate for collective ownership of data, ethical AI, and workers' data rights as key issues for unions."
UNI Global Union is actively working on these issues across the world, discussing how they, the unions, can leverage datasets for their benefit. They are speaking out against the monopolization of data ownership and questioning whether data should be made a commons, a public good accessible to all. They have also published the Top Ten Principles of Workers' Data Privacy and Protection and the Top Ten Principles of Ethical AI, outlining the essential demands to prevent an Orwellian future where workers are subjected to algorithmic decision-making beyond human control and insight.
Unions around the world must address these issues, as technologies continue to evolve at a breakneck pace. We must be clear about our ethical demands to technology and ensure that people are not excluded from the labor market due to an algorithm that no one supposedly controls. Collective ownership of data, ethical AI, and workers' data rights are key to creating a digital world of work that is empowering, inclusive, and open to all[1].
[1] Dan Blackburn ICTUR (@Blackburn_ICTUR). (2018, November 15). Are you managed by an algorithm? @CjColclough tells IUR that 'management by algorithm is spreading' and warns that data concentration is 'putting companies into an unacceptable position of economic, digital, social and even political power'. https://t.co/JpfcQuaw7j pic.twitter.com/R7omLDh6AT.Enrichment Data: Thanks to robust data protection laws in Europe, organizations must handle personal data lawfully, transparently, and securely in HR processes like recruitment, evaluation, and termination. The General Data Protection Regulation (GDPR) restricts automated decision-making that produces legal or similar effects unless specific safeguards are in place. The EU AI Act requires additional regulation for AI systems used in HR for recruitment, evaluation, and termination. Unions and worker organizations support strong data protection and transparency requirements, emphasizing the right to explanation, human override, and non-discrimination. Regulators scrutinize the use of AI in HR, and initiatives like the EDPB’s Support Pool of Experts are designed to help data protection authorities enforce the law more effectively. However, outside the EU, countries are developing their own frameworks, but none match the GDPR’s comprehensive coverage yet.
- The question arises if one's job would persist in a world where algorithms dictate hiring processes.
- Data from various sources is increasingly being utilized in HR processes, with minimal regulations to prevent misuse.
- Algorithms can potentially prevent hiring, demote, or even fire individuals based on their digital footprint.
- Companies are automating HR and productivity planning, using algorithms for job applicant sorting, productivity measurement, and even emotion testing.
- This raises concerns about surveillance, manipulation, and algorithmic control in the workplace, necessitating union action.
- Unions must advocate for a fair share of data wealth, fill regulatory gaps, and establish workers' data rights on various levels.
- Union Global Union strongly advocates for collective data ownership, ethical AI, and workers' data rights as crucial issues.
- Unions worldwide must address the impact of technologies on workers' rights, ensuring that algorithms don't exclude individuals from the labor market.
- While Europe has robust data protection laws for HR processes, outside Europe, countries are developing their own frameworks, falling short of comprehensive protection like the GDPR.