Data and Algorithms at Work: The Case for Worker Technology Rights

Annette Bernhardt,Reem SuleimanandLisa Kresge

Press Coverage

Download report pdf
View the webinar


 

Overview

Across the country, employers are increasingly using data and algorithms in ways that stand to have profound consequences for wages, working conditions, race and gender equity, and worker power. How employers use these digital technologies is not always obvious or even visible to workers or policymakers.[1] For example, hiring software by the company HireVue generates scores of job applicants based on their tone of voice and word choices captured during video interviews.[2] Algorithms are being used to predict whether workers will quit or become pregnant or try to organize a union, affecting employers’ decisions about job assignment and promotion.[3] Call center technologies are analyzing customer calls and nudging workers in real time to adjust their behavior.[4] And grocery platforms like Instacart are monitoring workers and calculating metrics on their speed as they fill shopping lists.[5]

In these and many other examples, business operations and decisions are informed by near-constant collection and analysis of worker data. This trend toward data-driven workplaces has been exacerbated by the COVID-19 pandemic, with workers experiencing more invasive forms of monitoring, both inside the workplace (such as tracking social distancing behaviors) and in remote workers’ homes (such as keystroke tracking).[6] And Amazon’s warehouse and delivery workers took the brunt of skyrocketing demands for delivered goods, with constant surveillance and productivity tracking software pushing the pace of work to an alarming rate and putting workers’ health at risk.[7]

As a country we are finally talking about consumers and their technology rights, whether it’s about the data that social media companies are gathering and selling or the manipulation of elections via fake news postings. New policy responses are also starting to emerge. Consumer data privacy bills are proliferating at the state and federal levels, localities are banning the use of facial recognition technologies, and civil liberties groups are suing social network platforms over discrimination in ads targeted by race, gender, and age. The tech sector itself is engaging in debates about the ethics and regulation of artificial intelligence.

By contrast, despite the proliferation of “future of work” conferences and white papers, there has been almost complete silence in policy discussions when it comes to workers and their technology rights.[8] This, despite the fact that workers currently have very little say about what data is collected on them, how employers are combining that data with algorithms to make decisions about them, and how these systems impact their jobs and livelihoods.

The almost complete lack of regulation means that there are strong incentives for employers to use digital technologies at will, in ways that can directly or indirectly harm workers. Similarly, developers are largely free to sell untested and faulty systems based on dubious science, exacerbating the potential harms against workers.[9] Those harms can take the form of work intensification and speed-up; deskilling and automation; hazardous working conditions; growth in contingent work; loss of autonomy and privacy; discrimination; and suppression of the right to organize. Of particular concern is that workers of color, women, and immigrants can face direct discrimination via systemic biases embedded in these technologies, and are also most likely to work in occupations at the front lines of experimentation with artificial intelligence. A future where workers labor in digital sweatshops, micro-managed with no autonomy and under constant pressure, is not too difficult to imagine.[10] This is already the reality for some workers.

In short, it’s time to recognize that workers have important and legitimate interests regarding the use of data and algorithms, just as consumers do. The discussion of technology rights needs to be extended into the workplace, explicitly confronting the fundamental imbalance in power between workers and the firms they work for—whether as employees, subcontracted workers, or independent contractors. Will data-driven technologies be used to benefit workers and enable them to thrive in their jobs? Or will technology be used to oppressively control labor, deskill jobs, suppress the right to organize, and reinforce race and gender inequality?

Public policy has a pivotal role to play in answering these questions. Technology is not inherently good or bad, but neither is it neutral; the role of workplace regulation is to ensure that technologies serve and respond to workers’ interests and to prevent negative impacts. Regulation is all the more important because employers themselves often do not understand the systems they are using. What we need, then, is a new set of 21st century labor standards establishing worker rights and employer responsibilities for the data-driven workplace. These standards should be established both in public policy, which is our focus here, and in collective bargaining agreements in unionized workplaces.

The goal of this report is to give policymakers and other stakeholders an understanding of trends in the data-driven workplace and a framework of the technology rights that workers need and deserve. In Part I, we describe data-based technologies, how they are being used in a wide range of industries, and the potential harms for workers.[11] We then lay out in Part II a new set of policy principles that give workers rights with respect to their data; hold employers responsible for any harms caused by their systems; regulate how employers use algorithms and electronic monitoring; ensure the right to organize around technology; guard against discrimination; and establish a strong enforcement regime.

We view these technology rights and protections as the bedrock upon which to build an economy that works for everyone. Ultimately, the goal is that workers fully participate in decisions over which technologies are developed, how they are used in the workplace, and how the resulting productivity gains are shared. This participation need not and should not be anti-innovation, because workers have a wealth of knowledge and experience to bring to the table. Dehumanization and automation are not the only path. With strong worker protections in place, new technology can be put in the service of creating a vibrant and productive economy built on living wage jobs, safe workplaces, and race and gender equity.

 

Part I. The New Workplace Technologies

The revolution in big data and artificial intelligence of the past two decades has yielded a wide array of tools that employers can use to capture and analyze worker data, electronically monitor their workers, and manage them using algorithms.[12] Of course, data analytics applied to work processes is not new; for example, Taylorism and scientific management formed the linchpin of mass industrialization.[13] But today, we are seeing employers develop new business models and methods of worker control and productivity management based on digital systems that have the potential to substantially affect working conditions, job quality, and race and gender equity.

It is important to understand that the data-driven workplace is an emerging trend; we are just at the beginning of both the development and the adoption of these digital technologies. Moreover, the lack of regulatory oversight has turned workplaces into sites of experimentation with these systems, many of which are hidden from workers, policymakers, and researchers. That said, below we give a brief overview of data-driven technologies being developed for and deployed in workplaces, provide examples of applications in a range of industries, and identify potentially harmful impacts on workers. We draw on interviews with technology and labor experts, including workers, as well as technology vendor materials and extensive secondary research conducted by us and others.

 

A brief overview of data and algorithms

Data-driven technologies can range from the mundane (such as resume-parsing technologies that identify keywords and skills) to the incredibly complex (such as computer vision-based detection of human activities). Here we give a simple review of these technologies and how they are used.

Worker data collection: Employers can collect an extensive array of data about workers. Some of it is gathered in the workplace, such as computer activity, location in the building, customer ratings, bathroom use, coworker interactions, and smartphone app interactions. Other data is bought from third parties, like social media activity, credit reports, driving history, and consumer activity. Some of this data, such as criminal background checks, has been collected by employers for decades. More recently, as new wearable sensors have become available, employers have partnered with technology vendors and wellness programs to collect more personal biometric and health and wellness data. Methods of data collection range from directly soliciting data from workers (and customers) through surveys or data mining the internet, to microphones embedded in worker badges. Employers may collect worker data themselves, but they may also contract with third-party firms to do so; an entire ecosystem is emerging of businesses engaged in collecting, processing, and selling worker data. New technologies continue to be developed at a rapid pace, expanding the range of worker data that can be captured.[14]

Electronic monitoring: Electronic monitoring is a particularly invasive form of data collection that entails extensive, and often continuous, monitoring of worker behaviors and actions. While not new, electronic monitoring has become more common with the development of passive data collection technologies such as sensors embedded in workplace equipment, devices, and wearables (e.g., wristbands) that can capture a wide range of data on worker location, activities, and interactions with coworkers. Likewise, systems that log keystrokes and capture screenshots enable employers to monitor computer and internet activity. Employers also use GPS technologies embedded in vehicles or in workers’ personal smartphones to monitor their presence on job sites and track their locations while out in the field. More recently, sophisticated monitoring systems based on advances in computer vision and human detection are being used to analyze in real time video captured by workplace cameras.[15]

Algorithms: An algorithm, in its simplest form, is a set of rules in computer programming code for solving a problem or performing a task based on input data. Computers are able to complete tasks independently by following the instructions outlined by the algorithm. The simple version of an algorithm is like a cookbook recipe: the algorithm is simply following a set of commands dictated by the programmer for how to transform the ingredients (data) into a meal (an employer objective). But recent advancements in artificial intelligence research have resulted in much more complex algorithms. The more advanced versions of these algorithms accomplish tasks and make decisions by mimicking human capacities to reason, learn, and recognize visual objects, text, and speech. The key point to understand is that algorithms transform input data into technological outputs, which can take the form of everything from promotion recommendations and instructions for delivery drivers, to chatbots and semi-autonomous service robots that complete job tasks.[16]

Workplace applications: Employers use data collection, electronic monitoring, and algorithms for a wide range of functions and purposes in the workplace, including:

  • Human resource analytics, such as hiring, performance evaluation, and on-the-job training. Hiring software is an especially important example, because employers are increasingly using it to partially or even wholly automate the recruitment, screening, and evaluation of job candidates—with substantial risk of bias and discrimination.[17]
  • Algorithmic management, such as workforce scheduling, coordination, and direction of worker activities. Productivity management systems are an especially important example, where employers use electronic monitoring and algorithms to closely track workers’ productivity, set quotas, and make consequential decisions such as discipline or firing based on performance metrics.[18]
  • Task automation, where some or even all tasks making up a job are automated using data-driven technologies. Examples are computer analysis of security surveillance footage, semi-autonomous service robots, and self-driving cars. One of the most common scenarios is partial task automation, where employers use technology to augment (but not replace) workers’ jobs, such as in the use of customer service chat bots in the retail industry.[19]

It is important to understand that data-driven technologies are, in the end, creatures of their creators and users. Humans make decisions about the objectives, design, and implementation of these systems.[20] In the workplace, employers decide if, when, and how to use electronic monitoring; which performance metrics to use; which management decisions or functions to automate; and whether to continue using productivity systems that are potentially harmful to workers’ bodies.[21]

 

Industry examples of workplace applications

In what follows, we give concrete examples of how data-driven technologies are being used in a wide range of workplaces. This is not a comprehensive inventory. Our goal is to illustrate the diverse ways that employers are using data collection, electronic monitoring, and algorithmic management, with a focus on industries that often pay low wages and depend on the labor of workers of color, women, and immigrants.

Call centers

While call center employers have monitored workers for decades, basic audio recordings of calls are increasingly being replaced by much more advanced monitoring and performance management systems.

Remote monitoring: Remote working in the pandemic has both highlighted the use of existing technologies to monitor workers and accelerated the adoption of new technologies. For example, Teleperformance, a call center company that provides remote call center services, uses webcams with a computer vision system that monitors workers at their computers and attempts to detect whether they are following company policies. If the system detects a work rule violation (such as non-work use of a mobile phone), it can send real-time notifications to a manager who can intervene and address the issue with the worker immediately. Multiple studies have documented the negative stress-related health effects of this intense level of electronic monitoring.[22]

Worker guidance and performance scoring: One technology vendor, Cogito, designs technology systems intended to help call centers improve customer service and efficiency. Its system monitors, records, and analyzes conversations and other interactions between call center employees and customers. Based on an analysis of customer sentiment and call center worker behavior, the system provides real-time behavioral guidance to workers on a computer dashboard, coaching them to express more empathy, pace the call more efficiently, or exude more confidence and professionalism. Supervisors also have access to a dashboard that notifies them of problematic situations and provides a “customer experience score” based on the worker’s performance metrics such as call efficiency, sales conversions, and customer churn.[23]

Warehouses and distribution centers

Warehouses and distribution centers have been early adopters of electronic monitoring and algorithmic management tools to manage inventory and staff.

Productivity monitoring: The warehouse industry is at the forefront of adopting automated labor management systems designed to increase worker speed and decrease error rates. These systems often rely on a granular level of electronic monitoring to set productivity quotas. Data collected from handheld or wearable product barcode scanners enable firms to track workers’ scan rates, errors, and lag time between scans (which can result in workers being penalized for too much time “off task”). These systems can also send performance notifications to workers nudging them to increase their pace or accuracy. In some systems, productivity scores can be shown in real time on video-game “leaderboards,” pitting workers against each other. Managers can monitor workers and receive reports on their productivity metrics. The systems can even send automated notices to human resources to fire workers for repeated low productivity scores.[24]

Task direction systems: Another type of warehouse technology focuses on directing worker tasks, especially picking products to fulfill orders. Two examples of these systems are voice-directed systems and autonomous mobile robot picking carts (also known as “lead me” carts). Both systems use algorithms that perform a variety of tasks, from analyzing warehouse workflow to assigning tasks and optimal picking routes to individual workers. “Lead-me” carts direct workers from one warehouse location to another, setting the pace of work and instructing the worker on what product and quantity of items to pick at each stop. Voice-directed systems provide workers with verbal step-by-step instructions on how to navigate the warehouse and which items to pick. Workers wear headsets with microphones and carry mobile devices equipped with speech recognition systems that enable workers to receive directions and verbally confirm task completion to the system. Both systems enable a granular level of monitoring of worker activities and provide managers with extensive data analytics on worker performance.[25]

Home care

As the U.S. population ages and demand for home care services for elderly and disabled people continues to grow, new technologies designed to monitor and manage home care workers are proliferating.

Electronic visit verification systems: In an effort to prevent fraud, the federal 21st Century Cures Act of 2016 included a provision requiring states to implement a system of electronic visit verification (EVV) for home care services reimbursed under Medicaid to ensure that services are actually rendered to those who qualify for home care assistance. The Cures Act requires that EVV systems provide a means to verify the date, time, location, and type of service provided, as well as the individuals providing and receiving the service. However, EVV implementation varies widely across states and in its degree of invasiveness for workers. In California, the home care worker is only required to enter relevant visit data into an online portal. Other states issue handheld devices, which the worker uses to clock in and out and record service data during the home care visit. Some states require workers to install an app on their smartphones that tracks their location in real time. In the most invasive version of EVV, states may also opt to include biometric recognition systems, such as facial recognition or fingerprints, to verify the identity of the home care worker or recipient.[26]

Home care apps: Two types of home care platforms—or apps—are increasingly being used in the industry: (1) on-demand platforms that manage the labor and payment transaction between a care provider and customer, and (2) marketplace platforms that provide a listing of available workers to individual households who then employ workers directly. On marketplace platforms, such as Care.com, clients can view worker profiles to find and select service providers. Worker profiles display performance metrics based on data compiled by the platform, such as customer request response times and customer ratings (which have the potential to perpetuate discrimination against people of color and immigrants in hiring and wage offers).[27] These ratings have a significant impact on which workers will be featured in customers’ searches, and therefore on their likelihood of finding work.

Retail and grocery

In addition to using technology to have customers do their own check-out, the retail industry is also at the forefront of using technology to collect data, monitor, and manage workers.[28] Key examples include:

Background checks and social media monitoring: Large retailers often deploy hiring technologies to help process large volumes of job applications. One company, HireRight, offers services tailored to the retail industry. In addition to standard checks for criminal records, immigration status, and other background screenings, HireRight maintains a retail theft database of employer reports of employee shoplifting, theft, or fraud—including alleged thefts that never resulted in legal action. Despite multiple legal challenges, retail theft databases remain legal. Likewise, criminal background checks are often plagued with errors. Moreover, given the well-documented racial bias in the criminal justice system, even accurate background checks can perpetuate racial discrimination and labor market exclusion. HireRight also recently developed a partnership with a technology vendor specializing in data mining job candidates’ personal social media accounts, to predict the risk that job candidates may be whistleblowers. The same strategies have also been used to identify worker organizing activities.[29]

Workforce scheduling systems: Over the past decade, many retailers have adopted scheduling optimization systems. These systems draw on a variety of data to predict customer demand, make decisions about the most efficient workforce schedule, and generate schedules that can adjust in real time as new data becomes available. Some systems, such as Percolata, use computer vision and algorithms to monitor and measure in-store customer traffic and worker activities. The Percolata system then estimates sales productivity scores for each worker and creates schedules based on those scores. Scheduling optimization systems can be programmed to incorporate worker preferences or to prevent back-to-back (“clopening”) or erratic schedules. However, these capabilities are often not fully enabled by managers and programmers, which can result in highly variable, unpredictable, and discordant schedules for workers.[30]

Grocery delivery apps: One of the most substantial technological changes in the grocery industry over the past few years has been the introduction of order fulfillment and food delivery platforms. One of the largest, Instacart, allows customers to monitor and communicate with workers as they shop for and scan each item on the customers’ grocery list, receive notifications of estimated delivery times, and rate workers’ performance. The platform also tracks and generates metrics on the workers’ accuracy, speed in fulfilling orders, degree to which they follow scripted language in chat conversations with customers, as well as their customer ratings. Workers receive regular notifications regarding their performance and are penalized for not meeting speed and quality metrics, which can result in firing or removal from the platform.[31] Another grocery platform app, Shipt, translates performance metrics into an “effort-based” pay algorithm that obscures how pay is calculated and has been shown to distribute pay inequitably among workers.[32] It is important to note that grocery stores are themselves also adopting monitoring technologies (e.g., barcode scanners, computer vision systems, etc.) to evaluate and score the performance of their in-house workers.

Janitorial and security services

The building services industry is increasingly adopting workforce management systems that rely on cloud-based platforms and mobile apps to manage and track workers such as janitors and security guards.

Janitorial services: Many janitorial companies have turned to platform-based systems to manage their workers. These systems serve a wide range of functions, from allowing workers to view pay stubs and check work hours to requesting time off and completing training modules. Some systems enable workers to clock in and out for shifts, submit maintenance reports, and send and receive notifications to supervisors. More advanced systems rely on algorithms to optimize cleaning routes and assign job tasks to workers, and then require workers to scan QR codes to verify they’ve completed a task. Others may include GPS to track workers’ presence on a job site, detect rule violations (e.g., late check-ins), and send alerts to managers. GPS-based monitoring systems can easily extend employers’ ability to monitor workers well beyond the workplace and work activities.[33]

Building security: Building security companies are deploying similar platform-based management systems as the janitorial industry. Many of the functions are the same (e.g., human resources features, job task verification and monitoring). However, some security guard management systems also allow workers to report incidents by uploading time-stamped photos (with geolocation) or notes from their phone. More advanced systems rely on complex algorithms to analyze data collected through CCTV video cameras and building sensors and automate decisions about when to deploy frontline security guards. Some of these systems are designed to classify objects in the video stream (such as firearms) while other systems use facial recognition systems to identify potential shoplifters. This raises questions of responsibility and accountability, given that these systems are not error proof – i.e., will workers be blamed when the systems make an error.[34]

Transportation

Employers in the transportation industry use a wide range of technologies to monitor, manage, and direct workers who drive passengers or deliver goods.

Driver monitoring: Truck and delivery fleet drivers are subject to extensive electronic monitoring. For example, sensors in trucks track everything from location, braking and acceleration patterns to lane changes, speed, and seatbelt use. Additionally, dash cams and audio recording technologies monitor and collect data on driver activities in the truck cab. Increasingly, these data streams are further analyzed using computer vision systems along with facial analysis and object recognition techniques to identify driver fatigue or driver distractions, such as texting or eating while driving. These systems enable fleet managers to exert control over workers by setting quantified metrics to evaluate driver performance and challenge workers’ accounts of driving conditions.[35]

Transportation platforms: Transportation platform companies such as Uber and Lyft offer on-demand services to customers by dispatching drivers (typically misclassified independent contractors) to pick-up and drop-off locations and coordinating communications through their apps. Not only do the platforms handle payment between parties, they set the price of the service and receive a percentage of the transaction. The platform app enables the companies to manage workers from afar, directing their activities, sending them notifications, and monitoring and collecting data on their behaviors. Moreover, the companies use incentives and penalties to shape worker decisions (e.g., when and for how long to drive). Drivers can be penalized for canceling or declining dispatches or for poor customer ratings, which in some cases can result in deactivation from the platform.[36]

Restaurants

Although the restaurant industry has experimented with robots and other types of automation, customers still largely prefer human servers. Therefore, restaurants have turned to technologies that cater to customers and monitor staff performance.

Self-service ordering: Restaurants are increasingly installing tabletop tablets that allow customers to browse menus, self-order food, and pay at the table. Some systems connect the tabletop ordering system with wearables, such as watches, that enable staff and managers to receive real-time notifications of customer requests or complaints. At the end of the meal, these systems can also prompt customers to fill out a satisfaction survey to rate their experience and their server. Some systems translate customer ratings into a score that restaurants can use to evaluate servers, effectively shifting managerial evaluation to the customer. Research has shown that relying on customer ratings for worker performance metrics can facilitate harassment and perpetuate discrimination.[37]

Performance monitoring: Another emerging technology in the restaurant industry is the use of electronic monitoring to analyze workers’ job performance. For example, the company Presto has developed a computer vision system that analyzes video data streams to automatically classify objects and human activities (and therefore flag, for example, long wait times for food or untidy waiting areas). The system uses this analysis to generate scores of likely customer experience. Based on these scores, the system can send real-time notifications to staff so that they can address issues immediately, as well as individual performance reports to managers. The company offers a similar product to monitor fast-food workers as they process orders for drive-thru customers; this system purports to identify worker errors and evaluate job performance. Not surprisingly, computer vision systems that classify human activities can easily produce inaccurate, unfair, or biased analyses, which, when coupled with algorithmic assessments of worker performance, can negatively impact workers.[38]

Hotels

The hotel industry has increasingly adopted a suite of technologies to monitor and manage front-line workers, especially housekeepers.

Worker safety: The hotel industry has begun to introduce “panic buttons” to protect hotel workers from sexual assault and harassment (largely as a result of legislation supported by unions and requirements of collective bargaining agreements). Panic buttons are devices that housekeepers and other isolated workers carry with them while working, which when activated will notify security or emergency personnel of the worker’s precise location. The buttons rely on technologies, such as Wi-Fi and GPS, and can vary from simple devices that transmit a signal only when activated, to more complex systems that enable continuous real-time location tracking. These features can be used by employers for purposes other than worker safety, such as collecting data on workers’ location that can be used to evaluate job performance. When these systems are not strictly regulated, they potentially expose workers to data privacy and security risks.[39]

Service optimization: Hotels are increasingly adopting service optimization systems that automate task prioritization and delegation. These systems are designed to achieve a specified management objective, room cleaning order, or personalized VIP services. When guests check out of their room or request services, the system automatically delegates the task to a worker based on criteria such as their proximity or workload. Through a smartphone or tablet, workers receive notifications and an ordered task list, which can change in real time throughout the day. Managers can also access the system to communicate with workers, manually delegate tasks, and monitor workers’ activities. These systems can lead to incoherent task prioritization, unrealistic productivity expectations, and work intensification for jobs that are already physically demanding and prone to injuries.[40]

Health care

The COVID pandemic has prompted a profound revolution in health care with the expanded use of telehealth, but other technologies impacting workers have been introduced as well.[41]

Hand-hygiene monitoring: Hospitals are increasingly using automated hand-hygiene compliance monitoring (HHCM) systems to monitor workers’ handwashing behaviors. The most advanced systems use sensors and wearables (e.g., badges) to link soap or sanitizer use with workers’ entrance or exit in rooms or their proximity to patients. Some systems alert workers in real time (via color-coded lights, wristband vibrations, etc.) if the system detects non-compliance with handwashing protocols (e.g., did not wash hands long enough). Alternatively, some systems provide positive feedback to workers on their compliance. HHCM systems can allow managers to view data in real time and generate handwashing performance reports at the department, team, or individual level. But studies have questioned the accuracy of some of these systems and raised concerns for their validity in measuring compliance, which could result in unfairly disciplining workers.[42]

Service robots: Health care industry adoption of semi-autonomous robots is on the rise and appears to be accelerating with the COVID pandemic. For example, workers use delivery robots, or “smart carts,” to transport materials (e.g., linens, meals, lab specimens) to other workers. Floor cleaning robots vacuum or scrub floors along a preset route programmed by workers, who also monitor and support their operation. Semi-autonomous robots rely on a variety of technologies—such as Wi-Fi, cameras, lasers, infrared and ultrasonic sensors, and GPS—to navigate hospital corridors and avoid human and nonhuman obstacles. Unlike algorithmic systems that monitor and make decisions about workers, service robots rely on algorithms to navigate their physical environment and work alongside workers. Importantly, hospital staff must be trained on how to work around robots and support their functioning in the complex hospital environment. This raises questions of responsibility and accountability, given that workers often take the blame for automation failures.[43]

Construction

The construction industry has incorporated technologies that can monitor workers’ locations on job sites, and scan for safety hazards to prevent injuries on the job.

Location monitoring: The construction industry is increasingly adopting workforce management systems that rely on geofencing and geolocation technologies. Geofencing software works by setting a virtual boundary around an area using GPS coordinates and detects when a mobile device crosses that boundary. These systems operate through apps installed on workers’ mobile phones that can tap into the phone’s GPS function and automatically clock workers in and out as they enter and exit the job site. Construction companies can also use these systems to track travel times between job sites or location histories of where workers traveled throughout the day. Managers can access a dashboard with real-time tracking data and receive alerts, such as workers clocking in outside of a designated job site.[44]

Safety monitoring: Safety monitoring systems are gaining momentum in the construction industry. Construction firms are increasingly using computer vision and complex algorithms to analyze video footage and classify whether workers are compliant with safety protocols (e.g., wearing proper personal protective equipment). Some companies have adapted these systems to detect workers’ compliance with COVID-19 protocols, like social distancing or mask wearing, and then send real-time alerts to workers and managers. Other companies have designed systems that focus on preventing accidents, for instance, by tracking workers as they walk through job sites and predicting in real time whether their trajectory places them at risk of being hit by heavy machinery. If the system determines a likely accident, it will alert the worker through vibrations on a wristband and disable the equipment to avoid possible injury.[45]

Public sector

In an effort to streamline and improve access to governmental services, and to manage an uncertain budgetary environment, the public sector is adopting new technologies that have important implications for its workers and their jobs—including teachers, social workers, and customer service agents.

Automated benefit application support: Government agencies handle a high volume of customer contacts, most commonly for benefit program applications and for inquiries (e.g., benefit eligibility). In the past, workers reviewed paper applications or computer forms. However, agencies are increasingly adopting a variety of technologies to keep up with the growing volume of benefit applications and inquiries. For example, some agencies have turned to chatbots or virtual assistants that use natural language processing technology (similar to that found in Apple’s Siri or Amazon’s Alexa) to answer simple questions or help people navigate applications. Other systems automatically process and review digital benefit program applications entered through phone apps or websites. Due to the large volume of work, these systems have typically not reduced jobs, but instead have resulted in workers handling more complex calls the system is unable to navigate. This shift can lead to work intensification and burnout, particularly if training is inadequate and workload measures do not reflect the changing level of complexity.[46]

Automated decision-making tools: Some government agencies have adopted or piloted technologies that automate decision-making for social services. Agencies are adopting these systems for two reasons: (1) to address concerns about bias or inefficiency in human decision-making, and (2) to help prioritize large caseloads when there is limited staffing. For example, agencies have adopted decision-making algorithms to identify priorities for investigations, such as responses to child protective services reports or domestic violence calls. While these technologies may replace some of the decisions previously made by humans, they can also free up social workers’ time, allowing them to focus on directly working with families. Many of these systems have received attention from scholars and advocates concerned about algorithmic harms against the public, especially in low-income communities and communities of color. However, a growing body of research also points to potential risks that these systems can pose for workers, such as loss of discretion in decision-making and being held responsible for negative outcomes for clients.[47]

A note on COVID

In some industries, the coronavirus pandemic has accelerated the adoption of data-driven technologies. An obvious example is electronic monitoring of social distancing behaviors to prevent the spread of the virus. Related, some companies added new features to existing worker management software, such as time-clock apps with “touchless” facial recognition features. Another example comes from the sudden and significant shift to remote work, which prompted increased use of webcams and other tracking software to monitor workers’ productivity more closely while working from home. Many restaurants and retailers have added delivery or curbside pick-up options, using third-party online ordering and delivery apps. And when shelter in place orders relaxed and hiring started again, many employers turned to virtual recruiting technologies, such as video interviews and algorithmic systems, to parse through applications and rank job applicants.[48] It is too early to assess how much of this technology adoption will become permanent, but the pandemic clearly introduced many employers to the power of data and algorithms.[49]

 

Potential harms for workers

Currently, much of the policy discussion about data rights is focused on privacy concerns, in part because the main focus has been on consumers. But with the advent of flawed systems based on faulty data and pseudoscience and powerful technologies such as facial recognition, there is growing understanding that the potential harms of new technologies extend far beyond privacy. This is very much the case for workers, given the diverse set of hiring, management, and monitoring tools based on data and algorithms reviewed above.

We are only beginning to understand the full range of possible negative impacts on workers. Note that these harms are not inevitable; data-driven technologies can also be used to help workers, make them safer, reduce monotony, and improve their work lives. But first and foremost, the goal of public policy should be to prevent harms to workers, which include but are not limited to the following:

Discrimination

So far, the harm for workers from data-driven technologies that has been best documented is discrimination based on race, gender, age, disability, and other categories, especially in hiring software.[50] The classic scenario is a hiring algorithm that is trained to look for job candidate characteristics that match a company’s current workforce, inevitably replicating the demographics— often white and male—of that workforce. But importantly, women and workers of color may also be disproportionately subject to harms from data-driven technologies because of the occupations where they work, especially low-wage jobs like warehousing and call centers where experimentation with invasive monitoring or algorithmic management is more likely.

Work intensification and health and safety harms

One of the key applications of data-driven technologies is to monitor and manage worker productivity, which is not harmful in and of itself. But when an employer uses technology to minutely track and relentlessly push workers to achieve greater productivity, the negative effects can quickly make themselves felt. Work intensification can have direct impacts on workers’ physical health and safety, as evidenced in the high injury rates that have been documented in Amazon’s warehouses.[51] Moreover, electronic monitoring to closely track workers’ every move can significantly affect their stress levels and mental health.[52] Extensive research has also linked job-related stress to ulcers, cardiovascular disorders, and other negative physical health consequences for workers.[53]

Deskilling and job loss

Data-driven technologies can be used to routinize jobs and break them into discrete simplified tasks, accompanied by measuring and monitoring of performance. While the employer’s main goal may be to increase efficiency, the result for workers can be deskilling of their jobs, narrowing the scope of their work, and increasing repetition.[54] The downstream consequences can be significant, in the form of lower wages, less access to training (since the job has been deskilled), and decreased job mobility. Depending on the industry, task standardization can then in turn also lead to partial or wholesale automation of those jobs, since the data gathered in real time on workers performing each task can then be used to train robots or algorithms to eventually take over. For example, chatbots learn by example as they listen in on call center agent calls, and algorithms to be used in autonomous vehicles learn from hours of monitoring truck drivers.[55]

Lower wages and less economic mobility

Data-driven technologies can affect workers’ wages through multiple routes.[56] Some can be direc⁠t—for example, when a job candidate is disqualified by an automated hiring system using criteria that are not obviously related to job performance and/or that tend to disfavor workers from marginalized groups. Wage theft is another direct example, as when time keeping software automatically deducts breaks (even if workers aren’t able to take them), or when intense productivity quotas discourage workers from taking the paid rest breaks they are legally entitled to.[57] Other times the effects on wages can be more indirect. For example, when a job is deskilled and routinized by advanced technologies, it is effectively turned into a dead-end job. In a similar vein, an algorithmic management system may make recommendations to an employer about job assignments or promotions in ways that hurt the long-term career mobility of a worker. Data-driven technologies can also indirectly serve as gatekeeper to the labor market, if qualified workers have limited tech literacy or lack access to broadband internet.[58]

Contingent work

As new technologies enable remote monitoring and management of workers, the incentive for employers to outsource previously in-house jobs to subcontractors, staffing agencies, or platform-based work is high—and with it, the increased likelihood of misclassifying workers as independent contractors. A key reason that employers outsource is to avoid bearing the full costs of employing workers directly, such as having to pay the minimum wage, carry workers’ compensation, and provide health insurance and retirement benefits. Meanwhile, workers who depend on platform-based income are excluded from workplace protections and bear the brunt of job insecurity.[59]

Suppression of the right to organize

There are growing reports that employers are using surveillance technologies to identify workers who are trying to organize a union, as well as predictive algorithms (that data-mine social media) to identify workers who might be likely to try to organize one.[60] Likewise, companies that design hiring systems can incorporate methods to screen out workers who are likely to be sympathetic to unions.[61] Such attempts to identify organizing activity are in and of themselves an intrusion on the right to organize, but especially so when employers then take steps to stop the organizing or forestall it by firing or otherwise intimidating workers.[62]

Loss of privacy

Workers have significant privacy concerns in their workplaces. Electronic monitoring, for example, can easily stray outside of the workplace, via systems that scan social media activity or apps downloaded on workers’ phones that access GPS location data regardless of whether they are on the job.[63] The risk is that this type of intrusive surveillance uncovers information about workers (e.g., their religion or sexuality) that is intensively private and not at all relevant to work performance. It may reveal a worker’s disability or other sensitive or legally protected information about the worker. Such intrusions into workers’ personal lives are especially likely for the growing number of people who are working remotely from their homes, given the broad data capture that is enabled by time clocking software or wearables that collect and use biometric data.[64]

Loss of autonomy and dignity

Finally, workers stand to lose their autonomy and dignity when data-driven technologies are used to micromanage and monitor every activity and remove all room for discretion on the job. While not as immediate or concrete as some of the harms discussed above, the danger of dehumanization at work in the era of artificial intelligence is very real, and already being reported by workers.[65] A visceral example is the potential public humiliation from having one’s productivity score compared to that of other workers on leaderboards.[66] But ultimately this is about lost opportunities. Workers want and deserve to have agency in troubleshooting and innovating best practices and learning new skills; the quashing of that very human desire is part of what’s at stake in the debate about new technology.

 

Part II. A Framework for Worker Technology Rights

The emerging suite of data-driven technologies in the workplace raises critical questions. Will these technologies be used to benefit and empower workers, help them thrive in their jobs, and bring greater equity to the workplace? Or will they be used to deskill workers, extract ever more labor, increase race and gender inequality, and suppress the right to organize? Who is going to be at the table when these decisions are made, and in particular what role will workers themselves have? In other words, who is going to govern technology? And what values will we as a society choose to prioritize in that governance?

 

The regulatory vacuum

The cornerstone of governing workplace technologies will be laws and regulations (and collective bargaining agreements in unionized workplaces). But currently, employers are introducing untested data-driven technologies with almost no regulation or oversight. Workers largely do not have the right to know what data is being gathered on them or whether it’s being sold or shared with others. They don’t have the right to review or correct the data. Employers aren’t required to notify workers about electronic monitoring or algorithms that they’re basing decisions on, and workers don’t have the right to challenge those decisions. And currently, there are virtually no meaningful guardrails on which technologies employers can use and how they use them.

The United States lags significantly behind the European Union in regulating data-driven technologies. For example, the EU has already passed a wide-ranging data privacy law and is in the process of drafting a comprehensive artificial intelligence law.[67] In the U.S., only a few scattered data privacy laws have been passed at the state level, all focused on consumers. And while recently we’ve seen a plethora of privacy bills emerge at the federal level, the timeline to actual passage will be long.[68]

Meanwhile, a slew of legal analyses of existing employment and labor laws have concluded that they are wholly inadequate to the task of protecting workers in the data-driven workplace.[69] In some cases, new laws will need to be written from scratch to, for example, establish a general right to worker privacy or establish guardrails on the use of algorithms.[70] Similarly, employers’ electronic monitoring of workers is largely unregulated in federal law. Some states have scattered privacy protections for some workers, but these are typically focused on specific types of data (e.g., biometrics) or simply institute a weak notice and consent model (e.g., when employers monitor worker communications).[71] In other cases, existing laws need substantial updating for the data-driven workplace. This is the case for anti-discrimination laws if they are to meet the challenge of addressing discriminatory harms stemming from algorithmic hiring and promotion tools.[72] Similarly, our health and safety laws do not have sufficient standards to protect workers from the psychological stress, repetitive motion, and fatigue-related injuries that can result from productivity monitoring systems.[73]

 

Towards a policy framework

In short, we need a new set of 21st century labor standards establishing worker rights and employer responsibilities for the data-driven workplace. For the majority of workers who are not members of unions, the profound asymmetry of power in the U.S. workplace means they have little to no say over the policies and decisions that affect them in their day-to-day work lives.[74] In particular, notions of consent to new technologies or the ability to find better conditions elsewhere are not meaningful or available to low-wage workers, women, and workers of color, who face a labor market that is often dominated by employers competing on the basis of cutting labor costs.[75] Employment and labor laws have long attempted to balance this asymmetry of power by instituting baseline labor standards and giving workers a mechanism for voice; those laws need to be strengthened and updated for the 21st century workplace and its technologies.

In what follows, we outline a set of policy principles that can help build a robust regulation regime. The principles lay out a vision for labor standards that give workers rights with respect to their data; hold employers responsible for harms caused by their systems; regulate the ways in which employers monitor workers and use algorithms; ensure the right to organize around technology; guard against discrimination; and establish a strong enforcement regime for worker recourse.

These principles are intended to inform policymakers and worker advocates developing legislation at the federal, state, and local levels. The principles draw on proposals and policy concepts developed by lawyers, academics, and worker advocates in the U.S., Europe, and elsewhere.[76] They include regulations of the technologies themselves as well as rules about when, how, and for what purpose employers use them in the workplace.

Importantly, we argue that new labor standards for digital technologies should first and foremost be embedded in employment and labor laws. Consumer-focused laws are insufficient for fully protecting workers because they are largely focused on privacy—and as described above, workers’ concerns about new technologies extend far beyond privacy to include impacts on wages, health and safety, working conditions, job stability, and race and gender equity.[77]

 

Principles

 

1. Goals and Scope

The rapid pace of innovation in the use of data collection, electronic monitoring, and algorithms affects every stage of the employment lifecycle and requires broad, ambitious standards set in law. Full coverage of both workers and employers should be the governing principle, as should attention to the full range of potential harms for workers. Specifically:

  • New rights and protections should be established to ensure worker dignity and welfare in the use of data-driven technologies in the workplace. These standards should be established in employment and labor laws. They should give workers agency over new technologies, promote health and safety, protect the right to organize, and guard against discrimination and other negative impacts.
  • All workers deserve protection. New rights and protections should cover all workers, including employees, independent contractors, job applicants, and remote workers. Representatives from unions or other worker organizations should be able to access these rights and protections on behalf of workers.
  • All employers should be held to these standards. Employers’ obligations should also apply to their labor subcontractors, as well as to vendors that provide technology or technology services.
  • All employment-related decisions that are made or assisted by data-driven technologies should be regulated. Employers make a wide range of decisions based on digital technologies. These decisions should be regulated whenever they impact workers, including effects on earnings, benefits, hours, and work schedules; race and gender equity; hiring, firing, promotion, discipline, and performance evaluation; job assignments, job content, and productivity requirements; workplace health and safety; and the right to organize.

 

2. Disclosure

Full disclosure and transparency are prerequisites for effective regulation. But currently, the biggest obstacle to regulating data-driven technologies is that their use is largely hidden from both policymakers and workers. Without disclosure, job applicants won’t know why a hiring algorithm rejected their resume; truck drivers won’t know when they are being tracked by GPS; and workers won’t realize their health plan data is being sold. Therefore:

  • Employers should provide notice to workers in a clear and accessible way regarding all data-driven technologies in the workplace. Notices should include an understandable description of the technology, the types of data being collected, and the rights and protections available to workers. Employers should also be required to file notices with the relevant regulatory agencies (i.e., those enforcing wage and hour, health and safety, and anti-discrimination laws).
  • Additional notification should be required when electronic monitoring is being used. This should include a description of which activities will be monitored, the method of monitoring, the data that will be gathered, the times and places where the monitoring will occur, and the purpose for monitoring and why it is necessary. Notice should also document how employment-related decisions could be affected.
  • Additional notification should be required when algorithms are being used that affect workers’ jobs or working conditions. This should include an accessible description of the algorithm, its purpose, the data it draws on, the type of outputs it generates, and how the employer will use those outputs in their decision making.

 

3. Worker Data

Employers can collect or buy vast amounts of data on their workers, and share it or sell it without restriction. It’s not realistic to expect workers to police that data collection themselves. Just like consumers, workers deserve legal standards on employers’ collection and use of their data, as well as more control over their personal information:

  • Employers should only collect worker data when it is necessary and essential for workers to do their jobs. Employers should minimize their collection of worker data, which should be defined broadly to include personal identity information, biometric and health information, any data related to workplace activities (including productivity data and algorithmic inferences), and online information including social media activity.[78] Unlimited collection of their data unnecessarily exposes workers to risk, including data breaches and employers’ misuse of personal information.
  • Workers should have the right to access, correct, and download their data. Workers should receive all relevant information regarding their data, including why and how it was collected, if it was inferred about the worker, and whether it was used to inform an employment-related decision, including hiring. Employers should be responsible for timely correction of any inaccurate data.
  • Worker data should be safeguarded and protected from misuse. In particular, employers should not be allowed to sell or license worker data to third parties under any circumstances; otherwise, the incentives to violate worker privacy by selling worker data for monetary gain are too high. Individual workers’ biometric and other health data should never be shared unless required by law.

4. Use of Electronic Monitoring

Electronic monitoring is a highly invasive technology because it allows for real-time and continuous capture of worker activities and behavior. As a result, the potential for misuse of electronic monitoring by employers is highfor example, in violating workers’ privacy, in using biased or incomplete monitoring evidence to discipline someone, or in pushing the pace of work to the point of injuries. Therefore:

  • Employers should only use electronic monitoring for narrow purposes that do not harm workers. Electronic monitoring should only be used if strictly necessary to enable core business tasks, to protect the safety of workers, or when needed to comply with legal obligations. To minimize potential exposure and harm to workers, monitoring should affect the smallest number of workers possible, should collect the least amount of data necessary, and should be the least invasive means for accomplishing its purpose.[79] Productivity monitoring in particular should be subject to higher scrutiny and reviewed by regulatory agencies overseeing workplace health and safety to ensure it is not used to speed up work to dangerous levels.[80]
  • Employers should respect workers’ privacy in using electronic monitoring. Intrusive surveillance in the workplace, especially by audio and video, can capture information about workers that is private and not relevant to performance. Workers should not be monitored in the breakroom, sensitive areas like the restroom, or off duty. Any GPS or other tracking devices should be disabled when the worker is off the job.
  • Electronic monitoring should not use high-risk technologies such as facial recognition.[81] Some new monitoring technologies are too risky to introduce in the workplace; for example, facial-recognition systems have been documented to have high error rates and racial bias.[82] Employers should be prohibited from incorporating unproven, questionable, or particularly invasive technologies into their electronic monitoring systems.
  • Electronic monitoring should not be used as a substitute for human decision making. Even in the best cases, electronic monitoring systems can only capture a partial picture of a given event or set of actions; in the worst cases, that picture is misleading or wrong. Employers should therefore be prohibited from relying exclusively or even mainly on data from electronic monitoring when making consequential decisions like hiring, firing, discipline, or promotion. Instead, employers should be required to conduct independent, human-driven assessments of workers based on other information sources.
  • Workers should be given full documentation when an employer makes a consequential decision informed by electronic monitoring. Workers should also be able to challenge that decision.

 

5. Use of Algorithms

The explosion in algorithmic management tools creates significant risk for workers; many of these technologies are opaque, untested, and being used by employers with little attention to or understanding of their potential harms for workers. The stakes for workers are simply too high when decisions like hiring and firing are about being made about their lives. Therefore:

  • Employers should not use algorithms that harm workers’ health, safety, and wellbeing. Employers should be responsible for ensuring that any employment-related decisions assisted by an algorithm are fair, reasonable, and do not harm workers, in part by conducting an impact assessment prior to adoption of the algorithm. Productivity algorithms in particular should be subject to higher scrutiny and reviewed by regulatory agencies overseeing workplace health and safety for potential harms.
  • Employers should not use algorithms to make irrelevant or unfair predictions about workers. The marketplace has seen a spate of pernicious “snake oil” algorithms making what turn out to be unsubstantiated predictions about workers.[83] Employers should be prohibited from making predictions or inferences about a worker’s traits and behaviors that are unrelated to their job responsibilities. Similarly, employers should not be able to use algorithms to predict or make judgements about a worker’s emotion, personality, or health.
  • Employers should not use high-risk algorithmic technologies such as facial recognition or expression analysis. Employers should be prohibited from using algorithms that incorporate unproven, questionable, or particularly invasive technologies.
  • Algorithms should not be used as a substitute for human decision making. The growing complexity of algorithmic systems means that even their developers may not understand how they arrive at conclusions—let alone the employers deploying these systems.[84] Employers should therefore be prohibited from relying exclusively or even mainly on algorithms when making consequential decisions like hiring, firing, discipline, or promotion. Instead, humans should have a substantial and meaningful role in the decision, drawing in other sources of information. Human decision makers should be trained to understand what a particular algorithm does and the limitations of its output.
  • Workers should be given full documentation when an employer makes a consequential decision assisted by an algorithm. Workers should also be able to challenge that decision.

6. Discrimination

Growing evidence suggests that data-driven technologies carry significant risks of discriminating against workers on the basis of race, gender, age, disability, and other characteristics. The “black box” nature of many of these technologies—and their use for consequential decisions such as hiring and promotion—means that regulatory scrutiny needs to be especially high. The following is adapted from “Civil Rights Principles for Hiring Technologies,” expanded to the full range of workplace applications: [85]

  • Data-driven technologies should not discriminate against workers based on protected characteristics. Policymakers should make clear that anti-discrimination laws apply to all workplace data-driven technologies. In particular, the use of data-driven technologies with a disparate impact should trigger the same level of scrutiny as any other discriminatory employment practice.
  • Removing protected characteristics from data-driven technologies should not give employers a free pass. The fact that an employer does not use protected characteristics such as race or gender in its algorithm or data system does not mean that the technology cannot have a disparate impact. Employers should still be required to test for disparate impacts and mitigate any harms.
  • Policymakers should update existing regulations on worker assessment tools. Data-driven technologies in worker assessment tools should only measure traits that have a logical and explainable relationship to the job at hand. They should not use mere correlation to make judgements, inferences, or predictions about a worker’s or job applicant’s ability to perform the job.

 

7. Organizing and Bargaining

Across the country, especially in low-wage industries, workers are increasingly voicing their frustration with excessive monitoring and algorithmic management in their workplaces. They should be able to organize around these issues without retaliation, and, when represented by unions, be able to bargain over them. Specifically:

  • Labor organizations should have the right to bargain over employers’ use of data-driven technologies. Federal labor law requires employers to bargain with worker representatives over the terms and conditions of employment. Data collection, electronic monitoring, and algorithmic management all impact the terms and conditions of employment. Unions should have access to the information necessary to fully understand the nature, scope, and effects of data-driven technologies used by the employer, and the employer should be required to bargain in good faith over them.[86]
  • Even when they are not represented by a union, workers should have the right to organize around the use of data-driven technologies in their workplace. When workers protest a company’s collection of their data, question the decisions made about them by algorithms, or seek to learn more about data practices, labor laws should be understood to protect this collective activity.
  • Employers should not use digital technologies to identify, monitor, or punish workers for organizing. Monitoring workers who are engaging in organizing activities has long been held to violate the law for its chilling effects. Employers should not engage in surveillance of workers when they are meeting with union representatives or discussing workplace problems. Efforts to screen workers using electronic monitoring or predictive algorithms for their sympathy with unions should also be recognized as illegal.

 

8. Impact Assessments

The novel and inscrutable nature of many data-driven technologies means that their impacts on workers are not self-evident. But waiting to discover harms after an algorithm or data system has already been implemented is not fair to workers. These technologies should be thoroughly vetted and made safe for the workplace before they are introduced. Specifically:

  • Data-driven technologies should be continuously evaluated and harms mitigated. Employers should be required to audit their technologies by conducting rigorous impact assessments, both prior to implementation and throughout the lifecycle of the technology.[87] They should be required to address any risks that are identified and be held legally liable for any harms caused by their technologies. Employers should also be required to submit impact assessments to the relevant regulatory agencies, which should have the right to halt the use of harmful systems.
  • Impact assessments should evaluate the full range of potential harms to workers. These include discrimination, harms to mental and physical health and safety, loss of privacy, and negative economic impacts.
  • Workers should have a role in impact assessments and have the ability to challenge them. Workers have significant and useful knowledge about a company’s production processes and how technology actually works on the ground. They (and their unions) should be consulted in all stages of an impact assessment and be able to review and give feedback. They should also be able to dispute the final assessment with the relevant regulatory agencies.

 

9. Enforcement

Enforcement is the lifeblood of laws and regulations; without it, the promise of legal rights is hollow. This is especially the case when it comes to the use of data-driven technologies, where the asymmetry of power and information between workers and employers is pronounced and where the incentives for employers to misuse opaque technologies are strong. Specifically:

  • Regulatory agencies should play a strong role in enforcing workplace technology standards. Workers should be able to submit complaints about employer noncompliance to regulatory agencies (i.e., those enforcing wage and hour, health and safety, and anti-discrimination laws). In turn, those agencies should respond to each complaint, apply penalties when warranted, and initiate workplace-wide investigations when needed. Regulatory agencies should also have the authority to proactively audit employers’ use of data-driven technologies. When technologies are found to harm workers, agencies should have the authority to require that employers mitigate the harms or halt the use of systems that can’t be made safe.
  • Regulatory agencies should have the authority to establish additional rules and standards. This allows the agencies to respond to rapid developments in existing and new technologies introduced in the workplace.
  • Workers should have a private right of action to sue employers for any violations of their technology rights and protections. The right for workers to sue their employers is a central pillar of robust enforcement, allowing them to control their own case and complementing agency enforcement efforts. Employers should also be prohibited from retaliating against workers for enforcing their rights.

 

The Path Ahead

In this report, we have argued that the arrival of data-driven technologies in the workplace poses significant risks to workers and requires the creation of a new set of labor standards in employment and labor laws. These new standards must be bold, comprehensive, and continuously updated to respond to the rapidly changing terrain of workplace technologies and the potential harms that workers face from them.

But while worker data rights and protections are critical, they alone will not be enough. For example, workers should receive the training needed to grow with their jobs and participate fully in technological change. Government staff need the skills and adequate resources to provide oversight and enforcement. Public R&D funding should be leveraged and increased to incentivize the development of technology that benefits people and the planet. The public sector itself must become a model for accountable technology adoption.[88] And the U.S. must build out a robust governance regime of regulating the designers, developers, and producers of new workplace technology. Above all, workers and their communities—especially low-income communities, women, immigrants, and communities of color—must be included in the development of that governance regime; their knowledge and experiences will be the keystone to ensuring that innovation truly contributes to the social good.

 

Acknowledgments

This report has been deeply informed by an inspiring community of researchers, lawyers, journalists, and advocates focused on documenting and analyzing the impact of new technologies in the workplace. We are especially grateful to the workers, unions, and other worker organizations who shared their experiences with us, and to the participants of a working group in California that contributed significant expertise to the development of the policy principles presented here. We thank the multiple reviewers of this report for their invaluable feedback. This report was generously supported by grants from The James Irvine Foundation and the Ford Foundation. Finally, we want to give a special thanks to Emlyn Bottomley for his invaluable contributions to the initial stages of this project and to our team. All errors of fact or interpretation remain our own.

This report is licensed to the public under a non-exclusive Creative Commons Attribution 4.0 International (CC by 4.0) license, https://creativecommons.org/licenses/by/4.0/.

 

Endnotes

[1] Throughout this report we use the terms “digital technologies” and “data-driven technologies” interchangeably when referencing the wide range of technologies that gather and transform data into outputs such as rankings, predictions, decisions, and machine-based actions.

[2] For a description of HireVue’s hiring technology system, see Aspan (2020). HireVue’s initial system used expression analysis technologies to assess job candidates, which they have since discontinued due to growing concern from technologists, academics, and civil rights advocates about the validity of their system (see Maurer 2021b).

[3] For information about “flight risk” algorithms that attempt to predict whether workers will quit, see Liu (2019). Also, Hao (2020) describes a tech vendor that attempts to predict whether job candidates are likely to quit before they are hired. See Zarya (2016) regarding employer efforts to predict pregnancy and Kessler (2020) regarding employers attempting to predict union organizing.

[4] See Dzieza (2020) and Simonite (2018) regarding call center analysis technologies.

[5] See Bhuiyan (2020) for more information about how grocery platforms, such as Instacart, operate.

[6] See McCallum (2021), Nguyen (2020), and UC Berkeley Labor Center (2020).

[7] See Kantor, Weise, and Ashford (2021).

[8] Greenhouse (2019) discusses the absence of workers in “future of work” discussions. Also, Gupta, Lerner, and McCartin (2018) point out that “future of work” discussions often assume a narrow vision of technological change and a future of work without workers, which leads to policy solutions focused on ensuring basic survival; they argue that the discussions should focus instead on the future of workers and how innovative technologies can contribute to a humane and sustainable future. Townsend (2021) argues that data protection policies need to recognize all aspects of individuals’ lives and extend consumer data rights to workers.

[9] For a discussion of “AI snake oil,” see Narayanan (2019), and see Slaughter (2021) for an overview of how algorithmic systems can cause harm.

[10] For an overview of how algorithmic systems can be used to control workers, see Kellogg, Valentine, and Christin (2020). Also, see Cappelli (2020) for a discussion of the recent shift toward Taylorist management models enabled by extensive worker monitoring and algorithms designed to control workers.

[11] For other excellent research overviews, see Adler-Bell and Miller (2018), Bogen and Rieke (2018), Milner and Traub (2021), Ngyuen (2021), Scherer and Brown (2021), UNI Global (n.d.), and Zickuhr (2021).

[12] For a deeper treatment of data collection, electronic monitoring, and use of algorithms in the workplace, see Briône (n.d.) and Kresge (2020). Also, see Bogen and Rieke (2018) and Reike et al. (2021) for an overview of data and algorithms in the hiring process.

[13] See O’Connor (2016) for a description of Taylorism and scientific management. See Davenport (2018) for a description of the transition from business (workplace) data analytics to artificial intelligence.

[14] For an overview of the data collection in the workplace, see Alder-Bell and Miller (2018) and Kresge (2020).

[15] See Ajunwa (2018), Ajunwa, Crawford, and Schultz (2017), Ciocchetti (2011), Mateescu and Nguyen (2019), and Zickuhr (2021).

[16] For a basic description of algorithms and artificial intelligence, see Schatsky, Muraskin, and Gurumurthy (2015). See The Royal Society (2017) for an overview on machine learning algorithms.

[17] For an overview of technologies used in the hiring process, see Bogen and Rieke (2018).

[18] See Ajunwa (2018) for an overview of productivity monitoring systems in the workplace. Kaplan (2015) describes various workplace monitoring systems used to manage workers. Also, see Kellogg, Valentine, and Christin (2020) for a discussion on the use of algorithms to discipline workers. Amazon has received the most attention for their productivity management systems based on algorithms; for example, see Lecher (2019). Another notable example of algorithmic management driven by productivity metrics is Instacart; see Griesbach et al. (2019) and Bhuiyan (2020).

[19] For a comparison of work augmentation and automation strategies using digital technologies, see Miller (2018). Trollope (2018) describes different levels of call center augmentation and automation. Sicular and Aron (2019) describe different scenarios of workplace augmentation.

[20] See Luca, Kleinberg, Mullainathan (2016) and Frankowski (2019) for a discussion about the role of humans in shaping the objectives and design of algorithmic systems.

[21] See Cappelli (2020) regarding management strategies and choices that can influence how they use technologies in the workplace.

[22] For a discussion of Teleperformance’s webcam monitoring system, see Solon (2021). Teleperformance is a global call center company that provides both outsourced and U.S.-based call support for many major U.S. companies and specializes in “work-at-home” programs (see Teleperformance, 2018). See Doellgast and O’Brady (2020) regarding worker stress resulting from intense call monitoring practices.

[23] See Cogito Corporation (2018), Dzieza (2020), and Simonite (2018) for more information about Cogito’s call performance monitoring and guidance system.

[24] For an overview of labor management systems and productivity monitoring in the warehouse industry, see McCrea (2020) and Dzieza (2020). See Vincent (2019) for more information about gamification in warehouses and Lecher (2019) regarding automated firing.

[25] Gutelius and Theodore (2019) provide a comprehensive overview of technological change in the warehouse industry and Overstreet (2019) provides a brief overview of warehouse robots and order picking technologies. For a detailed overview of a lead-me cart task direction system, see Six River (https://6river.com/directed-picking/) and for a description of a voice-picking system, see Lucas (https://www.lucasware.com/voice-picking-introduction/).

[26] For more information on EVV, see Cunningham (2019) and Metcalf (2018).

[27] See Ticona, Mateescu, and Rosenblat (2018) for an overview of home care platforms. A growing body of research points to customer ratings as a source of bias, which, when used to inform management decisions or automate decisions about workers on platforms, can serve as a significant source of discrimination. See Rosenblat et al. (2016) and Dzieza (2015) for a discussion of customer ratings.

[28] For overviews of technological change in the retail and grocery industries, see Carré et al. (2020) and Benner et al. (2020).

[29] See Clifford and Silver-Greenberg (2013) for information about retail theft databases. For more information about HireRight and their retail theft database, see HireRight (2018; 2020). Nelson (2019) details criminal background check errors and their effects on job seekers. Berfield (2015) describes how Walmart monitors workers’ social media accounts to identify organizing efforts.

[30] O’Connor (2016) describes the worker performance features of Percolata’s scheduling system. Also, see Tanaka et al. (2016) for a full description of the Percolata scheduling system in their patent application. Percolata has since incorporated surveillance cameras and computer vision into their system to measure in-store shopping traffic; see https://www.percolata.com/. For more information on the effects of scheduling optimization systems, see Kantor (2014) and Gleason and Lambert (2014).

[31] Bhuiyan (2020) details Instacart’s system of performance metrics.

[32] For more on algorithmically facilitated pay inequities at Shipt, see Lyons (2020).

[33] For examples, see Team Software (https://teamsoftware.com/software/lighthouse/cleaning-workforce-management-software/), Janitorial Manager (https://www.janitorialmanager.com/), and allGeo (https://www.allgeo.com/apps/mobile-employee-gps-time-clock).

[34] For an overview of technological changes in the security industry, see Lasky (2019) and Mahmood (2019). See Knight (2021) and Lazzaro (2021) regarding errors in computer vision systems and Lever (2017) regarding privacy concerns with these systems. For examples of some of these systems, see Guardso (https://www.guardso.com/guard-tour-system?tab=reporting) and Slivertrac Software (https://www.silvertracsoftware.com/automated-security-guard-management).

[35] Peterson (2019) describes sensors and driver monitoring and scoring systems. See Clinton (2019) for an overview of recent technological developments in dash cam driver monitoring systems. Many of the same sensor monitoring technologies are also used to enable self-driving trucks; see Viscelli (2018). Also, see Levy (2015) for an analysis of the use of monitoring systems to exert control over truck drivers.

[36] See Ticona, Mateescu, and Rosenblat (2018) for an overview of transportation network platforms, and Rosenblat (2016) and Scheiber (2017) for a description of how these companies use algorithms to manage workers.

[37] For an overview of tabletop tablets, see O’Connor (2019). See O’Donovan (2018) and Rainey (2018) for information about servers’ experiences with these systems.

[38] Matsakis (2019) describes the Presto restaurant monitoring system. See Knight (2021) and Lazzaro (2021) regarding bias and errors in computer vision systems.

[39] See Eidelson (2017) and Jacobs (2018) for information on the history and status of panic button legislation and collective bargaining. See Lindzon (2020) for a discussion on the privacy and security risks posed by panic button systems and the variation in policies designed to protect workers.

[40] For an overview of hotel service optimization systems, see Hotel Tech Report (2021). Escobar (2020) describes how many hotel technology vendors have adapted their systems to increase the ability for employers to monitor and scrutinize workers’ compliance with cleaning protocols in the context of COVID-19. Reyes (2018) describes the effects of these systems on workers.

[41] For an overview of technological change in the healthcare industry, see Litwin (2020).

[42] For an overview of the different types of hand-hygiene monitoring systems and their accuracy in measuring compliance, see Dyer (2015). See Lorenzi (2021) for a review of various systems in the context of COVID-19 and Schencker (2019) for a discussion of workers’ experiences with these systems in hospitals. For examples of systems with real-time color-coded alerts, see Ecolab (https://www.ecolab.com/solutions/hand-hygiene-compliance-monitoring) and BioVigil (https://biovigil.com/).

[43] See Litwin (2020), Simon (2015), and Lerman (2020) for descriptions of service robots in hospitals. See Cresswell and Aziz (2020) for a discussion about the effectiveness of hospital cleaning robots and Elish (2016) regarding humans taking the blame for automation failures.

[44] See Harris (2017) and Maria and Burger (2016) for an overview of these systems along with some examples.

[45] For examples, see Oliver (2020) and Woyke (2018).

[46] See Condon (2019), State of Ohio (2018), and Chaney (2020) for information about automated benefit application systems.

[47] For more information on automated decision-making systems in the public sector, see Chouldechova et al. (2018), Eubanks (2018), and Hurley (2019).

[48] See Maurer (2021b) and Patton (2021).

[49] For examples of new applications of workplace technology for COVID-19, see Negrón (forthcoming), Nguyen (2020), Rodriguez and Windwehr (2020), and UC Berkeley Labor Center (2020).

[50] See Barocas and Selbst (2016) for a discussion about how data-driven systems can have a disparate impact. Ajunwa (2020), Bogen and Rieke (2018), and Kim (2017) describe bias in hiring algorithms and how they can result in discrimination.

[51] See Jabsky and Obernauer (2019) and Ockenfels-Martinez and Boparai (2021) for documentation of health impacts at Amazon warehouses.

[52] For example, research conducted by Doellgast and O’Brady (2020) found that intensive electronic monitoring in call centers was associated with higher levels of stress among workers.

[53] For evidence on the link between job-related stress and health outcomes, see Nieuwenhuijsen, Bruinvels, and Frings-Dresen (2010).

[54] For example, see Ikeler (2016) and Levy and Barocas (2018) for a discussion of the use of clienteling software in the retail industry to deskill sales jobs.

[55] See TuSimple (2019) and Plus (2021) for a description of how truck drivers train algorithms designed to enable self-driving trucks and Sandu (2019) for a discussion on using call monitoring data collected from call center workers to train chatbots.

[56] For a broader treatment of the effects of technological change on employment and wages, see Acemoglu and Restrepo (2019).

[57] For example, see Tippett, Alexander, and Eigen (2017) regarding how scheduling software can enable wage theft.

[58] See Gonzales (2016) and Townsend (2020).

[59] For an overview of “fissuring” or business models based on outsourcing and contracting, see Weil (2019). See Rogers (2020) for some examples of fissuring and a description of the legal context that encourages fissuring.

[60] Berfield (2015) describes social media monitoring practices used by a union avoidance consultant for Walmart.

[61] See Kessler (2020). Also, see Newman (2017).

[62] For a discussion of the legal implications of electronic monitoring for labor organizing, see Garden (2018).

[63] For example, Madden et al. (2017) describe how data-driven systems can harm low-income communities, including how social media data mining can exclude low-income groups from employment.

[64] See Ajunwa (2018) and De Stefano and Taes (2021).

[65] For example, see Hanley and Hubbard (2020) and Milner and Traub (2021).

[66] See Lopez (2011) for a description of Disneyland’s use of leaderboards to motivate workers and Brodkin (2019) regarding a similar system used by Amazon.

[67] The EU passed broad data privacy legislation, the “General Data Protection Regulation” in 2016; see European Parliament and Council of the European Union (2016). More recently, European legislators released guidance on the regulation of Artificial Intelligence; see European Commission (2021).

[68] For an overview of active state consumer privacy bills, see Klosowski (2021).

[69] For legal analysis of the current gaps in protecting workers from data-driven technologies, see Ajunwa et al. (2017), Bales and Stone (2020), Barocas and Selbst (2016), Bodie (2021), Hirsh (2020), Kim (2017), Richardson (forthcoming), Rogers (2020), and Scherer and Brown (2021).

[70] For example, see Wachter and Mittlestadt (2019).

[71] For more detail on the patchwork of state privacy protections, see Ajunwa et al. (2017).

[72] For a discussion on the difficulty of addressing discrimination and privacy issues created by workplace technologies, see Bodie and Kim (2021).

[73] See Scherer and Brown (2021) for a detailed analysis of worker health and safety impacts from monitoring systems.

[74] See Gamble (2019) and Milner and Traub (2021) for a discussion of the asymmetry of power these systems present for workers.

[75] See Pasquale (2021) for a more general discussion about the limits of a consent model vis a vis digital technologies.

[76] These principles draw upon a large body of work, including ACLU (2020), Ajunwa et al. (2017), Alder-Bell and Miller (2018), the Algorithmic Accountability Act of 2019 (U.S. Congress 2019), Barocas and Selbst (2016), Block and Sachs (2019), Ciocchetti (2011), Colclough (2020), De Stefano (2021; 2018), European Commission (2021), European Parliament and Council of the European Union (2016), Georgetown Law Center on Privacy & Technology (2019), Milner and Traub (2021), Ockenfels-Martinez and Boparai (2021), Reisman et al. (2018), Rieke et al. (2021), Scherer and Brown (2021), Slaughter (2021), Trades Union Congress (2020), Tutt (2017), and UNI Global (n.d.).

[77] For a more general questioning of the privacy framework, see Morozov (2021). Also, see Tisné (2020) regarding the limitations of using an individual data privacy framework for regulating the collective harms that arise from data-driven systems.

[78] Biometric data in particular will require heightened protections; see Ajunwa, Crawford, and Ford (2016) for a detailed analysis.

[79] See Bottomley (2020) for examples of the data minimization principle in public policy.

[80] For a detailed analysis of the connection between productivity monitoring and health and safety outcomes in the context of warehousing, see Ockenfels-Martinez and Boparai (2021).

[81] A growing number of jurisdictions in the United States have placed bans on the use of facial recognition technology, particularly in the public sector. See Conger et al. (2019) and Simonite (2020).

[82] Researchers have documented significant race and gender disparities and inaccuracies in the use of facial recognition technology; see Buolamwini and Gebru (2018) and Raji et al. (2020).

[83] See Narayanan (2019).

[84] See Burrell (2016) regarding opacity in machine learning algorithms and Edwards and Veale (2017) for a discussion of the challenges.

[85] Many of the concepts here are adapted from principles published by The Leadership Conference on Civil and Human Rights (2020). The authors also thank Professor Pauline Kim of Washington University in St. Louis for her generosity in sharing her expertise on discrimination law for this section.

[86] See Bodie et al. (2017) and Rogers (2020) for a legal discussion on collective bargaining in relation to workplace data and technology decisions.

[87] See Moss et al. (2021) and Reisman et al. (2018) for a framework on algorithmic impact assessments, geared towards the public sector. In the EU, algorithmic impact assessments have been legislated in the form of Data Protection Impact Assessments (DPIAs) under the General Data Protection Regulation (GDPR); see legal analysis from Kaminski and Malgieri (2021).

[88] For a deeper study and analysis of algorithmic accountability in the public sector, see Ada Lovelace Institute (2021).

 

References

Acemoglu, Daron, and Pascual Restrepo. 2019. “Automation and New Tasks: How Technology Displaces and Reinstates Labor.” Journal of Economic Perspectives 33 (2): 3–30. https://doi.org/10.1257/jep.33.2.3.

Ada Lovelace Institute, AI Now, and Open Government Partnership. 2021. “Algorithmic Accountability for the Public Sector.” https://www.opengovpartnership.org/documents/algorithmic- accountability-public-sector/.

Adler-Bell, Sam, Michelle Miller. 2018. “The Datafication of Employment.” The Century Foundation, December 19, 2018. https://tcf.org/content/report/datafication-employment-surveillance-capitalism-shaping-workers-futures-without-knowledge/?agreed=1

Ajunwa, Ifeoma. 2018. “Algorithms at Work: Productivity Monitoring Platforms and Wearable Technology as the New Data-Centric Research Agenda for Employment and Labor Law.” Saint Louis University Law Journal 63 (21): 21–54. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3247286.

Ajunwa, Ifeoma. 2020. “The Paradox of Automation as Anti-Bias Intervention.” Cardozo Law Review 41 (5): 1671–1742. http://cardozolawreview.com/the-paradox-of-automation-as-anti-bias-intervention/.

Ajunwa, Ifeoma, Kate Crawford, and Joel S. Ford. 2016. “Health and Big Data: An Ethical Framework for Health Information Collection by Corporate Wellness Programs.” Journal of Law, Medicine and Ethics 44 (3): 474–80. https://doi.org/10.1177/1073110516667943.

Ajunwa, I., Crawford, K. and Schultz, J. .2017. ‘Limitless worker surveillance’, California Law Review, 105(3), pp. 735–776. Available at: https://www.californialawreview.org/print/3-limitless-worker-surveillance/.

American Civil Liberties Union. 2020. “Through the Keyhole: Privacy in the Workplace, and Endangered Right.” 2020. https://www.aclu.org/print/node/22540.

Aspan, Maria. 2020. “Siri, Did I Ace the Interview? A.I. Is Transforming the Job Interview—and Everything After.” Fortune, January 20, 2020. https://fortune.com/longform/hr-technology-ai-hiring-recruitment/.

Bales, Richard A., and Katherine V. W. Stone. 2020. “The Invisible Web at Work: Artificial Intelligence and Electronic Surveillance in the Workplace.” Berkeley Journal of Employment & Labor Law 41 (1): 1–60. https://lawcat.berkeley.edu/record/1181483?ln=en.

Barocas, Solon and Andrew D. Selbst. 2016. “Big Data’s Disparate Impact.” 104 California Law Review 671, September 30, 2016. http://dx.doi.org/10.2139/ssrn.2477899.

Benner, Chris, Sarah Mason, Françoise Carré, and Chris Tilly. 2020. “Delivering Insecurity: E-Commerce and the Future of Work in Food Retail.” UC Berkeley Labor Center & Working Partnerships USA. https://laborcenter.berkeley.edu/delivering-insecurity/.

Berfield, Susan. 2015. “How Walmart Keeps an Eye on Its Massive Workforce: The Retail Giant Is Always Watching.” Bloomberg BusinessWeek, November 24, 2015. https://www.bloomberg.com/features/2015-walmart-union-surveillance/.

Bhuiyan, J. 2020. Instacart shoppers say they face unforgiving metrics: ‘It’s a very easy job to lose.’ Los Angeles Times. August 27, 2019. Retrieved from https://www.latimes.com/business/technology/story/2020-08-27/shopping-for-instacart-metrics.

Block, Sharon, and Benjamin Sachs. 2019. “Clean Slate for Worker Power: Building a Just Economy and Democracy.” https://uploads-ssl.webflow.com/5fa42ded15984eaa002a7ef2/5fa42ded15984e5a8f2a8064_CleanSlate_Report_FORWEB.pdf.

Bodie, Matthew T. 2021. “The Law of Employee Data: Privacy, Property, Governance.” Indiana Law Journal 97: 1–68. https://scholarship.law.slu.edu/cgi/viewcontent.cgi?article=1607&context=faculty.

Bodie, Matthew T, Miriam A Cherry, L. McCormick, Marcia, and Jintong Tang. 2017. “The Law and Policy of People Analytics.” University of Colorado Law Review 88: 961–1042.

Bogen, Miranda, and Aaron Rieke. 2018. “Help Wanted: An Examination of Hiring Algorithms, Equity, and Bias.” Upturn. https://www.upturn.org/reports/2018/hiring-algorithms/.

Bottomley, Emlyn. 2020. “Data and Algorithms in the Workplace: An Overview of Current Public Policy Strategies.” https://laborcenter.berkeley.edu/data-and-algorithms-in-the-workplace-an-overview-of-current-public-policy-strategies/.

Brodkin, Jon. 2019. “Amazon Made Video Games for Its Workers to Reduce Tedium of Warehouse Jobs.” Ars Technica, May 22, 2019. https://arstechnica.com/information-technology/2019/05/amazon-gamifies-its-warehouse-work-like-tetris-but-with-real-boxes/.

Briône, Patrick. n.d. “Algorithmic Management – A Trade Union Guide.” UNI Global Union Professionals & Managers. https://uniglobalunion.org/sites/default/files/files/news/uni_pm_algorithmic_management_guide_en.pdf.

Buolamwini, Joy and Timnit Gebru. 2018. “Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Glassification.” In Conference on Fairness, Accountability and Transparency, FAT 2018, 23-24. February 2018. https://proceedings.mlr.press/v81/buolamwini18a.html.

Burrell, Jenna. 2016. “How the Machine ‘thinks’’: Understanding Opacity in Machine Learning Algorithms.’” Big Data and Society 3 (1): 1–12. https://doi.org/10.1177/2053951715622512.

Cappelli, Peter. 2020. “Stop Overengineering People Management.” Harvard Business Review, 2020. https://hbr.org/2020/09/stop-overengineering-people-management.

Carré, Francoise, Chris Tilly, Chris Benner, and Sarah Mason. 2020. “Change and Uncertainty, Not Apocalypse: Technological Change and Store-Based Retail.” UC Berkeley Labor Center & Working Partnerships USA. https://laborcenter.berkeley.edu/change-and-uncertainty-not-apocalypse-technological-change-and-store-based-retail/.

Chaney, Sarah. 2020. “Amazon, Google Help States as Coronavirus Boosts Unemployment Claims; Newer Technology Addresses Challenges Including Busy Phone Lines and Website Crashes.” Wall Street Journal (Online), May 12, 2020. https://www.wsj.com/articles/amazon-google-help-states-as-coronavirus-boosts-unemployment-claims-11589275801.

Chouldechova, Alexandra, Emily Putnam-Hornstein, Diana Benavides-Prado, Oleksandr Fialko, and Rhema Vaithianathan. 2018. “A Case Study of Algorithm-Assisted Decision Making in Child Maltreatment Hotline Screening Decisions.” Proceedings of Machine Learning Research Conference on Fairness, Accountability, and Transparency 81: 1–15. http://proceedings.mlr.press/v81/chouldechova18a.html.

Ciocchetti, Corey A. 2011. “The Eavesdropping Employer: A Twenty-First Century Framework for Employee Monitoring.” American Business Law Journal 48 (2): 285–369. https://onlinelibrary.wiley.com/doi/full/10.1111/j.1744-1714.2011.01116.x.

Clifford, Stephanie, and Jessica Silver-Greenberg. 2013. “Retailers Track Employee Thefts in Vast Databases.” New York Times, April 2, 2013. https://www.nytimes.com/2013/04/03/business/retailers-use-databases-to-track-worker-thefts.html.

Clinton, Paul. 2019. “Smarter Video Telematics Wave Arrives.” Automotive Fleet, March 19, 2019. https://www.automotive-fleet.com/327438/wave-of-smarter-video-telematics-solutions-arrives.

Cogito Corporation. 2020. “Augmented Intelligence in the Contact Center: The Why, What, and How.” https://cogitocorp.com/wp-content/uploads/2020/12/WP-4Ws-Augmented-Intelligence-r3.1.pdf

Colclough, Christina. 2020. “Workers’ Rights: Negotiating and Co-Governing Digital Systems at Work.” Social Europe, September 3, 2020. https://socialeurope.eu/workers-rights-negotiating-and-co-governing-digital-systems-at-work.

Condon, Stephanie. 2019. “From Ohio’s ‘Baby Bot’ to Driver’s Ed in Delaware: How States Are Using AI.” ZDNet, October 15, 2019. https://www.zdnet.com/article/from-ohios-baby-bot-to-drivers-ed-in-delaware-how-states-are-using-ai/.

Conger, Kate, Richard Fausset and Serge F. Kovaleski. 2019. “San Francisco Bans Facial Recognition Technology.” The New York Times, May 14, 2019. https://www.nytimes.com/2019/05/14/us/facial-recognition-ban-san-francisco.html.

Cresswell, Kathrin, and Aziz Sheikh. 2020. “Can Disinfection Robots Reduce the Risk of Transmission of SARS-CoV-2 in Health Care and Educational Settings?” Journal of Medical Internet Research 22 (9): 9–11. https://doi.org/10.2196/20896.

Cunningham, John. 2019. “Utilizing Electronic Visit Verification (EVV) in Home Care Visits.” Healthcare IT Today. January 22, 2019. https://www.healthcareittoday.com/2019/01/22/utilizing-electronic-visit-verification-evv-in-home-care-visits/.

Davenport, Thomas H. 2018. “From Analytics to Artificial Intelligence.” Journal of Business Analytics 1 (2): 73–80. https://doi.org/10.1080/2573234x.2018.1543535.

De Stefano, Valerio. 2018. “Negotiating the Algorithm: Automation, Artificial Intelligence, and Labour Protection.” 246. International Labour Office Employment Policy Department. Geneva. https://www.ilo.org/wcmsp5/groups/public/—ed_emp/—emp_policy/documents/publication/wcms_634157.pdf.

De Stefano, Valerio. 2021. “The EU Proposed Regulation on AI: a threat to labour protection? http://regulatingforglobalization.com/2021/04/16/the-eu-proposed-regulation-on-ai-a-threat-to-labour-protection/.

De Stefano, Valerio, and Simon Taes. 2021. “Algorithmic Management and Collective Bargaining.” Foresight Brief, May 2021. https://www.etui.org/publications/algorithmic-management-and-collective-bargaining.

Doellgast, Virginia, and Sean O’Brady. 2020. “Making Call Center Jobs Better: The Relationship between Management Practices and Worker Stress.” ILR School, Cornell University and DeGroote School of Business, McMaster University. https://hdl.handle.net/1813/74307.

Dyer, Jan. 2015. “Hand Hygiene Compliance Monitoring Provides Benefits, Challenges.” Infection Control Today, December 6, 2015. https://www.infectioncontroltoday.com/view/hand-hygiene-compliance-monitoring-provides-benefits-challenges.

Dzieza, Josh. 2020. “Robots Aren’t Taking Our Jobs — They’re Becoming Our Bosses.” The Verge, February 27, 2020. https://www.theverge.com/2020/2/27/21155254/automation-robots-unemployment-jobs-vs-human-google-amazon.

Edwards, Lilian, and Michael Veale. 2017. “Slave to the Algorithm? Why a ‘Right to an Explanation’ Is Probably Not the Remedy You Are Looking For.” Duke Law & Technology Review 16 (1): 1–84. https://doi.org/10.1007/978-3-642-32408-6_158.

Eidelson, Josh. 2017. “Hotels Add ‘Panic Buttons’ to Protect Housekeepers from Guests.” Bloomberg, December 13, 2017. https://www.bloomberg.com/news/articles/2017-12-13/hotels-add-panic-buttons-to-protect-housekeepers-from-guests.

Elish, Madeleine Clare. 2016. “Letting Autopilot Off the Hook. Why Do We Blame Humans When Automation Fails?” Slate, June 16, 2016. https://slate.com/technology/2016/06/why-do-blame-humans-when-automation-fails.html.

Escobar, Michal Christie. 2020. “2020: A Year of Change for Housekeeping.” Hospitality Technology, November 12, 2020. https://hospitalitytech.com/2020-year-change-housekeeping.

Eubanks, Virginia. 2018. “A Child Abuse Prediction Model Fails Poor Families.” Wired, January 15, 2018. https://www.wired.com/story/excerpt-from-automating-inequality/.

European Commission. 2021. “Proposal for a Regulation of the European Parliament and of the Council Laying Down Harmonised Rules on Artificial Intelligence Act) And Amending Certain Union Legislative Acts.” https://eur-lex.europa.eu/legal-content/EN/TXT/?qid=1623335154975&uri=CELEX%3A52021PC0206.

European Parliament and Council of the European Union. 2016. General Data Protection Regulation (GDPR), Regulation 2016/679. https://gdpr-info.eu/.

Frankowski, Dan. 2019. “Humans Choose, AI Does Not.” Fiddler, May 8, 2019. https://blog.fiddler.ai/2019/05/humans-choose-ai-does-not/.

Gamble, Joelle. 2019. “The Inequalities of Workplace Surveillance: When Bosses Watch over Our Every Move, the Data They Collect Ends up Making Us Even More Unequal.” The Nation, June 3, 2019. https://www.thenation.com/article/archive/worker-surveillance-big-data/.

Garden, Charlotte. 2018. “Labor Organizing in the Age of Surveillance.” St. Louis University Law Journal 63 (55): 55–68. https://digitalcommons.law.seattleu.edu/cgi/viewcontent.cgi?article=1817&context=faculty.

Georgetown Law Center on Privacy & Technology. 2019. “The Worker Privacy Act: Discussion Draft.” https://drive.google.com/file/d/1Mi1JTezFbmTdJg2Fbp_MreFuSTWQ5QmK/view.

Gleason, Carrie, and Susan J Lambert. 2014. “Uncertainty by the Hour.” Open Society Foundations’ Future of Work Project. https://static.opensocietyfoundations.org/misc/future-of-work/just-in-time-workforce-technologies-and-low-wage-workers.pdf.

Gonzales, Amy. 2016. “The Contemporary US Digital Divide: From Initial Access to Technology Maintenance.” Information Communication and Society 19 (2): 234–48. https://doi.org/10.1080/1369118X.2015.1050438.

Greenhouse, Steven. 2019. “Where Are the Workers When We Talk About the Future of Work?” The American Prospect, October 22, 2019. https://prospect.org/labor/where-are-the-workers-when-we-talk-about-the-future-of-work/.

Griesbach, Kathleen, Adam Reich, Luke Elliott-Negri, and Ruth Milkman. 2019. “Algorithmic Control in Platform Food Delivery Work.” Socius: Sociological Research for a Dynamic World 5: 1–15. https://doi.org/10.1177/2378023119870041.

Gupta, Sarita, Stephen Lerner, and Joseph A. McCartin. 2018. “It’s Not the ‘Future of Work’ It’s the Future of Workers That’s in Doubt.” The American Prospect, August 31, 2018. https://prospect.org/article/its-not-future-work-its-future-workers-doubt.

Gutelius, Beth, and Nik Theodore. 2019. “The Future of Warehouse Work: Technological Change in the U.S. Logistics Industry.” UC Berkeley Labor Center & Working Partnerships USA. http://laborcenter.berkeley.edu/future-of-warehouse-work/.

Hanley, Daniel A., and Sally Hubbard. 2020. “Eyes Everywhere: Amazon’s Surveillance Infrastructure and Revitalizing Worker Power.” https://www.openmarketsinstitute.org/publications/eyes-everywhere-amazons-surveillance-infrastructure-and-revitalizing-worker-power.

Hao, Karen. 2020. “An AI Hiring Firm Says It Can Predict Job Hopping Based on Your Interviews.” MIT Technology Review, July 24, 2020. https://www.technologyreview.com/2020/07/24/1005602/ai-hiring-promises-bias-free-job-hopping-prediction/.

Harris, Kim. 2017. “Mobile Tracking Apps Are Revolutionizing Construction — It’s Time to Get on Board.” Construction Executive, August 28, 2017. http://constructionexec.com/article/mobile-tracking-apps-are-revolutionizing-construction-its-time-to-get-on-board.

HireRight. 2018. “NRMA: The Retail Theft Database Overview.” http://www.geninfo.com/EXTRAS/industry-specific-solutions/retail/retail-industry-whitepapers/NRMA_Overview.pdf?v=00000091.

HireRight. 2020. “HireRight Introduces Social Media Screening through Partnership with Fama Technologies.” March 31, 2020. https://www.hireright.com/news/press-release/hireright-introduces-social-media-screening.

Hirsch, Jeffrey M. 2020. “Future Work.” University of Illinois Law Review 3: 889–958. https://illinoislawreview.org/wp-content/uploads/2020/06/Hirsch.pdf.

Hotel Tech Report. 2021. “Hotel Housekeeping Departments Depend on Great Software,” May 27, 2021. https://hoteltechreport.com/news/hotel-housekeeping.

Hurley, Dan. 2018. “Can an Algorithm Tell When Kids Are in Danger?” The New York Times, January 2, 2018. https://www.nytimes.com/2018/01/02/magazine/can-an-algorithm-tell-when-kids-are-in-danger.html.

Ikeler, Peter. 2016. “Deskilling Emotional Labour: Evidence from Department Store Retail.” Work, Employment and Society 30 (6): 966–83. https://doi.org/10.1177/0950017015609031.

Jabsky, Marina, and Charlene Obernauer. 2019. “Time off Task: Pressure, Pain, and Productivity at Amazon.” New York, NY. https://nycosh.org/resource/amazon-workers-report/.

Jacobs, Julia. 2018. “Hotels See Panic Buttons as a #MeToo Solution for Workers. Guest Bans? Not So Fast.” The New York Times, November 11, 2018. https://www.nytimes.com/2018/11/11/us/panic-buttons-hotel-me-too.html.

Kaminski, Margot E., and Gianclaudio Malgieri. 2021. “Algorithmic Impact Assessments under the GDPR: Producing Multi-Layered Explanations.” International Data Privacy Law 11 (2): 125–44. https://doi.org/10.1093/idpl/ipaa020.

Kantor, Jodi. 2014. “Working Anything but 9 to 5: Scheduling Technology Leaves Low-Income Parents with Hours of Chaos.” The New York Times, August 13, 2014. https://www.nytimes.com/interactive/2014/08/13/us/starbucks-workers-scheduling-hours.html.

Kantor, Jodi, Karen Weise, and Grace Ashford. 2021. “Inside Amazon’s Employment Machine.” The New York Times, June 15, 2021. https://www.nytimes.com/interactive/2021/06/15/us/amazon-workers.html.

Kaplan, Esther. 2015. “The Spy Who Fired Me: The Human Costs of Workplace Monitoring.” Harper’s Magazine, March 2015. https://harpers.org/archive/2015/03/the-spy-who-fired-me/6/.

Kellogg, Katherine C., Melissa A. Valentine, and Angèle Christin. 2020. “Algorithms at Work: The New Contested Terrain of Control.” Academy of Management Annals 14 (1): 366–410. https://doi.org/10.5465/annals.2018.0174.

Kessler, Sarah. 2020. “Companies Are Using Employee Survey Data to Predict — and Squash — Union Organizing.” OneZero, July 30, 2020. https://onezero.medium.com/companies-are-using-employee-survey-data-to-predict-and-squash-union-organizing-a7e28a8c2158.

Kim, Pauline T. 2017. “Data-Driven Discrimination at Work.” William & Mary Law Review 58 (3): 857–936. https://scholarship.law.wm.edu/wmlr/vol58/iss3/4/.

Kim, Pauline T., and Matthew T. Bodie. 2021. “Artificial Intelligence and the Challenges of Workplace Discrimination and Privacy.” Journal of Labor and Employment Law 35 (2): 289–315. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3929066.

Klosowski, Thorin. 2021 “The State of Consumer Data Privacy Laws in the US (And Why It Matters).” NYT Wirecutter, September 6, 2021. https://www.nytimes.com/wirecutter/blog/state-of-privacy-laws-in-us/

Knight, Will. 2021. “The Foundations of AI are Riddled with Errors.” Wired, August 31, 2021. https://www.wired.com/story/foundations-ai-riddled-errors/.

Kresge, Lisa. 2020. “Data and Algorithms in the Workplace: A Primer on New Technologies.” UC Berkeley Labor Center. https://laborcenter.berkeley.edu/working-paper-data-and-algorithms-in-the-workplace-a-primer-on-new-technologies/.

Lasky, Steve. 2019. “Guardians of Disruption Drive Technology Forward: The Landscape Is Changing as Traditional Guard Companies Embrace New Service Models.” Security Infowatch, October 15, 2019. https://www.securityinfowatch.com/security-executives/protective-operations-guard-services/article/21109971/guardians-of-disruption-drive-technology-forward.

Lazzaro, Sage. 2021. “How Computer Vision Works — and Why It’s Plagued by Bias.” Venture Beat, August 11, 2021. https://venturebeat.com/2021/08/11/how-computer-vision-works-and-why-its-plagued-by-bias/.

The Leadership Conference on Civil and Human Rights. 2020. “Civil Rights Principles for Hiring Assessment Technologies.” https://civilrights.org/resource/civil-rights-principles-for-hiring-assessment-technologies/.

Lecher, Colin. 2019. “How Amazon Automatically Tracks and Fires Warehouse Workers for ‘Productivity.” The Verge, August 25, 2019. https://www.theverge.com/2019/4/25/18516004/amazon-warehouse-fulfillment-centers-productivity-firing-terminations.

Lerman, Rachel. 2020. “Robot Cleaners Are Coming, This Time to Wipe up Your Coronavirus Germs.” Washington Post, September 8, 2020. https://www.washingtonpost.com/technology/2020/09/08/robot-cleaners-surge-pandemic/.

Lever, Rob. 2017. “Privacy Fears over Artificial Intelligence as Crimestopper.” Science X Phys.Org, November 12, 2017. https://phys.org/news/2017-11-privacy-artificial-intelligence-crimestopper.html.

Levy, Karen E.C. 2015. “The Contexts of Control: Information, Power, and Truck-Driving Work.” Information Society 31 (2): 160–74. https://doi.org/10.1080/01972243.2015.998105.

Levy, Karen, and Solon Barocas. 2018. “Refractive Surveillance: Monitoring Customers to Manage Workers.” International Journal of Communication 12: 1166–88. https://ijoc.org/index.php/ijoc/article/viewFile/7041/2302.

Lindzon, Jared. 2020. “Security Flaws Threaten ‘Panic Buttons’ Meant to Protect Hotel Workers.” Fast Company, February 6, 2020. https://www.fastcompany.com/90458034/security-flaws-threaten-panic-buttons-meant-to-protect-hotel-workers.

Litwin, Adam Seth. 2020. “Technological Change in Health Care Delivery: Its Drivers and Consequences for Work and Workers.” UC Berkeley Labor Center & Working Partnerships USA. https://laborcenter.berkeley.edu/technological-change-in-health-care-delivery/.

Liu, Jennifer. 2019. “This Algorithm Can Predict When Workers Are about to Quit—Here’s How.” CNBC, September 10, 2019. https://www.cnbc.com/2019/09/10/this-algorithm-can-predict-when-workers-are-about-to-quitheres-how.html.

Lopez, Steve. 2011. “Disney’s ‘Electronic Whip’: Anaheim Laundry Workers Monitored by Giant Screens Aim to Keep Productivity High as They Worry about Paying More for Healthcare.” Los Angeles Times, October 19, 2011. https://www.latimes.com/health/la-xpm-2011-oct-19-la-me-1019-lopez-disney-20111018-story.html.

Lorenzi, Neal. 2021. “Automated Hand-Hygiene System Evolution Continues: Data Collection Expands While COVID-19 Presents New Challenges.” Health Facilities Management, February 11, 2021. https://www.hfmmagazine.com/articles/4112-automated-hand-hygiene-system-evolution-continues.

Luca, Michael, Jon Kleinberg, and Sendhil Mullainathan. 2016. “Algorithms Need Managers, Too: Know How to Get the Most out of Your Predictive Tools.” Harvard Business Review, 2016. https://hbr.org/2016/01/algorithms-need-managers-too.

Lyons, Kim. 2020. “Some Shipt Workers Report Seeing Lower Pay under New Effort-based Model.” The Verge, October 16, 2020. https://www.theverge.com/2020/10/16/21519298/shipt-workers-lower-pay-algorithm-target-shopping

Madden, Mary, Michele Gilman, Karen Levy, and Alice Marwick. 2017. “Privacy, Poverty, and Big Data: A Matrix of Vulnerabilities for Poor Americans.” Washington University Law Review 95 (1): 053–125. https://openscholarship.wustl.edu/cgi/viewcontent.cgi?article=6265&context=law_lawreview.

Mahmood, Khurram. 2019. “Four Ways Computer Vision Is Transforming Physical Security.” Forbes, September 23, 2019. https://www.forbes.com/sites/forbestechcouncil/2019/09/23/four-ways-computer-vision-is-transforming-physical-security/?sh=689a63a15846.

Maria, Gitanjali, and Rachel Burger. 2016. “6 Outstanding Geofencing Tools to Use on Your Construction Site.” Capterra, December 1, 2016. https://blog.capterra.com/6-outstanding-geofencing-tools-to-use-on-your-construction-site/.

Mateescu, Alexandra, and Aiha Nguyen. 2019. “Explainer: Workplace Monitoring & Surveillance.” Data & Society. https://datasociety.net/library/explainer-workplace-monitoring-surveillance/.

Matsakis, Louise. 2019. “At an Outback Steakhouse Franchise, Surveillance Blooms.” Wired, October 19, 2019. https://www.wired.com/story/outback-steakhouse-presto-vision-surveillance/.

Maurer, Roy. 2021a. “2021 Recruiting Trends Shaped by the Pandemic.” Society for Human Resource Management (SHRM), February 1, 2021. https://www.shrm.org/resourcesandtools/hr-topics/talent-acquisition/pages/2021-recruiting-trends-shaped-by-covid-19.aspx.

Maurer, Roy. 2021b. “HireVue Discontinues Facial Analysis Screening. Decision Reflects Re-Examination of AI Hiring Tools.” Society for Human Resource Management (SHRM), February 3, 2021. https://www.shrm.org/resourcesandtools/hr-topics/talent-acquisition/pages/hirevue-discontinues-facial-analysis-screening.aspx.

McCallum, Jamie K. 2021. “Remote Controlled Workers.” The American Prospect, February 24, 2021. https://prospect.org/labor/remote-controlled-workers-digital-surveillance/.

McCrea, Bridget. 2020. “Labor Management Systems (LMS): The New Age of Employee Engagement.” Logistics Management, June 3, 2020. https://www.logisticsmgmt.com/article/labor_management_systesm_lmsthe_new_age_of_employee_engagement.

Metcalf, Jacob. 2018. “When Verification Is Also Surveillance: EVV Devices Could Intrusively Track Medicaid Recipients.” Data & Society Points. February 27, 2018. https://points.datasociety.net/when-verification-is-also-surveillance-21edb6c12cc9.

Miller, Steven M. 2018. “AI: Augmentation, more so than Automation.” Asian Management Insights 5 (1): 1–20. https://ink.library.smu.edu.sg/ami/83/.

Milner, Yeshimabeit, and Amy Traub. 2021. “Data Capitalism + Algorithmic Racism.” Data for Black Lives and Demos. https://www.demos.org/research/data-capitalism-and-algorithmic-racism.

Morozov, Evgeny. 2021. “Privacy Activists Are Winning Fights with Tech Giants. Why Does Victory Feel Hollow?” The Guardian, May 15, 2021. https://www.theguardian.com/commentisfree/2021/may/15/privacy-activists-fight-big-tech.

Moss, Emanuel, Elizabeth Anne Watkins, Ranjit Singh, Madeleine Clare Elish, and Jacob Metcalf. 2021. “Assembling Accountability: Algorithmic Impact Assessment for Public Interest.” Data & Society. https://datasociety.net/wp-content/uploads/2021/06/Assembling-Accountability.pdf.

Narayanan, Arvind. 2019. “How to Recognize AI Snake Oil.” Arthur Miller Lecture on Science and Ethics. Massachusetts Institute of Technology Program in Science, Technology, and Society. https://www.cs.princeton.edu/~arvindn/talks/MIT-STS-AI-snakeoil.pdf.

Negrón, Wilneida. Forthcoming. “Little Tech” Is Coming for Low-Wage Workers: A Framework for Reclaiming and Building Worker Power.” Coworker.

Nelson, Ariel. 2019. “Broken Records Redux: How Errors by Criminal Background Check Companies Continue to Harm Consumers Seeking Jobs and Housing.” National Consumer Law Center (NCLC). https://www.nclc.org/issues/rpt-broken-records-redux.html.

Newman, Nathan. 2017. “Reengineering Workplace Bargaining: How Big Data Drives Lower Wages and How Reframing Labor Law Can Restore Information Equality in the Workplace.” University of Cincinnati Law Review 85: 693–760. https://heinonline.org/HOL/P?h=hein.journals/ucinlr85&i=713.

Nguyen, Aiha. 2021. “The Constant Boss: Labor Under Digital Surveillance.” Data & Society. https://datasociety.net/library/the-constant-boss/

Nguyen, Aiha. 2020. “On the Clock and at Home: Post-COVID-19 Employee Monitoring in the Workplace.” People & Strategy Journal, no. Summer. https://www.shrm.org/executive/resources/people-strategy-journal/summer2020/Pages/feature-nguyen.aspx/.

Nieuwenhuijsen, K., D. Bruinvels, and M. Frings-Dresen. 2010. “Psychosocial Work Environment and Stress-Related Disorders, a Systematic Review.” Occupational Medicine (Oxford, England) 60 (4): 277–86. https://doi.org/10.1093/occmed/kqq081.

Ockenfels-Martinez, Martha, and Sukhdip Purewal Boparai. 2021. “The Public Health Crisis Hidden in Amazon Warehouses.” Oakland, CA. Human Impact Partners and Warehouse Workers Resource Center. https://humanimpact.org/hipprojects/amazon/.

O’Connor, Sarah. 2016. “When Your Boss Is an Algorithm.” Financial Times, September 7, 2016. https://www.ft.com/content/88fdc58e-754f-11e6-b60a-de4532d5ea35.

O’Connor, Shelby. 2019. “Tabletop Tech.” Food Service and Hospitality Magazine, September 27, 2019. https://www.foodserviceandhospitality.com/tabletop-tech/.

O’Donovan, Caroline. 2018. “An Invisible Rating System at Your Favorite Chain Restaurant Is Costing Your Server.” BuzzFeed News, June 21, 2018. https://www.buzzfeednews.com/article/carolineodonovan/ziosk-presto-tabletop-tablet-restaurant-rating-servers.

Oliver, Dean. 2020. “Construction Safety Startup Raises $4M to Monitor Machinery Blind Spots.” This Is Construction, June 4, 2020. https://www.thisisconstruction.com.au/news-articles/construction-safety-startup-raises-4m-to-monitor-machinery-blind-spots.

Overstreet, Kim. 2019. “Collaborative Robots in the E-Commerce Supply Chain.” Automation World. April 29, 2019. https://www.automationworld.com/factory/robotics/article/13319773/collaborative-robots-in-the-ecommerce-supply-chain

Pasquale, Frank. 2021. “Licensure as Data Governance: Moving toward an Industrial Policy for Artificial Intelligence,” September 28, 2021. Knight First Amendment Institute at Columbia University. https://knightcolumbia.org/content/licensure-as-data-governance.

Patton, Carol. 2021. “What Will Recruitment Look like after COVID?” Human Resource Executive, March 31, 2021. https://hrexecutive.com/what-will-recruitment-look-like-after-covid/.

Peterson, Hayley. 2019. “Amazon Is Tracking Delivery Workers’ Every Move with an App That Assigns Them Scores Based on Their Driving.” Business Insider, December 18, 2019. https://www.businessinsider.com/amazon-scores-delivery-workers-driving-skills-using-tracking-app-2019-12.

Plus. 2021. “Autonomous Trucking Company Plus Will Use AI and Billions of Miles of Data to Train Self-Driving Semis.” Venture Beat, April 15, 2021. https://venturebeat.com/2021/04/15/autonomous-trucking-company-plus-will-use-ai-and-billions-of-miles-of-data-to-train-self-driving-semis/.

Raghavan, Manish, Solon Barocas, Jon Kleinberg, Karen Levy, et al. 2020. “Mitigating bias in algorithmic hiring: evaluating claims and practices.” In Conference on Fairness, Accountability and Transparency, January 27-30, 2020. https://doi.org/10.1145/3351095.3372828

Raji, Inioluwa Deborah, Timnit Gebru, Margaret Mitchell, Joy Buolamwini, et al. 2020. “Saving Face: Investigating the Ethical Concerns of Facial Recognition Auditing.” Proceedings of the AAAI/ACM Conference on AI, Ethics, and Society, February 2020. https://doi.org/10.1145/3375627.3375820.

Rainey, Clint. 2018. “The Trouble with Tablets: How Self-Pay Platforms in Restaurants are Wreaking Havoc on Servers.” Grub Street, July 2018. https://www.grubstreet.com/2018/07/restaurant-tablets-server-complaints.html

Reisman, Dillon, Jason Schultz, Kate Crawford, and Meredith Whittaker. 2018. “Algorithmic Impact Assessments: A Practical Framework for Public Agency Accountability.” AI Now Institute. https://ainowinstitute.org/aiareport2018.pdf.

Richardson, Rashida. Forthcoming. “Defining and Demystifying Automated Decision Systems.” Maryland Law Review, March 24, 2021. https://ssrn.com/abstract=3811708.

Rieke, Aaron, Urmilla Janardan, Mingwei Hsu, and Natasha Duarte. 2021. “Essential Work: Analyzing the Hiring Technologies of Large Hourly Employers.” Upturn. https://www.upturn.org/reports/2021/essential-work/.

Reyes, Juliana Feliciano. 2018. “Hotel Housekeeping on Demand: Marriott Cleaners Say this App Makes their Job Harder.” The Philadelphia Inquirer, July 2, 2018. https://www.inquirer.com/philly/news/hotel-housekeepers-schedules-app-marriott-union-hotsos-20180702.html.

Rodriguez, Katitza, and Svea Windwehr. 2020. “Workplace Surveillance in Times of Corona.” Electronic Frontier Foundation, September 10, 2020. https://www.eff.org/deeplinks/2020/09/workplace-surveillance-times-corona.

Rogers, Brishen. 2020. “The Law and Political Economy of Workplace Technological Change.” Harvard Civil Rights-Civil Liberties Law Review 55: 1–53. https://harvardcrcl.org/wp-content/uploads/sites/10/2020/10/Rogers.pdf.

Rosenblat, Alex, Solon Barocas, Karen Levy, and Tim Hwang. 2016. “Discriminating Tastes: Customer Ratings as Vehicles for Bias.” Data & Society. https://datasociety.net/pubs/ia/Discriminating_Tastes_Customer_Ratings_as_Vehicles_for_Bias.pdf.

Rosenblat, Alex. 2016. “The Truth about How Uber’s App Manages Drivers.” Harvard Business Review, April 6, 2016. https://hbr.org/2016/04/the-truth-about-how-ubers-app-manages-drivers.

Sandu, Dan. 2019. “A Quick Guide for Effective Chatbot Training in Customer Service.” Chatbots Life, March 4, 2019. https://chatbotslife.com/a-quick-guide-for-effective-chatbot-training-in-customer-service-ad75ed768390.

Schatsky, David, Craig Muraskin, and Ragu Gurumurthy. 2015. “Demystifying Artificial Intelligence.” The Atlantic. https://www.theatlantic.com/sponsored/deloitte-shifts/demystifying-artificial-intelligence/257/.

Scheiber, Noam. 2017. “How Uber Uses Psychological Tricks to Push Its Drivers’ Buttons.” New York Times, April 2, 2017. https://www.nytimes.com/interactive/2017/04/02/technology/uber-drivers-psychological-tricks.html?smid=fb-nytscience&smtyp=cur&_r=0.

Schencker, Lisa. 2019. “Doctors and Nurses Clean Their Hands Only Half as Much as They Should. Now Some Chicago-Area Hospitals Are Having Them Wear Tracking Technology to Keep Tabs.” Chicago Tribune, December 26, 2019.

Scherer, Matt, and Lydia X. Z. Brown. 2021. “Warning: Bossware May Be Hazardous to Your Health.” Center for Democracy & Technology. https://cdt.org/wp-content/uploads/2021/07/2021-07-29-Warning-Bossware-May-Be-Hazardous-To-Your-Health-Final.pdf.

Sicular, Svetlana, and Dave Aron. 2019. “Leverage Augmented Intelligence to Win With AI.” Gartner, Inc. https://www.gartner.com/en/documents/3939714/leverage-augmented-intelligence-to-win-with-ai.

Simon, Matt. 2015. “This Incredible Hospital Robot Is Saving Lives. Also, I Hate It.” Wired, February 10, 2015. https://www.wired.com/2015/02/incredible-hospital-robot-saving-lives-also-hate/amp.

Simonite, Tom. 2018. “This Call May Be Monitored for Tone and Emotion.” Wired, March 19, 2018. https://www.wired.com/story/this-call-may-be-monitored-for-tone-and-emotion/.

Simonite, Tom. 2020. “Portland’s Face-Recognition Ban Is a New Twist on ‘Smart Cities.’ Wired, September 21, 2020. https://www.wired.com/story/portlands-face-recognition-ban-twist-smart-cities/.

Slaughter, Rebecca Kelly, Janice Kopec, and Mohamad Batal. 2021. “Algorithms and Economic Justice: A Taxonomy of Harms and a Path Forward for the Federal Trade Commission.” Digital Future Whitepaper Series. https://law.yale.edu/sites/default/files/area/center/isp/documents/algorithms_and_economic_justice_master_final.pdf.

Solon, Olivia. 2021. “Big Tech Call Center Workers Face Pressure to Accept Home Surveillance.” NBC News, August 8, 2021. https://www.nbcnews.com/tech/tech-news/big-tech-call-center-workers-face-pressure-accept-home-surveillance-n1276227

State of Ohio. 2018. “Transforming Delivery of Health & Human Services through Robotics Process Automation.” https://www.nascio.org/wp-content/uploads/2020/09/NASCIO-Awards-2019_State-of-OH-Bots.pdf.

Tanaka, Greg, Zhixin Liu, Garrett Wong, Zhijuan Gao, Liu. Ming, Patrick Chung Ting Cho, and Shaun Kurien Benjamin. 2016. Method for determining staffing needs based in part on sensor inputs. US 2016/0342929 A1, issued 2016. https://www.freepatentsonline.com/y2016/0342929.html.

Teleperformance. 2018. “Who We Are.” https://www.teleperformance.com/en-us/our-locations/united-states.

The Leadership Conference on Civil and Human Rights. 2020. “Civil Rights Principles for Hiring Assessment Technologies.” https://civilrights.org/resource/civil-rights-principles-for-hiring-assessment-technologies/.

The Royal Society. 2017. “Machine Learning: The Power and Promise of Computers That Learn by Example.” The Royal Society. https://royalsociety.org/~/media/policy/projects/machine-learning/publications/machine-learning-report.pdf.

Ticona, Julia, Alexandra Mateescu, and Alex Rosenblat. 2018. “Beyond Disruption: How Tech Shapes Labor Across Domestic Work and Ridehailing.” Data & Society. https://datasociety.net/library/beyond-disruption/.

Tippett, Elizabeth, Charlotte S. Alexander, and Zev J. Eigen. 2017. “When Timekeeping Software Undermines Compliance.” Yale Journal of Law and Technology 19 (1): 1–76. https://digitalcommons.law.yale.edu/yjolt/vol19/iss1/1/.

Tisné, Martin. 2020. “The Data Delusion: Protecting Individual Data Isn’t Enough When the Harm Is Collective.” https://cyber.fsi.stanford.edu/publication/data-delusion.

Townsend, Phela. 2021. “Data Privacy Is Not Just a Consumer Issue: It’s Also a Labor Rights Issue.” The Century Foundation Next 100, May 14, 2021. https://thenext100.org/data-privacy-is-not-just-a-consumer-issue-its-also-a-labor-rights-issue/.

Townsend, Phela. 2020. “Disconnected: How the Digital Divide Harms Workers and What We Can Do about It.” The Century Foundation Next 100, October 22, 2020. https://thenext100.org/disconnected-how-the-digital-divide-harms-workers-and-what-we-can-do-about-it/.

Trades Union Congress. 2020. “Technology managing people –The worker experience.” November 29, 2020. https://www.tuc.org.uk/sites/default/files/2020-11/Technology_Managing_People_Report_2020_AW_Optimised.pdf.

Trollope, Rowan. 2018. “Five Levels of AI-Driven Contact Center Agent Augmentation.” Medium, August 6, 2018. https://medium.com/@rowantrollope/five-levels-of-contact-center-agent-augmentation-23b8cc5b6473.

TuSimple. 2019. “Big Rig, No Driver: How TuSimple Uses AI to Train Self-Driving Semis.” Wired, April 2019. https://www.wired.com/brandlab/2019/04/big-rig-no-driver-tusimple-uses-ai-train-self-driving-semis/.

Tutt, Andrew. 2017. “An FDA for Algorithms.” Administrative Law Review 69: 83–123. https://doi.org/10.2139/ssrn.2747994.

UC Berkeley Labor Center. 2020. “COVID-19 and Technology at Work,” July 8, 2020. UC Berkeley Labor Center. https://laborcenter.berkeley.edu/covid-19-and-technology-at-work/.

UNI Global Union. n.d. “Top 10 Principles for Ethical Artificial Intelligence.” UNI Global Union. http://www.thefutureworldofwork.org/media/35420/uni_ethical_ai.pdf.

U.S. Congress. 2019. “Algorithmic Accountability Act of 2019, S. 1108 and H.R.2231, 116th Congress, 1st Session, Introduced in House April 10, 2019.” https://www.congress.gov/bill/116th-congress/senate-bill/1108.

Vincent, James. 2019. “Amazon Turns Warehouse Tasks into Video Games to Make Work ‘Fun.’” The Verge, May 22, 2019. https://www.theverge.com/2019/5/22/18635272/amazon-warehouse-working-conditions-gamification-video-games.

Viscelli, Steve. 2018. “Autonomous Trucks and the Future of the American Trucker.” UC Berkeley Labor Center & Working Partnerships USA. https://laborcenter.berkeley.edu/driverless/.

Wachter, Sandra, and Brent Mittelstadt. 2019. “A Right to Reasonable Inferences: Re-Thinking Data Protection Law in the Age of Big Data and AI.” Columbia Business Law Review 2019 (2). https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3248829.

Weil, David. 2019. “Understanding the Present and Future of Work in the Fissured Workplace Context.” RSF: The Russell Sage Foundation Journal of the Social Sciences 5 (5): 147–65. https://doi.org/10.7758/rsf.2019.5.5.08.

Woyke, Elizabeth. 2018. “AI Could Help the Construction Industry Work Faster—and Keep Its Workforce Accident-Free.” MIT Technology Review, June 12, 2018. https://www.technologyreview.com/s/611141/ai-could-help-the-construction-industry-work-faster-and-keep-its-workforce-accident-free/.

Zarya, Valentina. 2016. “Employers Are Quietly Using Big Data to Track Employee Pregnancies.” Fortune, February 17, 2016. https://fortune.com/2016/02/17/castlight-pregnancy-data/.

Zickuhr, Kathryn. 2021. “Workplace Surveillance Is Becoming the New Normal for U.S. Workers.” Washington Center for Equitable Growth. https://equitablegrowth.org/research-paper/workplace-surveillance-is-becoming-the-new-normal-for-u-s-workers/.