Pay Algorithms Make Working in the Gig Economy Feel Like ‘Gambling,’ Study Says

Startups behind the so-called gig economy have spent millions of dollars transforming the nature of work from structured hours, pay, and benefits to an ad-hoc world of “independent contractors” whose boss is an algorithm. According to a new paper, haphazard algorithmic pay has turned having a job into something more like gambling. In 2020, California voters approved Prop 22, which enshrined the ability of ride hail companies to treat their drivers as independent contractors rather than employees. The law was proposed by Uber, Lyft and DoorDash in response to a 2019 state law and state Supreme Court decision that mandated companies to re-classify their drivers as employees and pay for health care, unemployment insurance and other benefits. The companies instead proposed Prop 22, which allows them to keep workers as independent contractors and promises a “minimum wage” that only applies while they are actively ferrying passengers (An average 40 percent of driver time is spent waiting for customers). Uber, Lyft and DoorDash spent a combined $200 million on the ballot proposal, making it the most expensive campaign in the state’s history. A judge ruled the law unconstitutional in 2021, following a lawsuit from the Service Employees International Union, and the law is being debated in appeals court. According to a study preprint by legal scholar Veena Dubal, Prop 22, along with a similar law passed in Washington, legally enshrines an unfair wage system so much to gambling using opaque algorithms, which Dubal calls “algorithmic wage discrimination.” Rather than traditional wage discrimination, which involves someone paying a different wage based on their gender, race, or other protected categories, Dubal compares the pay schema of the big ride hail companies to consumer price discrimination, where people pay different prices for the same products depending on how many companies think they’re willing to pay. According to Dubal, Uber and Lyft are doing essentially the same thing with wages: paying drivers different rates for the same work based on what the companies think they’ll accept. They do this both through data collection on drivers and through the opacity of algorithms, which determine when drivers get a customer and how much they will be paid. “Algorithmic wage discrimination allows firms to personalize and differentiate wages for workers in ways unknown to them, to behave in ways that the firm desires, perhaps as little as the system determines that they may be willing to accept,” Dubal writes. The wages are “calculated with ever-changing formulas using granular data on location, individual behavior, demand, supply, and other factors,” she adds. In a study combining legal analysis and interviews with gig workers, Dubal concludes that Prop 22 has turned working into gambling. From a driver’s point of view, every time they log in to work they are essentially gambling for wages, as the algorithm provides no reason why those wages are what they are. One way the companies might do this is by determining how long a driver is willing to wait to be given a fare. Drivers are not paid for idling time, yet the company’s business model benefits from this idling because it keeps availability high. According to Dubal, “The company’s goal is to keep as many drivers on the road in order to quickly address fluctuations in rider demand,” and they’re motivated to “elongate the time between sending fares to any one driver, so long as that wait time does not lead the driver to end their shift.” The ride hail companies use “bonuses” to keep drivers on the road, giving them extra pay if they hit certain quotas. But as Dubal points out, the companies can use their algorithms to make it difficult to unlock these bonuses. Moreso, these bonuses are offered inconsistently and for different amounts for every driver with no explanation from the ride hail companies. Dubal interviewed a driver named Domingo who said he was close to hitting his bonus number when rides suddenly dried up. “I had 95 out of 96 rides for a $100 bonus,” he told Dubal. “It was ten o’clock at night in a popular area. It took me 45 min in a popular area to get that last ride. The algorithm was moving past me to get to people who weren’t closer to their bonus.” Neither the driver nor Dubal have evidence that the company did this intentionally, but this is the point: the system is operating capriciously, Dubal argues, and companies can simply claim that they are just trying to dole out customers to drivers efficiently using unexplained variables. In a sense, this is another variation on a verbal sleight of hand a former FTC chair person referred to in 2017: If you substitute “algorithm” with the words “a guy named Bob,” suddenly decisions that seemed objective and inscrutable seem unfair and malicious. Drivers were also reporting “fake surges,” where they got notifications about surge pricing at a certain location, showed up, and saw that surge pricing ended once the lot filled up. A driver who spoke to Dubal said an Uber employee pulled him aside and advised him not to chase surges for this reason Dubal said the best way to deal with this issue would be for regulators to consider banning the use of algorithms to set wages altogether, rather than pursue litigation to demystify the companies’ algorithms as some have called for in the past. But that’s not what’s happening; rather, the firms have set these practices into law through Prop 22 and laws in other states that permit them to treat their drivers as independent contractors. Dubal does say that this legal reality sees away from how pay has been regulated in the past. Even in a Supreme Court case that struck down a minimum wage law, the court declared that “A statute requiring an employer to pay in money, to pay at prescribed and regular intervals, to pay the value of the services rendered, even to pay with fair relation to the extent of the benefit obtained from the service, would be understandable.” In contrast, Dubal says, “The worker cannot know what the firm has algorithmically decided their labor is worth, and the technological form of calculation makes each person’s wages different.” Dubal believes algorithmic wage-setting is harmful because it motivates companies to collect reams and reams of data on drivers, who fundamentally view as consumers of their technology rather than employees. “If we ban this kind of payment practice…at least in some context, it will discourage surveillance of work,” Dubal told Motherboard. Surveillance of workers has existed since the industrial revolution, but the type of digital surveillance used for algorithmic wage-setting is on another level entirely, she said. Workers have an asymmetrical power relationship with their employers, so they may sign away their data without realizing what it will be used for: “They might not consider that this employment data about me is actually being retained for future decisions, not just in this workplace , but beyond.”The use of algorithms, ‘artificial intelligence’ and data to structure work scheduling and payment pervades the economy, including nurse staffing and janitorial work in hospitals. Dubal suggests we engage in a larger reckoning with these practices. “This is not just about economic harms on the on demand economy or unfairness in the on demand or gig economy, but actually about shifting payment norms,” she told Motherboard. Over and over again, the drivers Dubal spoke with equated the pay scheme to gambling, and one driver declared that much like a casino, “the house always wins.” Some even expressed shame about continuing to do the work for this reason. If there’s any upside to this, it’s that the ride hail companies are so opaque and unfair to drivers that it has angered many of them and is pushing them to organize. As one driver named Dietrich told Dubal, “[It’s] constant cognitive dissonance. You’re free, but the app controls you.” Uber did not respond to a request for comment, and Lyft did not provide a statement in time for publication.

Leave a Comment

Your email address will not be published. Required fields are marked *