CrimProf Blog

Editor: Kevin Cole
Univ. of San Diego School of Law

Friday, November 19, 2021

Diamantis on Algorithms as Employees

Mihailis Diamantis (University of Iowa - College of Law) has posted Algorithms as Employees: Holding Corporations Accountable for Their Digital Workforce on SSRN. Here is the abstract:
The workforce is digitizing. Leading consultancies estimate that algorithmic systems will replace forty-five percent of human-held jobs by 2030. This is a well-documented and alarming trend for the millions of truckers, bankers, and line-workers whose jobs will become obsolete. But now that corporations are using algorithms like employees, another public threat that has received far less attention is also arising: a growing corporate accountability gap.

One feature that algorithms share with the human employees they are replacing is their capacity to cause harm. Even today, algorithms discriminate against loan applicants, manipulate stock markets, collude over prices, and cause traffic deaths. Ordinarily, corporate employers would be responsible for these injuries, but the rules for assessing corporate liability arose at a time when only humans could act on behalf of corporations. Those rules apply awkwardly, if at all, to silicon. Some corporations have already discovered this legal loophole and are rapidly automating business functions to limit their own liability risk.

This Article seeks a way to hold corporations accountable for the algorithmic harms of their digital workforce. It draws inspiration from responses to earlier corporate efforts to dodge liability by manipulating the formal boundary defining employment. For more than a century, corporations have sought to jilt victims and immunize themselves by shifting operations from employees to various non-employee laborers, like temps, contractors, and gig workers. Lawmakers and scholars have responded to each of these machinations by developing functional tests that recharacterize some of these workers as employees, thereby closing the corporate accountability gap.

This Article proposes an analogous approach for algorithms: some algorithms should be treated, for liability purposes, as corporate employees. Drawing on existing functional characterizations of employment, the Article defines the concept of an “employed algorithm” as one over which a corporation exercises substantial control and from which it derives substantial benefits. If a corporation employs an algorithm that causes criminal or civil harm, the corporation should be liable just as if the algorithm were a human employee. This would allow plaintiffs and prosecutors to leverage existing, employee-focused liability rules to hold corporations accountable when the digital workforce transgresses.

| Permalink