Sunday, April 8, 2018
There's an interesting article by Professor Brian Sheppard (Seton Hall) in the March issue of Thomson Reuters' Practice Innovations newsletter that discusses the phenomenon of "skill fade" which occurs when workplace technologies replace human decision-making. By way of example, Professor Sheppard notes that several studies have shown that the use of autopilot technology for commercial flights can lead to a decline in pilot skill. Professor Sheppard then hypothesizes about whether the same could happen to lawyers as a result of the increased use of Artificial Intelligence driven by algorithms to make strategic decisions about a client's case that in the past would have been done by the lawyers themselves. Here's an excerpt:
. . . .
Artificial intelligence—particularly the sort that uses algorithm-powered machine learning—can evaluate, sort, and cull information outside the lawyer's view. For example, existing technology allows cases or other resources to be selected, processed, and used as citations in machine-generated memoranda or contracts. The automated process of creation is almost always hidden. Companies wall it off in the name of intellectual property. But even if there were no walls, lawyers would be unable to understand the decision-making process that led an algorithm to sort information. And states could not set ethical duties at such a challenging level.
However, it is not easy to see how the inscrutability of algorithms creates an ethical problem. Won't lawyers be able to assess whether the technology is producing better outputs than they could have produced without it? Won't they be able to review memoranda and check for errors?
Perhaps not. Automation may lead to a phenomenon known as skill fade.
Skill fade has been observed in occupational fields in which automation has become widespread. For example, numerous empirical studies have shown that autopilot can lead to a decline in pilot skill. Calvin L. Scovel III, the Inspector General of the US Department of Transportation, became so concerned about skill fade that his office reprimanded the Federal Aviation Administration. They claimed that they no longer know how many pilots are still capable of manually operating planes. Unfortunately, human error after autopilot failure was a likely cause in the recent crashes of Turkish Airlines flight 1951 and Asiana Airlines Flight 214.
Skill fade becomes problematic when an automated system fails. If we conceive of system failure as an inability to access or otherwise use automated programs, then system failure in legal practice would be incredibly rare. It might happen during a long-term power outage, internet failure, or something similarly dramatic. While these developments are not impossible—Puerto Rico stands as a continuing example—they are highly improbable. I have been doing legal research since the turn of the millennium, and I haven't looked at pocket parts since my first year of law school.
However, the inability to access technology is not the only type of system failure. Clients could also be harmed when automation evolves to a point that it ceases to perform as well as lawyers. As with the skill fade from widespread use of autopilot, unnoticed degradation is a bigger risk than the system breaking outright. As legal skills fade, we might be unable to gauge whether the outputs of the system are as good as the outputs that we would have created in the pre-automation period.
. . . .
Continue reading here.