French lawmakers made history this past summer (2019) when they enacted legislation limiting how tech companies can implement analytics in the legal sector. Implementing the legislation was tantamount to admitting that analytics, if not used properly, could corrupt the legal system. The problem is one of analyzing judge and attorney behaviors as a means of predicting case outcomes.
The question now before us is this: does case prediction take legal analytics too far? It is a question that must be answered in light of the fact that a number of legal tech innovators have figured out ways to do what France wants to avoid. They are able to successfully predict case outcomes through the use of analytics.
Predictions via Deep Learning
The technology behind predicting cases is actually quite fascinating. You design a piece of software capable of harvesting and analyzing tons of data from past cases. You also give the software deep learning capabilities based on statistical analysis. Finally, you feed in the data and let it go.
Software algorithms analyze millions of data points in order to establish trends. The software can gradually figure out what’s most likely to occur with future events based on comparisons of new data. And when those events finally do occur, the results are fed into the system as more data that can be analyzed and compared yet again.
According to Bloomberg Law, one particular company that currently works with Canadian clients dealing with tax disputes enjoys a success rate of about 90%. In other words, their software correctly predicts the outcome of tax disputes 90% of the time.
Attorneys and Judges Are Wary
In order to be that accurate, the software needs data pertaining to both attorneys and judges. It needs to know what actions attorneys take in tax dispute cases. It needs to know how judges rule based on certain kinds of evidence. All of this has attorneys and judges worried, and rightly so.
Neither attorneys nor judges are necessarily comfortable with algorithms tracking their actions and decisions. Some fear liability in the event they act opposite of what software predicts. Others consider it an invasion of their professional privacy. But there is an even bigger concern: the threat of judge shopping.
Judge shopping is already problematic enough in this country. It occurs when attorneys choose to take cases to specific jurisdictions because they know the judges in those jurisdictions will be more favorable to them. It is a practice that runs completely contrary to the ideals of Lady Justice being blind.
Not Good for Citizens Either
Those who side with attorneys and judges recognize that predictive analytics wouldn’t be good for citizens either. Our system is already dealing with the very real problem of inequities in justice between the rich and poor. To put it another way, if you are rich enough to afford the best attorneys, you are rich enough to buy a win.
Including data relating to attorneys and judges in predictive analytics algorithms would make the existing gap even wider by giving those with means the ability to predict case outcomes in advance. Don’t like the outcome? Then take the case to a different jurisdiction. Don’t like the attorney? Not a problem. You can afford to hire another one.
Analytics does have plenty of uses in the legal sector, according to the people behind the NuLaw legal case management application. Their application utilizes some of those analytics for things like marketing and case acquisition. But using analytics technologies to predict case outcomes seems to be going at least a bit too far.