Migrated from eDJGroupInc.com. Author: Karl Schieneman. Published: 2012-10-10 14:20:34Format, images and links may no longer function correctly. Nothing with law and technology combined comes with ease (or “E‘s” as I put in the title).  It should come as no surprise, then, that one of the biggest challenges in the technology-assisted review (TAR) space is how to educate a critical mass of lawyers about how to work with newer technologies.

TAR methodologies require some level of comfort with statistics in order for lawyers to validate results and be comfortable certifying the completeness of their review and production.   After two years of pushing hard to get lawyers to accept these tools, I am convinced we need less improvement in the tools today and more improvement in the competency of lawyers to successfully run TAR projects.

Educated lawyers will be able to use advanced TAR methodologies and better deal with uninformed adversaries, potentially Luddite judges who hate discovery disputes, and the issues that occur when the technology’s “buggyness” or quirks arise.  We all know there are times when data doesn’t get processed smoothly or people make mistakes.  I am completing a more detailed position paper within eDJ Group’s subscription research site – eDiscoveryMatrix.com – on this topic, but want to share some of the important aspects of this position in this and the follow-up article.

If the most pressing need to push TAR forward is more education (our first “E“), a logical question to ask is: where should one get educated in this field?

Ordinarily, one would get educated by looking to the experts to hold the lawyers’ hands through initial projects as the ideal TAR protocols are developed.  To date we have a number of proclaimed experts who have spent the last few years working to tout the virtues of predictive coding and TAR.

Craig Ball recently discussed the education dilemma facing the legal community. One of the dilemmas he pointed out, after identifying many of us by first name, is that we (the “experts”) appear to have biases toward different approaches and technologies.  Craig says we should model our industry after the milk association and come up with “easy to understand” standards for a confused marketplace.

It’s hard to argue with the wholesomeness of milk, but I see a real problem with this approach.  While we may all be experts to some degree, I would classify us instead as largely entrepreneurs with limited perspectives in trying different approaches and tools.

We understand how poor the current model is:

  • Guessing at key words,
  • Throwing bodies at review, and
  • Producing the curds (using our milk analogy) that comes out.

So the “experts”, from our various positions armed with data from a number of sources such as TREC, Blair Maron, and other related follow-up studies, went out in the field and have been trying to educate our brethren while simultaneously working on projects experimenting with new techniques.  Usually the “experts” work with only a single preferred tool.

The result is that we are risk takers in the legal field because we are willing to take a stand when we already know the model has to change.  However, we need to be careful to recognize that the expertise is still based on a narrow set of experiences across very few differing systems.

It’s my personal opinion that this is what Craig was pointing out and why there are confusing, diverse messages coming from the experts on the best approaches, workflows and tools.  To hear the differences of opinion, one needs to go no further than listening to the excellent panel I moderated and put together at the Carmel Valley eDiscovery Retreat back on July 24, 2012. I had five other experts and some fellow potential milk association members join me in a discussion on best practices with predictive coding.  Maura Grossman, Tom Gricks, Bennett Borden, Herb Roitblat and Dave Lewis had a spirited dialogue discussing different workflows and approaches using analytical review tools including predictive coding.  The panel’s preferences and biases come out clear as day in this podcast. Hear Predictive Coding Power User Panel from Carmel.

I will continue to discuss the “E‘s” of predictive coding in my next post.

eDiscoveryJournal Contributor Karl Schieneman

0 0 votes
Article Rating