Migrated from eDJGroupInc.com. Author: Greg Buckles. Published: 2014-11-16 19:00:00Format, images and links may no longer function correctly. 

One of the fun parts about publishing a new market perspective report is the hours of calls where I get to see what resonated with colleagues like Michael Simon and others. I don’t expect critical readers to agree with all of my conclusions. In fact, I enjoy the counterpoints as much as the congratulations after a long research cycle. One point keeps coming up over and over again that is worth extracting from my 26 page report. The report calls out pre-review processing and organization as the current market sweet spot for analytics, mainly because of the strong resistance to actual machine learning (PC/TAR) review methodologies reported in my expert interviews. So is there a real difference between a tech filtering collections based on clustered concepts or rules and an attorney training a predictive coding system? The market definitely sees them differently. Counsel rarely wrangle over exclusion of lunch notices, FaceBook messages or other non-relevant categories when they are even disclosed.  Why not?

Survey respondents clearly perceive deduplication, conversation consolidation (elimination of prior thread segments) and even near dup elimination as part of ‘normal’ processing these days. Sharp project managers have been using concept clustering, social network visualizations, timeline statistics and other analytics to group, prioritize and kill off the ‘junk’ for over a decade without serious resistance. Even further upstream, parties use relatively arbitrary search terms to collect email and files from archives and indexed repositories without having to disclose non-relevant items from their early interview collections. I definitely see a double standard being applied based primarily on marketing terms and potential loss of review revenue.

My real point here is that counsel should apply the same level of scrutiny and transparency to what happens BEFORE the review as they seem to be requiring when the other side wants to try PC/TAR. The challenge of big data will only grow as more corporations migrate their key ESI to Office 365 and other cloud repositories. All of these systems provide indexed search for collection and we are starting to see early analytics from the major players and their partners. The easiest place to control the cost and risk of discovery is at the source. This puts pressure on corporate legal and their retained counsel to aggressively exclude, filter and use narrow relevance criteria. This is where analytics can really deliver value if leveraged in a mature, transparent process. The majority of the market has concerns about using analytics to make relevance decisions without ‘eyes on the docs’, but they are already doing that prior to review. A healthy discovery process needs both innovation and critical scrutiny at every step.

Greg Buckles can be reached at Greg@eDJGroupInc.com for offline comment, questions or consulting. His active research topics include analytics, mobile device discovery, the discovery impact of the cloud, Microsoft’s 2013 eDiscovery Center and multi-matter discovery. Recent consulting engagements include managing preservation during enterprise migrations, legacy tape eliminations, retention enablement and many more. Blog market perspectives are personal opinions and should not be interpreted as a professional judgment.

0 0 votes
Article Rating