Migrated from eDJGroupInc.com. Author: Greg Buckles. Published: 2010-02-10 17:00:22  Barry Murphy called out 2010 as “the year of ECA”. Certainly those three letters were found on almost every booth, even if the exhibitors did not exactly agree on what that entails. The majority of the processing players seem to consider canned reports and inventory functionality to provide ECA support. But the important trend is that everyone is looking to provide upstream decision support.We know that attorney review consumes the largest slice of the discovery budget pie. That has been the driving force this year towards Legal Process Outsourcing (LPO) that has drawn large filing and patent service providers into the litigation support and review market. But software providers have realized that the fastest ROI lies in managing the volume of what is within the definition of relevance prior to ‘collection’. Thus we have archiving platforms, search appliances and everything else touting themselves as ‘ECA Solutions’.This reminded me of how ‘End to End Discovery Solution’ dominated the booths in LTNY 2007. Anyone who thinks that eDiscovery has any kind of technology ‘Solution’ has never actually been involved in large scale civil litigation. Still, it was good to see products with workflow, dashboards and tracking metrics slowly reach the market. For too long, providers have competed to see who could bolt on the most features to win the RFP without thinking through how the users would integrate these different functions. My favorite examples right now are all the review platforms that have added ‘concept clustering’ as an add-on option by leveraging Content Analyst or OrcaTec to populate a field in their database. A talented project manager or litsupport analyst can use these fields to add value during culling, review or quality assurance, but that does not meet the market expectations of dynamic conceptual feedback integrated into the iterative cycles of identification through production.Some of the new, useful ECA features included a dynamic ‘phrase frequency’ view in Recommind’s Axcelerate, logged false positive/negative sampling in several products and a couple social network visualizations from products like Nuix 3. dtSearch and other index applications have offered term frequency reports that have been invaluable for creating and negotiating Boolean relevance search criteria. Getting a dynamic list of common phrases is much more specific and yet still digestible by non-technical counsel. Sampling and QA/QC in general have been woefully neglected outside of dedicated review platforms like Attenex Patterns or early scanning software like IPRO’s Premium Scan. Face it, defensibility is not sexy and does not sell product compared to the unqualified promises of massive ROI associated with common sense de-duplication and filtering. But all of those cost savings come at the risk of sanctions and public chastisement without an effective, documented QA/QC process. So it was nice to see the essential features integrated into their scenarios. Yet another sign that our industry is slowly, painfully growing up.

0 0 votes
Article Rating