Migrated from eDJGroupInc.com. Author: Greg Buckles. Published: 2012-08-01 10:28:48
Chris La Cour’s second annual Carmel Valley eDiscovery Retreat (CVEDR) has just wrapped up three days of topical, interactive discussions in beautiful, relaxed Monterey, California. Many of us who attended last year’s retreat knew that this event is a perfect opportunity to gently introduce your significant other to the eDiscovery social scene away from the crazy crowds and business intensity of Legal Tech New York. The Monterey Bay Aquarium, kayaking and other attractions kept them happily occupied while we debated questions like, “Are you obligated to disclose Technology Assisted Review (TAR) or other analytics if they are used to make relevance decisions?” It was great to see many familiar top speakers and providers, but I do hope that more entry level corporations and law firms realize how valuable these smaller events are to new eDiscovery professionals. CVEDR tries to avoid ‘death by power point’ and the classic bobble-headed panelists reading from notes with those familiar droning Charlie Brown style teacher voices. Instead, the event has four themes each with five related topic sessions per track. That’s a lot of content and expertise crammed into two and a half days.
The Monday keynote by the Honorable Nora B. Fisher covered handling eDiscovery disputes before the bench with a new special master program. Craig D. Ball gave us an animated, entertaining keynote on Tuesday full of special effects, scary facts and even his own race-caller interpretation of recent predictive coding cases before the bench. For those of you who have followed my recent blogs on cell phone forensics, Craig added a couple of interesting iPhone ESI issues that I will be testing out and writing about soon.Moderating the Analytics track sessions kept me pretty busy, but EVERYONE attended the judicial panel on predictive coding and validation moderated by Karl Schieneman. Judge Peck was careful as always to steer clear of any comments on active cases, but his clear and simple observations actually stole some of the wind from my poor panelists for our session “TAR- Are eyes on every item needed?”. An impressive fact that we learned about Judge Fisher; she personally reviews all disputed discovery items before her bench. Joining the discussion via conference call, Judge Facciola, Maas and Waxse pulled no punches in their opinions on search terms, predictive coding for review and various uses of TAR methods. This was one session that you did not want to miss.
Paraphrased take-aways from sessions (any mistaken quotes or attributions are my fault):
- I’m not documenting my decisions for myself, but for those who come behind me – Chris Spizzirri, Morris James
- A centralized, flexible eDiscovery platform makes it much easier to record the who, what, when and why of eDiscovery decisions – Bob Rohlf, Exterro
- Communication and close supervision are essential to prevent IT decisions from straying across the boundary of legal decisions – Melissa Frank, eDiscovery counsel
- You cannot understand the value or ROI of eDiscovery technology unless you measure the cost and time of eDiscovery – Deborah Baron, Autonomy, an HP company
- Technology without people and process is not a solution – John Reikes, Kroll Ontrack
- Yes, we keep and use culling/search filters across matters – 90% audience response for Global Culling session
- Sampling for collection and culling is very different from quality control sampling after review – Michael Wade, Planet Data
- TAR use in eDiscovery is young and you should anticipate questions from opposing counsel on your process and tools – Mathew Nelson, Clearwell now a part of Symantec
- You may not be obligated to disclose TAR, but non-disclosure can be painful in clawback or missing smoking gun scenarios – one of our fabulous magistrates
- Some of the best documents I’ve found are the ones that I did not know that I was looking for – George Socha
Carefully unattributed:
- The key is to get to the heart of the matter first and fastest
- In my testing against prior reviews, predictive coding systems missed up to 60% of relevant documents compared to prior reviews
- Every vendor will let you run a proof of concept, so do it
- The problem is that when attorneys read about the low consistency and accuracy of human review, they do not think that it applies to them
These were just some of the memorable points and positions that I could recall. You had to be there to get the full impact of the sessions and I was working this conference instead of covering it and taking notes. I hope that the gorgeous setting did not discourage you from asking to attend. The content and discourse was excellent and well worth the time.
The monkeys in the title pertain to the hysterical screaming monkey chotski toys from ZL Technologies and the ‘Get the eDiscovery Monkey off your back’ monkeys from BIA (which were grabbed up before I could get one). This monkey theme was reinforced by a 5 year old throwing her bean bag monkey around the San Jose airport on our way home. I have attached my session decks below so that you can see what kinds of questions we tackled and are encouraged to try more small events like this. The eDJ Group is participating in the Executive Counsel’s Exchange events as well as David Cowen’s breakfast events, so check out one of these coming to your city soon. Analytics Track Session 1 Analytics Track Session 2 Analytics Track Session 3 Analytics Track Session 4 Analytics Track Session 5More pictures from Monterey to encourage you for next year: