, ,

The Three-Legged E-Discovery Model

Inhouse counsel typically provide for their e-discovery needs by having a partnership with a hosting vendor who provides processing and hosting for major cases and who generally bills based on the volume of gigabytes or number of files processed. However, the best way to manage costs is to add a third role – a consultant who can suggest alternative technologies and processes to limit overall costs.

The tripartite model recognizes the inherent conflict of hosting vendors who want to maximize their revenue by maximizing the volume of data they process and host. Their revenues are their corporate clients’ costs – they’re literally two sides to the same coins. When bonuses and commissions for hosting vendor personnel are based on the volume hosted, the natural inclination is to recommend a “collect-everything-and-sort-it-out-in-the-review-platform” approach. 

The bias towards maximizing the volume of data held by the review platform extends beyond just how much data is put into the system and can impact how that data is stored as well. For example, electronic documents like Word, PowerPoint, and Excel can have embedded graphics. Those embedded graphics can be extracted and stored as separate files despite the fact that in many cases they just clutter the review database and needlessly inflate the storage space being billed by the hosting vendor.

Hosting vendors understandably focus their attention on ways to use their existing system and may not take the time to learn of new technologies or new pricing models that could result in lower costs to clients but lower revenue for them. Vendors have finite resources to devote to market research especially if they’ve already made multi-year commitments to pay hefty licensing fees for their current offerings. Furthermore, some software licenses restrict the ability of vendors to share benchmarking data comparing the effectiveness of the primary technology with that obtained from other sources. 


Case Sensitive Searching. We recently conducted a post-action audit of e-discovery for a client and found that the initial selection of files to be reviewed in the final hosting platform involved a key term that was an acronym that appeared in all caps, e.g., ACT. The hosting vendor dutifully collected such files without either knowing or disclosing that the key term search could have specified case sensitivity which would have omitted gathering many files where the lower-case version of the word (“act”) or mixed-case version of the word (“Act”) that were never responsive.

Analytics/Predictive Coding. Hosting vendors often tout the efficiency of their analytics and predictive coding technology in eliminating clutter. And they’re right, analytics and predictive coding/Technology Assisted Review is very effective. However, that type of technology can be licensed on terms other than paying per gigabyte fees. Concept clustering, social network analysis, and other tools can be used iteratively to cull the non-relevant “noise” documents before the data goes to hosting vendors and without paying volume-based fees.

Having a knowledgeable consultant who is not compensated based on primarily volume processed or hosted can result in the selection of more cost-effective e-discovery solutions.

, ,

E-Discovery Process Improvement: The After-Action Audit

Process improvement involves an ongoing effort to identify what’s working well and identify what could be improved. In e-discovery, it’s sometimes painfully obvious when things didn’t work well, e.g., a production deadline is missed or sensitive data is produced. However, it’s not always obvious what could be improved – it’s hard to identify potential improvements if results are about what people thought was achievable.

After-action audits can be eye-opening in identifying ways to improve the e-discovery process. However, the term “improve” is rather broad; more specific goals will provide better guidance for the audit. Jeff Carr, long-time legal cost expert, recommends SMART goals – those that are Specific, Measurable, Achievable, Realistic, and Timely, e.g.:

  • Lower outside counsel document review fees by 20%
  • Lower hosting fees for e-discovery review platform vendors by 30%.
  • Identify cost-effective ways to get early looks at potential discovery from the very start of potential litigation.

One process I find useful is to select a case representative in complexity and scope to those ordinarily encountered by the client, and reprocess the same documents using alternative tool sets. Using actual case data has several advantages:

  • Proving scalability of alternative tools sets. Some tools look nice on small select demo data sets (does “Enron” sound familiar?) but don’t scale well for large collections.
  • Identifying “gotchas” in alternative tools. There can be idiosyncrasies in data sets that cause problems in some tools. Nothing identifies these problems like running actual client data.
  • Validating original technology. Search and analytics tools that performs similar functions may not produce the same results, e.g., some full text search software may have problems indexing specific document types. The audit provides a way to potentially identify weaknesses.

In the ideal world, there would be production notes detailing the tools that were used to achieve the original volume reduction, and the decisions that were made, and there would be bills from attorneys, review providers, hosting providers, and software providers. All that detail provides a baseline for comparison.

Audit Deliverables

The audit report should cover:

  • Alternative Techniques. What techniques and tools could have provided the same functionality in terms of eliminating irrelevant content and identifying relevant content, but at a lower cost? For example, social network analysis, key term logic testing, concept clustering, visual similarity, and other functions are available in a variety of software packages that can be provided without per-gigabyte processing fees.
  • Dollar Impact of Alternative Tools. Culling irrelevant content early in the process saves considerable money downstream, e.g., reduced hosting fees, and reduced attorney review time. The report can estimate those savings.
  • Recommended Training. What training should be provided to either make better use of existing tools or to use new tools?


The direct costs of the after-action audit needn’t be very large when the tools used for auditing are provided on a flat- or no-fee basis, i.e., not charged on a per-GB, per-user, or per-search basis. Most audits can be performed using low-cost cloud storage or existing consultant infrastructure.


Audits needn’t take a long time to complete. Large savings are usually quickly obvious, and useful, actionable data can be available within about a month.

Further Reading: