, ,

E-Discovery Process Improvement: The After-Action Audit

Process improvement involves an ongoing effort to identify what’s working well and identify what could be improved. In e-discovery, it’s sometimes painfully obvious when things didn’t work well, e.g., a production deadline is missed or sensitive data is produced. However, it’s not always obvious what could be improved – it’s hard to identify potential improvements if results are about what people thought was achievable.

After-action audits can be eye-opening in identifying ways to improve the e-discovery process. However, the term “improve” is rather broad; more specific goals will provide better guidance for the audit. Jeff Carr, long-time legal cost expert, recommends SMART goals – those that are Specific, Measurable, Achievable, Realistic, and Timely, e.g.:

  • Lower outside counsel document review fees by 20%
  • Lower hosting fees for e-discovery review platform vendors by 30%.
  • Identify cost-effective ways to get early looks at potential discovery from the very start of potential litigation.

One process I find useful is to select a case representative in complexity and scope to those ordinarily encountered by the client, and reprocess the same documents using alternative tool sets. Using actual case data has several advantages:

  • Proving scalability of alternative tools sets. Some tools look nice on small select demo data sets (does “Enron” sound familiar?) but don’t scale well for large collections.
  • Identifying “gotchas” in alternative tools. There can be idiosyncrasies in data sets that cause problems in some tools. Nothing identifies these problems like running actual client data.
  • Validating original technology. Search and analytics tools that performs similar functions may not produce the same results, e.g., some full text search software may have problems indexing specific document types. The audit provides a way to potentially identify weaknesses.

In the ideal world, there would be production notes detailing the tools that were used to achieve the original volume reduction, and the decisions that were made, and there would be bills from attorneys, review providers, hosting providers, and software providers. All that detail provides a baseline for comparison.

Audit Deliverables

The audit report should cover:

  • Alternative Techniques. What techniques and tools could have provided the same functionality in terms of eliminating irrelevant content and identifying relevant content, but at a lower cost? For example, social network analysis, key term logic testing, concept clustering, visual similarity, and other functions are available in a variety of software packages that can be provided without per-gigabyte processing fees.
  • Dollar Impact of Alternative Tools. Culling irrelevant content early in the process saves considerable money downstream, e.g., reduced hosting fees, and reduced attorney review time. The report can estimate those savings.
  • Recommended Training. What training should be provided to either make better use of existing tools or to use new tools?


The direct costs of the after-action audit needn’t be very large when the tools used for auditing are provided on a flat- or no-fee basis, i.e., not charged on a per-GB, per-user, or per-search basis. Most audits can be performed using low-cost cloud storage or existing consultant infrastructure.


Audits needn’t take a long time to complete. Large savings are usually quickly obvious, and useful, actionable data can be available within about a month.

Further Reading: