Category Archives: Enterprise eDiscovery

Lawson v. Spirit Aerosystems: Federal Court Blasts “Bloated” ESI Collection, Rendered TAR Ineffective

By John Patzakis

Technology Assisted Review (TAR), when correctly employed, can significantly reduce legal review costs with generally more accurate results than other traditional legal review processes. However, the benefits associated with TAR are often undercut by the over-collection and over-inclusion of Electronically Stored Information (ESI) into the TAR process. These challenges played out in spades in the recent decision in Lawson v. Spirit Aerosystems, where a Kansas federal judge issued a detailed ruling outlining the parties’ eDiscovery battles, use of Technology Assisted Review (TAR), and whether further TAR costs should be shifted to the Plaintiff. The ex-CEO of Spirit Aerosystems brought his suit accusing Spirit of unlawfully withholding $50 million in retirement benefits over his alleged violation of a non- compete agreement.

Lessons Learned from New Technology-Assisted Review Case Law ...

The Lawson court outlined two ways in particular how ESI over-collection can detrimentally impact TAR. First, the more data introduced into the process, the higher the cost and burden. Some practitioners believe it is necessary to over-collect and subsequently over-include ESI to allow the TAR process to sort everything out. Many service providers charge by volume, so there can be economic incentives that conflict with what is best for the end-client. In some cases, the significant cost savings realized through TAR are erased by eDiscovery costs associated with overly aggressive ESI inclusion on the front end. Per the judge in Lawson, “the TAR set was unnecessarily voluminous because it consisted of the bloated ESI collection” due to overbroad collection parameters.

The court also outlined how the TAR process is much more effective when the initial set of data has a higher richness (also referred to as “prevalence”) ratio. In other words, the higher the rate of responsive data in the initial data set, the better. It has always been understood that document culling is very important to successful, economical document review, and that includes TAR. As noted by Lawson court, “the ‘richness’ of the dataset…can also be a key driver of TAR expenses. This is because TAR is not as simple as loading the dataset and pushing a magic button to identify the relevant and responsive documents. Rather, the parties must devote the resources (usually a combination of attorneys and contract reviewers) necessary to “educate” or “train” the predictive algorithm, typically through an ongoing process…” According to the courts’ decision, the inefficiencies in the process resulted in an estimated TAR bill of $600,000 involving the review of approximately 200 GBs of data. This is far too expensive for TAR to be feasible as a standard litigation process, and the problems all started with the “bloated” ESI collection.

To be sure, the volume of ESI is growing exponentially and will only continue to do so. The costs associated with collecting, processing, reviewing, and producing documents in litigation are the source of considerable pain for litigants, including the Plaintiff in Lawson, who will, per the courts’ ruling, incur at least a substantial amount of the TAR bill under the cost-shifting order. The only way to reduce that pain to its minimum is to use all tools available in all appropriate circumstances within the bounds of reasonableness and proportionality to control the volumes of data that enter the discovery pipeline, including TAR.

Ideally, an effective and targeted collection capability can enable parties to ultimately process, host, review and produce less ESI.  This capability should enable a pre-collection early case assessment capability (ECA) to foster cooperation and proportionality in discovery by informing the parties early in the process about where relevant ESI is located and what ESI is significant to the case. And with such benefits also comes a much more improved TAR process. X1 Distributed Discovery (X1DD) uniquely fulfills this requirement with its ability to perform pre-collection early case assessment, instead of ECA after the costly, time consuming and disruptive collection phase, thereby providing a game-changing new approach to the traditional eDiscovery model.  X1DD enables enterprises to quickly and easily search across hundreds of distributed endpoints from a central location.  This allows organizations to easily perform unified complex searches across content, metadata, or both and obtain full results in minutes, enabling true pre-collection ECA with live keyword analysis and distributed processing and collection in parallel at the custodian level. To be sure, this dramatically shortens the identification/collection process by weeks if not months, curtails processing and review costs from not over-collecting data, and provides confidence to the legal team with a highly transparent, consistent and systemized process. And now we know of another key benefit of an effective collection and ECA process: much more accurate and feasible technology assisted review.

Leave a comment

Filed under Best Practices, Case Law, Case Study, collection, ECA, eDiscovery, Enterprise eDiscovery, ESI

How to Implement an Effective eDiscovery Search Term Strategy

By Mandi Ross and John Patzakis

A key Federal Rules of Civil Procedure provision that greatly impacts eDiscovery processes is Rule 26(f), which requires the parties’ counsel to “meet and confer” in advance of the pre-trial scheduling conference on key discovery matters, including the preservation, disclosure and exchange of potentially relevant electronically stored information (ESI). With the risks and costs associated with eDiscovery, this early meeting of counsel is a critically important means to manage and control the cost of eDiscovery, and to ensure relevant ESI is preserved.

A very good authority on the Rule 26(f) eDiscovery conference is the “Suggested Protocol for Discovery of Electronically Stored Information,” provided by then Magistrate Judge Paul W. Grimm and his joint bar-court committee. Under Section 8 of the Model Protocol, the topics to be discussed at the Rule 26(f) conference include: “Search methodologies for retrieving or reviewing ESI such as identification of the systems to be searched;” “the use of key word searches, with an agreement on the words or terms to be searched;” “limitations on the time frame of ESI to be searched;” and “limitations on the fields or document types to be searched.”x1-collection-img

Optimizing the process of developing keyword searches, however, is no easy task, especially without the right technology and expertise. The typical approach of brainstorming a list of terms that may be relevant and running the search on a dataset to be reviewed results in a wide range of inefficiencies. Negotiations over proper usage of search terms may become onerous and contentious. Judges are often tasked with making determinations regarding the aptness of the methodology, and many are reluctant to do so. Thus, the use of outside expertise leveraging indexing in place technology is beneficial in building an effective and comprehensive search term strategy.

The courts agree. In Victor Stanley v. Creative Pipe, U.S. District Court Judge Paul Grimm explains, “Selection of the appropriate search and information retrieval technique requires careful advance planning by persons qualified to design effective search methodology.”

Building a sound search strategy is akin to constructing a building. First, lay the foundation with a clear understanding of the claims and defenses of the case and the types of documents that will support a legal strategy. Once a solid foundation is built, the structure of language, logical expressions, and metadata are blended as necessary to create the appropriate set of robust Boolean searches. These searches then target the retrieval of responsive documents, and consistently achieve a staggering 80 percent reduction in data volumes to be reviewed.

It’s quite simple. If a document does not contain the defined language, then the document is unlikely to be relevant. The best way to find the language specific to the claims and defenses is to create a linguistic narrative of the case. This not only helps construct a roadmap for a comprehensive strategy designed to reduce the volume of data, it also creates a thorough categorization system for organization and prioritization of review. The approach is straightforward, flexible, and adaptive to client objectives, whether during early case assessment, linear or technology-assisted review, or anything in between.

The narrative search approach includes the following steps:

  1. Issue Analysis: Create an unambiguous definition of each issue that characterizes the claims being made and the defenses being offered.
  2. Logical Expression Definition: Define the specific expressions that encapsulate each issue. There may be multiple expressions required to convey the full meaning of the issue.
  3. Component Identification and Expansion: Distill each logical expression into specific components. These components form the basis for the expansion effort, which is the identification of words that convey the same conceptual meaning (synonyms).
  4. Search Strategies: Determine the appropriate parameters to be used for proximity, as well as developing a strategy for searching non-standard, structured data, such as spreadsheets, non-text, or database files.
  5. Test Precision and Recall: In tandem with the case team, review small sample sets to refine the logical expression statements to improve precision and recall.

The effectuation of this process requires the right technology that enables its application in real time. The ability to index data in place is a game changer, as it provides legal teams early insight into the data and validates search term sampling and testing instantly, without first requiring data collection. This is in contrast to the outdated, costly, and time-consuming process involving manual data collection and subsequent migration into a physical eDiscovery processing platform. The latter process negates counsel’s ability to conduct any meaningful application of search term proportionality, without first incurring significant expense and loss of time.

X1 Distributed Discovery enables enterprises to quickly and easily search across thousands of distributed endpoints from a central location. This allows organizations to easily perform unified complex searches across content, metadata, or both, and obtain full results in minutes, enabling true pre-collection search analytics with live keyword analysis and distributed processing and collection in parallel at the custodian level. This dramatically shortens the identification/collection process by weeks if not months, curtails processing and review costs by not over-collecting data, and provides confidence to the legal team with a highly transparent, consistent and systemized process.

Led by an experienced consulting team that leverages cutting-edge technology, this innovative narrative methodology, created by the experts at Prism Litigation Technology, enriches common search terms by adding layers of linguistic and data science expertise to create a fully defensible, transparent, and cogent approach to eDiscovery. For more on this workflow, please see the white paper: Don’t Stop Believin’: The Staying Power of Search Term Optimization.


Mandi Ross is the CEO of Prism Litigation Technology (www.prismlit.com)

John Patzakis is Chief Legal Officer and Executive Chairman at X1 (www.X1.com)

Leave a comment

Filed under Best Practices, ECA, eDiscovery, Enterprise eDiscovery, ESI, Preservation & Collection

Remote ESI Collection and Data Audits in the Time of Social Distancing

By John Patzakis

The vital global effort to contain the COVID-19 pandemic will likely disrupt our lives and workflows for some time. While our personal and business lives will hopefully return to normal soon, the trend of an increasingly remote and distributed workforce is here to stay. This “new normal” will necessitate relying on the latest technology and updated workflows to comply with legal, privacy, and information governance requirements.

From an eDiscovery perspective, the legacy manual collection workflow involving travel, physical access and one-time mass collection of custodian laptops, file servers and email accounts is a non-starter under current travel ban and social distancing policies, and does not scale for the new era of remote and distributed workforces going forward. In addition to the public health constraints, manual collection efforts are expensive, disruptive and time-consuming as many times an “overkill” method of forensic image collection process is employed, thus substantially driving up eDiscovery costs.

When it comes to technical approaches, endpoint forensic crawling methods are now a non-starter. Network bandwidth constraints coupled with the requirement to migrate all endpoint data back to the forensic crawling tool renders the approach ineffective, especially with remote workers needing to VPN into a corporate network.  Right now, corporate network bandwidth is at a premium, and the last thing a company needs is their network shut down by inefficient remote forensic tools.

For example, with a forensic crawling tool, to search a custodian’s laptop with 10 gigabytes of email and documents, all 10 gigabytes must be copied and transmitted over the network, where it is then searched, all of which takes at least several hours per computer. So, most organizations choose to force collect all 10 gigabytes. The case of U.S. ex rel. McBride v. Halliburton Co.  272 F.R.D. 235 (2011), Illustrates this specific pain point well. In McBride, Magistrate Judge John Facciola’s instructive opinion outlines Halliburton’s eDiscovery struggles to collect and process data from remote locations:

“Since the defendants employ persons overseas, this data collection may have to be shipped to the United States, or sent by network connections with finite capacity, which may require several days just to copy and transmit the data from a single custodian . . . (Halliburton) estimates that each custodian averages 15–20 gigabytes of data, and collection can take two to ten days per custodian. The data must then be processed to be rendered searchable by the review tool being used, a process that can overwhelm the computer’s capacity and require that the data be processed by batch, as opposed to all at once.”

Halliburton represented to the court that they spent hundreds of thousands of dollars on eDiscovery for only a few dozen remotely located custodians. The need to force-collect the remote custodians’ entire set of data and then sort it out through the expensive eDiscovery processing phase, instead of culling, filtering and searching the data at the point of collection drove up the costs.

Solving this collection challenge is X1 Distributed Discovery, which is specially designed to address the challenges presented by remote and distributed workforces.  X1 Distributed Discovery (X1DD) enables enterprises to quickly and easily search across up to thousands of distributed endpoints and data servers from a central location.  Legal and compliance teams can easily perform unified complex searches across both unstructured content and metadata, obtaining statistical insight into the data in minutes, and full results with completed collection in hours, instead of days or weeks. The key to X1’s scalability is its unique ability to index and search data in place, thereby enabling a highly detailed and iterative search and analysis, and then only collecting data responsive to those steps. blog-relativity-collect-v3

X1DD operates on-demand where your data currently resides — on desktops, laptops, servers, or even the cloud — without disruption to business operations and without requiring extensive or complex hardware configurations. After indexing of systems has completed (typically a few hours to a day depending on data volumes), clients and their outside counsel or service provider may then:

  • Conduct Boolean and keyword searches of relevant custodial data sources for ESI, returning search results within minutes by custodian, file type and location.
  • Preview any document in-place, before collection, including any or all documents with search hits.
  • Remotely collect and export responsive ESI from each system directly into a Relativity® or RelativityOne® workspace for processing, analysis and review or any other processing or review platform via standard load file. Export text and metadata only or full native files.
  • Export responsive ESI directly into other analytics engines, e.g. Brainspace®, H5® or any other platform that accepts a standard load file.
  • Conduct iterative “search/analyze/export-into-Relativity” processes as frequently and as many times as desired.

To learn more about this capability purpose-built for remote eDiscovery collection and data audits, please contact us.

Leave a comment

Filed under Best Practices, Case Law, Case Study, ECA, eDiscovery, eDiscovery & Compliance, Enterprise eDiscovery, ESI, Information Governance, Preservation & Collection, Relativity

Court Compels Forensic Imaging of Custodian Computer, Imposes Sanctions Due to Non-Defensible eDiscovery Preservation Process

By John Patzakis

HealthPlan Servs., Inc. v. Dixit, et al., 2019 WL 6910139 (M.D. Fla. Dec. 19, 2019), is an important eDiscovery case addressing what is required and expected from organizations to comply with electronic evidence discovery collection requirements. In this copyright infringement and breach of contract case, a Federal Magistrate Judge granted the plaintiff’s motion to compel immediate inspection of a defendant employee Feron Kutsomarkos’s laptop after the defendants failed to properly preserve and collect evidence from her. The Court granted plaintiff’s motion to compel the forensic examination, which set forth specific improprieties in their opponent’s ESI preservation process. The Court also granted the plaintiff’s motion for fees, sanctions, and a punitive jury instruction.

 

There are several key takeaways from this case. Here are the top 5:

  1. Custodian Self-Collection Is Not Defensible

Ms. Kutsomarkos conducted her own search of the emails rather than having an expert or trained IT or legal staff overseen by her attorney perform the search. The court found this process to not be defensible as the production “should have come from a professional search of the laptop” instead. This is yet another case disapproving of this faulty practice. For instance, another company found themselves on the wrong end of a $3 million sanctions penalty for spoliation of evidence because they improperly relied on custodians to search and collect Federal Court their own data. See GN Netcom, Inc. v. Plantronics, Inc., No. 12-1318-LPS, 2016 U.S. Dist. LEXIS 93299 (D. Del. July 12, 2016). Even with effective monitoring, severe defensibility concerns plague custodian self-collection, with several courts disapproving of the practice due to poor compliance and inconsistency of results. See Green v. Blitz, 2011 WL 806011, (E.D. Tex. Mar. 1, 2011), Nat’l Day Laborer Org. v. U.S. Immigration and Customs Enforcement Agency, 2012 WL 2878130 (S.D.N.Y. July 13, 2012).

  1. Producing Party Expected to Produce Their Own Data in a Defensible Manner

When responding to a litigation discovery request, the producing party is afforded the opportunity to produce their own data. However, the process must be defensible with a requisite degree of transparency and validation. When an organization does not have a systematic and repeatable process in place, the risks and costs associated with eDiscovery increase exponentially.  Good attorneys and the eDiscovery professionals who work with them will not only ensure their client complies with their own eDiscovery requirements, but will also scrutinize the opponent’s process and gain a critical advantage when the opponent fails to meet their obligations.

And that is what happened here. The corporate defendants had no real process other than telling key custodians to search and collect their own data. The eDiscovery-savvy plaintiff counsel filed motions poking large holes in the defendant’s process and won a likely case-deciding ruling. The stakes are high in such litigation matters and it is incumbent upon counsel to have a high degree of eDiscovery competence for both defensive and offensive purposes.

  1. Forensic Imaging is The Exception, Not the Rule

The court compelled the forensic imaging of a defendant’s laptop, but only as a punitive measure after determining bad faith non-compliance. Section 8c of The Sedona Principles, Third Edition: Best Practices, Recommendations & Principles for Addressing Electronic Document Production, provides that: “Forensic data collection requires intrusive access to desktop, server, laptop, or other hard drives or media storage devices.”  While noting the practice is acceptable in some limited circumstances, “making a forensic copy of computers is only the first step of an expensive, complex, and difficult process of data analysis . . . it should not be required unless circumstances specifically warrant the additional cost and burden and there is no less burdensome option available.”  The duty to preserve evidence, including ESI, extends only to relevant information. Parties that comply with discovery requirements will avoid burdensome and risk-laden forensic imaging.

  1. Metadata Must be Preserved

Metadata is required to be produced intact when designated by the requesting party, which is now commonplace. (See, Federal Rule of Civil Procedure 34(b)(1)(C)). Metadata is often relevant evidence itself and is also needed for accurate eDiscovery culling, processing and analysis. In her production, counsel for defendant Kutsomarkos provided pdf versions of documents from her laptop. However, the court found that “the pdf files scrubbed the metadata from the documents and that metadata should be available on the hard drives.” There are defensible and very cost effective ways to collect and preserve metadata. They were not used by the defendants, to their great detriment.

  1. A Defensible But Streamlined Process Is Optimal

HealthPlan Services, is yet another court decision underscoring the importance of a well-designed, cost-effective and defensible eDiscovery collection process. Such a capability is only attainable with the right enterprise technology. With X1 Distributed Discovery (X1DD), parties can perform targeted search and collection of the ESI of hundreds of endpoints over the internal network without disrupting operations. The search results are returned in minutes, not weeks, and thus can be highly granular and iterative, based upon multiple keywords, date ranges, file types, or other parameters. This approach typically reduces the eDiscovery collection and processing costs by at least one order of magnitude (90%), thereby bringing much needed feasibility to enterprise-wide eDiscovery collection that can save organizations millions while improving compliance by maintaining metadata, generating audit logs and establishing chain of custody.

And in line with concepts outlined in HealthPlan Services, X1DD provides a repeatable, verifiable and documented process for the requisite defensibility. For a demonstration or briefing on X1 Distributed Discovery, please contact us.

Leave a comment

Filed under Best Practices, Case Law, eDiscovery, Enterprise eDiscovery, ESI, Uncategorized

CaCPA Compliance Requires Effective Investigation and eDiscovery Capabilities

By John Patzakis

The California Consumer Protection Act, (CaCPA ), which will be in full force on January 1, 2020,  promises to profoundly impact major US and global organizations, requiring the overhaul of their data audit, investigation and information governance processes. The CaCPA requires that an organization have absolute knowledge of where all personal data of California residents is stored across the enterprise, and be able to remove it when required. Many organization with a global reach will be under obligations to comply with both the GDPR and CaCPA, providing ample requirement justification to bolster their compliance efforts.

CCPA Image

According to data security and privacy attorney Patrick Burke, who was recently a senior New York State Financial Regular overseeing cybersecurity compliance before heading up the data privacy law practice at Phillips Nizer, CaCPA compliance effectively requires a robust digital investigation capability. Burke, speaking in a webinar earlier this month, noted that under the “CaCPA, California residents can request that all data an enterprise holds on them be identified and also be removed. Organizations will be required to establish a capability to respond to such requests. Actual demonstrated compliance will require the ability to search across all data sources in the enterprise for data, including distributed unstructured data located on desktops and file servers.” Burke further noted that organizations must be prepared to produce “electronic evidence to the California AG, which must determine whether there was a violation of CaCPA…as well as evidence of non-violation (for private rights of action) and of a ‘cure’ to the violation.”

The CaCPA contains similar provisions as the GDPR, which both specify processes and capabilities organizations must have in place to ensure the personal data of EU and California residents is secure, accessible, and can be identified upon request. These common requirements, enumerated below, can only be complied with through an effective enterprise eDiscovery search capability:

  • Data minimization: Under both the CaCPA and the GDPR, enterprises should only collect and retain as little personal data on California residents EU subjects as possible. As an example, Patrick Burke, who routinely advises his legal clients on these regulations, notes that unauthorized “data stashes” maintained by employees on their distributed unstructured data sources is a key problem, requiring companies to search all endpoints to identify information including European phone numbers, European email address domains and other personal identifiable information.
  • Enforcement of right to be forgotten: An individual’s personal data must be identified and deleted on request.
  • Effective incident response: If there is a compromise of personal data, an organization must have the ability to perform enterprise-wide data searches to determine and report on the extent of such breaches and resulting data compromise within seventy-two (72) hours under the GDPR. There are less stringent, but similar CaCPA requirements.
  • Accountability: Log and provide audit trails for all personal data identification requests and remedial actions.
  • Enterprise-wide data audit: Identify the presence of personal data in all data locations and delete unneeded copies of personal data.

Overall, a core requirement of both CaCPA and GDPR compliance is the ability to demonstrate and prove that personal data is being protected, requiring information governance capabilities that allow companies to efficiently produce the documentation and other information necessary to respond to auditors’ requests. Many consultants and other advisors are helping companies establish privacy compliance programs, and are documenting policies and procedures that are being put in place.

However, while policies, procedures and documentation are important, such compliance programs are ultimately hollow without consistent, operational execution and enforcement. CIOs and legal and compliance executives often aspire to implement information governance programs like defensible deletion and data audits to detect risks and remediate non-compliance. However, without an actual and scalable technology platform to effectuate these goals, those aspirations remain just that. For instance, recent IDG research suggests that approximately 70% of information stored by companies is “dark data” that is in the form of unstructured, distributed data that can pose significant legal and operational risks.

To achieve GDPR and CaCPA compliance, organizations must ensure that explicit policies and procedures are in place for handling personal information, and just as important, the ability to prove that those policies and procedures are being followed and operationally enforced. What has always been needed is gaining immediate visibility into unstructured distributed data across the enterprise, through the ability to search and report across several thousand endpoints and other unstructured data sources, and return results within minutes instead of days or weeks. The need for such an operational capability provided by best practices technology is further heightened by the urgency of CaCPA and GDPR compliance.

A link to the recording of the recent webinar “Effective Incident Response Under GDPR and CaCPA”, is available here.

 

Leave a comment

Filed under CaCPA, compliance, Data Audit, eDiscovery, eDiscovery & Compliance, Enterprise eDiscovery, GDPR, Records Management, Uncategorized