The discovery process is traditionally a reactive one, triggered by imminent litigation.
This didn’t change when Electronically Stored Information (ESI) was included in that process in 2006. And when one looks at the EDRM (eDiscovery Reference Model), which originated at that time and continues to be the foundation for the eDiscovery process, it looks pretty straightforward.
A legal hold is sent to custodians, letting them know not to destroy ESI. Then those who have control over that data, preserve the data in some form. Then that data is collected and processed into usable formats so it can be reviewed. Finally, review teams look at that information, begin making a case, and produce it in a format stipulated by Department of Justice guidelines.
But this process by nature is a reactive one from the beginning. This often makes this seemingly linear process convoluted and confusing, involving multiple stakeholders and data handoffs, which can increase data risk.
Without upstream thinking, the process might look like this:
Legal Hold notices are sent out, but without a clear process on how this is done, how the holds are confirmed, and how data custodians are to respond, this can lead to unintentional spoliation of ESI, which can lead to delays or court sanctions.
Identification / Preservation
Once data is identified as needing to be preserved, time must be spent on locating where that data is housed and how to lock it down without spoliation, especially if there isn’t a data map. This often involves an organization’s I.T. department, which creates an additional stakeholder, and who may or may not have a clear understanding of the preservation process as it pertains to the needs of the legal department. There also may be automatic deletion protocols in place, which could also cause unintentional spoliation.
After the data is located and preserved from deletion, it must be collected. Again, if an organization is simply reacting to a triggering event, they may have to hire an outside digital forensics service provider who knows how to extract the needed ESI without spoliation. This adds another stakeholder and data handoff, which can put an organization’s data at risk. Even if an organization has done collections before, because there are new data sources (SaaS tools, Social Media, Collaboration apps, etc.,) continually being created and added, they may have to hire a digital forensics expert who specializes in those types of data. And if that hasn’t been planned for, it can again lead to delays, extra cost, and risk.
Now that the data has been collected, it must be processed, which often involves yet another service provider. Not only does that add another stakeholder, but the data must leave an organization to be hosted elsewhere, creating more risk and cost.
After the data is processed, it must then be sent to a review team. Some service providers offer a first-pass review along with processing, but even with that, an organization’s outside counsel will also want their attorneys to review the documents. This creates even more stakeholders, data handoffs, and costs. Also, if there isn’t a clear process from the start, it’s at the review stage where problems with the ESI may finally be noticed, which may require going back to do more collections and/or processing, which leads to further review.
This step might be viewed as something as easy as pressing a button (it often is), but there are still things that can go wrong. Production isn’t always completed in line with what the requesting party order and may have to be redone. Redactions may not be burned in and Personally Identifying Information (PII) and other sensitive, privileged, or private information may be leaked. Again, not thinking about these risks from the outset, can lead to risk, cost, and work delays.
The thing to remember is that a disjointed process can still lead to a functioning result. And this is where many teams may not realize there is a need to change. They may simply think, this is how we’ve always done it and not want to disrupt things. But with the immense data stores organizations create and the increasing attacks on that data — not only against the organization itself, but law firms, data storage and hosting facilities, and other 3rd party vendors — the need to mitigate risk is more complex than ever.
This is why thinking upstream is so important. Organizations can no longer afford to be reactive with this process, but instead have to take on a “when not if” mentality. This starts with continual and ongoing data stewardship and insight before litigation ever takes place. At the same time, putting processes in place for each stage based on those data insights will streamline things and reduce risk. It can also decrease the number of stakeholders and data handoffs involved, which not only reduces risk, but cost as well.
Don’t remain in a reactive mode with your process. Start thinking upstream to avoid a process that looks more like an arrow and less like a bowl of spaghetti.
Learn more about how IPRO solutions can help your organization think upstream to reduce data handoffs.