[T]he American Bar Association urges courts and lawyers to address the emerging ethical and legal issues related to the usage of artificial intelligence (“AI”) in the practice of law including: (1) bias, explainability, and transparency of automated decisions made by AI; (2) ethical and beneficial usage of AI; and (3) controls and oversight of AI and the vendors that provide AI.
As law firms and legal departments strategize ways to broaden the use of AI across their organizations, eDiscovery professionals can also find guidance in Resolution 112 on how to use their skills and experience to help translate technical details, provide transparency, assist with inquiries to identify potential bias, and manage vendor relationships.
Potential Bias and the Need for Transparency
As noted, the ABA report for Resolution 112 addresses the potential for bias and need for transparency in the way AI creates its outputs. The potential for biased results is a real concern in numerous contexts including housing, lending decisions, employment, harassment, and criminal profiling. Users need to understand that and address how bias can impact AI results.
EDiscovery professionals can have important experience to draw on here. They certainly understand from processing, searching, analyzing and categorizing large data sets the concept that “garbage in means garbage out.” Seasoned eDiscovery professionals remember the “black box” concern that the industry encountered in the early days of predictive coding and computer-assisted review. Many courts were skeptical of the new technology and the industry had to develop standards to govern whether and how it should be used.
EDiscovery project managers leading complex reviews using AI are trained to follow a documented, explainable process that can be defended in court. Such exercises can form a solid foundation for understanding and having the ability to explain how a particular “new” AI tool works, as well as assisting to build a defensible process around the use of the AI tool in various applications.
The advanced analytic tools available in many of the eDiscovery software systems allow eDiscovery practitioners to slice and dice data in several ways and at granular levels. The focus in many eDiscovery projects is looking for illegal activity, whether it’s in the context of constructing or defending a claim, responding to a subpoena, conducting an internal investigation, or performing due diligence in a potential merger or acquisition. More specifically, it’s not uncommon for eDiscovery professionals in certain types of practices to be trained to look for evidence of fraud or discrimination. EDiscovery professionals are uncanny in their ability to find needles in haystacks; to locate proverbial “smoking guns” and data that shouldn’t be there. Professionals with these kinds of “detective skills” can be well suited to help make bias inquiries as contemplated by the ABA policy.
Questions to Ask and AI Vendor Engagement
ABA Resolution 112 makes clear that lawyers and other AI users are ultimately responsible for their use of AI; as part of meeting that responsibility, the ABA report recommends questions for law firms and corporate legal departments to ask AI vendors before they are engaged.
The appropriate questions will vary, of course, depending on the circumstances. It is significant to note at the outset, however, that many eDiscovery professionals are routinely tasked with vendor relationship and contract management for the software and services related to technology and support for their functions. Again, in my role as an eDiscovery director, I relied heavily on eDiscovery technologists to help me understand and negotiate technical terms and service level agreements as well as manage the day-to-day relationships with service providers. The eDiscovery professional can be an incredible resource to executive leaders in explaining the industry, how pricing models work, the relationships between providers, and numerous other issues that the larger IT department may not recognize. For example, the eDiscovery team in a large corporation has important information about the data and systems that impact the legal department at a level of detail that the enterprise IT department likely doesn’t possess; this information could be critical to decisions about the implementation of new AI technology, including potential implications for client-facing tools. Similarly, the eDiscovery group in a law firm has unique knowledge about how data is stored and what systems are involved in managing client and other third-party data that can make such professionals key stakeholders in an AI technology RFP process. Again, these professionals are well positioned by virtue of their experience to help the attorneys provide vendor oversight and controls.
AI technology will continue to be a powerful force for advancement of legal advocacy and the administration of justice, but it must be used carefully and with appropriate human oversight. Law firms and corporate legal departments eager to leverage AI tools in new ways can benefit greatly from the knowledge and versatility of their internal eDiscovery teams. The guidelines set forth in ABA Resolution 112 highlight how the unique skills and experience of eDiscovery professionals will continue to be market differentiators for organizations that leverage them effectively.
Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by my employer, my partners or my clients. eDiscovery Today is made available solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscovery Today should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.