Illinois lawyers are increasingly asked to navigate the risks and benefits posed by artificial intelligence (AI) in their practices. While AI tools are abundant, resources outlining their implementation are limited and may not be objective when tied to a specific platform.
Thankfully, the Illinois Attorney Registration and Disciplinary Commission (ARDC) recently released “The Illinois Attorney’s Guide to Implementing AI” (the “Guide”), providing a practical framework for legal professionals to understand and deploy AI in their practices while honoring the ethical demands of confidentiality, data security, and supervision.
“As artificial intelligence becomes woven into everyday legal work, Illinois lawyers need clear, practical guidance to ensure its use aligns with our ethical duties,” said ARDC Administrator Lea S. Gutierrez.
According to Gutierrez, while the Guide was developed with an eye on solo and small-firm practitioners who may lack institutional support for new technology, it is meant to be a resource for all Illinois lawyers across workplaces. The Guide builds upon the foundation laid by the Illinois Supreme Court’s AI policy and judicial reference sheet, providing the “practical side” of using AI in legal practice.
This blog will unpack key recommendations and resources from the Guide tied directly to the professional responsibilities spelled out in the Illinois Rules of Professional Conduct (the “Rules”).
What is generative AI?
Generative AI (GAI) is based on a new type of technology, called transformer technology, that can be configured to mimic natural human language.
GAI tools, which include OpenAI’s ChatGPT, Google’s Gemini, or Microsoft’s Copilot, are trained to do so by reviewing and recognizing patterns in datasets they are given access to. Based on this training, GAI tools provide responses (called outputs) to user prompts.
The Guide mentions two variations of GAI tools that lawyers often encounter:
- Large Language Models (LLMs), which can quickly generate, interpret, and summarize human language in response to prompts.
- Vision Models, which can generate realistic images and videos based on descriptions or examples.
In the legal context, GAI assists with tasks such as drafting documents, summarizing transcripts, or conducting research.
For example, if a lawyer wanted a summary of transcripts to prepare for an upcoming deposition, they could upload the transcripts into a GAI tool and be provided a summary, proposed questions to ask, and even conflicting testimony to examine further.
These outputs would be driven by how a lawyer drafts the prompts (or directions) they enter into the tool. Alternatively, a legal-specific GAI tool may already have such functionality built into the application.
The Guide underscores that GAI’s creativity comes with certain risks, including outputs that may be factually incorrect (“hallucinations”) or reflect biases depending on the data it is trained on. This introduces unique ethical concerns for confidentiality, data security, transparency, and candor to the court that every Illinois lawyer should understand.
Evaluating GAI Tools
Selecting a suitable GAI application is a crucial first step. Lawyers do not need to be AI experts to use GAI effectively, but should approach it with structure and care.
The Guide introduces a flexible, step-by-step framework to help lawyers evaluate GAI tools, understand the data those tools handle, and communicate transparently with clients about their use. While not mandatory, the framework serves as a practical guide lawyers can adapt to their own practices to demonstrate compliance with confidentiality obligations under Rule 1.6 (Confidentiality).
When evaluating GAI tools, the Guide encourages a thorough vetting process, including:
- Assessing whether the GAI provider describes the origin of the data the application is trained on, how frequently this data is updated, and its relevant limitations (e.g., Is the data all Illinois case law and secondary legal sources? Or is it Wikipedia and Reddit content?)
- Requesting detailed documentation on privacy, security, and access controls from the provider.
- Reviewing compliance with regulatory and client confidentiality duties, including where and how data is stored.
- Testing the tool’s outputs on actual legal tasks and confirming the provider’s commitment to continuous updates of security protocols.
- The provider’s ability to deliver meaningful support, incident response details, and clear data retention and deletion policies.
It is important to clarify the provider versus the tool, which the Guide compares to a car’s make and model. For example, Google is the provider and Gemini is the GAI tool (see Table 1 on page 6 of the Guide).
Many GAI tools have multiple versions available (e.g., public, consumer, business, enterprise) that safeguard privacy and security concerns to varying degrees (see Table 2 on page 13 of the Guide). However, the Guide encourages attorneys not to assume regarding a tool’s safeguards but rather to walk through a checklist (Guide Table 2) to confirm a tool’s safeguards before implementing it.
Managing client rights
The last section of the Guide helps practitioners communicate and memorialize their use of GAI tools with their clients. GAI’s integration into legal practice raises key questions about client rights and informed consent.
The Guide advises lawyers to at least provide clients’ notice of the use of GAI in their representation, particularly where it touches on confidential or sensitive matters. It views client notice, client opt-out rights, and informed consent as additional safeguards that a lawyer can employ when using GAI tools, but not a method of shifting risk from the lawyer to the client. The due diligence of the lawyer, from tool selection to content output verification, always remains regardless of the client’s notice or position.
Sample policies and consent forms in the Guide enable Illinois lawyers to document these steps, mitigating risk, and fostering client trust. Ultimately, managing disclosure and informed consent upholds duties embedded in Rules 1.2 (Scope of Representation), 1.4 (Communication), and 1.6 (Confidentiality).
Resources for implementing GAI
To support the seamless and ethical adoption of GAI in legal practice, the Guide offers several practical references, including:
- GAI Terms of Use Checklist: Assists lawyers in reviewing provider contracts and GAI tool policies, covering data security, privacy, update protocols, and liability.
- Sample Notice of AI Practices: Provides template language lawyers can use to disclose GAI use to clients to ensure transparency.
- Sample Use of GAI Tools Policy: Sets forth best practices for supervision, review, and error management in AI-generated legal work.
- Sample Informed Client Consent Form: Facilitates informed, written acknowledgement from clients when GAI is employed in their matters.
Each resource is crafted to address the real-world demands of law offices, helping lawyers operationalize ethical standards in a rapidly evolving landscape.
Conclusion
GAI’s promise comes with maintaining ethical responsibilities for Illinois lawyers. By understanding how GAI works, using structured criteria for tool selection, respecting client rights, and leveraging tailored resources, attorneys can responsibly tap into the benefits of GAI while maintaining the profession’s core ethical commitments.
For ongoing updates on AI ethics and legal tech, consult the news and resources at 2Civility’s news page.
Staying up to date on issues impacting the legal profession is vital to your success. Subscribe here to get the Commission’s weekly news delivered to your inbox.
3 Areas of ‘Friction’ Between Law Firms and Generative AI Developers
How to Stop This $475,000 Email Scam from Happening to Your Law Firm
