Skip to main content

Navigating AI Disclosure Rules in New York Courts

Judges are still figuring out the best way to preempt misuse of generative AI (GenAI) in their courts as use of AI technology becomes more commonplace in litigation.

Since Judge Brantley Starr of the U.S. District Court for the Northern District of Texas issued the first standing order on the use of AI in preparing court filings in 2023 (Rules & Orders | Northern District of Texas | United States District Court), hundreds of state and federal judges have amended or issued standing orders, general orders, local rules, pretrial orders, and other guidance to address AI use and misuse in their courtrooms. As this landscape continues to evolve, practitioners should understand that judges continue forging their own paths.

Courts throughout New York are grappling with the increasing use of AI-generated legal documents and the integrity and accuracy (or lack thereof) of court filings. Despite growing concern, there is not (yet) any single, district, or state-wide local rule on disclosing the use of AI in court filings. Rather, individual judges have issued standing orders or individual rules that, among other things, mandate disclosure of the use of AI when drafting/preparing submissions, require certification that any AI-generated document has been independently reviewed, and require counsel to confirm all filings comply with Rule 11, or its state court equivalent. These local court-level rules are part of an evolving trend toward mandatory transparency, accountability, and review of AI-generated legal documents to maintain accuracy and integrity in judicial proceedings. Steps must be taken to safeguard against errors, bias, fabrication, and the use of fictitious authority from GenAI systems, while protecting client confidentiality and preserving claims of privilege and protection, where applicable, in court dockets.

Moreover, AI disclosure rules underscore that all final submissions remain the lawyer’s responsibility. It is the lawyer’s duty to ensure all AI-generated outputs are accurate and free from misleading or hallucinated content. Below is a table reflecting relevant individual rules of New York federal and state judges, as well as a relevant local bankruptcy rule for the Southern District of New York.1

Illustrative Disclosure Rules

Judge

Summary of Judge’s Rule

E.D.N.Y. – Magistrate Judge Arlene R. Lindsay

Magistrate Judge Lindsay’s Individual Practices (C.3) provide for an AI provision consistent with Rule 11(b) of the Federal Rules of Civil Procedure. It specifically requires that any attorney for a party, or any pro se party, who has used AI in the preparation of any documents filed with the court must disclose that AI has been used and must further certify in the document that the person has checked the accuracy of any portion of the document GenAI drafted, including all citations and legal authority require that any attorney or any pro se party.

E.D.N.Y. – Magistrate Judge Lee G. Dunst

In his individual rules (1.B), Magistrate Judge Dunst reminds that “parties are cautioned to ensure that any use of Artificial Intelligence resources in connection with their submissions to the Court still comply with their professional obligations to the Court.”2 

S.D.N.Y. Individual Rules & Practices in Civil Cases – Judge Vernon S. Broderick

Judge Broderick’s Civil Rules and Practices (see Section J) provides that party who “utilizes any [GenAI] tool in the preparation of any documents filed with the Court must disclose that [GenAI] has been used” (emphasis added). Additionally, if GenAI is used “in the drafting of any documents filed with the court,” the party must also include a certification confirming (i) having reviewed and verified those portion(s) drafted by GenAI and (ii) compliance with obligations under FRCP Rule 11. Judge Broderick’s rules further provide that no certification is required if a GenAI tool is used for research. Judge Broderick’s standing orders also include a link to a sample certification. Parties who do not include this certification may face penalties including sanctions or “other remedies that the Court deems appropriate.”

S.D.N.Y. – Judge John P. Cronan

Judge Cronan’s Individual Rules (2.E) provide that counsel is responsible for providing the court with complete and accurate representations of the record, the procedural history of the case, and any cited legal authorities. All litigants are responsible for verifying the accuracy of any output produced in whole or in part by an AI tool. Any attorney who signs a filing for which an AI tool was used to prepare (including by appearing on the signature block of the filing) must attach to the filing a signed certification (i) stating whether the litigant personally reviewed the filing for accuracy of cited legal authorities and factual assertions and (ii) if so, describing in detail the steps taken to verify the accuracy of all legal authorities and factual assertions the AI tool generated. An attorney who signs an AI-assisted filing yet fails to review that filing for accuracy or to provide the required certification violates this rule. The court may impose sanctions on counsel and/or strike the filing for failing to comply with this rule. A model certification may be found on the court’s website.

S.D.N.Y. Bankruptcy Court (Local Rule 9011-1)

Local Bankruptcy Rule 9011-1 refers specifically to “generative artificial intelligence services,” and cautions lawyers that GenAI tools “may produce factually or legally inaccurate content.” The rule reminds litigants that they must “review and verify any computer-generated content” to ensure it complies with Federal Rule of Bankruptcy Procedure 9011. There are no prohibitions, disclosure requirements, or other limitations on the use of GenAI tools. The commentary notes that this rule is based on the local rule the Eastern District of Texas federal district court adopted.

N.Y. Sup. Ct. – Justice Nancy M. Bannon

Section 8 of Justice Bannon’s rules provide that all submissions with respect to a motion, hearing, or trial must include a certification by an attorney or pro se litigant that either no GenAI program was used in the drafting of any affidavit, affirmation, or memorandum of law contained within the submission, or that a GenAI program was used but an attorney (or the self-represented party) reviewed and approved all generated text, including citations, quotations, and legal analysis for accuracy. Any GenAI program must be identified and the documents that include content the program generated must be specified, along with which portions of the documents the program drafted. One certification pertaining to a party’s submission comprised of several such documents shall suffice. To the extent letters are permitted under these part rules, the same requirement applies. Violations may result in sanctions.

N.Y. Sup. Ct. – Justice Peter A. Weinmann 

Justice Weinmann’s individual rules require parties to disclose any use of GenAI in a filing, which includes naming the program used, identifying which portion of the filing had GenAI material, and certifying that the GenAI-created work product is accurate.

Chautauqua County Supreme Court – Justice Grace M. Hanlon

Justice Hanlon’s rules require disclosure if GenAI is used to draft any filings and requires the filer attest that all citations have been verified for accuracy. GenAI is defined in Justice Hanlon’s rules as artificial intelligence that is “capable of generating new content (such as images or text) in response to a submitted prompt (such as a query) by learning from a large reference database of examples.”

Kings Co. Sup. Ct. – Judge Aaron D. Maslow[3]

Section 15 of Judge Maslow’s rules provide that all submissions with respect to a motion must include a certification by an attorney either that no GenAI program was used in the drafting of any affidavit, affirmation, or memorandum of law contained within the submission, or that a GenAI program was used but all generated text, including citations, quotations, and legal analysis, was reviewed for accuracy and approved by an attorney (or the self-represented party). If the certification states a GenAI program was used, the program must be identified and the documents which include matter generated by the program must be specified along with which parts of the documents the program drafted. One certification pertaining to a party’s submission comprised of several such documents shall suffice.

Niagara County Supreme Court – Justice Michael J. Norris

Justice Norris’ local rules require that any party who uses GenAI to prepare any documents filed with the court must disclose the specific AI tool used, the portions of the filing AI drafted, and include a certification that the AI work product was “diligently reviewed by a human being for accuracy and applicability.”

These rules reflect active court-level adaptation to the challenges and risks the use of GenAI presents in legal practice, with emphasis on transparency, accuracy, and ethical compliance.

Additionally, even in the absence of formal rules or standing orders, some courts are issuing decisions that remind parties and counsel of the risks of using AI and the importance of verifying all such submissions or face consequences, including sanctions.4


1 Please note, this table does not purport to be exhaustive. These local court rules/orders are subject to change and counsel and parties should consult the rules of the specific judge(s) one is practicing before.

2 See Benjamin v. Costco Wholesale Corp., 2:24-cv-07399 (EDNY 2024) (LGD). In Benjamin v. Costco Wholesale Corp., the plaintiff included numerous case citations created by ChatOn, a GenAI tool, in a reply brief. In response to the Magistrate Judge Dunst’s order to show cause, the plaintiff’s attorney submitted the authentic cases cited in her opening brief and an attorney declaration certifying the ChatOn-created cases were false. The court ordered the plaintiff’s attorney to pay a $1,000 penalty, acknowledging such a fine was on the lower end because of the “mitigating factors” of the attorney’s remorse, one-time use of GenAI in drafting, and voluntary CLE participation. Magistrate Dunst’s full decision can be read here.

3 Earlier this year, the Administrative Board of the Courts sought public comment on a proposal recommended by the Commercial Division Advisory Council (CDAC) to add a new Rule 6( e) to the Rules of the Commercial Division (22 NYCRR § 202.70) regarding the use of GenAI in preparing court documents.

4 See, e.g., Mata v. Avianca, 22-cv-1461 (SDNY 2023) (PKC) where several attorneys and their law firm were sanctioned when they “abandoned their responsibilities” by both submitting “non-existent judicial opinions with fake quotes” a GenAI tool created and then “stand[ing] by the fake opinions after judicial orders called their existence into question.” Judge P. Kevin Castel found “subjective bad faith” due to “shifting and contradictory explanations,” including misleading representations by one of the offending attorneys that “he had done other, meaningful research . . . and did not rely exclusive[ly] on an AI chatbot,” when in reality GenAI “was the only source of his substantive arguments.” The court found violations of both federal Rule 11 (because filing papers “without taking the necessary care in their preparation” constitutes an abuse of the judicial system), and Rule 3.3(a)(1) of the New York Rules of Professional Conduct (which prohibits lawyers making a “false statement of fact or law”). The court also considered whether the lawyers committed a criminal offense under 18 U.S.C. § 595 (which criminalizes forging a signature or seal of a federal judge or court), concluding that no such crime had occurred, but that creating fake opinions “raises similar concerns.” The court ultimately sanctioned the attorneys by imposing a $5,000 fine and ordering that they send copies of the court’s order to both the plaintiff and to all judges who had been falsely identified as the authors of the six fake GenAI-generated cases that the attorneys had submitted to the court.

See also, Hall v. Acad. Charter Sch., 2:24-cv-08630-(EDNY 2024) (JMW), where the plaintiff’s counsel included three hallucinated cases in her opposition to the defendant’s motion to partially dismiss the amended complaint. In response to the order to show cause as to why sanctions should not be imposed, plaintiff’s counsel admitted that the brief was “drafted by a clerk who used [GenAI] for legal research” and was never cite-checked by an attorney. Counsel also explained that her husband’s sudden death took a severe toll on her mental and emotional wellbeing and her ability to practice law.

Sympathetic to the attorney’s circumstances, Magistrate Judge Wicks declined to impose any sanctions, especially since he did not find any evidence of bad faith in drafting the brief. Unlike other attorneys, this use of GenAI, “while aberrant, appear[ed] to be an isolated occurrence.” 

And Park v. Kim, (Appeal No. 22-2057) (2d Cir. 2024), where plaintiff’s counsel submitted a reply brief that contained a hallucinated citation, which she admitted was due to using GenAI. The Second Circuit (Judges Parker, Nathan, and Merriam) stated that “citation in a brief to a non-existent case suggests conduct that falls below the basic obligations of counsel.” The attorney was referred to the Court’s Grievance Panel pursuant to Local Rule 46.2 “for further investigation, and for consideration of a referral to the Committee on Admissions and judgement,” and ordered to “furnish a copy of this decision to her client” due to her violation of Rule 11, along with a filed certification attesting that she had done so.