[EDRM’s Editor’s Note: This article was first published here on August 13, 2024, and EDRM is grateful to Rob Robinson, editor and managing director of Trusted Partner ComplexDiscovery, for permission to republish.]
ComplexDiscovery’s Editor’s Note: The rapid evolution of artificial intelligence (AI) technologies has introduced groundbreaking capabilities but also unprecedented challenges, particularly in the realm of digital replicas—often referred to as “deepfakes.” These highly convincing, digitally manipulated images, videos, and audio recordings are becoming increasingly sophisticated, raising complex legal and ethical concerns. The U.S. Copyright Office’s recent report, the first in a planned series on AI and copyright law, delves deeply into these issues, offering critical insights and recommendations. This report underscores the dual nature of digital replicas, which can both empower creative expression and pose significant risks to individuals and institutions alike. By examining existing legal frameworks and identifying key gaps, the Copyright Office lays the groundwork for a potential new federal “digital replica right” to protect against unauthorized use. As AI continues to blur the lines between reality and fiction, this report marks an important milestone in the ongoing effort to safeguard individual rights and uphold the integrity of the creative and informational ecosystems.
U.S. Copyright Office Urges Federal Action on Digital Replicas
The U.S. Copyright Office recently released its first report in a series concerning the intersection of artificial intelligence (AI) and copyright law, focusing on the emerging issue of digital replicas. Digital replicas, often referred to as “deepfakes,” are digitally created or manipulated video, image, or audio recordings that can convincingly but falsely depict individuals. While these technological advancements are impressive, they also bring significant legal and ethical challenges that the Copyright Office aims to address through its comprehensive study and recommendations.
The report primarily highlights the dual-edged sword that digital replicas represent. On one hand, they can be used beneficially, such as creating accessibility tools for individuals with limited speech or enabling authorized performances by deceased artists. On the other hand, there are severe risks associated with their misuse, including fraud, impersonation, and the spread of misinformation. The report underscores three key areas of harm: sexually explicit deepfake imagery, fraudulent activities, and the dissemination of misinformation that can undermine political systems and news platforms.
Examining the current legal landscape, the report identifies gaps in both state and federal laws concerning the protection against unauthorized digital replicas. It points out that existing laws, such as the Copyright Act, the Lanham Act, the Federal Trade Commission (FTC) Act, and the Communications Act, are insufficient to provide comprehensive protection. For instance, under current copyright law, one’s voice is not copyrightable. The FTC’s authority is limited to commercial misuse, and the Lanham Act requires a showing of consumer confusion for false endorsement claims. Similarly, state laws vary widely, with some states like Tennessee, Louisiana, and New York taking specific legislative steps to address digital replicas, albeit with inconsistent rules and exemptions.
To address these shortcomings, the report calls for the creation of a new federal “digital replica right.” This proposed right would protect all individuals, not just public figures, allowing them to license their likeness and secure remedies such as monetary damages and injunctive relief against unauthorized use. The right would cover both commercial and non-commercial uses and include provisions for online service providers, akin to the Digital Millennium Copyright Act’s safe harbor rules.
Several legislative initiatives in Congress align with these recommendations. Notably, the Nurture Originals, Foster Art, and Keep Entertainment Safe Act, known as the NO FAKES Act (S. 4875), along with the No Artificial Intelligence Fake Replicas and Unauthorized Duplications Act (HR 6943), aim to establish property rights over an individual’s voice and likeness. These legislative efforts indicate growing recognition of the need for federal regulation in this area.
The report also addresses the limitations of Section 114(b) of the Copyright Act, which allows for the imitation of sounds in sound recordings. The Copyright Office clarifies that Section 114(b) does not preempt state laws on unauthorized digital replicas, noting that the section has a different policy aim. Additionally, the report suggests that current laws under the Lanham Act and state right of publicity laws are adequate to handle certain types of AI-generated imitations and that future reports will delve into the copyrightability of AI-generated works and the legal implications of training AI models on copyrighted material.
The Copyright Office’s recommendations come in the wake of Executive Order 14110, which calls for policies ensuring the trustworthy development and use of AI. The report emphasizes the urgent need for clarity and uniformity in laws governing digital replicas to protect individuals and maintain the integrity of creative markets.
As digital replicas continue to become more sophisticated, the legal framework governing their use must evolve to keep pace. The Copyright Office’s report is a significant step towards addressing the complex issues at the intersection of AI and copyright law, setting the stage for further legislative and policy advancements.
Read the original release here.
News Sources
- Overview of U.S. Copyright Office Report Regarding Artificial Intelligence and Digital Replicas
- Copyright Office Advocates for Federal ‘Digital Replica’ Law
- US Copyright Office Versus AI Part I
- Client Alert: U.S. Copyright Office Issues “Digital Replica” Report Finding Urgent Need for New Federal Legislation
- US Copyright Office issues first report in AI series, spotlighting deepfakes
Additional Reading
- ABA Issues Ethical Guidelines for Integrating AI in Legal Practice
- Apple Delays AI Feature Rollout to October for Stability Improvements
Source: ComplexDiscovery OÜ
Assisted by GAI and LLM Technologies per EDRM GAI and LLM Policy.