On December 5, The New York Times Company (the Times) filed a complaint for copyright and trademark infringement against Perplexity AI, Inc. in the U.S. District Court for the Southern District of New York, adding another major lawsuit to the growing wave of litigation against generative artificial intelligence (AI) companies. The Times alleged in its filing that Perplexity engaged in “large-scale, unlawful copying and distribution” of millions of its articles to build its AI-powered “answer engine.” The complaint argued that Perplexity’s products directly substitute for the newspaper’s own content, thereby undermining its business and devaluing its journalism. Perplexity’s conduct “threatens this legacy and impedes the free press’s ability to continue playing its role in supporting an informed citizenry and a healthy democracy,” the Times argued.
The U.S. Supreme Court justices today seemed skeptical of Cox Communications’ arguments that it should not be held liable for contributory infringement for failing to terminate internet access to subscribers who were alleged to have committed infringement, but had tough questions for both sides in Cox Communications, Inc. v. Sony Music Entertainment, Inc.
The practice of music sampling, which is the integration of pre-recorded sounds into new musical gestures, experienced a golden, unregulated age in the late 1980s that is almost unimaginable today. Major works like Public Enemy’s It Takes a Nation of Millions to Hold Us Back (1988) and De La Soul’s 3 Feet High and Rising (1989) layered dozens of samples on a single track, while massive commercial hits like Tone-L?c’s “Wild Thing” (1988) openly lifted core musical elements.
The U.S. Supreme Court on Monday denied certiorari in Halicki v. Carroll Shelby Licensing, a case in which Denice Shakarian Halicki, widow of the creator of the “Gone in 60 Seconds” film franchise sought review of a U.S. Court of Appeals for the Ninth Circuit decision that held the car character “Eleanor,” a customized Ford Mustang, was not entitled to copyright protection.
Two weeks after the Trump administration asked the U.S. Supreme Court to stay an interlocutory injunction issued by the U.S. Court of Appeals for the D.C. Circuit, in September that allowed Register of Copyrights Shira Perlmutter to return to her post pending her lawsuit against President Donald Trump for allegedly illegally removing her from office, Perlmutter has responded. In her opposition to the application for a stay, filed on Monday, Perlmutter accused the administration of making “an inexcusable mess of Congress’s plans for the governance of its Library.”
The U.S. Supreme Court on Monday granted a motion from the U.S. Solicitor General to participate in oral argument as an amicus in the copyright case between Cox Communications and Sony Music Entertainment. The order allows the government to weigh in during the December 1 hearing on whether an internet service provider (ISP) can be held contributorily liable for copyright infringement committed by its users.
Today, Mrs Justice Joanna Smith DBE of the United Kingdom’s High Court of Justice issued a highly awaited ruling in Getty Images (US) Inc. v. Stability AI Ltd., a case which was expected to have major implications in determining liability for generative artificial intelligence (AI) developers under UK intellectual property law. The 205-page decision, which mainly focuses on Getty’s trademark claim while also clarifying important aspects of secondary copyright liability in the AI context, failed to address certain fundamental questions in large part because Getty failed to raise sufficient evidence to proceed with its claim of primary copyright infringement at trial.
The Trump Administration this week asked the Chief Justice of the U.S. Supreme Court to stay an interlocutory injunction issued by the U.S. Court of Appeals for the D.C. Circuit in September that allowed Register of Copyrights Shira Perlmutter to return to her post pending her lawsuit against President Donald Trump for allegedly illegally removing her from office. The application for a stay was submitted to the Chief Justice of the Supreme Court by Todd Blanche, Acting Librarian of Congress; Paul Perkins, Acting Register of Copyrights; Sergio Gor, Director, White House Presidential Personnel Office; Trent Morse, Deputy Director, White House Presidential Personnel Office; Executive Office of the President; and Trump. The administration argued that the D.C. appellate court’s injunction represents “another case of improper judicial interference with the President’s power to remove executive officers.”
A New York judge ruled on Monday that OpenAI cannot stop a consolidated, multi-district class action brought against by dozens of authors for direct copyright infringement by the outputs of its large language model (LLM), ChatGPT. OpenAI argued that the plaintiffs had failed to allege substantial similarity between the works and ChatGPT’s outputs, but Judge Sidney Stein of the U.S. District Court for the Southern District of New York said that “[a] more discerning observer could reasonably conclude that the allegedly infringing outputs are substantially similar to plaintiffs’ copyrighted works.”
The $1.5 billion settlement in Bartz v. Anthropic, recently granted preliminary approval, is the largest copyright settlement in American legal history. That’s impressive, but more important, it shows tech companies must play by the same rules as everyone else. Tech companies regularly ask for special treatment, arguing their innovations are too important to be slowed down by existing laws. But when these companies grow big enough to affect billions of people’s lives, those early shortcuts become serious problems.
Taking their cue from the recent Bartz v. Anthropic saga, the authors of a neuroscience book and professors at the State University of New York filed a class action complaint on October 9 with the U.S. District Court for the Northern District of California, alleging that Apple Inc. committed mass copyright infringement by using pirated books to train its artificial intelligence systems. Plaintiffs Susana Martinez-Conde and Stephen Macknik claimed that Apple built its Apple Intelligence platform, including its OpenELM and Foundation Models, by making unauthorized copies of copyrighted works without permission or compensation.
Dr. Stephen Thaler has taken his fight to get works created by artificial intelligence (AI) machines recognized as copyrightable to the U.S. Supreme Court. In his petition for certiorari, filed October 9 by Ryan Abbott of Brown, Neri, Smith & Khan, Thaler is asking the court to take up the question: “Whether works outputted by an AI system without a direct, traditional authorial contribution by a natural person can be copyrighted.”
In 2025, three federal courts finally confronted a question that had hovered over artificial intelligence for years: can machines legally learn from copyrighted works? Each opinion—Thomson Reuters v. Ross Intelligence, Bartz v. Anthropic, and Kadrey v. Meta Platforms—applied the four-factor fair-use test under 17 U.S.C. §107 to large-scale model training. Together, they form the first real framework for evaluating how copyright interacts with machine learning.
The recent $1.5 billion settlement between a major AI company and authors over copyright infringement represents far more than legal resolution—it marks the dawn of legitimate AI training data markets. This watershed moment signals the beginning of a necessary evolution toward market-based licensing schemes, much like how the music industry adapted to digital distribution by developing fair compensation frameworks for artists.
A number of amici have weighed in this week supporting ROSS Intelligence’s appeal to the U.S. Court of Appeals for the Third Circuit challenging the originality and fair use rulings of the District of Delaware in a copyright infringement case brought by global legal information company, Thomson Reuters. ROSS’s petition for review was granted by the Third Circuit in June.