Section 230 and the Future of Content Moderation | Fenwick & West LLP – JDSupra – JD Supra
The Communications Decency Act of 1996 (CDA) was a landmark law enacted to regulate content on the internet. The purpose of the legislation was to regulate indecent and obscene material online, but it is most relevant today for Section 230a provision that protects online platforms from liability in a variety of circumstances involving third-party use of their services. While Section 230 is often credited with allowing the internet to flourish in the late 1990s and the early 21th century, it has been the subject of calls for amendment from across the political spectrum as courts and online platforms attempt to fit the law to the modern internet. In particular, a rash of bills in 2020 targeted the law, specifically in the context of immunity for content-moderation decisionsan application that has become more heavily scrutinized as service providers attempt to curb abusive content and critics raise concerns of censorship.
This article addresses the evolving landscape for online platforms seeking to moderate content while limiting litigation risk.
Background: The CDA and Section 230
Shortly after the CDA was enacted, it faced First Amendment challenges to its provisions that prohibited the transmission of obscene or indecent content to minors. The U.S. Supreme Court held the anti-indecency provision of the statute unconstitutional in Reno v. American Civil Liberties Union, but held that provision to be severable from the rest of the law, allowing Section 230 to stand.
Now, Section 230 is the principal legal protection afforded to online platforms from lawsuits over content posted by their users. It contains three provisions specifying when platforms will be immune from suit: first, in Subsection (c)(1) as a publisher; second, in Subsection (c)(2)(A) for the Good Samaritan removal or filtering of content; and third, in Subsection (c)(2)(B) as a provider of the technical means to restrict content.
Subsection (c)(1) states: No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider. It is concerned with liability arising from information provided online, but as stated in Barrett v. Rosenthal, [l]iability for censoring content is not ordinarily associated with the defendant's status as publisher or speaker.
Subsection (c)(2) provides immunity for moderation or alleged censorship scenarios, stating: No provider or user of an interactive computer service shall be held liable on account of: (a) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or (b) any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1).
Courts have interpreted Subsection (c)(1) broadly as providing immunity to online platforms, both from suits over content posted by their users and for their removal of content from their sites. In a key early decision involving allegedly defamatory messages on a message board, Zeran v. America Online, the U.S. Court of Appeals for the Fourth Circuit held that Section 230 creates a federal immunity to any cause of action that would make service providers liable for information originating with a third-party user of the service. Thus, lawsuits seeking to hold a service provider liable for its exercise of a publisher's traditional editorial functionssuch as deciding whether to publish, withdraw, postpone or alter contentare barred. This immunity is generally not limited to particular causes of action, and because Section 230 preempts state law where inconsistent, Section (c)(1) is a defense to state tort and contract claims as well as federal lawsuits.
Subsection (c)(1) is not an absolute bar to litigation for third-party content on online platforms, however. In a critical decision denying Section 230 immunity, Fair Housing Council of San Fernando Valley v. Roommates.com, the U.S. Court of Appeals for the Ninth Circuit held that Section 230 did not preempt claims under the Fair Housing Act and state housing discrimination laws where a roommate-matching service required users to answer a questionnaire with criteria such as sex, sexual orientation and whether they will bring children to the household. The Ninth Circuit, noting that Section 230 was not meant to create a lawless no-mans-land on the Internet, found that the questionnaire was designed to force subscribers to divulge protected characteristics and discriminatory preferencesin other words, the defendant was a developer of an allegedly discriminatory system because it elicited content from users and made use of it in conducting its business based on allegedly illegal criteria. The Ninth Circuit contrasted this with cases in which immunity was upheldincluding where websites used neutral tools that did absolutely nothing to enhance the defamatory sting of the message, to encourage defamation or to make defamation easier, such as allowing users to filter dating profiles based on voluntary user inputs. Notably, the Ninth Circuit did apply Section 230 immunity to the additional comments section of users profiles, where users were merely encouraged to provide information about themselves; even though the lawsuit pointed to instances where users input race or religious requirements into this section, the Ninth Circuit noted that Roomates.com only passively published these comments as written, which is precisely what Section 230 protects.
Additionally, the Ninth Circuit has held that failure to warn cases are not preempted by Section 230. In Doe v. Internet Brands, the plaintiff alleged that two individuals were using a modeling website to pose as talent agents and find, contact and lure targets for a rape scheme. The defendant allegedly knew about these particular individuals and how they were using the website, but failed to warn users about the risk of being victimized. The Ninth Circuit determined the critical question under Subsection (c)(1) to be whether the allegations depended on construing the defendant as a publisher (i.e., whether the claims arose from the defendants failure to remove content from the website). The Ninth Circuit noted that, in these circumstances, the marginal chilling effect of allowing such a claim to proceed did not warrant turning Section 230 into an all purpose get-out-of-jail-free card, nor would it discourage Good Samaritan filtering of third party content.
Further, in May 2021, the Ninth Circuit reversed a district courts dismissal based on Section 230 immunity in Lemmon v. Snap. Parents of three boys ages 1720 killed in a car accident sued the maker of Snapchat for its Speed Filteran overlay users could add to photos and videos that shows their speed. The parents alleged that one of the boys opened the app shortly before the crash to document their speed (at one point over 123 miles per hour) and that Snap allowed this feature notwithstanding (untrue) rumors that users would receive a reward for reaching over 100 miles per hour in the app. The Ninth Circuit held that the negligent-design claim did not seek to hold Snap liable for its conduct as a publisher or speaker and [t]he duty to design a reasonably safe product is fully independent of Snap's role in monitoring or publishing third-party content, thus Subsection (c)(1) did not apply. Separately, the Ninth Circuit held Subsection (c)(1) inapplicable because Snap designed the Speed Filter and reward system at issue, so the claim did not rely on information provided by another information content provider. Though the implications of this holding are yet to be seen, the Ninth Circuit attempted to constrain it to true defective design cases; the allegations did not depend on the content of any messages actually transmitted, so this was not a case of creative pleading designed to circumvent CDA immunity.
The breadth of immunity provided by Section 230 has also been pared back by subsequent legislation. In 2018, largely as a response to Backpage.com prevailing on Section 230 immunity in litigation concerning sex trafficking, the Allow States and Victims to Fight Online Sex Trafficking Act of 2017 (FOSTA), was signed into law, amending Section 230 to eliminate platforms immunity from prosecution for violating certain state sex trafficking laws. It also eliminated platforms immunity from civil suits brought by victims of sex trafficking for knowingly promoting and facilitating sex trafficking. Notably, the text of FOSTA states that it does not apply to Subsection (c)(2).
Section 230 and Content Moderation
While Subsection (c)(1) was a paradigm shift in terms of making the internet a unique forum in which content could be hosted and accessed without traditional publisher liability applying to service platforms, Subsection (c)(2) has also been essential in forming the legal landscape for social media and other online spaces. Because both provisions of Subsection (c)(2) concern content removal, it has been particularly relevant in recent years as more people, including politicians and other public figures, participate in online communities. Subsection (c)(2) has not been the deciding factor for many cases to date, but disputes concerning content moderation issues are likely to proliferate.
Several courts have held that Subsection (c)(2) immunizes online platforms from liability for content removal decisions, though it is case-dependent whether such claims can survive the pleading stage. For example, this year, the U.S. Court of Appeals for the Second Circuit applied Subsection (c)(2) to affirm the dismissal at the pleading stage of claims brought against a video sharing site over the sites removal of the plaintiffs videos promoting sexual orientation change efforts. In Domen v. Vimeo, the court noted that Subsection (c)(2) is a broad provision that forecloses civil liability where providers restrict access to content that they consider objectionable. The Second Circuit found that the plaintiff had not pleaded that Vimeo had acted in bad faith because there were no plausible allegations that Vimeos actions were anti-competitive conduct or self-serving behavior in the name of content regulation, as opposed to a straightforward consequence of Vimeos content policies.
Similarly, a case in the U.S. District Court for the Northern District of California, Daniels v. Alphabet, held that Subsection (c)(2)(A) barred nearly all of the plaintiffs claims regarding removal of his videos from YouTube and alleged restriction of his account, noting that the plaintiff himself acknowledged that the defendants reason for removal was that the videos violated YouTubes Community Guidelines and YouTubes policy on harassment and bullying. The plaintiffs conclusory assertions of bad faith were insufficient to overcome the discretion afforded by Subsection (c)(2)(A). This decision and the ruling in Vimeo demonstrate that the good-faith removal defense can be successfully raised at the pleading stage, though defendants may have more trouble doing so where plaintiffs bring more specific allegations of bad faith.
Conversely, the Ninth Circuit in Enigma Software Group USA v. Malwarebytes held that a security software company was not entitled to immunity under Subsection (c)(2)(B) at the pleading stage where the plaintiff alleged that Malwarebytess software blocked the installation or use of its security software for anti-competitive purposes. There, the Ninth Circuit found that the complaint plausibly alleged the companies were direct competitors. It reversed the district courts finding of immunity and remanded the case to the district court, holding that the anticompetitive allegations were sufficient to survive dismissal at the pleading stage.
Numerous other cases have dispensed with content moderation or account removal allegations against by applying Subsection (c)(2), often in the social media context and with little analysis of the good faith requirement. Additionally, several courts have applied Subsection (c)(1) to removal decisions on the theory that removing or withholding content from a platform is a typical publisher decision, which is protected by Subsection (c)(1). Though this approach sidesteps the good-faith analysis built into Subsection (c)(2), there does not appear to be a consistent approach among courts regarding when to apply Subsection (c)(1) to moderation or removal decisions, and it remains to be seen how reliably courts will take this more-protective route.
Potential Changes to Section 230
Outside of the courts, content moderation has been hotly contested across the political spectrum. Generally, proposed bills have divided on party lines. Democrats have sought to protect providers ability to remove hate speech and offensive content while leaving open liability in the anti-discrimination context, and Republicans have sought to impose more First Amendment-like restrictions on what providers can remove.
The Senate Committee on Commerce, Science, and Transportation held a hearing in October 2020 to address Section 230 with executives from Twitter, Facebook and Google present, in which senators addressed issues ranging from political censorship to the spreading of misinformation. While Subsection (c)(2) currently protects platforms decisions to remove, label or restrict the spread of content they deem to be damaging in some way, some senators pressed the companies representatives to explain the reasoning behind the removal or restriction of various specific posts. Senator Roger Wicker (R-MS), providing the majority opening statement, acknowledged the role Section 230 played in enabling the growth of the internet but also claimed it has also given these internet platforms the ability to control, stifle, and even censor content in whatever manner meets their respective standards, and [t]he time has come for that free pass to end. He also pointed to instances of removal that he characterized as inconsistent or evincing political bias. Senator Maria Cantwell (D-WA), in the minority opening statement, focused on enabling platforms to remove hate speech or misinformation related to health and public safety.
In March 2021, Facebook CEO Mark Zuckerberg argued before the House Committee on Energy and Commerce that Section 230 immunity should be reduced in favor of platforms being required to demonstrate that they have systems in place for identifying unlawful content and removing it. His proposal contemplated a third party that would set standards for what would constitute an adequate system, proportionate to the size of the provider at issue. Additionally, Mr. Zuckerberg advocated for more transparency into how platforms decide to remove harmful but legal content.
Since 2020, numerous bills have been introduced that would further pare back the immunity Section 230 provides to platforms, both for removing and for failing to remove certain categories of third-party content. One example is the Safeguarding Against Fraud, Exploitation, Threats, Extremism and Consumer Harms (SAFE TECH) Act, introduced by Senators Mark Warner (D-VA), Mazie Hirono (D-HI) and Amy Klobuchar (D-MN). This bill proposes to limit immunity in cases involving, among other things, civil rights or discrimination, antitrust, stalking, harassment, intimidation, international human rights law or wrongful death. It would also make Section 230 an affirmative defenserather than a pleading-stage immunityand would make it unavailable to defendants challenging a preliminary injunction). Another example is the Platform Accountability and Consumer Transparency (PACT) Act, which has received some bipartisan support. This bill seeks to set certain requirements for platforms takedown processes and provides state attorneys general as well as the Federal Trade Commission with certain enforcement authority. Several other bills have been introduced with similar focus on stripping immunity based on the subject matter of litigation or based on the practices of the platform. The Biden Administration has not taken an official position on Section 230.
Conclusion
While Section 230 remains the predominant legal protection for online platforms moderating content in good faith, courts are beginning to engage more regularly with these issues, and recent decisions signal that defendants may have difficulty relying on Subsection (c)(2) immunity to dispose of well-pled suits at the pleading stage. Further, many cases that have been dismissed above on Subsection (c)(2) grounds may have survived under new proposed legislation. Section 230 reform may introduce uncertainty to online platforms litigation risk, so content providers should remain aware of the shifting landscape for this critical legal protection.
Follow this link:
Section 230 and the Future of Content Moderation | Fenwick & West LLP - JDSupra - JD Supra
- Publishing Pro-Hamas Propaganda Is Protected by First Amendment - Reason - February 7th, 2025 [February 7th, 2025]
- "Title VI Must Be Applied Consistent with First Amendment Principles" - Reason - February 7th, 2025 [February 7th, 2025]
- Coming soon: Executive Watch Tracking the Trump Administrations free speech record First Amendment News 456 - Foundation for Individual Rights and... - February 7th, 2025 [February 7th, 2025]
- Q&A: Professor emphasizes the impact the TikTok ban could have on the First Amendment - Elon News Network - February 7th, 2025 [February 7th, 2025]
- First Amendment Audit of ELPD Draws Widespread Attention Online - East Lansing Info - February 7th, 2025 [February 7th, 2025]
- Groups demand U.S. attorney for D.C. respect First Amendment - Freedom of the Press Foundation - February 7th, 2025 [February 7th, 2025]
- Maryland age assurance lawsuit shows NetChoice digging in on First Amendment - Biometric Update - February 7th, 2025 [February 7th, 2025]
- What does the first amendment protect during public comment? - Spectrum News 1 - February 7th, 2025 [February 7th, 2025]
- FOX News Trey Yingst to be honored at First Amendment Awards - Editor And Publisher Magazine - February 7th, 2025 [February 7th, 2025]
- NetChoice sues to block Marylands Kids Code, saying it violates the First Amendment - The Verge - February 7th, 2025 [February 7th, 2025]
- Stevens: Oklahoma tests First Amendment in move to fund Catholic charter school - The Post and Courier - February 7th, 2025 [February 7th, 2025]
- OPINION: Keeping the First Amendment on Facebook - Lebanon Reporter - February 7th, 2025 [February 7th, 2025]
- RFK Jr. wants to ban pharma ads on TV. The First Amendment may have something to say. - MSNBC - February 1st, 2025 [February 1st, 2025]
- Standing Up for the First Amendment: The Roundtable Submits Comment Letter Opposing Amicus Brief Disclosure Requirements - Philanthropy Roundtable - February 1st, 2025 [February 1st, 2025]
- Trial begins in First Amendment suit against St. John the Baptist Parish - The Lens NOLA - February 1st, 2025 [February 1st, 2025]
- RCFP reviews Pam Bondis record on newsgathering, First Amendment issues - Reporters Committee for Freedom of the Press - February 1st, 2025 [February 1st, 2025]
- Texas county challenges First Amendment ruling on library book bans in 5th Circuit hearing - Yahoo! Voices - February 1st, 2025 [February 1st, 2025]
- Trump "Global Gag Rule" as to Abortion Likely Doesn't Violate the First Amendment - Reason - February 1st, 2025 [February 1st, 2025]
- It was a violation of our First Amendment rights: FIU students react to the TikTok ban - PantherNOW - February 1st, 2025 [February 1st, 2025]
- CWRU First Amendment clinic receives crucial grant from the Stanton Foundation - Crain's Cleveland Business - February 1st, 2025 [February 1st, 2025]
- Matt Gaetz says the First Amendment was "harmed gravely" by January 6 prosecutions - Media Matters for America - January 26th, 2025 [January 26th, 2025]
- New FCC Chair Revives Complaints About ABC, CBS And NBC Content That His Predecessor Rejected As "At Odds With The First Amendment" -... - January 26th, 2025 [January 26th, 2025]
- Trumps stated promise: Stop all government censorship and his free speech Executive Order First Amendment News 454 - Foundation for Individual Rights... - January 26th, 2025 [January 26th, 2025]
- We Must Protect The First Amendment At All Costs vs. No Thanks, Ill Just Take My Freedoms For Granted Until They Disappear - The Onion - January 26th, 2025 [January 26th, 2025]
- TikTok and the First Amendment Robert G. Natelson - Law & Liberty - January 26th, 2025 [January 26th, 2025]
- De Pere man sued city of Green Bay for violating his First Amendment rights. The city settled. - Green Bay Press Gazette - January 26th, 2025 [January 26th, 2025]
- UChicago Student Sues University, Alleging First Amendment and Tenant Rights Violations - The Chicago Maroon - January 26th, 2025 [January 26th, 2025]
- Dr. Rand Paul Introduces Free Speech Protection Act to Safeguard Americans First Amendment Rights Against Government Censorship - Senator Rand Paul - January 26th, 2025 [January 26th, 2025]
- Capistrano School District Accused of Trampling First Amendment Rights of Student - California Globe - January 26th, 2025 [January 26th, 2025]
- Jerry Zahorchak | Keeping the First Amendment on Facebook | Columns | tribdem.com - TribDem.com - January 26th, 2025 [January 26th, 2025]
- 2 blockbuster cases about the First Amendment and online speech - The Hill - January 26th, 2025 [January 26th, 2025]
- The First Amendment is First for a Reason - The Wilson Quarterly - January 26th, 2025 [January 26th, 2025]
- Takeaways from the Supreme Courts TikTok decision and what it may mean for the First Amendment - CNN - January 19th, 2025 [January 19th, 2025]
- Oral Argument in TikTok v. Garland: Does the First Amendment Apply, and How? - The Federalist Society - January 19th, 2025 [January 19th, 2025]
- TikTok, HamHom, and the First Amendment - Reason - January 19th, 2025 [January 19th, 2025]
- Supreme Court weighs First Amendment rights and porn in Texas case - NPR - January 19th, 2025 [January 19th, 2025]
- "Strong stand for the First Amendment": TikTok announces U.S. return after Trump promise to stay ban - Salon - January 19th, 2025 [January 19th, 2025]
- FCCs Rosenworcel Takes Parting Swipe at Incoming Trump Administration Over First Amendment - TV Technology - January 19th, 2025 [January 19th, 2025]
- Upholding TikTok ban, Supreme Court attacks First Amendment ahead of Trump inauguration - WSWS - January 19th, 2025 [January 19th, 2025]
- Rand Paul Reacts to TikTok Ruling: 'Violation of the First Amendment' - Newsweek - January 19th, 2025 [January 19th, 2025]
- Supreme Court Denies TikTok First Amendment Pass, Effectively Shuttering the Social Media Platform in the U.S. on Jan. 19 Unless Sold to Third Party -... - January 19th, 2025 [January 19th, 2025]
- "Satan loves the First Amendment" banner lawsuit allowed to proceed against Broward schools - CBS News - January 6th, 2025 [January 6th, 2025]
- Claim Against School Board That Refused to Display "Satan Loves the First Amendment" Banner Can Go Forward - Reason - January 6th, 2025 [January 6th, 2025]
- First Amendment gives way to national security: Countdown on for TikTok - Virginia Mercury - January 6th, 2025 [January 6th, 2025]
- Settlement puts Disneys business interests above First Amendment - Freedom of the Press Foundation - January 6th, 2025 [January 6th, 2025]
- Federal Judge Temporarily Blocks Protect Tennessee Minors Act Over First Amendment Concerns - SValleyNow.com | Local News for Marion County and the... - January 6th, 2025 [January 6th, 2025]
- Sullivan and the Central Meaning of the First Amendment Lee Levine & Matthew Schafer - Law & Liberty - January 1st, 2025 [January 1st, 2025]
- Tennessee age verification law blocked from taking effect due to First Amendment concerns - WZTV - January 1st, 2025 [January 1st, 2025]
- FIRE to SCOTUS: TikTok ban violates Americans' First Amendment rights - Foundation for Individual Rights and Expression - January 1st, 2025 [January 1st, 2025]
- Ald. Jim Gardiner Agrees to Pay $157K to Settle Lawsuit Claiming He Violated First Amendment by Blocking Critics From Official Facebook Page - WTTW... - January 1st, 2025 [January 1st, 2025]
- First Amendment the first casualty in Oklahoma school chiefs weird war on woke | Opinion - Wichita Eagle - January 1st, 2025 [January 1st, 2025]
- Donald Trump Asks Supreme Court to Delay TikTok Ban Over First Amendment Concerns - TheWrap - January 1st, 2025 [January 1st, 2025]
- How Washington State Stifles the First Amendment for the Incarcerated - Solitary Watch - December 22nd, 2024 [December 22nd, 2024]
- Opinion | Theres Still Time for the Senate to Support the First Amendment - The New York Times - December 22nd, 2024 [December 22nd, 2024]
- First Amendment Censorship Claims Against Stanford Internet Observatory Can Go Forward to Discovery as to Jurisdiction and Standing - Reason - December 22nd, 2024 [December 22nd, 2024]
- S. Ct. Will Hear First Amendment Challenge to TikTok Divestment on Jan. 10 - Reason - December 22nd, 2024 [December 22nd, 2024]
- Counterpoint: Reporters shouldnt have more First Amendment rights than the rest of us - Citrus County Chronicle - December 22nd, 2024 [December 22nd, 2024]
- Deal reached in First Amendment -Facebook lawsuit against Ald. Gardiner, as city agrees to pay some costs - Nadig Newspapers - December 22nd, 2024 [December 22nd, 2024]
- Iowa Republicans are afraid of the First Amendment - Bleeding Heartland - December 22nd, 2024 [December 22nd, 2024]
- TikTok Asks Supreme Court to Block Law Banning Its U.S. Operations - The New York Times - December 18th, 2024 [December 18th, 2024]
- Supreme Court agrees to hear TikToks First Amendment challenge to U.S. ban if not sold - Spectrum News NY1 - December 18th, 2024 [December 18th, 2024]
- Chris Hayes Says Trumps Media Lawsuits Are Meant to Open the Floodgates to Overturn Key First Amendment Rights | Video - Yahoo! Voices - December 18th, 2024 [December 18th, 2024]
- Media on the run: A sign of things to come in Trump times? First Amendment News 451 - Foundation for Individual Rights and Expression - December 18th, 2024 [December 18th, 2024]
- KERC Approves First Amendment to Multi-Year Transmission, Distribution, and Retail Supply Tariff Regulations 2024 - SolarQuarter - December 18th, 2024 [December 18th, 2024]
- Masked Protests and First Amendment Rights The Chickenman Case in Smyrna - Wgnsradio - December 18th, 2024 [December 18th, 2024]
- First Amendment attorneys say Ohio bill aimed at curbing antisemitism may infringe on rights - 10TV - December 18th, 2024 [December 18th, 2024]
- First Amendment warning: 100% chance of Ryan Walters tweeting - NonDoc - December 18th, 2024 [December 18th, 2024]
- Chris Hayes Says Trump's Media Lawsuits Are Meant to 'Open the Floodgates' to Overturn Key First Amendment Rights | Video - TheWrap - December 18th, 2024 [December 18th, 2024]
- SJC expands First Amendment protection to true threats over the Internet, by text, and in person - The Boston Globe - December 14th, 2024 [December 14th, 2024]
- OPINION: The First Amendment is the Biggest Story of the 2024 Presidential Election - Nevada Globe - December 14th, 2024 [December 14th, 2024]
- First Amendment: Anathema or weapon? - Workers World - December 14th, 2024 [December 14th, 2024]
- Justices Will Hear First Amendment Challenge to Denial of Tax Exemption for Catholic Charities - Law.com - December 14th, 2024 [December 14th, 2024]
- The Press and The People Must Not Willingly Surrender First Amendment Rights to Trump - Daily Kos - December 14th, 2024 [December 14th, 2024]
- La. TikTok creator says potential app ban infringes on First Amendment right - KPLC - December 14th, 2024 [December 14th, 2024]
- Opinion | The TikTok Ruling Is a Blow for the First Amendment and Free Speech - The New York Times - December 10th, 2024 [December 10th, 2024]
- TikTok failed to save itself with the First Amendment - The Verge - December 10th, 2024 [December 10th, 2024]
- Newsoms War on Political Speech: ADF Defends Rumble in the First Amendment Case - California Family Council - December 10th, 2024 [December 10th, 2024]
- Opinion | The TikTok Sale and the First Amendment - The Wall Street Journal - December 8th, 2024 [December 8th, 2024]
- Secret court hearing threatens the First Amendment and more - The Hill - December 8th, 2024 [December 8th, 2024]
- President Trump lacks standing: CBS rubbishes lawsuit over Kamala Harris 60 Minutes interview as procedurally baseless and prohibited by the First... - December 8th, 2024 [December 8th, 2024]