Histopathology image classification: Highlighting the gap between manual analysis and AI automation

Section 230
Great Seal of the United States
Long titleProtection For 'Good Samaritan' Blocking and Screening of Offensive Material
NicknamesSection 230
Enacted bythe 104th United States Congress
EffectiveFebruary 8, 1996
Codification
Acts amendedCommunications Act of 1934
Telecommunications Act of 1996
U.S.C. sections created47 U.S.C. § 230
Legislative history

In the United States, Section 230 is a section of the Communications Act of 1934 that was enacted as part of the Communications Decency Act of 1996, which is Title V of the Telecommunications Act of 1996, and generally provides immunity for online computer services with respect to third-party content generated by its users. At its core, Section 230(c)(1) provides immunity from liability for providers and users of an "interactive computer service" who publish information provided by third-party users:

No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

Section 230(c)(2) further provides "Good Samaritan" protection from civil liability for operators of interactive computer services in the good faith removal or moderation of third-party material they deem "obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected."

Section 230 was developed in response to a pair of lawsuits against online discussion platforms in the early 1990s that resulted in different interpretations of whether the service providers should be treated as publishers or, alternatively, as distributors of content created by their users. Its authors, Representatives Christopher Cox and Ron Wyden, believed interactive computer services should be treated as distributors, not liable for the content they distributed, as a means to protect the growing Internet at the time.

Section 230 was enacted as part of the Communications Decency Act (CDA) of 1996 (a common name for Title V of the Telecommunications Act of 1996), formally codified as part of the Communications Act of 1934 at 47 U.S.C. § 230.[a] After passage of the Telecommunications Act, the CDA was challenged in courts and was ruled by the Supreme Court in Reno v. American Civil Liberties Union (1997) to be unconstitutional, though Section 230 was determined to be severable from the rest of the legislation and remained in place. Since then, several legal challenges have validated the constitutionality of Section 230.

Section 230 protections are not limitless and require providers to remove material illegal on a federal level, such as in copyright infringement cases. In 2018, Section 230 was amended by the Stop Enabling Sex Traffickers Act (FOSTA-SESTA) to require the removal of material violating federal and state sex trafficking laws. In the following years, protections from Section 230 have come under more scrutiny on issues related to hate speech and ideological biases in relation to the power that technology companies can hold on political discussions and became a major issue during the 2020 United States presidential election, especially with regard to alleged censorship of more conservative viewpoints on social media.

Passed when Internet use was just starting to expand in both breadth of services and range of consumers in the United States,[2] Section 230 has frequently been referred to as a key law, which allowed the Internet to develop.[3]

Application and limits

Section 230 has two primary parts both listed under §230(c) as the "Good Samaritan" portion of the law. Under section 230(c)(1), as identified above, an information service provider shall not be treated as a "publisher or speaker" of information from another provider. Section 230(c)(2) provides immunity from civil liabilities for information service providers that remove or restrict content from their services they deem "obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected", as long as they act "in good faith" in this action.[4]

In analyzing the availability of the immunity offered by Section 230, courts generally apply a three-prong test. A defendant must satisfy each of the three prongs to gain the benefit of the immunity:[5]

  1. The defendant must be a "provider or user" of an "interactive computer service".
  2. The cause of action asserted by the plaintiff must treat the defendant as the "publisher or speaker" of the harmful information at issue.
  3. The information must be "provided by another information content provider", i.e., the defendant must not be the "information content provider" of the harmful information at issue.

Section 230 immunity is not unlimited. The statute specifically excepts federal criminal liability (§230(e)(1)), electronic privacy violations (§230(e)(4)) and intellectual property claims (§230(e)(2)).[6] There is also no immunity from state laws that are consistent with 230(e)(3) though state criminal laws have been held preempted in cases such as Backpage.com, LLC v. McKenna[7] and Voicenet Communications, Inc. v. Corbett[8] (agreeing that "the plain language of the CDA provides ... immunity from inconsistent state criminal laws"). What constitutes "publishing" under the CDA is somewhat narrowly defined by the courts. The Ninth Circuit held that "Publication involves reviewing, editing, and deciding whether to publish or to withdraw from publication third-party content."[9] Thus, the CDA does not provide immunity with respect to content that an interactive service provider creates or develops entirely by themselves.[10][11] CDA immunity also does not bar an action based on promissory estoppel.[12][13] As of mid-2016, courts have issued conflicting decisions regarding the scope of the intellectual property exclusion set forth in §230(e)(2). For example, in Perfect 10, Inc. v. CCBill, LLC,[14] the 9th Circuit Court of Appeals ruled that the exception for intellectual property law applies only to federal intellectual property claims such as copyright infringement, trademark infringement, and patents, reversing a district court ruling that the exception applies to state-law right of publicity claims.[15] The 9th Circuit's decision in Perfect 10 conflicts with conclusions from other courts including Doe v. Friendfinder. The Friendfinder court specifically discussed and rejected the lower court's reading of "intellectual property law" in CCBill and held that the immunity does not reach state right of publicity claims.[16]

Two bills passed since the passage of Section 230 have added further limits to its protections. The Digital Millennium Copyright Act in 1998, service providers must comply with additional requirements for copyright infringement to maintain safe harbor protections from liability, as defined in the DMCA's Title II, Online Copyright Infringement Liability Limitation Act.[17] The Stop Enabling Sex Traffickers Act (the FOSTA-SESTA act) of 2018 eliminated the safe harbor for service providers in relationship to federal and state sex trafficking laws.

Background and passage

Prior to the Internet, case law was clear that a liability line was drawn between publishers of content and distributors of content; a publisher would be expected to have awareness of material it was publishing and thus should be held liable for any illegal content it published, while a distributor would likely not be aware and thus would be immune. This was established in the 1959 case, Smith v. California,[18] where the Supreme Court ruled that putting liability on the provider (a book store in this case) would have "a collateral effect of inhibiting the freedom of expression, by making the individual the more reluctant to exercise it."[19]

In the early 1990s, the Internet became more widely adopted and created means for users to engage in forums and other user-generated content. While this helped to expand the use of the Internet, it also resulted in a number of legal cases putting service providers at fault for the content generated by its users. This concern was raised by legal challenges against CompuServe and Prodigy, which were early service providers at that time.[20] CompuServe stated it would not attempt to regulate what users posted on its services, while Prodigy had employed a team of moderators to validate content. Both companies faced legal challenges related to content posted by their users. In Cubby, Inc. v. CompuServe Inc., CompuServe was found not to be at fault as, by its stance as allowing all content to go unmoderated, it was a distributor and thus not liable for libelous content posted by users. However, in Stratton Oakmont, Inc. v. Prodigy Services Co., the court concluded that because Prodigy had taken an editorial role with regard to customer content, it was a publisher and was legally responsible for libel committed by its customers.[21][b]

Chris Cox
Ron Wyden
Chris Cox (left) and Ron Wyden, the framers of Section 230

Service providers made their Congresspersons aware of these cases, believing that if followed by other courts across the nation, the cases would stifle the growth of the Internet.[22] United States Representative Christopher Cox (R-CA) had read an article about the two cases and felt the decisions were backwards. "It struck me that if that rule was going to take hold then the internet would become the Wild West and nobody would have any incentive to keep the internet civil," Cox stated.[23]

At the time, Congress was preparing the Communications Decency Act (CDA), part of the omnibus Telecommunications Act of 1996, which was designed to make knowingly sending indecent or obscene material to minors a criminal offense. A version of the CDA had passed through the Senate pushed by Senator J. James Exon (D-NE).[24] People in a grassroots effort in the tech industry reacted to try to convince the House of Representatives to challenge Exon's bill. Based on the Stratton Oakmont decision, Congress recognized that requiring service providers to block indecent content would make them be treated as publishers in the context of the First Amendment, and thus would make them become liable for other content such as libel, not set out in the existing CDA.[20] Cox and fellow Representative Ron Wyden (D-OR) wrote the House bill's section 509, titled the Internet Freedom and Family Empowerment Act, designed to override the decision from Stratton Oakmont, so that a service provider could moderate content as necessary and would not have to act as a wholly neutral conduit. The new provision was added to the text of the proposed statute while the CDA was in conference within the House.

The overall Telecommunications Act, with both Exon's CDA and Cox/Wyden's provision, passed both Houses by near-unanimous votes and was signed into law by President Bill Clinton by February 1996.[25] Cox/Wyden's section became Section 509 of the Telecommunications Act of 1996 and became law as a new Section 230 of the Communications Act of 1934. The anti-indecency portion of the CDA was immediately challenged on passage, resulting in the Supreme Court 1997 case, Reno v. American Civil Liberties Union, that ruled all of the anti-indecency sections of the CDA were unconstitutional, but left Section 230, among other provisions of the Act, as law.[26]

Impact

Section 230 has often been called "The 26 words that made the Internet".[2] The passage and subsequent legal history supporting the constitutionality of Section 230 have been considered essential to the growth of the Internet through the early part of the 21st century. Coupled with the Digital Millennium Copyright Act (DMCA) of 1998, Section 230 provides internet service providers safe harbors to operate as intermediaries of content without fear of being liable for that content as long as they take reasonable steps to delete or prevent access to that content. These protections allowed experimental and novel applications in the Internet area without fear of legal ramifications, creating the foundations of modern Internet services such as advanced search engines, social media, video streaming, and cloud computing. NERA Economic Consulting estimated in 2017 that Section 230 and the DMCA, combined, contributed about 425,000 jobs to the U.S. in 2017 and represented a total revenue of US$44 billion annually.[27]

Subsequent history

Early challenges – Zeran v. AOL (1997–2008)

The first major challenge to Section 230 itself was Zeran v. AOL, a 1997 case decided at the Fourth Circuit.[28] The case involved a person that sued America Online (AOL) for failing to remove, in a timely manner, libelous ads posted by AOL users that inappropriately connected his home phone number to the Oklahoma City bombing. The court found for AOL and upheld the constitutionality of Section 230, stating that Section 230 "creates a federal immunity to any cause of action that would make service providers liable for information originating with a third-party user of the service."[29] The court asserted in its ruling Congress's rationale for Section 230 was to give Internet service providers broad immunity "to remove disincentives for the development and utilization of blocking and filtering technologies that empower parents to restrict their children's access to objectionable or inappropriate online material."[28] In addition, Zeran notes "the amount of information communicated via interactive computer services is ... staggering. The specter of tort liability in an area of such prolific speech would have an obviously chilling effect. It would be impossible for service providers to screen each of their millions of postings for possible problems. Faced with potential liability for each message republished by their services, interactive computer service providers might choose to severely restrict the number and type of messages posted. Congress considered the weight of the speech interests implicated and chose to immunize service providers to avoid any such restrictive effect."[28]

This rule, cementing Section 230's liability protections, has been considered one of the most important case laws affecting the growth of the Internet, allowing websites to be able to incorporate user-generated content without fear of prosecution.[30] However, at the same time, this has led to Section 230 being used as a shield for some website owners as courts have ruled Section 230 provides complete immunity for ISPs with regard to the torts committed by their users over their systems.[31][32] Through the next decade, most cases involving Section 230 challenges generally fell in favor of service providers, ruling in favor of their immunity from third-party content on their sites.[32]

Erosion of Section 230 immunity – Roommates.com (2008–16)

While Section 230 had seemed to have given near complete immunity to service providers in its first decade, new case law around 2008 started to find cases where providers can be liable for user content due to being a "publisher or speaker" related to that content under §230(c)(1). One of the first such cases to make this challenge was Fair Housing Council of San Fernando Valley v. Roommates.com, LLC 521 F.3d 1157 (9th Cir. 2008),[33] The case centered on the services of Roommates.com that helped to match renters based on profiles they created on their website; this profile was generated by a mandatory questionnaire and which included information about their gender and race and preferred roommates' race. The Fair Housing Council of San Fernando Valley stated this created discrimination and violated the Fair Housing Act, and asserted that Roommates.com was liable for this. In 2008, the Ninth Circuit in an en banc decision ruled against Roommates.com, agreeing that its required profile system made it an information content provider and thus ineligible to receive the protections of §230(c)(1).[32]

The decision from Roommates.com was considered to be the most significant deviation from Zeran in how Section 230 was handled in case law.[32][34] Eric Goldman of the Santa Clara University School of Law wrote that while the Ninth Circuit's decision in Roommates.com was tailored to apply to a limited number of websites, he was "fairly confident that lots of duck-biting plaintiffs will try to capitalize on this opinion and they will find some judges who ignore the philosophical statements and instead turn a decision on the opinion's myriad of ambiguities".[32][35] Over the next several years, a number of cases cited the Ninth Circuit's decision in Roommates.com to limit some of the Section 230 immunity to websites. Law professor Jeff Kosseff of the United States Naval Academy reviewed 27 cases in the 2015–2016 year involving Section 230 immunity concerns, and found more than half of them had denied the service provider immunity, in contrast to a similar study he had performed in from 2001 to 2002 where a majority of cases granted the website immunity; Kosseff asserted that the Roommates.com decision was the key factor that led to this change.[32]

Sex trafficking – Backpage.com and FOSTA-SESTA (2012–17)

Around 2001, a University of Pennsylvania paper warned that "online sexual victimization of American children appears to have reached epidemic proportions" due to the allowances granted by Section 230.[36] Over the next decade, advocates against such exploitation, such as the National Center for Missing and Exploited Children and Cook County Sheriff Tom Dart, pressured major websites to block or remove content related to sex trafficking, leading to sites like Facebook, MySpace, and Craigslist to pull such content. Because mainstream sites were blocking this content, those that engaged or profited from trafficking started to use more obscure sites, leading to the creation of sites like Backpage. In addition to removing these from the public eye, these new sites worked to obscure what trafficking was going on and who was behind it, limiting ability for law enforcement to take action.[36] Backpage and similar sites quickly came under numerous lawsuits from victims of the sex traffickers and exploiters for enabling this crime, but the court continually found in favor of Backpage due to Section 230.[37] Attempts to block Backpage from using credit card services as to deny them revenue was also defeated in the courts, as Section 230 allowed their actions to stand in January 2017.[38]

Due to numerous complaints from constituents, Congress began an investigation into Backpage and similar sites in January 2017, finding Backpage complicit in aiding and profiting from illegal sex trafficking.[39] Subsequently, Congress introduced the FOSTA-SESTA bills: the Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA) in the House of Representatives by Ann Wagner (R-MO) in April 2017, and the Stop Enabling Sex Traffickers Act (SESTA) U.S. Senate bill introduced by Rob Portman (R-OH) in August 2017. Combined, the FOSTA-SESTA bills modified Section 230 to exempt service providers from Section 230 immunity when dealing with civil or criminal crimes related to sex trafficking,[40] which removes section 230 immunity for services that knowingly facilitate or support sex trafficking.[41] The bill passed both Houses and was signed into law by President Donald Trump on April 11, 2018.[42][43]

The bills were criticized by pro-free speech and pro-Internet groups as a "disguised internet censorship bill" that weakens the section 230 immunity, places unnecessary burdens on Internet companies and intermediaries that handle user-generated content or communications with service providers required to proactively take action against sex trafficking activities, and requires a "team of lawyers" to evaluate all possible scenarios under state and federal law (which may be financially unfeasible for smaller companies).[44][45][46][47][48] Critics also argued that FOSTA-SESTA did not distinguish between consensual, legal sex offerings from non-consensual ones, and argued it would cause websites otherwise engaged in legal offerings of sex work to be threatened with liability charges.[39] Online sex workers argued that the bill would harm their safety, as the platforms they utilize for offering and discussing sexual services in a legal manner (as an alternative to street prostitution) had begun to reduce their services or shut down entirely due to the threat of liability under the bill.[49][50]

Debate on protections for social media (2016–present)

Many social media sites, notably the Big Tech companies of Facebook, Google, and Apple, as well as Twitter, have come under scrutiny as a result of the alleged Russian interference in the 2016 United States elections, where it was alleged that Russian agents used the sites to spread propaganda and fake news to swing the election in favor of Donald Trump. These platforms also were criticized for not taking action against users that used the social media outlets for harassment and hate speech against others. Shortly after the passage of FOSTA-SESTA acts, some in Congress recognized that additional changes should be made to Section 230 to require service providers to deal with these bad actors, beyond what Section 230 already provided to them.[51]

In 2020, Supreme Court Justice Clarence Thomas made a statement in respect of denying certiorari to Malwarebytes, Inc. v. Enigma Software Group USA, LLC., which referenced Robert Katzman's dissent in Force v. Facebook. He opined that section 230 had been interpreted too broadly, and could be narrowed or eliminated in a future case, which he urged his colleagues to hear.

Courts have…departed from the most natural reading of the text by giving Internet companies immunity for their own content ... Section 230(c)(1) protects a company from publisher liability only when content is ‘provided by another information content provider.’ Nowhere does this provision protect a company that is itself the information content provider.[52][53]

Consequently, in 2023 the Supreme Court agreed to hear two cases considering whether Social media can be held liable for "aiding and abetting" in acts of international terrorism, when their recommender systems promote it.

Numerous experts have suggested that changing 230 without repealing it entirely would be the optimal way to improve it.[54] Google's former fraud czar Shuman Ghosemajumder proposed in 2021 that full protections should only apply to unmonetized content, to align platforms' content moderation efforts with their financial incentives, and to encourage the use of better technology to achieve that necessary scale.[55] Researchers Marshall Van Alstyne and Michael D. Smith supported this idea of an additional duty-of-care requirement.[56] However, journalist Martin Baron has argued that most of Section 230 is essential for social media companies to exist at all.[57]

Platform neutrality

Some politicians, including Republican senators Ted Cruz (TX) and Josh Hawley (MO), have accused major social networks of displaying a bias against conservative perspectives when moderating content (such as Twitter suspensions).[58][59][60] In a Fox News op-ed, Cruz argued that section 230 should only apply to providers that are politically "neutral", suggesting that a provider "should be considered to be a liable 'publisher or speaker' of user content if they pick and choose what gets published or spoke."[61] Section 230 does not contain any requirements that moderation decisions be neutral.[61] Hawley alleged that section 230 immunity was a "sweetheart deal between big tech and big government".[62][63]

In December 2018, Republican representative Louie Gohmert introduced the Biased Algorithm Deterrence Act (H.R.492), which would remove all section 230 protections for any provider that used filters or any other type of algorithms to display user content when otherwise not directed by a user.[64][65]

In June 2019, Hawley introduced the Ending Support for Internet Censorship Act (S. 1914), that would remove section 230 protections from companies whose services have more than 30 million active monthly users in the U.S. and more than 300 million worldwide, or have over $500 million in annual global revenue, unless they receive a certification from the majority of the Federal Trade Commission that they do not moderate against any political viewpoint, and have not done so in the past 2 years.[66][67]

There has been criticism—and support—of the proposed bill from various points on the political spectrum. A poll of more than 1,000 voters gave Senator Hawley's bill a net favorability rating of 29 points among Republicans (53% favor, 24% oppose) and 26 points among Democrats (46% favor, 20% oppose).[68] Some Republicans feared that by adding FTC oversight, the bill would continue to fuel fears of a big government with excessive oversight powers.[69] Nancy Pelosi, the Democratic Speaker of the House, has indicated support for the same approach Hawley has taken.[70] The chairman of the Senate Judiciary Committee, Senator Graham, has also indicated support for the same approach Hawley has taken, saying "he is considering legislation that would require companies to uphold 'best business practices' to maintain their liability shield, subject to periodic review by federal regulators."[71]

Legal experts have criticized the Republicans' push to make Section 230 encompass platform neutrality. Wyden stated in response to potential law changes that "Section 230 is not about neutrality. Period. Full stop. 230 is all about letting private companies make their own decisions to leave up some content and take other content down."[72] Kosseff has stated that the Republican intentions are based on a "fundamental misunderstanding" of Section 230's purpose, as platform neutrality was not one of the considerations made at the time of passage.[73] Kosseff stated that political neutrality was not the intent of Section 230 according to the framers, but rather making sure providers had the ability to make content-removal judgement without fear of liability.[20] There have been concerns that any attempt to weaken Section 230 could actually cause an increase in censorship when services lose their exemption from liability.[63][74]

Attempts to bring damages to tech companies for apparent anti-conservative bias in courts, arguing against Section 230 protections, have generally failed. A lawsuit brought by the non-profit Freedom's Watch in 2018 against Google, Facebook, Twitter, and Apple on antitrust violations for using their positions to create anti-conservative censorship was dismissed by the D.C. Circuit Court of Appeals in May 2020, with the judges ruling that censorship can only apply to First Amendment rights blocked by the government and not by private entities.[75]

Hate speech

In the wake of the 2019 shootings in Christchurch, New Zealand, El Paso, Texas, and Dayton, Ohio, the impact on Section 230 and liability towards online hate speech has been raised. In both the Christchurch and El Paso shootings, the perpetrator posted hate speech manifestos to 8chan, a moderated imageboard known to be favorable for the posting of extreme views. Concerned politicians and citizens raised calls at large tech companies for the need for hate speech to be removed from the Internet; however, hate speech is generally protected speech under the First Amendment, and Section 230 removes the liability for these tech companies to moderate such content as long as it is not illegal. This has given the appearance that tech companies do not need to be proactive against hateful content, thus allowing the hate content to proliferate online and lead to such incidents.[76][24]

Notable articles on these concerns were published after the El Paso shooting by The New York Times,[76] The Wall Street Journal,[77] and Bloomberg Businessweek,[24] among other outlets, but which were criticized by legal experts including Mike Godwin, Mark Lemley, and David Kaye, as the articles implied that hate speech was protected by Section 230, when it is in fact protected by the First Amendment. In the case of The New York Times, the paper issued a correction to affirm that the First Amendment protected hate speech, and not Section 230.[78][79][80]

Members of Congress have indicated they may pass a law that changes how Section 230 would apply to hate speech as to make tech companies liable for this. Wyden, now a Senator, stated that he intended for Section 230 to be both "a sword and a shield" for Internet companies, the "sword" allowing them to remove content they deem inappropriate for their service, and the shield to help keep offensive content from their sites without liability. However, Wyden warned that because tech companies have not been willing to use the sword to remove content, they could be at risk of losing the shield.[76][24] Some have compared Section 230 to the Protection of Lawful Commerce in Arms Act, a law that grants gun manufacturers immunity from certain types of lawsuits when their weapons are used in criminal acts. According to law professor Mary Anne Franks, "They have not only let a lot of bad stuff happen on their platforms, but they've actually decided to profit off of people's bad behavior."[24]

Representative Beto O'Rourke stated his intent for his 2020 presidential campaign to introduce sweeping changes to Section 230 to make Internet companies liable for not being proactive in taking down hate speech.[81] O'Rourke later dropped out of the race. Fellow candidate and former vice president Joe Biden has similarly called for Section 230 protections to be weakened or otherwise "revoked" for "big tech" companies—particularly Facebook—having stated in a January 2020 interview with The New York Times that "[Facebook] is not merely an internet company. It is propagating falsehoods they know to be false", and that the U.S. needed to "[set] standards" in the same way that the European Union's General Data Protection Regulation (GDPR) set standards for online privacy.[82][83]

In the aftermath of the Backpage trial and subsequent passage of FOSTA-SESTA, others have found that Section 230 appears to protect tech companies from content that is otherwise illegal under United States law. Professor Danielle Citron and journalist Benjamin Wittes found that as late as 2018, several groups deemed as terrorist organizations by the United States had been able to maintain social media accounts on services run by American companies, despite federal laws that make providing material support to terrorist groups subject to civil and criminal charges.[84] However, case law from the Second Circuit has ruled that under Section 230, technology companies are generally not liable for civil claims based on terrorism-related content.[85] U.S. Supreme Court Justice Clarence Thomas has stated that Section 230 gives companies too much immunity in these area, as reported in his several dissenting statements to court orders denying certification of cases related to Section 230. Thomas believed that the Supreme Court needed to review the limits granted by Section 230.[86]

The Supreme Court heard the cases of Gonzalez v. Google LLC and Twitter, Inc. v. Taamneh in the 2022 term. Gonzalez involved Google's liability for the YouTube recommendation options that appeared to promote recruitment videos for ISIS that led to the death of a U.S. citizen in a 2015 Paris terrorist attack. Google has claimed it is not liable under Section 230 protections.[87] In Taamneh, the company had been found liable for hosting terrorism-related content from third-party users under the Antiterrorism and Effective Death Penalty Act of 1996, Section 230's protections.[88] The Supreme Court ruled for both Google and Twitter, asserting that neither company aided or abetted in terrorism under existing laws, but did not address the Section 230 question.[89]

Social media algorithms

Many social media sites use in-house algorithmic curation through recommender systems to provide a feed of content to their users based on what the user has previously seen and content similar to that. Such algorithms have been criticized for pushing violent, racist, and misogynist content to users,[90] and influence on minors to become addicted to social media and affect their mental health.[91]

Whether Section 230 protects social media firms from what their algorithms produce remains a question in case law. The Supreme Court considered this question in regard to terrorism content in the forementioned Gonzalez and Taamneh cases, but neither addressed if Section 230 protected social media firms for the product of their algorithms.[89] A ruling by the Third Circuit Court in August 2024 stated that a lawsuit against TikTok, filed by parents of a minor that died from attempting the blackout challenge and who argued TikTok's algorithm that promoted the challenge led to the minor's death, can proceed after ruling that because TikTok has curated its algorithm, it is not protected by Section 230.[92] Separately, Ethan Zuckerman and the Knight First Amendment Institute at Columbia filed a lawsuit against Facebook, arguing that if the company claims their algorithm for showing content on the Facebook feed is protected by Section 230, then users have the right to use third-party tools to customize what the algorithm shows them to block unwanted content.[93]

2020 Department of Justice review

In February 2020, the United States Department of Justice held a workshop related to Section 230 as part of an ongoing antitrust probe into "big tech" companies. Attorney General William Barr said that while Section 230 was needed to protect the Internet's growth while most companies were not stable, "No longer are technology companies the underdog upstarts...They have become titans of U.S. industry" and questioned the need for Section 230's broad protections.[94] Barr said that the workshop was not meant to make policy decisions on Section 230, but part of a "holistic review" related to Big Tech since "not all of the concerns raised about online platforms squarely fall within antitrust" and that the Department of Justice would want to see reform and better incentives to improve online content by tech companies within the scope of Section 230 rather than change the law directly.[94] Observers to the sessions stated the focus of the talks only covered Big Tech and small sites that engaged in areas of revenge porn, harassment, and child sexual abuse, but did not consider much of the intermediate uses of the Internet.[95]

The DOJ issued their four major recommendations to Congress in June 2020 to modify Section 230. These include:[96][97]

  1. Incentivizing platforms to deal with illicit content, including calling out "Bad Samaritans" that solicit illicit activity and remove their immunity, and carve out exemptions in the areas of child abuse, terrorism, and cyber-stalking, as well as when platforms have been notified by courts of illicit material;
  2. Removing protections from civil lawsuits brought by the federal government;
  3. Disallowing Section 230 protections in relationship to antitrust actions on the large Internet platforms; and
  4. Promoting discourse and transparency by defining existing terms in the statute like "otherwise objectionable" and "good faith" with specific language, and requiring platforms to publicly document when they take moderation actions against content unless that may interfere with law enforcement or risk harm to an individual.

Legislation to alter Section 230

In 2020, several bills were introduced through Congress to limit the liability protections that Internet platforms had from Section 230 as a result of events in the preceding years.

EARN IT Act of 2020
In March 2020, a bi-partisan bill known as the Eliminating Abusive and Rampant Neglect of Interactive Technologies (EARN IT) Act (S. 3398) was introduced to the Senate, which called for the creation of a 15-member government commission (including administration officials and industry experts) to establish "best practices" for the detection and reporting of child exploitation materials. Internet services would be required to follow these practices; the commission would have the power to penalize those who are not in compliance, which can include removing their Section 230 protections.[98]
While the bill had bi-partisan support from its sponsors (Lindsey Graham, Josh Hawley, Dianne Feinstein, and Richard Blumenthal) and backing from groups like National Center for Missing and Exploited Children[99] and the National Center on Sexual Exploitation,[100] the EARN IT Act was criticized by a coalition of 25 organizations,[101][102] as well as by human rights groups including the Electronic Frontier Foundation,[103][104] the American Civil Liberties Union,[105][106] and Human Rights Watch.[107][108] Opponents of the bill recognized that some of the "best practices" would most likely include a backdoor for law enforcement into any encryption used on the site, in addition to the dismantling of Section 230's approach, based on commentary made by members of the federal agencies that would be placed on this commission. For example, Attorney General Barr has extensively argued that the use of end-to-end encryption by online services can obstruct investigations by law enforcement, especially those involving child exploitation and has pushed for a governmental backdoor into encryption services.[98][109] The Senators behind EARN IT have stated that there is no intent to bring any such encryption backdoors with this legislation.[110]
Wyden also was critical of the bill, calling it "a transparent and deeply cynical effort by a few well-connected corporations and the Trump administration to use child sexual abuse to their political advantage, the impact to free speech and the security and privacy of every single American be damned."[98][111] Graham stated that the goal of the bill was "to do this in a balanced way that doesn't overly inhibit innovation, but forcibly deals with child exploitation."[112] As an implicit response to EARN IT, Wyden along with House Representative Anna G. Eshoo proposed a new bill, the Invest in Child Safety Act, in May 2020 that would give US$5 billion to the Department to Justice to give additional manpower and tools to enable them to address child exploitation directly rather than to rely on technology companies to rein in the problem.[113]
The EARN IT Act advanced out of the Senate Judiciary Committee by a unanimous 22-0 vote on July 2, 2020, following an amendment by Lindsey Graham. Graham's amendment removed the legal authority of the proposed federal commission, instead giving a similar authority to each individual state government.[114] The bill was introduced into the House on October 2, 2020.[115]
Limiting Section 230 Immunity to Good Samaritans Act
In June 2020, Hawley and three Republican senators, Marco Rubio, Kelly Loeffler and Kevin Cramer, called on the FCC to review the protections that the Big Tech companies had from Section 230, stating in their letter that "It is time to take a fresh look at Section 230 and to interpret the vague standard of 'good faith' with specific guidelines and direction" due to the "a lack of clear rules" and the "judicial expansion" around the statute.[116] Hawley introduced the "Limiting Section 230 Immunity to Good Samaritans Act" bill in the Senate on June 17, 2020, with co-sponsors Rubio, Mike Braun and Tom Cotton, which would allow providers with over 30 million monthly U.S. users and over US$1.5 billion in global revenues to be liable to lawsuits from users who believed that the provider was not uniformly enforcing content; users would be able to seek damages up to US$5,000 and lawyers fees under the bill.[117]
Platform Accountability and Consumer Transparency (PACT) Act
A bi-partisan bill introduced by Senators Brian Schatz and John Thune in June 2020, the "Platform Accountability and Consumer Technology Act" would require Internet platforms to issue public statements on their policies for how they moderate, demonetize, and remove user content from their platforms, and to publish public quarterly reports to summarize their actions and statistics for that quarter. The bill would also mandate that platforms conform with all court-ordered removal of content deemed illegal within 24 hours. Further, the bill would eliminate platforms' Section 230 protections from federal civil liability in cases brought against the platforms and would enable state attorneys general to enforce actions against platforms. Schatz and Thune considered their approach more of "a scalpel, rather than a jackhammer" in contrast to other options that have been presented to date.[118]
Behavioral Advertising Decisions Are Downgrading Services (BAD ADS) Act
Hawley introduced the Behavioral Advertising Decisions Are Downgrading Services Act in July 2020, which would remove Section 230 protections for larger service providers (30 million users in the U.S. or 300 million globally and with more than US$1.5 billion in annual revenue) if their sites used behavioral advertising, with ads tailored to the users of the sites based on how the users had engaged with the site or where they were located. Hawley had spoken out against such ad practices and had previously tried to add legislation to require service providers to add "do not track" functionality for Internet ads.[119]
Online Freedom and Viewpoint Diversity Act
Republican Senators Lindsey Graham, Roger Wicker and Marsha Blackburn introduced the Online Freedom and Viewpoint Diversity Act in September 2020. The bill, if passed, would strip away Section 230 liability protection for sites that fail to give reason for actions taken in moderating or restricting content, and require them to state that said content must have an "objectively reasonable belief" it violated their site's terms or the site could be penalized. The bill would also replace the vague "objectionable" term in Section 230(c)(2) with more specific categories, like "unlawful" material where a website would not become liable for taking steps to moderate content.[120]
Safeguarding Against Fraud, Exploitation, Threats, Extremism, and Consumer Harms Act (SAFE TECH act)
Democratic Senators Mark Warner, Mazie Hirono and Amy Klobuchar introduced the SAFE TECH in February 2021. The bill has multiple parts. It would first alter 230(c)(1) to cover only "speech" and not "information" making providers liable for illegal speech. It would also remove the Good Samaritan immunity around federal and state laws with regard to civil rights laws, antitrust laws, cyberstalking laws, human rights laws or civil actions regarding a wrongful death. Further, it would eliminate the immunity for any speech that the provider was paid to carry, such as through advertising or marketplace listings. Finally, it would force providers to comply with court orders for removal of material related to the prior areas.[121]

Executive Order on Preventing Online Censorship

President Donald Trump, during his first administration, was a major proponent of limiting the protections of technology and media companies under Section 230 due to claims of an anti-conservative bias. In July 2019, Trump held a "Social Media Summit" that he used to criticize how Twitter, Facebook, and Google handled conservative voices on their platforms. During the summit, Trump warned that he would seek "all regulatory and legislative solutions to protect free speech".[122]

The two tweets on May 26, 2020, from President Trump that Twitter had marked "potentially misleading" (inserting the blue warning icon and "Get the facts..." language) that led to the executive order

In late May 2020, President Trump made statements that mail-in voting would lead to massive fraud, in a pushback against the use of mail-in voting due to the COVID-19 pandemic for the upcoming 2020 primary elections, in both his public speeches and his social media accounts. In a Twitter message on May 26, 2020, he stated that, "There is NO WAY (ZERO!) that Mail-In Ballots will be anything less than substantially fraudulent." Shortly after its posting, Twitter moderators marked the message with a "potentially misleading" warning (a process it had introduced a few weeks earlier that month primarily in response to misinformation about the COVID-19 pandemic)[123] linking readers to a special page on its site that provided analysis and fact-checks of Trump's statement from media sources like CNN and The Washington Post, the first time it had used the process on Trump's messages.[124] Jack Dorsey, Twitter's former CEO, defended the moderation, stating that they were not acting as a "arbitrator of truth" but instead "Our intention is to connect the dots of conflicting statements and show the information in dispute so people can judge for themselves."[125] Trump was angered by this, and shortly afterwards threatened that he would take action to "strongly regulate" technology companies, asserting these companies were suppressing conservative voices.[126]

Trump signs an executive order on "Preventing Online Censorship" on May 28, 2020.

On May 28, 2020, Trump signed the "Executive Order on Preventing Online Censorship" (EO 13925), an executive order directing regulatory action at Section 230.[127] Trump stated in a press conference before signing his rationale for it: "A small handful of social media monopolies controls a vast portion of all public and private communications in the United States. They've had unchecked power to censor, restrict, edit, shape, hide, alter, virtually any form of communication between private citizens and large public audiences."[128] The EO asserts that media companies that edit content apart from restricting posts that are violent, obscene or harassing, as outlined in the "Good Samaritan" clause §230(c)(2), are then "engaged in editorial conduct" and may forfeit any safe-harbor protection granted in §230(c)(1).[129] From that, the EO specifically targets the "Good Samaritan" clause for media companies in their decisions to remove offensive material "in good faith". Courts have interpreted the "in good faith" portion of the statute based on its plain language; the EO purports to establish conditions where that good faith may be revoked, such as if the media companies have shown bias in how they remove material from the platform. The goal of the EO is to remove the Section 230 protections from such platforms, thus leaving them liable for content.[130] Whether a media platform has bias would be determined by a rulemaking process to be set by the Federal Communications Commission in consultation with the Commerce Department, the National Telecommunications and Information Administration (NTIA), and the Attorney General, while the Justice Department and state attorneys general will handle disputes related to bias, gather these to report to the Federal Trade Commission, who would make determinations if a federal lawsuit should be filed. Additional provisions prevent government agencies from advertising on media company platforms that are demonstrated to have such bias.[128]

Text of the "Executive Order on Preventing Online Censorship"

The EO came under intense criticism and legal analysis after its announcement.[131] Senator Wyden stated that the EO was a "mugging of the First Amendment", and that there does need to be a thoughtful debate about modern considerations for Section 230, though the political spat between Trump and Twitter is not a consideration.[132] Professor Kate Klonick of St. John's University School of Law in New York considered the EO "political theater" without any weight of authority.[130] The Electronic Frontier Foundation's Aaron Mackey stated that the EO starts with a flawed misconstruing of linking sections §230(c)(1) and §230(c)(2), which were not written to be linked and have been treated by case law as independent statements in the statute, and thus "has no legal merit".[129]

By happenstance, the EO was signed on the same day that riots erupted in Minneapolis, Minnesota in the wake of the murder of George Floyd, an African-American from an incident involving four officers of the Minneapolis Police Department. Trump had tweeted on his conversation with Minnesota's governor Tim Walz about bringing National Guard to stop the riots, but concluded with the statement, "Any difficulty and we will assume control but, when the looting starts, the shooting starts", a phrase attached to Miami Police Chief Walter E. Headley to deal with violent riots in 1967.[133][134] After internal review, Twitter marked the message with a "public interest notice" that deemed it "glorified violence", which they would normally remove for violating the site's terms, but stated to journalists that they "have kept the Tweet on Twitter because it is important that the public still be able to see the Tweet given its relevance to ongoing matters of public importance."[135] Following Twitter's marking of his May 28 tweet, Trump said in another tweet that due to Twitter's actions, "Section 230 should be revoked by Congress. Until then, it will be regulated!"[136]

By June 2, 2020, the Center for Democracy & Technology filed a lawsuit in the United States District Court for the District of Columbia seeking preliminary and permanent injunction from the EO from being enforced, asserting that the EO created a chilling effect on free speech since it puts all hosts of third-party content "on notice that content moderation decisions with which the government disagrees could produce penalties and retributive actions, including stripping them of Section 230's protections".[137]

The Secretary of Commerce via the NTIA sent a petition with a proposed rule to the FCC on July 27, 2020, as the first stage of executing the EO.[138][139] FCC chair Ajit Pai stated in October 2020 that after the Commission reviewed what authority they have over Section 230 that the FCC will proceed with putting forth their proposed rules to clarify Section 230 on October 15, 2020.[140] Pai's announcement, which came shortly after Trump again called for Section 230 revisions after asserting Big Tech was purposely hiding a reporting of leaked documents around Hunter Biden, Joe Biden's son, was criticized by the Democratic FCC commissioners Geoffrey Starks and Jessica Rosenworcel and the tech industry, with Rosenworcel stating "The FCC has no business being the president's speech police."[141][142]

A second lawsuit against the EO was filed by activist groups including Rock the Vote and Free Press on August 27, 2020, after Twitter had flagged another of Trump's tweets for misinformation related to mail-in voting fraud. The lawsuit stated that should the EO be enforced, Twitter would not have been able to fact-check tweets like Trump's as misleading, thus allowing the President or other government officials to intentionally distribute misinformation to citizens.[143]

President Biden rescinded the EO on May 14, 2021, along with several of Trump's other orders.[144]

Subsequent events

Following the November election, Trump has made numerous claims on his social media accounts contesting the results, including claims of fraud. Twitter and other social media companies have marked these posts as potentially misleading, similar to previous posts Trump has made. As a result, Trump threatened to veto the defense spending bill for 2021 if it did not contain language to repeal Section 230.[145] Trump made good on his promise, vetoing the spending bill on December 23, 2020, in part for not containing a repeal of Section 230.[146] The House voted to overturn the veto on December 28, 322–87, sending the bill to the Senate to vote to overturn. The Senate similarly voted to override the veto on January 1, 2021, without adding any Section 230 provisions.[147]

During this, Trump urged Congress to expand the COVID-19 relief payments in the Consolidated Appropriations Act, 2021 that he had signed into law on December 27, 2020, but also stated that they should address the Section 230 repeal and other matters that were not addressed in the defense bill. The Senate majority leader Mitch McConnell stated on December 28 that he would bring legislation later that week that would include the expanded COVID-19 relief along with legislation to deal with Section 230, as outlined by Trump.[148] Ultimately, no additional legislation was introduced.

In the wake of the 2021 United States Capitol attack on January 6, 2021, Pai stated that he would not be seeking any Section 230 reform before his prior planned resignation from office on January 20, 2021. Pai stated that this was mostly due to the lack of time to implement such rule making before his resignation, but also said that he would not "second-guess those decisions" of social media networks under Section 230 to block some of Trump's messages from January 6 that contributed to the violence.[149] In the days that followed, Twitter, Facebook, and other social media services blocked or banned Trump's accounts claiming his speech during and after the riot was inciting further violence. These actions were supported by politicians, but led to renewed calls by Democratic leaders to reconsider Section 230, as these politicians believed that Section 230 led the companies to fail to take any preemptive action against the people who had planned and executed the Capitol riots.[150][151] Separately, Trump filed class-action lawsuits against Twitter, Facebook, and YouTube in July 2021 related to his bans from the January 2021 period, claiming their actions were unjustifiable, and claiming that Section 230 was unconstitutional.[152] Trump's lawsuit against Twitter was dismissed by a federal judge in May 2022, stating that the suit's First Amendment claims were inactionable against non-government groups like Twitter, though allowed an amended claim to be filed.[153]

In March 2021, Facebook's Mark Zuckerberg, Alphabet's Sundar Pichai, and Twitter's Jack Dorsey were asked to testify to the House Committee on Energy and Commerce relating to the role of social media in promoting extremism and misinformation following the 2020 election, of which Section 230 was expected to be a topic. Prior to the event, Zuckerberg proposed an alternate change to Section 230 compared to previously proposed bills. Zuckerberg stated that it would be costly and impractical for social media companies to traffic all problematic material, and instead it would be better to tie Section 230 liability protection to companies that have demonstrated that they have mechanisms in place to remove this material once it is identified.[154]

In July 2021, Democratic senators Amy Klobuchar and Ben Ray Luján introduced the Health Misinformation Act, which is intended primarily to combat COVID-19 misinformation. It would add a carveout to Section 230 to make companies liable for the publication of "health misinformation" during a "public health emergency" — as established by the Department of Health and Human Services — if the content is promoted to users via algorithmic decisions.[155]

Justice Against Malicious Algorithms Act

Following Frances Haugen's testimony to Congress that related to her whistleblowing on Facebook's internal handling of content, House Democrats Anna Eshoo, Frank Pallone Jr., Mike Doyle, and Jan Schakowsky introduced the "Justice Against Malicious Algorithms Act" in October 2021, which is in committee as H.R.5596. The bill would remove Section 230 protections for service providers related to personalized recommendation algorithms that present content to users if those algorithms knowingly or recklessly deliver content that contributes to physical or severe emotional injury.[156] "The last few years have proven that the more outrageous and extremist content social media platforms promote, the more engagement and advertising dollars they rake in," said Representative Frank Pallone Jr., the chairman of the Energy and Commerce Committee. "By now it's painfully clear that neither the market nor public pressure will stop social media companies from elevating disinformation and extremism, so we have no choice but to legislate, and now it's a question of how best to do it," he added.[157]

State laws

Florida

The state of Florida (predominantly Republican after the 2020 election) passed its "deplatforming" Senate Bill 7072 in May 2021, which had been proposed in February 2021 after Trump had been banned from several social media sites. SB 7072 prevents social media companies from knowingly blocking or banning politicians, and grants the Florida Elections Commission the ability to fine these companies for knowing violations, with fines as high as $250,000 per day for state-level politicians.

The bill exempts companies that own theme parks or other large venues within the state, thus would exempt companies such as Disney whose parks provide a significant tax revenue to the state.[158] The Computer & Communications Industry Association (CCIA) and NetChoice filed suit against the state to block enforcement of the law, in Netchoice v. Moody, asserting that the law violated the First Amendment rights of private companies.[159] Judge Robert Lewis Hinkle of the United States District Court for the Northern District of Florida issued a preliminary injunction against the law on June 30, 2021, stating that "The legislation now at issue was an effort to rein in social-media providers deemed too large and too liberal. Balancing the exchange of ideas among private speakers is not a legitimate governmental interest", and further that the law "discriminates on its face among otherwise identical speakers".[160]

Texas

Texas H.B. 20, enacted in September 2021, intended to prevent large social media providers from banning or demonetizing their users based on the user's viewpoint, including for views expressed outside of the social media platform, as well as to increase transparency in how these providers moderate content.[161] The CCIA and NetChoice filed suit to prevent enforcement of the law in NetChoice v. Paxton. A federal district judge placed an injunction on this law in December 2021, stating that the law's "prohibitions on 'censorship' and constraints on how social media platforms disseminate content violate the First Amendment".[162] However, the Fifth Circuit reversed the injunction on a 2–1 order without yet ruling on the merits of the case in May 2022, effectively allowing the Texas law to come into effect.[163] The CCIA and NetChoice appealed the Fifth Circuit decision directly to the U.S. Supreme Court seeking an emergency injunction to block the law. They argued that regulations on how social media platforms moderate users' content may prevent them from moderating at all in certain situations and thus force them to publish material they find objectionable, an outcome that would violate the social media platforms' First Amendment rights.[164]

On May 31, 2022, the Supreme Court restored the injunction by court order (with four Justices, Samuel Alito, Clarence Thomas, Elena Kagan, and Neil Gorsuch dissenting) while lower court litigation continued.[165] The Fifth Circuit reversed the district court ruling in September 2022, with Judge Andy Oldham stating in the majority opinion, "Today we reject the idea that corporations have a freewheeling First Amendment right to censor what people say."[166] The decision creates a circuit split with the potential to be heard by the Supreme Court.[166] The Fifth Circuit agreed to rejoin enforcement of the law in October 2022 while several tech companies petitioned the case to the Supreme Court.[167]

Both the Florida and the Texas law cases were heard by the Supreme Court, who ruled in July 2024 to vacate and remand both circuit Court decisions due to their failure to evaluate both laws across all aspects of the social media sites rather than the specific functions targeted by the law. [168]

California

Case law

Numerous cases involving Section 230 have been heard in the judiciary system since its introduction, many which are rote applications of Section 230.

The following is a partial list of legal cases that have been established as case law that have influenced the interpretation of Section 230 in subsequent cases or have led to new legislation around Section 230.

Defamatory information

Zeran v. AOL, 129 F.3d 327 (4th Cir. 1997).[169]
Immunity was upheld against claims that AOL unreasonably delayed in removing defamatory messages posted by third party, failed to post retractions, and failed to screen for similar postings.
Blumenthal v. Drudge, 992 F. Supp. 44, 49–53 (D.D.C. 1998).[170]
The court upheld AOL's immunity from liability for defamation. AOL's agreement with the contractor allowing AOL to modify or remove such content did not make AOL the "information content provider" because the content was created by an independent contractor. The Court noted that Congress made a policy choice by "providing immunity even where the interactive service provider has an active, even aggressive role in making available content prepared by others."
Carafano v. Metrosplash.com, 339 F.3d 1119 (9th Cir. 2003).[171]
The court upheld immunity for an Internet dating service provider from liability stemming from third party's submission of a false profile. The plaintiff, Carafano, claimed the false profile defamed her, but because the content was created by a third party, the website was immune, even though it had provided multiple choice selections to aid profile creation.
Batzel v. Smith, 333 F.3d 1018 (9th Cir. 2003).[172]
Immunity was upheld for a website operator for distributing an email to a listserv where the plaintiff claimed the email was defamatory. Though there was a question as to whether the information provider intended to send the email to the listserv, the Court decided that for determining the liability of the service provider, "the focus should be not on the information provider's intentions or knowledge when transmitting content but, instead, on the service provider's or user's reasonable perception of those intentions or knowledge." The Court found immunity proper "under circumstances in which a reasonable person in the position of the service provider or user would conclude that the information was provided for publication on the Internet or other 'interactive computer service'."
Green v. AOL, 318 F.3d 465 (3rd Cir. 2003).[173]
The court upheld immunity for AOL against allegations of negligence. Green claimed AOL failed to adequately police its services and allowed third parties to defame him and inflict intentional emotional distress. The court rejected these arguments because holding AOL negligent in promulgating harmful content would be equivalent to holding AOL "liable for decisions relating to the monitoring, screening, and deletion of content from its network -- actions quintessentially related to a publisher's role."
Barrett v. Rosenthal, 40 Cal. 4th 33 (2006).[174]
Immunity was upheld for an individual internet user from liability for republication of defamatory statements on a listserv. The court found the defendant to be a "user of interactive computer services" and thus immune from liability for posting information passed to her by the author.
MCW, Inc. v. badbusinessbureau.com(RipOff Report/Ed Magedson/XCENTRIC Ventures LLC) 2004 WL 833595, No. Civ.A.3
02-CV-2727-G (N.D. Tex. April 19, 2004).[175]
The court rejected the defendant's motion to dismiss on the grounds of Section 230 immunity, ruling that the plaintiff's allegations that the defendants wrote disparaging report titles and headings, and themselves wrote disparaging editorial messages about the plaintiff, rendered them information content providers. The Web site, www.badbusinessbureau.com, allows users to upload "reports" containing complaints about businesses they have dealt with.
Hy Cite Corp. v. badbusinessbureau.com (RipOff Report/Ed Magedson/XCENTRIC Ventures LLC), 418 F. Supp. 2d 1142 (D. Ariz. 2005).[176]
The court rejected immunity and found the defendant was an "information content provider" under Section 230 using much of the same reasoning as the MCW case.
Barnes v. Yahoo!, Inc. 570 F.3d 1096 (9th Cir. 2009)
The court rejected immunity for the defendant when failing to uphold a promissory estoppel claim related to third-party content that they were otherwise immune from; in this case, Yahoo! had promised to remove nude photos of the plaintiff placed maliciously on the site by an ex-partner but had failed to do so. While the Ninth Circuit ultimately dismissed the case since Yahoo! would not have been liable for the photos under Section 230, their promissory estoppel makes them a "publisher or speaker" under Section 230.[32]

False information

Gentry v. eBay, Inc., 99 Cal. App. 4th 816, 830 (2002).[177]
eBay's immunity was upheld for claims based on forged autograph sports items purchased on the auction site.
Ben Ezra, Weinstein & Co. v. America Online, 206 F.3d 980, 984–985 (10th Cir. 2000), cert. denied, 531 U.S. 824 (2000).[178]
Immunity for AOL was upheld against liability for a user's posting of incorrect stock information.
Goddard v. Google, Inc., C 08-2738 JF (PVT), 2008 WL 5245490, 2008 U.S. Dist. LEXIS 101890 (N.D. Cal. December 17, 2008).[179]
Immunity was upheld against claims of fraud and money laundering. Google was not responsible for misleading advertising created by third parties who bought space on Google's pages. The court found the creative pleading of money laundering did not cause the case to fall into the crime exception to Section 230 immunity.
Milgram v. Orbitz, ESX-C-142-09 (N.J. Super. Ct. August 26, 2010).[180]
Immunity for Orbitz and CheapTickets was upheld for claims based on fraudulent ticket listings entered by third parties on ticket resale marketplaces.
Herrick v. Grindr, 765 F. App'x 586 (2nd Cir. 2019).
The Second Circuit upheld immunity for the Grindr dating app for LGBT persons under Section 230 in regards to the misuse of false profiles created in the names of a real person. The plaintiff had broken up with a boyfriend, who later went onto Grindr to create multiple false profiles that presented the real-life identity and address of the plaintiff and as being available for sexual encounters, as well as having illegal drugs for sale. The plaintiff reported that over a thousand men had come to his house for sex and drugs, based on the communications with the fake profile, and he began to fear for his safety. He sued Grindr for not taking actions to block the false profiles after multiple requests. Grindr asserted Section 230 did not make them liable for the actions of the ex-boyfriend. This was agreed by the district court and the Second Circuit.[181][182]

Sexually explicit content and minors

Doe v. America Online, 783 So. 2d 1010, 1013–1017 (Fl. 2001),[183] cert. denied, 122 S.Ct. 208 (2000).
The court upheld immunity against state claims of negligence based on "chat room marketing" of obscene photographs of minor by a third party.
Kathleen R. v. City of Livermore, 87 Cal. App. 4th 684, 692 (2001).[184]
The California Court of Appeal upheld the immunity of a city from claims of waste of public funds, nuisance, premises liability, and denial of substantive due process. The plaintiff's child downloaded pornography from a public library's computers, which did not restrict access to minors. The court found the library was not responsible for the content of the internet and explicitly found that section 230(c)(1) immunity covers governmental entities and taxpayer causes of action.
Doe v MySpace, 528 F.3d 413 (5th Cir. 2008).[185]
The court upheld immunity for a social networking site from negligence and gross negligence liability for failing to institute safety measures to protect minors and failure to institute policies relating to age verification. The Does' daughter had lied about her age and communicated over MySpace with a man who later sexually assaulted her. In the court's view, the Does' allegations were "merely another way of claiming that MySpace was liable for publishing the communications."
Dart v. Craigslist, Inc., 665 F. Supp. 2d 961 (N.D. Ill. October 20, 2009).[186]
The court upheld immunity for Craigslist against a county sheriff's claims that its "erotic services" section constituted a public nuisance because it caused or induced prostitution.
Backpage.com v. McKenna, et al., CASE NO. C12-954-RSM[187]
In July 2012, the United States District Court for the Western District of Washington at Seattle granted Backpage’s motion for a preliminary injunction seeking to preclude the enforcement of the Washington State SB 6251 which made it a felony to publish or cause to be published "any advertisement for a commercial sex act...that includes the depiction of a minor."[188] In granting the motion, the court found plaintiffs were likely to succeed on their claim that SB 6251 is preempted by federal law because it is likely expressly preempted by Section 230 and likely conflicts with federal law.[188][189]
Backpage.com LLC v Cooper, Case # 12-cv-00654[SS1][190]
Backpage.com LLC v Hoffman et al., Civil Action No. 13-cv-03952 (DMC) (JAD)[191]
The court upheld immunity for Backpage in contesting a Washington state law (SB6251)[192] that would have made providers of third-party content online liable for any crimes related to a minor in Washington state.[193] The states of Tennessee and New Jersey later passed similar legislation. Backpage argued that the laws violated Section 230, the Commerce Clause of the United States Constitution, and the First and Fifth Amendments.[192] In all three cases the courts granted Backpage permanent injunctive relief and awarded them attorney's fees.[190][194][195][196][197]
Backpage.com v. Dart., CASE NO. 15-3047[198]
The court ruled in favor of Backpage after Sheriff Tom Dart of Cook County, Illinois, a frequent critic of Backpage and its adult postings section, sent a letter on his official stationery to Visa and MasterCard demanding that these firms "immediately cease and desist" allowing the use of their credit cards to purchase ads on Backpage. Within two days both companies withdrew their services from Backpage.[199] Backpage filed a lawsuit asking for a temporary restraining order and preliminary injunction against Dart granting Backpage relief and return to the status quo prior to Dart sending the letter. Backpage alleged that Dart's actions were unconstitutional, violating the First and Fourteenth Amendments to the US Constitution as well as Section 230 of the CDA. Backpage asked for Dart to retract his "cease and desist" letters.[200] After initially being denied the injunctive relief by a lower court,[201][202] the Seventh Circuit U.S. Court of Appeals reversed that decision and directed that a permanent injunction be issued enjoining Dart and his office from taking any actions "to coerce or threaten credit card companies...with sanctions intended to ban credit card or other financial services from being provided to Backpage.com."[203] The court cited Section 230 as part of its decision, and the Supreme Court declined to hear the petition to this case. However, this decision, in part, led to the passage of the FOSTA-SESTA Acts, and subsequently the dismissal of Backpage's case after federal enforcement agencies had seized Backpage's assets for violating FOSTA-SESTA.[204]

Discriminatory housing ads

Chicago Lawyers' Committee For Civil Rights Under Law v. Craigslist, 519 F.3d 666 (7th Cir. 2008).[205]
The court upheld immunity for Craigslist against Fair Housing Act claims based on discriminatory statements in postings on the classifieds website by third party users.
Fair Housing Council of San Fernando Valley v. Roommates.com, LLC, 521 F.3d 1157 (9th Cir. 2008) (en banc).[206]
The Ninth Circuit Court of Appeals rejected immunity for the Roommates.com roommate matching service for discrimination claims brought under the federal Fair Housing Act[207] and California housing discrimination laws.[208] The court concluded that the manner in which the service elicited required information from users concerning their roommate preferences (by having dropdowns specifying gender, presence of children, and sexual orientation), and the manner in which it utilized that information in generating roommate matches (by eliminating profiles that did not match user specifications), that the service was an "information content provider" and thus liable for the discrimination claims. The court upheld immunity for the descriptions posted by users in the "Additional Comments" section because these were entirely created by users.[32]

Threats

Delfino v. Agilent Technologies, 145 Cal. App. 4th 790 (2006), cert denied, 128 S. Ct. 98 (2007).
A California Appellate Court unanimously upheld immunity from state tort claims arising from an employee's use of the employer's e-mail system to send threatening messages. The court concluded that an employer that provides Internet access to its employees qualifies as a "provider ... of an interactive service."

Failure to warn

Jane Doe No. 14 v. Internet Brands, Inc., No. 12-56638 (9th Cir. September 17, 2014).
The Ninth Circuit Court of Appeals rejected immunity for claims of negligence under California law.
Doe filed a complaint against Internet Brands which alleged a "failure to warn" her of a known rape scheme, despite her relationship to them as a ModelMayhem.com member. They also had requisite knowledge to avoid future victimization of ModelMayhem.com users by warning users of online sexual predators. The Ninth Circuit Court of Appeals concluded that the Communications Decency Act did not bar the claim and remanded the case to the district court for further proceedings.
In February 2015, the Ninth Circuit panel set aside its 2014 opinion and set the case for reargument. In May 2016, the panel again held that Doe's case could proceed.[209][210]
Anderson v. TikTok, Inc., No. 22-3061 (3rd Cir. August 27, 2024).
The Third Circuit Court of Appeals reversed and remanded the case to the lower court in August 2024, and ruled that Section 230 only immunizes information provided by third parties, not recommendations made by TikTok's algorithm.[211][212]
In May 2022, Anderson, the mother of a 10-year-old girl from Pennsylvania, filed a lawsuit against TikTok in the United States District Court for the Eastern District of Pennsylvania.[213] Her daughter died while attempting the Blackout Challenge on TikTok. The district court upheld immunity for TikTok and dismissed the complaint in October 2022.[214]
The Third Circuit rejected immunity and held that TikTok’s algorithm, which recommended the Blackout Challenge to the daughter, was TikTok’s own “expressive activity” and thus its first-party speech. The court ruled that Section 230 does not bar Anderson’s claims based on TikTok’s first-party speech.[212]  

Terrorism

Force v. Facebook Inc., 934 F.3d 53 (2nd Cir. 2019).
The Second Circuit upheld immunity in civil claims for service providers for hosting terrorism-related content created by users. Families, friends, and associates of several killed in Hamas attacks filed suit against Facebook under the United States' Anti-Terrorism Act, asserting that since Hamas members used Facebook to coordinate activities, Facebook was liable for its content. While previous rules at federal District and Circuit level have generally ruled against such cases, this decision in the Second Circuit was first to assert that Section 230 does apply even to acts related to terrorism that may be posted by users of service providers, thus dismissing the suit against Facebook. The Second Circuit ruled that the various algorithms Facebook for its recommender system remain as part of the role of the distributor of the content and not the publisher, since these automated tools were essentially neutral.[85] The Supreme Court declined to hear the case.[215]
Judge Robert Katzman gave a 35-page dissenting opinion in the Force case, which was cited by Justice Clarence Thomas in a suggestion to colleagues in 2020 to reconsider the scope of immunity granted under Section 230.[157]
Gonzalez v. Google LLC (2023) and Twitter, Inc. v. Taamneh (2023)
Both cases were ruled on by the Supreme Court in May 2023. In Gonzalez, YouTube, via Google was accused of aiding terrorism by its recommendation algorithm, for which Google claimed immunity via Section 230.[216] In Twitter, Twitter and other social media companies were argued to have assisted in terrorist activity under the Antiterrorism and Effective Death Penalty Act of 1996 amended by the Justice Against Sponsors of Terrorism Act. Twitter tried to argue for Section 230 immunity but this was not considered in lower courts.[88] in the Supreme Court decision for Twitter, the unanimous court ruled that there were no actionable charges against Twitter under the Antiterrorism Act; the plaintiff family failed to state a claim for relief. Subsequently, the Gonzalez case was returned by per curium order to lower courts to review the case in light of the Twitter decision. These decisions effectively avoided raising any questions in Section 230.[217]

Similar legislation in other countries

European Union

Directive 2000/31/EC,[218] the e-Commerce Directive, establishes a safe harbor regime for hosting providers:

  • Article 14 establishes that hosting providers are not responsible for the content they host as long as (1) the acts in question are neutral intermediary acts of a mere technical, automatic and passive capacity; (2) they are not informed of its illegal character, and (3) they act promptly to remove or disable access to the material when informed of it.
  • Article 15 precludes member states from imposing general obligations to monitor hosted content for potential illegal activities.

The updated Directive on Copyright in the Digital Single Market (Directive 2019/790) Article 17 makes providers liable if they fail to take "effective and proportionate measures" to prevent users from uploading certain copyright violations and do not respond immediately to takedown requests.[219]

Australia

In Dow Jones & Company Inc v Gutnick,[220] the High Court of Australia treated defamatory material on a server outside Australia as having been published in Australia when it is downloaded or read by someone in Australia.

Gorton v Australian Broadcasting Commission & Anor (1973) 1 ACTR 6

Under the Defamation Act 2005 (NSW),[221] s 32, a defence to defamation is that the defendant neither knew, nor ought reasonably to have known of the defamation, and the lack of knowledge was not due to the defendant's negligence.

Italy

The Electronic Commerce Directive 2000[218] (e-Commerce Directive) has been implemented in Italy by means of Legislative Decree no. 70 of 2003. The provisions provided by Italy are substantially in line with those provided at the EU level. However, at the beginning, the Italian case-law had drawn a line between so-called "active" hosting providers and "passive" Internet Service Providers, arguing that "active" Internet Service Providers would not benefit from the liability exception provided by Legislative Decree no. 70. According to that case-law, an ISP is deemed to be active whenever it carries out operations on the content provided by the user, such as in case it modifies the content or makes any enrichment of the content. Under certain cases, courts have held ISPs liable for the user's content for the mere facts that such content was somehow organised or enriched by the ISP (e.g. by organizing the contents in libraries or categories, etc. or monetised by showing ads).

New Zealand

Failing to investigate the material or to make inquiries of the user concerned may amount to negligence in this context: Jensen v Clark [1982] 2 NZLR 268.

France

Directive 2000/31/CE was transposed into the LCEN law. Article 6 of the law establishes safe haven for hosting provider as long as they follow certain rules.

In LICRA vs. Yahoo!, the High Court ordered Yahoo! to take affirmative steps to filter out Nazi memorabilia from its auction site. Yahoo!, Inc. and its then president Timothy Koogle were also criminally charged, but acquitted.

Germany

In 1997, Felix Somm, the former managing director for CompuServe Germany, was charged with violating German child pornography laws because of the material CompuServe's network was carrying into Germany. He was convicted and sentenced to two years probation on May 28, 1998.[222][223] He was cleared on appeal on November 17, 1999.[224][225]

The Oberlandesgericht (OLG) Cologne, an appellate court, found that an online auctioneer does not have an active duty to check for counterfeit goods (Az 6 U 12/01).[226]

In one example, the first-instance district court of Hamburg issued a temporary restraining order requiring message board operator Universal Boards to review all comments before they can be posted to prevent the publication of messages inciting others to download harmful files. The court reasoned that "the publishing house must be held liable for spreading such material in the forum, regardless of whether it was aware of the content."[227]

United Kingdom

The laws of libel and defamation will treat a disseminator of information as having "published" material posted by a user, and the onus will then be on a defendant to prove that it did not know the publication was defamatory and was not negligent in failing to know: Goldsmith v Sperrings Ltd (1977) 2 All ER 566; Vizetelly v Mudie's Select Library Ltd (1900) 2 QB 170; Emmens v Pottle & Ors (1885) 16 QBD 354.

In an action against a website operator, on a statement posted on the website, it is a defence to show that it was not the operator who posted the statement on the website. The defence is defeated if it was not possible for the claimant to identify the person who posted the statement, or the claimant gave the operator a notice of complaint and the operator failed to respond in accordance with regulations.

See also

Notes

  1. ^ Section 230 is commonly mislabeled as "Section 230 of the Communications Decency Act." It was the ninth section of the Communications Decency Act, and the 509th section of the Telecommunications Act of 1996; formally, Section 230 is an amendment to the Communications Act of 1934 codified as Section 230 of Title 47 of the U.S. Code.[1]
  2. ^ The details of the Stratton Oakmont case would later serve as the basis for the book and its film The Wolf of Wall Street

References

  1. ^ Brannon, Valerie C. (June 6, 2019). "Liability for Content Hosts: An Overview of the Communication Decency Act's Section 230" (PDF). Congressional Research Service. Archived (PDF) from the original on January 11, 2021. Retrieved September 5, 2020.
  2. ^ a b Grossman, Wendy M. "The Twenty-Six Words that Created the Internet, book review: The biography of a law". ZDNet. Archived from the original on January 12, 2021. Retrieved September 4, 2020.
  3. ^ "Trump's Executive Order: What to Know About Section 230". Council on Foreign Relations. Archived from the original on November 20, 2020. Retrieved September 4, 2020.
  4. ^ https://www.pbs.org/newshour/politics/what-you-should-know-about-section-230-the-rule-that-shaped-todays-internet
  5. ^ Ruane, Kathleen Ann (February 21, 2018). "How Broad A Shield? A Brief Overview of Section 230 of the Communications Decency Act" (PDF). Congressional Research Service. Archived (PDF) from the original on March 6, 2021. Retrieved August 12, 2019.
  6. ^ See Gucci America, Inc. v. Hall & Associates, 135 F. Supp. 409 (S.D.N.Y. 2001). (no immunity for contributory liability for trademark infringement).
  7. ^ Backpage.com, LLC v. McKenna, 881 F. Supp.2d 1262 (W.D. Wash. 2012).
  8. ^ Voicenet Commc'ns, Inc. v. Corbett, 2006 WL 2506318, 4 (E.D.Pa. August 30, 2006).
  9. ^ Barnes v. Yahoo!, Inc. , 570 F.3d 1096 at 1102 (9th Cir. 2009).
  10. ^ "Anthony v. Yahoo! Inc., 421 F.Supp.2d 1257 (N.D.Cal. 2006)". November 5, 2012. Archived from the original on October 16, 2020. Retrieved January 20, 2021.
  11. ^ Anthony v. Yahoo! Inc., 421 F.Supp.2d 1257 (N.D.Cal. 2006)
  12. ^ Barnes v. Yahoo! Inc., 570 F.3d 1096 (9th Cir. 2009).
  13. ^ "Barnes v. Yahoo!, 570 F.3d 1096 (9th Cir. 2009)". November 5, 2012. Archived from the original on November 10, 2020. Retrieved January 20, 2021.
  14. ^ Perfect 10, Inc. v. CCBill, LLC, 481 F.3d 751 (9th Cir. March 29, 2007, amended May 31, 2007).
  15. ^ Cf. Carafano v. Metrosplash.com, Inc., 339 F.3d 1119 (9th Cir. August 13, 2003). (dismissing, inter alia, right of publicity claim under Section 230 without discussion). But see Doe v. Friendfinder Network, Inc., 540 F.Supp.2d 288 (D.N.H. 2008). (230 does not immunize against state IP claims, including right of publicity claims).
  16. ^ Doe v. Friendfinder Network, Inc., 540 F.Supp.2d 288 (D.N.H. 2008).
  17. ^ "Explainer: How Letting Platforms Decide What Content To Facilitate Is What Makes Section 230 Work". Above the Law. June 21, 2019. Archived from the original on March 6, 2021. Retrieved July 2, 2019.
  18. ^ Smith v. California, 361 U.S. 147 (1959).
  19. ^ "Section 230 as First Amendment Rule". Harvard Law Review. 131: 2027. May 10, 2018. Archived from the original on January 9, 2021. Retrieved June 21, 2019.
  20. ^ a b c Robertson, Adi (June 21, 2019). "Why The Internet's Most Important Law Exists And How People Are Still Getting It Wrong". The Verge. Archived from the original on February 26, 2021. Retrieved June 21, 2019.
  21. ^ Stratton Oakmont, Inc. v. Prodigy Services Co., 31063/94, 1995 WL 323710, 1995 N.Y. Misc. LEXIS 712 Archived April 17, 2009, at the Wayback Machine (N.Y. Sup. Ct. 1995).
  22. ^ Fung, Brian; Sneed, Tierney (February 21, 2023). "Takeaways from the Supreme Court's hearing in blockbuster internet speech case". CNN. Retrieved February 21, 2023.
  23. ^ Reynolds, Matt (March 24, 2019). "The strange story of Section 230, the obscure law that created our flawed, broken internet". Wired UK. Archived from the original on January 5, 2021. Retrieved August 12, 2019.
  24. ^ a b c d e Gillette, Felix (August 7, 2019). "Section 230 Was Supposed to Make the Internet a Better Place. It Failed". Bloomberg L.P. Archived from the original on November 7, 2020. Retrieved August 12, 2019.
  25. ^ Pub. L. 104–104 (text) (PDF)
  26. ^ Reno v. ACLU, 521 U.S. 844, 885 (United States Supreme Court 1997).
  27. ^ Dippon, Christian (2017). Economic Value of Internet Intermediaries and the Role of Liability Protections (PDF) (Report). NERA Economic Consulting. Archived (PDF) from the original on January 15, 2021. Retrieved May 30, 2020 – via Internet Association.
  28. ^ a b c Zeran v. Am. Online, Inc., 129 F.3d 327, 331 (4th Cir. 1997)
  29. ^ 129 F.3d 330.
  30. ^ Shroud, Matt (August 19, 2014). "These six lawsuits shaped the internet". The Verge. Archived from the original on February 8, 2021. Retrieved July 2, 2019.
  31. ^ Tushnet, Rebecca (August 28, 2008). "Power Without Responsibility: Intermediaries and the First Amendment". The Georgetown Law Journal. 76: 101–131. Archived from the original on February 25, 2021. Retrieved July 1, 2019.
  32. ^ a b c d e f g h Kosseff, Jeff (2017). "The Gradual Erosion of the Law That Shaped the Internet: Section 230s Evolution Over Two Decades". Columbia Science and Technology Law Review. 18 (1). SSRN 3225774.
  33. ^ http://www.ca9.uscourts.gov/datastore/opinions/2008/04/02/0456916.pdf Archived February 21, 2012, at the Wayback Machine, Fair Hous. Council of San Fernando Valley v. Roommates.com, LLC, 521 F.3d 1157 (9th Cir. 2008).
  34. ^ Defterderian, Varty (2009). ""Fair Housing Council v. Roommates.com": A New Path for Section 230 Immunity". Berkeley Technology Law Journal. 24 (1): 563–592. JSTOR 24121369.
  35. ^ Goldman, Eric (April 3, 2008). "Roommates.com Denied 230 Immunity by Ninth Circuit En Banc (With My Comments)". ericgoldman.org. Archived from the original on March 6, 2021. Retrieved May 29, 2020.
  36. ^ a b Biederma, Christine (June 18, 2019). "Inside Backpage.com's Vicious Battle With the Feds". Wired. Archived from the original on June 18, 2019. Retrieved July 1, 2019.
  37. ^ Ars Staff (December 23, 2017). "How do you change the most important law in Internet history? Carefully". Ars Technica. Archived from the original on February 21, 2021. Retrieved December 26, 2017.
  38. ^ Chung, Andrew (January 9, 2017). "U.S. Supreme Court will not examine tech industry legal shield". Reuters. Archived from the original on October 22, 2020. Retrieved July 1, 2019.
  39. ^ a b Romero, Aja (July 2, 2018). "A new law intended to curb sex trafficking threatens the future of the internet as we know it". Vox. Archived from the original on February 28, 2021. Retrieved July 2, 2019.
  40. ^ Jackman, Tom (August 1, 2017). "Senate launches bill to remove immunity for websites hosting illegal content, spurred by Backpage.com". Washington Post. ISSN 0190-8286. Archived from the original on December 21, 2020. Retrieved December 26, 2017.
  41. ^ Ann, Wagner (March 21, 2018). "H.R.1865 – 115th Congress (2017–2018): Allow States and Victims to Fight Online Sex Trafficking Act of 2017". www.congress.gov. Archived from the original on April 8, 2018. Retrieved April 9, 2018.
  42. ^ Dias, Elizabeth (April 11, 2018). "Trump Signs Bill Amid Momentum to Crack Down on Trafficking". The New York Times. Archived from the original on April 12, 2018. Retrieved April 11, 2018.
  43. ^ Magid, Larry (April 6, 2018). "DOJ Seizes Backpage.com Weeks After Congress Passes Sex Trafficking Law". Forbes. Archived from the original on April 8, 2018. Retrieved April 8, 2018.
  44. ^ "ACLU letter opposing SESTA". American Civil Liberties Union. Archived from the original on March 24, 2018. Retrieved March 25, 2018.
  45. ^ "SWOP-USA stands in opposition of disguised internet censorship bill SESTA, S. 1963". Sex Workers Outreach Project. Archived from the original on October 24, 2017. Retrieved October 23, 2017.
  46. ^ "Wikipedia warns that SESTA will strip away protections vital to its existence". The Verge. Archived from the original on March 9, 2018. Retrieved March 8, 2018.
  47. ^ "Sex trafficking bill is turning into a proxy war over Google". The Verge. Archived from the original on September 21, 2017. Retrieved September 20, 2017.
  48. ^ Quinn, Melissa. "Tech community fighting online sex trafficking bill over fears it will stifle innovation". Washington Examiner. Archived from the original on September 19, 2017. Retrieved September 20, 2017.
  49. ^ "How a New Senate Bill Will Screw Over Sex Workers". Rolling Stone. Archived from the original on March 24, 2018. Retrieved March 25, 2018.
  50. ^ Zimmerman, Amy (April 4, 2018). "Sex Workers Fear for Their Future: How SESTA Is Putting Many Prostitutes in Peril". The Daily Beast. Archived from the original on April 7, 2018. Retrieved April 7, 2018.
  51. ^ Zhou, Li; Scola, Nancy; Gold, Ashley (November 1, 2017). "Senators to Facebook, Google, Twitter: Wake up to Russian threat". Politico. Archived from the original on January 14, 2021. Retrieved March 12, 2019.
  52. ^ Thomas, Clarence. "Statement of JUSTICE THOMAS respecting the denial of certiorari" (PDF). SupremeCourt.gov.
  53. ^ "Clarence Thomas Suggests Section 230 Immunities Applied Too Broadly to Tech Companies". news.yahoo.com. October 13, 2020. Retrieved November 11, 2022.
  54. ^ York, T. J. (November 22, 2021). "Experts Warn Against Total Repeal of Section 230". Archived from the original on November 28, 2021. Retrieved November 28, 2021.
  55. ^ Ghosemajumder, Shuman (January 6, 2021). "Fixing Section 230–not ending it—would be better for everyone". Fast Company. Archived from the original on November 28, 2021. Retrieved November 28, 2021.
  56. ^ Smith, Michael D.; Alstyne, Marshall Van (August 12, 2021). "It's Time to Update Section 230". Harvard Business Review. ISSN 0017-8012. Archived from the original on November 28, 2021. Retrieved November 28, 2021.
  57. ^ "Ex-Washington Post editor: Big Tech does 'a lot of harm' but has 'advantages'". finance.yahoo.com. August 6, 2021. Archived from the original on November 28, 2021. Retrieved November 28, 2021.
  58. ^ Harmon, Elliot (April 12, 2018). "No, Section 230 Does Not Require Platforms to Be "Neutral"". Electronic Frontier Foundation. Archived from the original on February 17, 2021. Retrieved July 17, 2019.
  59. ^ Robertson, Adi (June 21, 2019). "Why the internet's most important law exists and how people are still getting it wrong". The Verge. Archived from the original on February 26, 2021. Retrieved July 17, 2019.
  60. ^ Lecher, Colin (June 20, 2019). "Both parties are mad about a proposal for federal anti-bias certification". The Verge. Archived from the original on February 24, 2021. Retrieved July 17, 2019.
  61. ^ a b Masnick, Mike (April 13, 2018). "Ted Cruz Demands A Return Of The Fairness Doctrine, Which He Has Mocked In The Past, Due To Misunderstanding CDA 230". Techdirt. Archived from the original on December 7, 2020. Retrieved July 17, 2019.
  62. ^ Vas, Nicole (March 19, 2019). "GOP steps up attack over tech bias claims". The Hill. Archived from the original on February 25, 2021. Retrieved July 17, 2019.
  63. ^ a b Eggerton, John. "Sen. Hawley: Big Tech's Sec. 230 Sweetheart Deal Must End". Multichannel. Archived from the original on August 21, 2020. Retrieved July 17, 2019.
  64. ^ MacMillan, John D. McKinnon and Douglas (December 11, 2018). "Google CEO Sundar Pichai Faces Lawmakers Skeptical Over Privacy, Alleged Anti-Conservative Bias". The Wall Street Journal. Archived from the original on November 8, 2020. Retrieved July 17, 2019.
  65. ^ Evans, Glenn. "Gohmert bill targets filtering of conservative messages by social media platforms". Longview News-Journal. Archived from the original on December 14, 2020. Retrieved July 17, 2019.
  66. ^ Eggerton, John. "Hawley Bill Takes Big Bite Out of Big Tech's Sec. 230 Shield". Multichannel. Archived from the original on December 7, 2020. Retrieved July 17, 2019.
  67. ^ Wellons, Mary Catherine (June 18, 2019). "GOP senator introduces a bill that would blow up business models for Facebook, YouTube and other tech giants". CNBC. Archived from the original on January 27, 2021. Retrieved June 21, 2019.
  68. ^ Echelon Insights. "A Plurality Supports Regulation of Tech Companies for Bias" (PDF). Archived (PDF) from the original on November 24, 2020. Retrieved July 31, 2019.
  69. ^ Lecher, Colin (June 21, 2019). "Both parties are mad about a proposal for federal anti-bias certification". The Verge. Archived from the original on February 24, 2021. Retrieved June 21, 2019.
  70. ^ Corbone, Christopher (April 15, 2019). "Pelosi heralds 'new era' of Big Tech regulation, says 230 protections could be removed". Fox News. Archived from the original on November 8, 2020. Retrieved August 2, 2019.
  71. ^ Lima, Cristiano (July 9, 2019). "How a widening political rift over online liability is splitting Washington". Politico. Archived from the original on February 25, 2021. Retrieved August 7, 2019.
  72. ^ Steward, Emily (May 16, 2019). "Ron Wyden wrote the law that built the internet. He still stands by it – and everything it's brought with it". Recode. Archived from the original on February 9, 2021. Retrieved August 14, 2019.
  73. ^ Ingram, Matthew (August 8, 2019). "The myth of social media anti-conservative bias refuses to die". Columbia Journalism Review. Archived from the original on March 4, 2021. Retrieved August 14, 2019.
  74. ^ "Expect More Conservative Purges on Social Media If Republicans Target Section 230". Reason.com. November 28, 2018. Archived from the original on February 2, 2021. Retrieved July 17, 2019.
  75. ^ Larson, Erik (May 27, 2020). "Twitter, Facebook Win Appeal in Anticonservative-Bias Suit". Bloomberg News. Archived from the original on October 3, 2020. Retrieved May 27, 2020.
  76. ^ a b c Wakabayashi, Daisuke (August 6, 2019). "Legal Shield for Websites Rattles Under Onslaught of Hate Speech". The New York Times. Archived from the original on February 8, 2021. Retrieved August 12, 2019.
  77. ^ Prager, Dennis (August 6, 2019). "Don't Let Google Get Away With Censorship". The Wall Street Journal. Archived from the original on January 15, 2021. Retrieved August 12, 2019.
  78. ^ Masnick, Mike (August 6, 2019). "NY Times Joins Lots Of Other Media Sites In Totally And Completely Misrepresenting Section 230". Techdirt. Archived from the original on February 20, 2021. Retrieved August 12, 2019.
  79. ^ Brown, Elizabeth Nolan (August 7, 2019). "Free Speech on the Internet Continues to Confuse Everyone". Reason. Archived from the original on November 7, 2020. Retrieved August 12, 2019.
  80. ^ Post, David (August 9, 2019). "The Sec. 230 Temperature is Rising". Reason. Archived from the original on August 7, 2020. Retrieved August 9, 2019.
  81. ^ Kelly, Makena (August 16, 2019). "Beto O'Rourke seeks new limits on Section 230 as part of gun violence proposal". The Verge. Archived from the original on February 7, 2021. Retrieved August 16, 2019.
  82. ^ Kelly, Makena (January 17, 2020). "Joe Biden wants to revoke Section 230". The Verge. Archived from the original on March 1, 2021. Retrieved January 17, 2020.
  83. ^ "Joe Biden Says Age Is Just a Number". The New York Times. January 17, 2020. ISSN 0362-4331. Archived from the original on March 2, 2021. Retrieved January 17, 2020.
  84. ^ Citron, Danielle; Wittes, Benjamin (August 2, 2018). "The Problem Isn't Just Backpage: Revising Section 230 Immunity". Georgetown Law Technology Review. 453. SSRN 3218521.
  85. ^ a b Neuburger, Jeffrey (August 9, 2019). "Facebook Shielded by CDA Immunity against Federal Claims for Allowing Use of Its Platform by Terrorists". National Law Review. Archived from the original on January 7, 2021. Retrieved August 14, 2019.
  86. ^ Williams, Pete (March 7, 2022). "Supreme Court's Thomas calls for new look at giving Facebook broad immunity". www.nbcnews.com.
  87. ^ "Supreme Court to hear challenge to law that shields internet companies from lawsuits". USA Today.
  88. ^ a b Robertson, Adi (October 3, 2022). "The Supreme Court will determine whether you can sue platforms for hosting terrorists". The Verge. Retrieved October 4, 2022.
  89. ^ a b Barnes, Robert; Zakrzewski, Cat (May 18, 2023). "Supreme Court rules for Google, Twitter on terror-related content". Washington Post. Retrieved September 1, 2024.
  90. ^ Weale, Sally (February 5, 2024). "Social media algorithms 'amplifying misogynistic content'". The Guardian. Retrieved September 1, 2024.
  91. ^ Root, Jay; Ashford, Grace (March 29, 2024). "Inside a High-Stakes Fight to Limit Social Media's Hold on Children". The New York Times. Retrieved September 1, 2024.
  92. ^ Lukpat, Alyssa (August 28, 2024). "Appeals Court Raises Questions Over Section 230 Law Giving Social-Media Companies Legal Immunity". Wall Street Journal. Retrieved September 1, 2024.
  93. ^ Ingram, Mathew (May 16, 2024). "A professor is suing Facebook over its recommendation algorithms". Columbia Journalism Review. Retrieved September 1, 2024.
  94. ^ a b Feiner, Lauren (February 19, 2020). "AG Barr takes aim at a key legal protection for Big Tech companies". CNBC. Archived from the original on March 1, 2021. Retrieved February 20, 2020.
  95. ^ Robertson, Adi (February 19, 2020). "Five Lessons From The Justice Department's Big Debate Over Section 230". The Verge. Archived from the original on February 10, 2021. Retrieved February 20, 2020.
  96. ^ "Department of Justice's Review of Section 230 of The Communications Decency Act of 1996". United States Department of Justice. June 17, 2020. Archived from the original on January 16, 2021. Retrieved June 17, 2020.
  97. ^ Kendall, Brent; McKinnon, John D. (June 17, 2020). "Justice Department Proposes Limiting Internet Companies' Protections". The Wall Street Journal. Archived from the original on January 28, 2021. Retrieved June 17, 2020.
  98. ^ a b c Robertson, Adi (March 5, 2020). "Congress proposes anti-child abuse rules to punish web platforms — and raises fears about encryption". The Verge. Archived from the original on January 11, 2021. Retrieved March 5, 2020.
  99. ^ Clark, John F. (March 5, 2020). "EARN IT Act 2020". National Center for Missing & Exploited Children. Archived from the original on January 23, 2021. Retrieved October 6, 2020.
  100. ^ "Statement – National Center on Sexual Exploitation Supports EARN IT Act". National Center on Sexual Exploitation. March 5, 2020. Archived from the original on March 6, 2021. Retrieved October 6, 2020.
  101. ^ Gross, Grant (March 13, 2020). "Child exploitation bill earns strong opposition from encryption advocates". Washington Examiner. Archived from the original on October 8, 2020. Retrieved October 6, 2020.
  102. ^ "Coalition letter opposing EARN IT 3-6-20" (PDF). March 6, 2020. Archived (PDF) from the original on November 24, 2020. Retrieved October 6, 2020.
  103. ^ "Internet freedom activists: Congress must reject hotly contested EARN IT Act". The Daily Dot. March 6, 2020. Archived from the original on January 15, 2021. Retrieved October 6, 2020.
  104. ^ Harmon, Elliot (January 31, 2020). "Congress Must Stop the Graham-Blumenthal Anti-Security Bill". Electronic Frontier Foundation. Archived from the original on February 25, 2021. Retrieved October 6, 2020.
  105. ^ Fisher, Christine. "EARN IT Act amendments transfer the fight over Section 230 to the states". Engadget. Archived from the original on January 26, 2021. Retrieved October 6, 2020.
  106. ^ Newman, Ronald; Ruane, Kate; Guliani, Neema Singh; Thompson, Ian (July 1, 2020). "ACLU Letter of Opposition to EARN IT Act Manager's Amendment". American Civil Liberties Union. Archived from the original on October 31, 2020. Retrieved October 6, 2020.
  107. ^ Kurnick, Chelsea (September 15, 2020). "Censorship Disguised". East Bay Express. Archived from the original on January 26, 2021. Retrieved October 6, 2020.
  108. ^ "US: Senate Should Reject EARN IT Act". Human Rights Watch. June 1, 2020. Archived from the original on November 16, 2020. Retrieved October 6, 2020.
  109. ^ Ng, Alfred (March 10, 2020). "Why your privacy could be threatened by a bill to protect children". CNet. Archived from the original on February 27, 2021. Retrieved March 10, 2020.
  110. ^ Feiner, Lauren (March 11, 2020). "Senators dispute industry claims that a bill targeting tech's legal shield would prohibit encryption". CNBC. Archived from the original on January 3, 2021. Retrieved April 2, 2020.
  111. ^ Romm, Tony (March 3, 2020). "Congress, Justice Department take aim at tech, hoping to halt spread of child sexual exploitation online". The Washington Post. Archived from the original on January 14, 2021. Retrieved March 3, 2020.
  112. ^ "Graham, Blumenthal, Hawley, Feinstein Introduce EARN IT Act to Encourage Tech Industry to Take Online Child Sexual Exploitation Seriously" (Press release). United States Senate Committee on the Judiciary. March 5, 2020. Archived from the original on January 17, 2021. Retrieved March 10, 2020.
  113. ^ Keller, Michael (May 5, 2020). "A $5 Billion Proposal to Fight Online Child Sexual Abuse". The New York Times. Archived from the original on February 5, 2021. Retrieved May 28, 2020.
  114. ^ Fisher, Christine (July 2, 2020). "EARN IT Act amendments transfer the fight over Section 230 to the states". Engadget. Archived from the original on January 26, 2021. Retrieved September 16, 2020.
  115. ^ Mullin, Joe (October 2, 2020). "Urgent: EARN IT Act Introduced in House of Representatives". EFF. Archived from the original on February 23, 2021. Retrieved October 2, 2020.
  116. ^ Hoonhout, Tobias (June 9, 2020). "GOP Senators Ask FCC to 'Clearly Define' Section 230 Protections for Big Tech". National Review. Archived from the original on August 7, 2020. Retrieved June 14, 2020.
  117. ^ Brandom, Russell (June 17, 2020). "Senate Republicans want to make it easier to sue tech companies for bias". The Verge. Archived from the original on February 20, 2021. Retrieved June 17, 2020.
  118. ^ Kelly, Makena (June 24, 2020). "The PACT Act would force platforms to disclose shadowbans and demonetizations". The Verge. Archived from the original on February 20, 2021. Retrieved June 24, 2020.
  119. ^ Robinson, Adi (July 28, 2020). "Sen. Josh Hawley wants to strip legal protections from sites with targeted ads". The Verge. Archived from the original on January 25, 2021. Retrieved July 28, 2020.
  120. ^ Kelly, Makena (September 8, 2020). "Republicans pressure platforms with new 230 bill". The Verge. Archived from the original on February 28, 2021. Retrieved September 8, 2020.
  121. ^ Birnbaum, Emily; Lapowsky, Issie (February 5, 2021). "This is the Democrats' plan to limit Section 230". Protocol. Archived from the original on February 9, 2021. Retrieved February 5, 2021.
  122. ^ Romm, Tony (July 11, 2019). "Trump accuses social media companies of 'terrible bias' at White House summit decried by critics". The Washington Post. Archived from the original on December 21, 2020. Retrieved May 28, 2020.
  123. ^ Roth, Yoel; Pickles, Nick (May 11, 2020). "Updating our Approach to Misleading Information". Twitter. Archived from the original on February 28, 2021. Retrieved May 28, 2020.
  124. ^ Lybrand, Holmes; Subramaniam, Tara (May 27, 2020). "Fact-checking Trump's recent claims that mail-in voting is rife with fraud". CNN. Archived from the original on February 28, 2021. Retrieved May 28, 2020.
  125. ^ Klar, Rebecca (May 28, 2020). "Dorsey defends decision to fact-check Trump tweet: 'More transparency from us is critical'". The Hill. Archived from the original on February 28, 2021. Retrieved May 28, 2020.
  126. ^ Fung, Brian (May 27, 2020). "Trump threatens to crack down on social media platforms after Twitter labels his tweets". CNN. Archived from the original on January 21, 2021. Retrieved May 28, 2020.
  127. ^ "Executive Order on Preventing Online Censorship". whitehouse.gov. May 28, 2020. Archived from the original on January 29, 2021. Retrieved May 28, 2020 – via National Archives.
  128. ^ a b Fung, Brian; Nobles, Ryan; Liptak, Kevin (May 28, 2020). "Trump is set to announce an executive order against social media companies". CNN. Archived from the original on February 13, 2021. Retrieved May 28, 2020.
  129. ^ a b Dean, Sam (May 28, 2020). "The facts about Section 230, the internet speech law Trump wants to change". Los Angeles Times. Archived from the original on March 4, 2021. Retrieved May 29, 2020.
  130. ^ a b Allyn, Bobby (May 28, 2020). "Stung By Twitter, Trump Signs Executive Order To Weaken Social Media Companies". NPR. Archived from the original on June 28, 2020. Retrieved May 28, 2020.
  131. ^ Baker, Peter; Wakabayashi, Daisuke (May 28, 2020). "Trump's Order on Social Media Could Harm One Person in Particular: Donald Trump". The New York Times. Archived from the original on May 28, 2020. Retrieved May 28, 2020.
  132. ^ Sullivan, Mark (May 28, 2020). "Trump's new executive order is a 'mugging of the First Amendment,' says Sen. Wyden". Fast Company. Archived from the original on November 28, 2020. Retrieved May 28, 2020.
  133. ^ Torres, Ella; Mansell, William (May 29, 2020). "Minnesota protest updates: Trump warns military could 'assume control' of protest response". ABC News. Archived from the original on May 29, 2020. Retrieved May 29, 2020.
  134. ^ Madani, Doha (May 29, 2020). "Trump warns 'when looting starts, shooting starts' as fires burn in Minneapolis". NBC News. Archived from the original on May 29, 2020. Retrieved May 29, 2020.
  135. ^ Spangler, Todd (May 29, 2020). "Twitter Adds Warning Label to Donald Trump's Tweet About 'Shooting' Protesters in Minneapolis, Saying It Glorifies Violence". Variety. Archived from the original on March 2, 2021. Retrieved May 29, 2020.
  136. ^ Chalfant, Morgan (May 29, 2020). "Trump accuses Twitter of unfair targeting after company labels tweet 'glorifying violence'". The Hill. Archived from the original on May 29, 2020. Retrieved May 29, 2020.
  137. ^ Spangler, Todd (June 2, 2020). "Lawsuit Alleges Donald Trump's Executive Order Targeting Twitter, Facebook Violates First Amendment". Variety. Archived from the original on March 2, 2021. Retrieved June 2, 2020.
  138. ^ Moon, Mariella (July 27, 2020). "Trump administration petitions FCC to reinterpret Section 230 rules". Engadget. Archived from the original on January 26, 2021. Retrieved July 27, 2020.
  139. ^ Petition For Rulemaking Of The National Telecommunications and Information Administration (PDF) (Report). National Telecommunications And Information Administration. July 27, 2020. Archived (PDF) from the original on February 9, 2021. Retrieved July 27, 2020.
  140. ^ Fiener, Lauren (October 16, 2020). "FCC Chairman says he will move to 'clarify' Section 230, threatening tech's legal shield". CNBC. Archived from the original on February 24, 2021. Retrieved October 16, 2020.
  141. ^ Coldeway, Devin (October 15, 2020). "With 'absurd' timing, FCC announces intention to revisit Section 230". TechCrunch. Archived from the original on February 2, 2021. Retrieved October 16, 2020.
  142. ^ Guynn, Jessica (October 15, 2020). "Trump vs. Big Tech: Everything you need to know about Section 230 and why everyone hates it". USA Today. Archived from the original on November 17, 2020. Retrieved October 16, 2020.
  143. ^ Kelly, Makena (August 27, 2020). "President Trump's social media order will endanger voting rights, new lawsuit claims". The Verge. Archived from the original on December 2, 2020. Retrieved August 27, 2020.
  144. ^ Lyons, Kim (May 15, 2021). "Biden revokes Trump executive order that targeted Section 230". The Verge. Archived from the original on May 15, 2021. Retrieved May 15, 2021.
  145. ^ Spangler, Todd (December 2, 2020). "Trump Claims He'll Veto Defense Spending Bill Unless Congress Repeals Legal Shield for Social Media Companies". Variety. Archived from the original on March 2, 2021. Retrieved December 2, 2020.
  146. ^ Demirjian, Karoun (December 23, 2020). "Trump vetoes defense bill, teeing up holiday override votes in Congress". The Washington Post. Archived from the original on February 12, 2021. Retrieved December 23, 2020.
  147. ^ Foran, Clare; Barrett, Ted; Zaslav, Ali (January 1, 2021). "Senate votes to override Trump's veto on defense bill". CNN. Archived from the original on March 3, 2021. Retrieved January 1, 2021.
  148. ^ Kelly, Makena (December 29, 2020). "Section 230 has become a bargaining chip in ongoing stimulus talks". The Verge. Archived from the original on February 2, 2021. Retrieved December 29, 2020.
  149. ^ Brinbrum, Emily (January 7, 2021). "Ajit Pai is distancing himself from President Trump". Protocol. Archived from the original on March 1, 2021. Retrieved January 8, 2021.
  150. ^ Romm, Tony (January 8, 2021). "Facebook, Twitter could face punishing regulation for their role in U.S. Capitol riot, Democrats say". The Washington Post. Archived from the original on February 3, 2021. Retrieved January 10, 2021.
  151. ^ Lima, Cristiano; Hendel, John (January 8, 2021). "'This is going to come back and bite 'em': Capitol breach inflames Democrats' ire at Silicon Valley". Politico. Archived from the original on February 18, 2021. Retrieved January 10, 2021.
  152. ^ Robertson, Adi (July 7, 2021). "Donald Trump files sweeping, nonsensical lawsuits against Facebook, Twitter, and Google". The Verge. Archived from the original on July 7, 2021. Retrieved July 7, 2021.
  153. ^ Morgan, Dan (May 6, 2022). "Judge dismisses Trump lawsuit seeking to lift Twitter ban". CNBC. Retrieved May 14, 2022.
  154. ^ Brandom, Russell (March 24, 2021). "Mark Zuckerberg proposes limited 230 reforms ahead of congressional hearing". The Verge. Archived from the original on March 24, 2021. Retrieved March 24, 2021.
  155. ^ Kelly, Makena (July 22, 2021). "Senators target Section 230 to fight COVID-19 vaccine misinformation". The Verge. Archived from the original on July 22, 2021. Retrieved July 22, 2021.
  156. ^ Robertson, Adi (October 14, 2021). "Lawmakers want to strip legal protections from the Facebook News Feed". The Verge. Archived from the original on October 14, 2021. Retrieved October 14, 2021.
  157. ^ a b McCabe, David (March 24, 2021). "How a Stabbing in Israel Echoes Through the Fight Over Online Speech". The New York Times. ISSN 0362-4331. Retrieved October 4, 2022.
  158. ^ Kelly, Makena (May 24, 2021). "Florida governor signs law to block 'deplatforming' of Florida politicians". The Verge. Archived from the original on May 24, 2021. Retrieved May 24, 2021.
  159. ^ Robertson, Adi (May 27, 2021). "Industry groups sue to stop Florida's new social media law". The Verge. Archived from the original on May 27, 2021. Retrieved May 28, 2021.
  160. ^ Robertson, Adi (June 30, 2021). "Judge blocks Florida's social media law". The Verge. Archived from the original on July 1, 2021. Retrieved June 30, 2021.
  161. ^ Robertson, Adi (September 9, 2021). "Texas passes law that bans kicking people off social media based on 'viewpoint'". The Verge. Archived from the original on September 9, 2021. Retrieved September 9, 2021.
  162. ^ Robertson, Adi; Brandom, Russell (December 1, 2021). "Federal court blocks Texas law banning 'viewpoint discrimination' on social media". The Verge. Archived from the original on December 2, 2021. Retrieved December 1, 2021.
  163. ^ Robertson, Adi (May 11, 2022). "Court lets Texas restrictions on social platform content moderation take effect". The Verge. Retrieved May 11, 2022.
  164. ^ Jeong, Sarah (May 13, 2022). "Tech industry appeals the bad Texas social media law to the Supreme Court". The Verge. Retrieved May 14, 2022.
  165. ^ Menegus, Bryan (May 31, 2022). "Texas's bizarre social media law suspended by Supreme Court". Engadget. Retrieved May 31, 2022.
  166. ^ a b Zakrzewski, Cat (September 16, 2022). "Appeals court upholds Texas law regulating social media moderation". Washington Post. Retrieved September 16, 2022.
  167. ^ "Federal appeals court pauses Texas social media law's enforcement amid looming Supreme Court petition".
  168. ^ Sherman, Mark (September 29, 2023). "The Supreme Court will decide if state laws limiting social media platforms violate the Constitution". Associated Press. Retrieved September 29, 2023.
  169. ^ Zeran v. AOL Archived October 31, 2008, at the Wayback Machine, 129 F.3d 327 (4th Cir. 1997).
  170. ^ Blumenthal v. Drudge Archived May 11, 2008, at the Wayback Machine, 992 F. Supp. 44, 49–53 (D.D.C. 1998).
  171. ^ Carafano v. Metrosplash.com Archived February 18, 2006, at the Wayback Machine, 339 F.3d 1119 (9th Cir. 2003).
  172. ^ Batzel v. Smith, 333 F.3d 1018 (9th Cir. 2003).
  173. ^ Green v. AOL Archived May 12, 2008, at the Wayback Machine, 318 F.3d 465 (3rd Cir. 2003).
  174. ^ Barrett v. Rosenthal Archived May 15, 2011, at the Wayback Machine, 40 Cal. 4th 33 (2006).
  175. ^ MCW, Inc. v. badbusinessbureau.com, L.L.C. 2004 WL 833595, No. Civ.A.3:02-CV-2727-G, (N.D. Tex. April 19, 2004).
  176. ^ Hy Cite Corp. v. badbusinessbureau.com, 418 F. Supp. 2d 1142 (D. Ariz. 2005).
  177. ^ Gentry v. eBay, Inc., 99 Cal. App. 4th 816, 830 (2002).
  178. ^ Ben Ezra, Weinstein & Co. v. America Online Archived July 24, 2008, at the Wayback Machine, 206 F.3d 980 (10th Cir. 2000).
  179. ^ Goddard v. Google, Inc. Archived February 25, 2021, at the Wayback Machine, C 08-2738 JF (PVT), 2008 WL 5245490, 2008 U.S. Dist. LEXIS 101890 (N.D. Cal. December 17, 2008).
  180. ^ Milgram v. Orbitz Worldwide, LLC Archived August 7, 2020, at the Wayback Machine, ESX-C-142-09 (N.J. Super. Ct. August 26, 2010).
  181. ^ Kindkade, Tyler (January 10, 2019). "A Man Sent 1,000 Men Expecting Sex And Drugs To His Ex-Boyfriend Using Grindr, A Lawsuit Says". Buzzfeed News. Archived from the original on February 26, 2021. Retrieved August 14, 2019.
  182. ^ Williams, Jamie (April 8, 2019). "Victory! Second Circuit Affirms Dismissal of Latest Threat to Section 230". Electronic Frontier Foundation. EFF. Archived from the original on November 8, 2020. Retrieved August 14, 2019.
  183. ^ Doe v. America Online Archived May 23, 2009, at the Wayback Machine, 783 So. 2d 1010 (Fl. 2001)
  184. ^ Kathleen R. v. City of Livermore Archived March 4, 2021, at the Wayback Machine, 87 Cal. App. 4th 684 (2001)
  185. ^ Doe v. MySpace Archived February 24, 2021, at the Wayback Machine, 528 F.3d 413 (5th Cir. 2008)
  186. ^ Dart v. Craigslist Archived February 26, 2021, at the Wayback Machine, 665 F. Supp. 2d 961 (N.D. Ill. October 20, 2009)
  187. ^ BACKPAGE.COM LLC, Plaintiff, and THE INTERNET ARCHIVE, Plaintiff Intervenor, vs. ROB MCKENNA, Attorney General of Washington, et al., Defendants, in their official capacities (United States District Court Western District of Washington at Seattle July 30, 2012), Text.
  188. ^ a b "Backpage.com et al v. McKenna et al, No. 2:2012cv00954 - Document 69 (W.D. Wash. 2012)". Justia Law. Retrieved September 1, 2024.
  189. ^ McCann, Nick (December 10, 2012). "Washington Drops Online Sex Traffic Law". Courthouse News Service. Retrieved September 1, 2024.
  190. ^ a b BACKPAGE.COM, LLC, Plaintiff, v. BACKPAGE.COM, LLC v. ROBERT E. COOPER, JR., et al., Defendants, Case 3:12-cv-00654, Document 88 (The United States District Court for the Middle District of Tennessee, Nashville Division May 22, 2014).
  191. ^ BACKPAGE.COM, LLC, Plaintiff, V. JOHN JAY HOFFMAN, Acting Attorney General of the State of New Jersey; et al.; Defendants. in their official capacities - THE INTERNET ARCHIVE, Plaintiff, v. JOHN JAY HOFFMAN, Acting Attorney General of the State of New Jersey; et al.; Defendants, in their official capacities (United States District Court for The District of New Jersey June 28, 2013), Text.
  192. ^ a b "Backpage.com v. McKenna, et al". Digital Media Law Project. August 2, 2012. Archived from the original on February 27, 2021. Retrieved May 18, 2014.
  193. ^ 62nd Legislature 2012 Regular Session. "Certification of Enrollment: Engrossed Substitute Senate Bill 6251" (PDF). Washington State Legislature. Archived (PDF) from the original on January 12, 2013. Retrieved May 18, 2014.
  194. ^ "Judgment in a Civil Case: Backpage.com, LLC and The Internet Archive v. Rob McKenna, Attorney General of the State of Washington, et al" (PDF). United States District Court for the Western District of Washington. December 10, 2012. Case Number C12-954RSM, Document 87. Archived (PDF) from the original on March 6, 2021. Retrieved May 18, 2014.
  195. ^ Nissenbaum, Gary (May 29, 2014). "Are Internet Publishers Responsible for Advertisements for Potential Sexual Liaisons with Minors?". www.gdnlaw.com. Nissenbaum Law Group, LLC. Archived from the original on March 12, 2016. Retrieved January 21, 2016.
  196. ^ BACKPAGE.COM, LLC, Plaintiff, v. JOHN JAY HOFFMAN, Acting Attorney General of the State of New Jersey, et al.; Defendants, in their official capacities. & THE INTERNET ARCHIVE, Plaintiff-Intervenor, v. JOHN JAY HOFFMAN, Acting Attorney General of the State of New Jersey, et al.; Defendants, in their official capacities., CIVIL ACTION NO. 2:13-03952 (CCC-JBC) (The United States District Court for the District of New Jersey May 14, 2014).
  197. ^ BACKPAGE.COM, LLC, Plaintiff, v. JOHN JAY HOFFMAN, Acting Attorney General of the State of New Jersey, et al.; Defendants, in their official capacities. - THE INTERNET ARCHIVE, Plaintiff-Intervenor, v. JOHN JAY HOFFMAN, Acting Attorney General of the State of New Jersey, et al. (The United States District Court for the District of New Jersey May 13, 2014), Text.
  198. ^ BACKPAGE.COM, LLC, Plaintiff-Appellant, v. THOMAS J. DART, Sheriff of Cook County, Illinois, Defendant-Appellee (United States Court of Appeals For the Seventh Circuit November 30, 2015), Text.
  199. ^ Stempel, Jonathan (November 30, 2015). "Backpage.com wins injunction vs Chicago sheriff over adult ads". www.reuters.com. Reuters. Archived from the original on February 5, 2021. Retrieved January 21, 2016.
  200. ^ Sneed, Tierney (July 21, 2015). "Backpage Sues Chicago Sheriff Over Pressure Campaign to Stop Sex Ads". talkingpointsmemo.com. Talking Points Memo. Archived from the original on November 8, 2020. Retrieved January 21, 2016.
  201. ^ BACKPAGE.COM, LLC, Plaintiff, No. 15 C 06340 v. SHERIFF THOMAS J. DART, Defendant (The United States District Court Northern District of Illinois, Eastern Division July 24, 2015), Text.
  202. ^ "BACKPAGE.COM, LLC v. DART". www.leagle.com. Leagle. August 24, 2015. Archived from the original on December 7, 2020. Retrieved February 19, 2016.
  203. ^ BACKPAGE.COM, LLC, Plaintiff-Appellant, v. THOMAS J. DART, Sheriff of Cook County, Illinois Defendant-Appellee (United States Court of Appeals For the Seventh Circuit November 30, 2015), Text.
  204. ^ Weiberg, Fossa (June 1, 2018). "Backpage.com lawsuit against Cook County sheriff dismissed". Chicago Tribune. Archived from the original on September 23, 2020. Retrieved March 15, 2020.
  205. ^ Chicago Lawyers' Committee For Civil Rights Under Law, Inc. v. Craigslist, Inc. Archived May 22, 2008, at the Wayback Machine 519 F.3d 666 (7th Cir. 2008).
  206. ^ Fair Housing Council of San Fernando Valley v. Roommate.com, LLC Archived February 21, 2012, at the Wayback Machine, 521 F.3d 1157 (9th Cir. 2008) (en banc).
  207. ^ 42 U.S.C. § 3604(c) Archived February 9, 2012, at the Wayback Machine.
  208. ^ Cal. Gov. Code § 12955 Archived August 2, 2010, at the Wayback Machine.
  209. ^ Proctor, Katherine (May 31, 2016). "Raped Model's Suit Against Website Revived". Courthouse News Service. Archived from the original on October 7, 2016. Retrieved June 1, 2016.
  210. ^ Jane Doe No. 14 v. Internet Brands, Inc. Archived December 7, 2020, at the Wayback Machine, no. 12-56638 (9th Cir. May 31, 2016).
  211. ^ Raymond, Nate (August 28, 2024). "TikTok must face lawsuit over 10-year-old girl's death, US court rules". Reuters. Retrieved September 1, 2024.
  212. ^ a b "Anderson v. TikTok Inc, No. 22-3061 (3d Cir. 2024)". Justia Law. Retrieved September 1, 2024.
  213. ^ "ANDERSON v. TIKTOK, INC., 2:22-cv-01849 - CourtListener.com". CourtListener. Retrieved September 1, 2024.
  214. ^ Pierson, Brendan (October 27, 2022). "TikTok immune from lawsuit over girl's death from 'blackout challenge' -judge". Reuters. Retrieved September 1, 2022.
  215. ^ Robertson, Adi (May 18, 2020). "Supreme Court rejects lawsuit against Facebook for hosting terrorists". The Verge. Archived from the original on January 30, 2021. Retrieved May 18, 2020.
  216. ^ Liptak, Adam; McCabe, Dave (October 3, 2022). "Supreme Court Takes Up Challenge to Social Media Platforms' Shield". The New York Times. Retrieved October 3, 2022.
  217. ^ Fung, Brian (May 18, 2023). "Supreme Court shields Twitter from liability for terror-related content and leaves Section 230 untouched". CNN Politics. Retrieved May 18, 2023.
  218. ^ a b "EUR-Lex - 32000L0031 - EN". europa.eu. Archived from the original on February 20, 2021. Retrieved April 16, 2009.
  219. ^ "Proposal for a directive on copyright in the Digital Single Market" (PDF). May 25, 2018. p. 26. Archived (PDF) from the original on June 9, 2018. Retrieved March 12, 2019.
  220. ^ "Dow Jones & Company Inc. v Gutnick [2002] HCA 56 (10 December 2002)". kentlaw.edu. Archived from the original on December 4, 2020. Retrieved December 15, 2021.
  221. ^ "DEFAMATION ACT 2005". austlii.edu.au. Archived from the original on December 12, 2020. Retrieved March 12, 2009.
  222. ^ "The CompuServe Germany Case". Archived from the original on February 25, 2004. Retrieved November 23, 2003.
  223. ^ Kuner, Christopher. "Judgment of the Munich Court in the "CompuServe Case" (Somm Case)". Archived from the original on March 3, 2016. Retrieved March 12, 2009.
  224. ^ PROF. DR. ULRICH SIEBER. "Commentary on the Conclusion of Proceedings in the "CompuServe Case"". Archived from the original on December 5, 2015. Retrieved March 12, 2009.
  225. ^ "World: Europe Ex-CompuServe boss acquitted". BBC News. November 17, 1999. Archived from the original on December 3, 2008. Retrieved March 12, 2009.
  226. ^ Kaufmann, Noogie C. (March 12, 2004). "BGH: Online-Auktionshäuser müssen Angebote von Plagiaten sperren". heise online. Archived from the original on April 14, 2008. Retrieved March 12, 2009.
  227. ^ "heise online - IT-News, Nachrichten und Hintergründe". heise online. Archived from the original on October 22, 2008.

Bibliography

Further reading

  • Roberts, Jeff John (December 2019). "Tech's Legal Shield Tussle". Fortune (Paper). New York City. pp. 33–34. ISSN 0015-8259.