Difference between revisions of "The Common Carrier Theory of Facebook"
(→NetChoice v. Paxton)
|Line 12:||Line 12:|
=== The Statute===
=== The Statute===
Revision as of 09:32, 4 December 2021
- "Are Facebook and Google State Actors?" Lawfare, Jed Rubenfeld , November 4, 2019.
NetChoice v. Paxton
NetChoice LLC et al v. Paxton, Judge Robert Pitman Western District of Texas (Austin) CIVIL DOCKET FOR CASE #: 1:21-cv-00840-RP
Sec. 143A.002. CENSORSHIP PROHIBITED. (a) A social media platform or interactive computer service may not censor a user, a user's expression, or a user's ability to receive the expression of another person based on: (1) the viewpoint of the user or another person; (2) the viewpoint represented in the user's expression or another person's expression; or (3) a user's geographic location in this state or any part of this state. (b) This section applies regardless of whether the viewpoint is expressed on a social media platform or interactive computer service or through any other medium. <\pre>
Facts*"youtube-cuts-off-the-best-real-time-legal-coverage-of-rittenhouse-trial-and-immediately-regrets-it," PJMedia (Nov. 15, 2021):
Right after Assistant District Attorney Thomas Binger began his closing statement in the Kyle Rittenhouse Trial, YouTube cut off channels that were beating legacy media channels. Coincidence? The Rekieta Law channel, which features multiple lawyers doing real-time analysis of the trial, often beat the number of people watching the PBS stream. The PBS stream is one of the more reliable ones available to YouTube users and was being used by several outlets. After getting cut off, Nick Rekieta reminded YouTube that ten lawyers considered it a breach of contract. ... Ticking off channels featuring dozens of lawyers seemed like a bad business plan. Within a few minutes, the stream was put back up after Rekieta reminded the tech giant that the courtroom coverage was public property and therefore not under copyright.
Statutes*"2020 09 17 Cox Wyden FCC Reply Comments, (2020).
Commentary*"Section 230’s Application to States’ Regulation of Social Media, Eric Goldman, Santa Clara Univ. Legal Studies Research Paper, 14 Nov 2021). *"The Scale Of Content Moderation Is Unfathomable: from the it's-way-more-than-you-think dept," TechDirt.com, (Nov 2nd 2021) Mike Masnick.
Litigation Strategy* The idea is for someone to sue Facebook or Twitter for defamation. The plaintiff would be somebody, preferably not a public figure in any way, who was clearly defamed by someone posting on defendant's website (let's use Twitter for concreteness). Twitter would move to dismiss on the grounds that they are immune by statute from liability for defamation by somebody posting on their website, and perhaps immune for some common law reason such as that they can't be expected to police content any more than somebody polices the graffiti on the wall of their warehouse. The response would be that Twitter has forfeited its immunity because it actually *does* police content, and not just for obscenity, but for correctness of political content, so it has shown itself willing and able to do so. THus, Twitter is actually a content provider with editorial control like a magazine, not a neutral software provider. This is a question of fact, which must be decided by a jury. The purpose would not be to win the particular lawsuit, but to establish in a court of law that Twitter is liable for defamatory content. If this could get beyond motion to dismiss, it would allow Discovery of Twitter's internal documents and practices and publicity about them at trial, even if it lost in the end. It would prepare the way for Twitter to be treated by courts and regulated by the government as a common carrier (though that of course raises the additional issue of whether it is a natural monopoly). A big thing is to find a test case which would not be dismissed for other reasons--- so it would be good to find defamation per se (false accusation of a crime, say), of a private figure, with a false statement of fact not opinion, etc., which was widely circulated, with a plaintiff in a favorable state and federal circuit.
Caselaw*Eric Goldman's blog *Section 230 Twitter *Page v. Oath, 2021 WL 528472 (Del. Superior Ct. Feb. 11, 2021). *IO Group v. Veoh Networks, 5:2006cv03926 (N.D. Cal. Aug. 27, 2008) . *Blumenthal v. Drudge (1998) * Downs v. Oath, 2019 WL 2209206 (S.D.N.Y. May 22, 2019) *[UMG Recordings, Inc. v. Veoh Networks, Inc]., 2008 WL 5423841 (C.D. Cal. Dec. 29, 2008)
Judge Thomas (2021)Thomas, J. (2021):
If part of the problem is private, concentrated control over online content and platforms available to the public, then part of the solution may be found in doctrines that limit the right of a private company to exclude. Historically, at least two legal doctrines limited a company’s right to exclude. First, our legal system and its British predecessor have long subjected certain businesses, known as common carriers, to special regulations, including a general requirement to serve all comers. Candeub, Bargaining for Free Speech: Common Carriage, Network Neutrality, and Section 230, 22 Yale J. L. & Tech. 391, 398–403 (2020) (Candeub); see also Burdick, The Origin of the Peculiar Duties of Public Service Companies, Pt. 1, 11 Colum. L. Rev. 514 (1911). Justifications for these regulations have varied. Some scholars have argued that common-carrier regulations are justified only when a carrier possesses substantial market power. Candeub 404. Others have said that no substantial market power is needed so long as the company holds itself out as open to the public. Ibid.; see also Ingate v. Christie, 3 Car. & K. 61, 63, 175 Eng. Rep. 463, 464 (N. P. 1850) (“[A] person [who] holds himself out to carry goods for everyone as a business . . . is a common carrier”). And this Court long ago suggested that regulations like those placed on common carriers may be justified, even for industries not historically recognized as common carriers, when “a business, by circumstances and its nature, . . . rise[s] from private to be of public concern.” See German Alliance Ins. Co. v. Lewis, 233 U. S. 389, 411 (1914) (affirming state regulation of fire insurance rates). At that point, a company’s “property is but its instrument, the means of rendering the service which has become of public interest.” Id., at 408. This latter definition of course is hardly helpful, for most things can be described as “of public interest.” But whatever may be said of other industries, there is clear historical precedent for regulating transportation and communications networks in a similar manner as traditional common carriers. Candeub 398–405. Telegraphs, for example, because they “resemble[d] railroad companies and other common carriers,” were “bound to serve all customers alike, without discrimination.” Primrose v. Western Union Telegraph Co., 154 U. S. 1, 14 (1894).footnote 2 Footnote 2 This Court has been inconsistent about whether telegraphs were common carriers. Compare Primrose, 154 U. S., at 14, with Moore v. New York Cotton Exchange, 270 U. S. 593, 605 (1926). But the Court has consistently recognized that ]y from certain types of suits”Footnote3 or to regulations that make it more difficult for other companies to compete with the carrier (such as franchise licenses). Ibid. By giving these companies special privileges, governments place them into a category distinct from other companies and closer to some functions, like the postal service, that the State has traditionally undertaken. Second, governments have limited a company’s right to exclude when that company is a public accommodation. This concept—related to common-carrier law—applies to companies that hold themselves out to the public but do not “carry” freight, passengers, or communications. See, e.g., Civil Rights Cases, 109 U. S. 3, 41–43 (1883) (Harlan, J., dissenting) (discussing places of public amusement). It also applies regardless of the company’s market power. See, e.g., 78 Stat. 243, 42 U. S. C. §2000a(a). Footnote 3T elegraphs, for example, historically received some protection from defamation suits. Unlike other entities that might retransmit defamatory content, they were liable only if they knew or had reason to know that a message they distributed was defamatory. Restatement (Second) of Torts §581 (1976); see also O’Brien v. Western Union Tel. Co., 113 F. 2d 539, 542 (CA1 1940). B. Internet platforms of course have their own First Amendment interests, but regulations that might affect speech are valid if they would have been permissible at the time of the founding. See United States v. Stevens, 559 U. S. 460, 468 (2010). The long history in this country and in England of restricting the exclusion right of common carriers and places of public accommodation may save similar regulations today from triggering heightened scrutiny—especially where a restriction would not prohibit the company from speaking or force the company to endorse the speech. See Turner Broadcasting System, Inc. v. FCC, 512 U. S. 622, 684 (1994) (O’Connor, J., concurring in part and dissenting in part); PruneYard Shopping Center v. Robins, 447 U. S. 74, 88 (1980). There is a fair argument that some digital platforms are sufficiently akin to common carriers or places of accommodation to be regulated in this manner. 1. In many ways, digital platforms that hold themselves out to the public resemble traditional common carriers. Though digital instead of physical, they are at bottom communications networks, and they “carry” information from one user to another. A traditional telephone company laid physical wires to create a network connecting people. Digital platforms lay information infrastructure that can be controlled in much the same way. And unlike newspapers, digital platforms hold themselves out as organizations that focus on distributing the speech of the broader public. Federal law dictates that companies cannot “be treated as the publisher or speaker” of information that they merely distribute. 110 Stat. 137, 47 U. S. C. §230(c). The analogy to common carriers is even clearer for digital platforms that have dominant market share. Similar to utilities, today’s dominant digital platforms derive much of their value from network size. The Internet, of course, is a network. But these digital platforms are networks within that network. The Facebook suite of apps is valuable largely because 3 billion people use it. Google search—at 90% of the market share—is valuable relative to other search engines because more people use it, creating data that Google’s algorithm uses to refine and improve search results. These network effects entrench these companies. Ordinarily, the astronomical profit margins of these platforms—last year, Google brought in $182.5 billion total, $40.3 billion in net income—would induce new entrants into the market. That these companies have no comparable competitors highlights that the industries may have substantial barriers to entry. To be sure, much activity on the Internet derives value from network effects. But dominant digital platforms are different. Unlike decentralized digital spheres, such as the e-mail protocol, control of these networks is highly concentrated. Although both companies are public, one person controls Facebook (Mark Zuckerberg), and just two control Google (Larry Page and Sergey Brin). No small group of people controls e-mail. Much like with a communications utility, this concentration gives some digital platforms enormous control over speech. When a user does not already know exactly where to find something on the Internet—and users rarely do— Google is the gatekeeper between that user and the speech of others 90% of the time. It can suppress content by deindexing or downlisting a search result or by steering users away from certain content by manually altering autocomplete results. Grind, Schechner, McMillan, & West, How Google Interferes With Its Search Algorithms and Changes Your Results, Wall Street Journal, Nov. 15, 2019. Facebook and Twitter can greatly narrow a person’s information flow through similar means. And, as the distributor of the clear majority of e-books and about half of all physical books,Footnote4 Amazon can impose cataclysmic consequences on authors by, among other things, blocking a listing. Footnote4 As of 2018, Amazon had 42% of the physical book market and 89% of the e-book market. Day & Gu, The Enormous Numbers Behind Amazon’s Market Reach, Bloomberg, Mar. 27, 2019. It changes nothing that these platforms are not the sole means for distributing speech or information. A person always could choose to avoid the toll bridge or train and instead swim the Charles River or hike the Oregon Trail. But in assessing whether a company exercises substantial market power, what matters is whether the alternatives are comparable. For many of today’s digital platforms, nothing is. If the analogy between common carriers and digital platforms is correct, then an answer may arise for dissatisfied platform users who would appreciate not being blocked: laws that restrict the platform’s right to exclude. When a platform’s unilateral control is reduced, a government official’s account begins to better resemble a “government-controlled spac[e].” Mansky, 585 U. S., at ___ (slip op., at 7); see also Southeastern Promotions, 420 U. S., at 547, 555 (recognizing that a private space can become a public forum when leased to the government). Common-carrier regulations, although they directly restrain private companies, thus may have an indirect effect of subjecting government officials to suits that would not otherwise be cognizable under our public-forum jurisprudence. This analysis may help explain the Second Circuit’s intuition that part of Mr. Trump’s Twitter account was a public forum. But that intuition has problems. First, if market power is a predicate for common carriers (as some scholars suggest), nothing in the record evaluates Twitter’s market power. Second, and more problematic, neither the Second Circuit nor respondents have identified any regulation that restricts Twitter from removing an account that would otherwise be a “government-controlled space.” 2. Even if digital platforms are not close enough to common carriers, legislatures might still be able to treat digital platforms like places of public accommodation. Although definitions between jurisdictions vary, a company ordinarily is a place of public accommodation if it provides “lodging, food, entertainment, or other services to the public . . . in general.” Black’s Law Dictionary 20 (11th ed. 2019) (defining “public accommodation”); accord, 42 U. S. C. §2000a(b)(3) (covering places of “entertainment”). Twitter and other digital platforms bear resemblance to that definition. This, too, may explain the Second Circuit’s intuition. Courts are split, however, about whether federal accommodations laws apply to anything other than “physical” locations. Compare, e.g., Doe v. Mutual of Omaha Ins. Co., 179 F. 3d 557, 559 (CA7 1999) (Title III of the Americans with Disabilities Act (ADA) covers websites), with Parker v. Metropolitan Life Ins. Co., 121 F. 3d 1006, 1010–1011 (CA6 1997) (en banc) (Title III of the ADA covers only physical places); see also 42 U. S. C. §§2000a(b)–(c) (discussing “physica[l] locat[ions]”). Once again, a doctrine, such as public accommodation, that reduces the power of a platform to unilaterally remove a government account might strengthen the argument that an account is truly government controlled and creates a public forum. See Southeastern Promotions, 420 U. S., at 547, 555. But no party has identified any public accommodation restriction that applies here. II. The similarities between some digital platforms and common carriers or places of public accommodation may give legislators strong arguments for similarly regulating digital platforms. “[I]t stands to reason that if Congress may demand that telephone companies operate as common carriers, it can ask the same of ” digital platforms. Turner, 512 U. S., at 684 (opinion of O’Connor, J.). That is especially true because the space constraints on digital platforms are practically nonexistent (unlike on cable companies), so a regulation restricting a digital platform’s right to exclude might not appreciably impede the platform from speaking. See id., at 675, 684 (noting restrictions on one-third of a cable company’s channels but recognizing that regulation may still be justified); PruneYard, 447 U. S., at 88. Yet Congress does not appear to have passed these kinds of regulations. To the contrary, it has given digital platforms “immunity from certain types of suits,” Candeub 403, with respect to content they distribute, 47 U. S. C. §230, but it has not imposed corresponding responsibilities, like nondiscrimination, that would matter here. None of this analysis means, however, that the First Amendment is irrelevant until a legislature imposes common carrier or public accommodation restrictions—only that the principal means for regulating digital platforms is through those methods. Some speech doctrines might still apply in limited circumstances, as this Court has recognized in the past. For example, although a “private entity is not ordinarily constrained by the First Amendment,” Halleck, 587 U. S., at ___, ___ (slip op., at 6, 9), it is if the government coerces or induces it to take action the government itself would not be permitted to do, such as censor expression of a lawful viewpoint. Ibid. Consider government threats. “People do not lightly disregard public officers’ thinly veiled threats to institute criminal proceedings against them if they do not come around.” Bantam Books, Inc. v. Sullivan, 372 U. S. 58, 68 (1963). The government cannot accomplish through threats of adverse government action what the Constitution prohibits it from doing directly. See ibid.; Blum v. Yaretsky, 457 U. S. 991, 1004–1005 (1982). Under this doctrine, plaintiffs might have colorable claims against a digital platform if it took adverse action against them in response to government threats. But no threat is alleged here. What threats would cause a private choice by a digital platform to “be deemed . . . that of the State” remains unclear. Id., at 1004.5 And no party has sued Twitter. The question facing the courts below involved only whether a government actor violated the First Amendment by blocking another Twitter user. That issue turns, at least to some degree, on ownership and the right to exclude. Footnote 5 Threats directed at digital platforms can be especially problematic in the light of 47 U. S. C. §230, which some courts have misconstrued to give digital platforms immunity for bad-faith removal of third-party content. Malwarebytes, Inc. v. Enigma Software Group USA, LLC, 592 U. S. ___, ___–___ (2020) (THOMAS, J., statement respecting denial of certiorari) (slip op., at 7–8). This immunity eliminates the biggest deterrent— a private lawsuit—against caving to an unconstitutional government threat. For similar reasons, some commentators have suggested that immunity provisions like §230 could potentially violate the First Amendment to the extent those provisions pre-empt state laws that protect speech from private censorship. See Volokh, Might Federal Preemption of SpeechProtective State Laws Violate the First Amendment? The Volokh Conspiracy, Reason, Jan. 23, 2021. According to that argument, when a State creates a private right and a federal statute pre-empts that state law, “the federal statute is the source of the power and authority by which any private rights are lost or sacrificed.” Railway Employees v.
Miscellaneous*""Public Forum" is a term of constitutional significance - it refers to the public space that the govt provides - not a private website at which people congregate. Courts have repeatedly held that social media platforms are not subject to the "public forum doctrine." But what if the government bans private companies from doing something, and then sets up their own public space? What about air waves TV and the FCC?