For whatever it is worth, about how private social media companies should do "free speech" if that is what they want to do:
https://www.persuasion.community/p/the-twitter-files-show-its-time-to?utm_source=post-email-title&publication_id=61579&post_id=90100626&isFreemail=true&utm_medium=email"First, it means giving up the quest for a free speech utopia and embracing viewpoint neutrality. There is no way to create any meaningful free speech environment that allows for actual debate while protecting participants from hurtful ideas or painful speech. Executives at Twitter or Meta are no better than college administrators at crafting the perfect speech code. The brightest minds have already made that effort, and even the brightest minds have failed.
"Second, it means moderating on the basis of traditional speech limits. Even institutions that embrace viewpoint neutrality will place limits on speech. They’ll have to. If there is one thing we know from decades of experience with the internet is that completely unmoderated spaces can and do become open sewers that are often unsafe for children and deeply unpleasant for adults. Unmoderated spaces can become so grotesque that they’re simply not commercially viable.
"“Viewpoint neutral” is thus not a synonym for “unmoderated.” Consistent with viewpoint neutrality, a platform can impose restrictions that echo offline speech limitations. Defamation isn’t protected speech. Neither is obscenity. Harassment is unlawful. Invasions of privacy (doxxing, for example) should face sanctions. Threats and incitement violate criminal law. A platform can say, “Children are present. No nudity.”
"It is easy to imagine different rules that make it easier to talk about issues and harder to target individuals. Examples of viewpoint-neutral time, place, and manner regulations that could prevent, for example, some of the worst conduct on Twitter could include limiting or eradicating the quote-tweet function, limiting the visibility of replies to other users’ tweets, or limiting the ability of users to reply or interact with tweets of people they don’t follow.
"Third, it means embracing clarity and transparency. Make rules clear. Create an appeals process when users are penalized. No human institution is ever going to apply its rules perfectly, and accountability is necessary. Secrecy in decision-making can impair trust every bit as thoroughly as flaws in the substance of the decisions made."
This was certainly interesting:
"Indeed, one of the interesting lessons of the last few years is that social media censorship is both divisive and ineffective. It often backfires. In a free society, attempts to censor speech often create a demand for that speech. Twitter censoring the Hunter Biden story, for example, didn’t squelch its reach. Internet searches for Hunter Biden skyrocketed after Twitter took action.
"The idea that censoring speech can have the opposite effect is so well-known that it has a term—the Streisand Effect. In 2003, Barbra Streisand sued to have a picture of her home removed from an internet site. At the time she filed the suit, the image had only been downloaded a grand total of six times (twice by her lawyers). After her suit hit the news, the image was downloaded 420,000 times in a single month.
"The reality of the Streisand Effect can create perverse incentives. Bad actors will intentionally court suspensions or flirt with outright bans to generate attention and sympathy."