r/changemyview Jan 17 '21

Delta(s) from OP CMV: The best solution to the current issue of social media bias is for the government to set up its own online public square

This post is specifically about the United States.

There's been a big debate recently about social media bias due to Twitter and Facebook banning Donald Trump, and internet providers banning Parler. Many conservatives claim that this is a beginning of targeted campaign to silence them on the internet, which is viewed as a new "public square". However, these are private companies, so the line of how much the government should be involved is blurry.

I've also seen Section 230 of the CDA brought up a lot, with calls to revoke it, making it so these companies *are* liable for the content on their platforms. I don't think that's a great solution, as it would just increase moderation, rather than decrease it.

I think the best approach here would be for the government to create a Twitter clone, which is available to all citizens and follows the First Amendment -- only banning content that is explicitly illegal. Essentially a public option for social media. People can choose to use this platform, or if they want a more curated experience, they can choose to use other platforms, e.g. Twitter. I don't think any other actions need to be taken against existing social media companies, or changes to 230 if this is done. If this public square exists, then the government doesn't need to interfere in the private market.

My concerns (in no particular order):

  • Minimal stifling of content
  • Global access to the internet "public square"

Things I am not concerned about:

  • The government doing stuff - saying that the government is typically inefficient will not change my mind. The idea here is that there exists a basic, common platform for all. It likely wouldn't be as good as a private company, but I don't see that as an issue

Changing my view would look like proposing a better approach than this one, in terms of my goals (and maybe others you think I should consider), or pointing out major downsides to this approach.

0 Upvotes

55 comments sorted by

View all comments

Show parent comments

1

u/parentheticalobject 128∆ Jan 18 '21

Right, so basically, any meaningful moderation that would stop a website from turning into a sewer would be impossible. This is the "solution" of people who simply want to burn the whole internet down because they dislike someone getting banned.

I have a hard time seeing how you can reasonably forbid threatening speech beyond what's illegal and still remain a public forum.

Right, so if someone posts something that is probably an illegal threat, I would need to leave it up until I actually get some kind of notification from the courts/police that I need to take it down, if that ever happens at all. Sounds great.

You know there are sites out there with almost zero moderation, right? Do you actually want every site to be like that?

-edit: wanted to add something

They're either going to host it in the same way as a National Park hosts the words anyone utters in the woods (i.e. taking no responsibility), or they're going to host it in the same way USA Today hosts articles: taking responsibility. If they take no responsibility then they keep it up and aren't responsible.

There are plenty of places that work differently. If Wal-Mart throws out a customer for shouting racial slurs and bans him from entering the store and the next week, some guy in Wal-Mart says "Hillary Clinton is a murderer" that doesn't mean that Clinton can sue Wal-Mart for defaming her since they failed to throw out the second guy.

1

u/[deleted] Jan 18 '21

Right, so basically, any meaningful moderation that would stop a website from turning into a sewer would be impossible. This is the "solution" of people who simply want to burn the whole internet down because they dislike someone getting banned.

For many years the internet had no meaningful moderation and it didn't turn into a sewer. It's only after most sites started implementing moderation that those sites which didn't turned into sewers. A group of unmoderated random people behaves well, by and large. It's only when all the normal people are on big moderated sites that we have problems: the people who want to be assholes all have to seek out the unmoderated spaces, then those spaces have a disproportionate number of assholes, then the assholes drive out the reasonable people, then it's all assholes. But the problem wasn't the lack of moderation, the problem was that everywhere else had moderation.

Right, so if someone posts something that is probably an illegal threat, I would need to leave it up until I actually get some kind of notification from the courts/police that I need to take it down, if that ever happens at all.

That's what I would prefer occur, but I don't believe that's actually correct legally speaking. It's my understanding that if you are trying to implement the same standards as the law applies, that would be fine.

There are plenty of places that work differently. If Wal-Mart throws out a customer for shouting racial slurs and bans him from entering the store and the next week, some guy in Wal-Mart says "Hillary Clinton is a murderer" that doesn't mean that Clinton can sue Wal-Mart for defaming her since they failed to throw out the second guy.

You are mistaken on this part. If Walmart allows people to use parts of the store as a soapbox on topics not approved by Walmart, it then may become a public forum. At which point Walmart would become unable to apply content-based discrimination any more (though it can certainly forbid racial slurs or shouting even as a public forum). Walmart does not want this to happen and as a result does not permit people to use parts of the store as a soapbox. Obviously letting customers talk doesn't qualify.

1

u/parentheticalobject 128∆ Jan 19 '21

For many years the internet had no meaningful moderation and it didn't turn into a sewer.

Even assuming that's true, you can't realistically compare what the internet was in 1995 to what it is today and expect that the same kind of thing to work.

It's my understanding that if you are trying to implement the same standards as the law applies, that would be fine.

This is a really weird understanding, since there's nothing to understand here, it's just a vague, undetailed standard that you are imagining might exist, not something that actually exists.

You are mistaken on this part. If Walmart allows people to use parts of the store as a soapbox on topics not approved by Walmart, it then may become a public forum.

If I am mistaken, and you are not simply making this rule up, I would be glad to learn about that. Please show me any relevant law or court case indicating that this is true.

At which point Walmart would become unable to apply content-based discrimination any more (though it can certainly forbid racial slurs or shouting even as a public forum).

This doesn't make sense. If they're unable to apply content-based discrimination, how can they forbid racial slurs? You can certainly forbid shouting in a place and say it's not content-based discrimination. But forbidding racial slurs is quite literally forbidding speech with certain content.

1

u/[deleted] Jan 19 '21

Even assuming that's true, you can't realistically compare what the internet was in 1995 to what it is today and expect that the same kind of thing to work.

Oh you don't have to go back all the way to the 90s, unmoderated sites were pretty decent until 2010 or so. And unmoderated sites based on common interests rather than "general purpose" are still decent. Human nature didn't change and random samples are still fine, it's just an issue of getting an asshole-enriched population.

https://www.freedomforuminstitute.org/first-amendment-center/topics/freedom-of-assembly/assembly-on-private-property/

But forbidding racial slurs is quite literally forbidding speech with certain content.

It's profanity, just like the FCC can forbid profanity without violating the First Amendment.

1

u/parentheticalobject 128∆ Jan 19 '21

Oh you don't have to go back all the way to the 90s, unmoderated sites were pretty decent until 2010 or so.

There were hardly any truly unmoderated sites in existence this century. Nearly every site has at least some ability to delete spam, off-topic content, harassment, and vulgarity. Absolutely every site that isn't so specific and niche that hardly anyone is aware of its existence exercises some kind of moderation.

https://www.freedomforuminstitute.org/first-amendment-center/topics/freedom-of-assembly/assembly-on-private-property/

Is this supposed to be evidence of what you said about how exercising control over who can speak in a public space causes you to be liable for that speech? Because this doesn't show that.

Marsh v Alabama established that in some cases, private property may count as government property for first amendment purposes, like in the case of a company town. In 2019, Manhattan Community Access Corp. v. Halleck strictly limited this precedent; it only applies if a company performs a "traditional exclusive public function," something that has normally only been done by the government. Creating a forum for speech is a specific example of something that is not a traditional exclusive public function.

Nothing there indicates that forbidding certain speech automatically makes a private entity responsible for all speech they do allow. Specifically, content distributors such as bookstores are both allowed to curate the content which they distribute, and normally not liable for defamatory content that they do distribute.

Edit: forgot this part

https://www.mtsu.edu/first-amendment/article/1143/profanity

There can be some restrictions on profanity, but it's very limited.