• For our 10th anniversary on May 9th, 2024, we will be giving out 15 GB of free, off-shore, DMCA-resistant file storage per user, and very possibly, public video hosting! For more details, check a look at our roadmap here.

    Welcome to the edge of the civilized internet! All our official content can be found here. If you have any questions, try our FAQ here or see our video on why this site exists at all!

"Freedom of speech is too much in the Information Age."

Arnox

Master
Staff member
Founder
Messages
5,285

Had this interesting exchange with someone on Reddit. Maybe you guys will find it interesting as well.

SpellingSocialist said:
I think that was the old theory, and we are starting to realize that it simply isn't true when the creation of "speech" is virtually cost-less (in terms of effort, money, reputation, time, etc. needed). In the past, to disseminate a message, you needed at least one of those things to disseminate a message. Now, as a nearly-anonymous person I can record myself doing something / saying something completely wrong or nonsensical on my phone, upload the video to Reddit, and as long as it resonates with an audience, my message will be amplified. It costs me virtually nothing, whereas higher quality speech, presumably refuting my misinformation, would take more time, more effort, and an actual reputation (at a minimum).
Arnox said:
Remember the urban myth that lemmings were so dumb, they would jump off a cliff? That myth was (basically) started with the Disney film, White Wildernesss in 1958. And look how long that myth held on without the modern internet. Not satisfied? I got more. How about the myth that the tongue has certain areas responsible for certain tastes? Or how about the myth where we only use 20% of our brain? (Or was it 25%? I don't know. Don't care. It's a stupid myth.) Or how about the myth that Columbus' crew, when sailing the Pacific, panicked because they thought the world was flat?

In summary though, it goes both ways. Information disseminated slowly is debunked slowly. Information disseminated fast is debunked fast.

There's other factors here too that you're overlooking as well such as the fact that Twitter has a RAMPANT bot problem, and Twitter apparently doesn't give a crap about it. How much less nonsense would we have to put up with just on that site alone, I wonder, if the site staff were actually acting responsibly? But I guess they don't care because it looks good on a graph for investors. "Wowie! Look at all our site engagement!"

And finally, I'll bring one last example up. Remember how so many people were making and eating Nyquil Chicken after seeing a TikTok video? Oh wait, that didn't actually happen. Some fucko news station had a slow news day and decided to report on some rando humorous TikTok video that was made YEARS ago. Other news stations just decided to run with it because, "Wowie! Look at all that engagement!"

The "ignorant and unwashed masses" are not the issue. Freedom of speech is not the issue. The people at the top pulling the strings though ARE. People who are trying to scare you into giving up your rights, your power, your speech.
 

Arnox

Master
Staff member
Founder
Messages
5,285
SpellingSocialist said:
I think what you're missing in your summary about (mis)information disseminated quickly or slowly is that 1) a given percentage of people will never hear the correction, and 2) the percentage of people that accept and believe the correction is a function of how believable the correction is. And frankly, it takes much more time, effort, money, reputation, etc. to make a believable correction. Many, many people still believe in lemmings and NyQuil chicken.
There is a reason that misinformation spreads like it does, and that is because we *want* to believe it for some reason. It's funny to believe in lemmings throwing themselves off the cliff - we feel superior thinking that NyQuil chicken eaters are so much stupider than us. The only way to make the corrections believable is to make them equally "sticky" - so the lemmings correction focuses not so much on the lemmings themselves (because we don't actually give much of a shit about lemmings) but on the disreputable filmmaker who tried to pull one over on us! The NyQuil chicken correction becomes far less focused on the food or the craze (because that's honestly boring) and more on how the news media and corporate interests are trying to manipulate us to keep our attention! Anger! Outrage!
But obviously, it takes time, effort, money, reputation, etc. to make these corrections interesting. And the corrections are almost always going to be less interesting than the original misinformation, they're going to spread less quickly, and they're going to spread less broadly.
Finally, I didn't say that the "ignorant and unwashed masses" are the issue, or that freedom of speech is the issue. I simply said that in a world where speech has virtually no cost attached to it, the old theories don't hold up well.
Arnox said:
At first, I was gonna break things down, but before I do that, I want to clarify this. What do you mean exactly when you say, "the old theories"?
SpellingSocialist said:
You've said that "the best way to fix bad speech is more speech." I think the ultimate distillation of that is "more speech = better" and "more information = better".
In my opinion, that is the old theory.
Arnox said:
Alright then. So, to continue, I think you're greatly overestimating how much it takes to debunk a myth. And especially one that was thought up without any basis for it whatsoever. I think your opinion of the general (adult) populace is too low as well. And that's another thing people keep forgetting. How many easily impressionable kids there are online. And even if they are impressed, those kids generally grow up. They start asking questions.
And even if all that wasn't actually a factor, consider the alternative then. If people are so vulnerable to misinformation as you say, then do you really think it's a a good idea to put those vulnerable people in charge of deciding what is misinformation and what isn't? Or maybe we should just tell people to start taking responsibility for what they believe instead of trying to do everything for them. The U.S. was not just built on personal freedom, it was also built on independence and responsibility, and I think way too many people forget that.
SpellingSocialist said:
I don't believe I'm overestimating how much it takes to debunk a myth, and I believe my estimates are backed up by most research on the topic. I will share a few sources, and selected quotes from them:https://twin-cities.umn.edu/news-events/science-debunking-misinformation

“If misinformation has already spread, individuals can try to debunk it. This strategy follows three steps by first explaining why the mistaken information was thought to be correct, then sharing why the information is wrong, and lastly explaining why the alternative is correct. While this can be challenging and time consuming, it can be an effective strategy in stopping misinformation in its tracks.” - Panayiota Kendeou, Ph.D.

(my point being that debunking misinformation takes more effort than creating it in the first place, emphasis mine)

“While fact-checking can help address misinformation, and even reduce a person’s belief in false or inaccurate information, misinformation can still influence people’s thinking even after it has been proven that the information is incorrect." - Emily Vraga, Ph.D.

(this is directly linked to confirmation bias. Once we believe something is true - in this case the misinformation - we tend to interpret new information in light of what we already believe. This makes it less likely that any new information will change our mind regarding our initial belief, which was created through misinformation. Emphasis mine)

https://journals.sagepub.com/doi/pdf/10.1177/0956797617714579

From their recommendations on how to combat misinformation:

"Recommendation 3: correct misinformation with new detailed information but keep expectations low. The moderator analyses indicated that recipients of misinformation are less likely to accept the debunking messages when the countermessages simply label the misinformation as wrong rather than when they debunk the misinformation with new details (e.g., Thorson, 2013). A caveat is that the ultimate persistence of the misinformation depends on how it is initially perceived, and detailed debunking may not always function as expected." (emphasis mine)

I don't know if it's a good idea to put "vulnerable people" in charge of determining what is misinformation and what isn't. I think that path is fraught with danger. However, simply putting forth more and more information and expecting 1) good information to push out bad and 2) people to take responsibility for what they believe and use critical thinking skills doesn't appear to be working - and that was the old theory about what worked.
Arnox said:
Honestly, it sounds like you actually DO believe in freedom of speech but just disagree as to how exactly myths should be debunked. And the thing is... I don't actually disagree with any of those studies you linked. In fact, I would argue that in order to best employ these linked strategies, you need a free-speech-friendly environment. Not entirely related, but this is why I bang on so much about us needing to return to old-school OG forums. Sure, they weren't perfect, but they've proven to be by far the best social medium we have for logical long-form discussions and many many things in between. The format is extremely flexible and powerful.
Consider Reddit itself here. And yes, it's technically a forum, but look at our post scores. Both of us have gotten downvoted even though both of us are calmly sharing ideas and opinions in a logical rational discussion very likely because someone simply didn't like what they were reading. Of course, upvoted posts are promoted, and downvoted posts will be pseudo-censored. This is what I'm talking about. These modern social media systems that infect sites like Reddit and Twitter are part of the issue and are instead used and abused and are actively promoting disinformation. For Reddit and Twitter, it can get especially egregious too when a staff member decides they don't like what you have to say and just delete your post and/or ban you. And does the staff member face consequences for this? They do not.
SpellingSocialist said:
Of course I believe in freedom of speech. I've never said I don't. However, just as in a free-market society we must recognize that market failures exist and so markets must be regulated or replaced in certain situations, in a free-speech society we must recognize that there are certain situations where free speech is not always optimal.
By the way, I don't believe there is a perfect answer. However, I want to really emphasize that the current way of thinking among technocrats, idealists, and visionaries - that good data displaces bad, good information pushes out bad, and more is always better - is simply and obviously not working.
Last thing: I get where you're going about the problems inherent in moderation and censorship, but look at a place like AskScience, where moderation is strict and quality of information is high. Look at forums of the past, like SomethingAwful, where buy-in was $10, moderation was fanatical, and humor quality and density was extremely high. And finally, I've never had a comment or post deleted here on Reddit - but I've seen factually incorrect comments bubble their way to the top of a thread many, many times. The correction is occasionally equally upvoted - occasionally, but not always.
One last thought on forums: the forum structure is good, but it still gives unfair weight to earlier comments over later (and especially over middle) comments, as nearly everyone who opens a thread will read the early comments, some will read the last few, and just a few will read the middle comments. Additionally, activity bumps a thread to the top of a forum, ensuring that popular or controversial topics receive the most attention. I think it would be interesting to see a more Rawlsian format, where thread ranking within a forum was entirely randomized, combined with reply threads (like Reddit) but where each reply chain's ranking is also randomized and not based on upvotes/downvotes. However, obviously readers/visitors would not be well-served by this kind of approach (from a satisfaction or entertainment perspective. From a moral perspective, it would be quite good).
Arnox said:
Well, the problem for me isn't actually moderation itself per se. 4chan is an (in)famous example where it's mostly a zoo. So I do definitely agree with SOME rules. What I see though way too often are unnecessary and/or overbearing rules that harm more than they help. But yes, I think we've reached a general agreement and I'll just leave it here for now. Thanks for the good discussion! I don't get much of that on this damn website...
 
Top