Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Youtube Social Networks

YouTube Waits Until a Month After the Election To Act on False Claims of Election Fraud (cnn.com) 419

YouTube is taking belated action on election misinformation: The company said it would now remove misleading videos that claim widespread fraud or other errors changed the outcome of the US presidential election. From a report: Google-owned YouTube announced that it would begin enforcing against this content on Wednesday, citing Tuesday's safe harbor deadline for the US election, which is the date after which state election results cannot effectively be challenged. YouTube said that enough states have certified their election results to determine a President-elect. National news outlets have universally projected that Joe Biden will be the next President.

As an example of content that would be banned, YouTube said it would take down videos claiming that a presidential candidate won the election due to widespread software glitches or errors in counting votes. It will begin enforcing the policy starting Wednesday, and said it would "ramp up" efforts in the weeks to come. It will still allow videos including news coverage and commentary to remain on the platform if they have enough context. Any videos in violation of the policy that were posted prior to Wednesday, will remain up even though they now break YouTube's rules. They will feature an information panel that says election results have been certified.

This discussion has been archived. No new comments can be posted.

YouTube Waits Until a Month After the Election To Act on False Claims of Election Fraud

Comments Filter:
  • by SirAbbadon ( 6783790 ) on Wednesday December 09, 2020 @01:16PM (#60812396)
    I find it terribly troubling that the content hosts have coronated themselves the arbiters of truth. Well, what they think is true. Today. And have decided to silence voices that may see things differently. The dystopian authors assumed they would ban books, or reprint them. Now they just click a few buttons and can silence dissent. And I for one, welcome our new insect overlords...
    • Comment removed (Score:4, Insightful)

      by account_deleted ( 4530225 ) on Wednesday December 09, 2020 @01:37PM (#60812518)
      Comment removed based on user account deletion
      • by MBGMorden ( 803437 ) on Wednesday December 09, 2020 @01:49PM (#60812574)

        If you're creating a system for forwarding other people's content, sooner or later you're going to have to admit that you're at least partially responsible for the more dangerous bullshit out there.

        Part of the American mindset has always been that there are no ideas that are too "dangerous" that discussion must be forbidden. Deciding to silence voices because they are citing things you do not believe to be true basically means that whoever is in power can determine what is "true" and shoot down those who disagree.

        The Catholic Church for example was in power in the 17th century and KNEW the "truth" - Galileo Galilei spent that last 11 years of his life under house arrest for daring to speak otherwise.

        I say this as someone who actually doesn't believe that widespread fraud occurred in the election, but the handling of the situations by the social media companies has been absolutely reprehensible, if not criminal.

        • by Anonymous Coward on Wednesday December 09, 2020 @02:27PM (#60812784)

          YouTube is not silencing anyone, it's just choosing not being their megaphone.

          • by Baleet ( 4705757 )
            Choosing not to be their megaphone, though, is choosing not to allow someone what amounts to a soapbox in the public square. The marketplace of ideas used to play out in the letters to the editor of the local newspaper. Since social media has replaced that, the posts in Facebook, YouTube, and other social media sites are where the public debate takes place. So, sure, YouTube is not a governmental entity. But it is, in fact, operating as a public space. If a restaurant is open to the public, it cannot refuse
        • by jbengt ( 874751 )

          Part of the American mindset has always been that there are no ideas that are too "dangerous" that discussion must be forbidden.

          Part of the American ideal maybe, but not part of the American mindset.

        • by LostMyAccount ( 5587552 ) on Wednesday December 09, 2020 @03:02PM (#60812958)

          It's the amplitude and reach of so-called dangerous ideas that's changed.

          No idea was too dangerous in the pre-social media era because the people who owned the printing presses and broadcast outlets had advertisers and others to which they were kind of responsible to, so they self-regulated. Plus they were run by professionals with a status stake in their professional output.

          You could still be a fringe character promoting your message, but you had a reach limited by who you could distribute your message to and your ability to publish it in sufficient quantity with your own money.

          The problem seems definitely linked to social media which has more or less been designed to influence people. Without social media, I'd be kind of surprised if YouTube would need to censor anything because it would be difficult to even get enough people interested in wacko videos about dead Venezuelans stealing the election for them to matter.

          I'd love to be stuck on a desert island with Sidney Powell to find out how much of her schtick she *actually* believes when there's no other audience to hear her admit it really all is a schtick and she doesn't actually buy any of it. It's kind of hard to believe how someone can be so apparently intelligent yet so far off the deep end.

        • by thegarbz ( 1787294 ) on Wednesday December 09, 2020 @04:41PM (#60813500)

          Deciding to silence voices

          Not amplifying is not the same as silencing.

      • Re: (Score:2, Interesting)

        by sosume ( 680416 )

        Who decides on what is dangerous bullshit? How about in 10 years from now?

    • I've read the first amendment but can't find where it says private companies must publish all content uploaded to their platform. Perhaps you can point me towards it? Are you also aware of many other video hosting sites besides YouTube? Sounds like the free market to me.

  • by rsilvergun ( 571051 ) on Wednesday December 09, 2020 @01:19PM (#60812410)
    since the end of the election, so mission accomplished. At this point it doesn't matter, the grift was wildly successful.

    I do wonder if this'll backfire on his party though. Most of those donations are likely to be from small donors (there's one rather famous example that turned into a lawsuit) simply because anyone who's rich enough to hand out large donations is probably clever enough to understand that Trump lost in a manner that no amount of lawsuits will change (he could probably get one state legislature to hand him an unearned victory Bush v Gore style but he'd need to do that 3 or 4 times with how the EC played out).

    This may end up starving smaller down ballot races of funding as those small donors are gonna be kind of broke, and a lot of them are feeling burned right now (my political forums are choked with posts from people who donated and realized the effort to overturn not only isn't working but was never going to work...).
    • by cats-paw ( 34890 )

      This may end up starving smaller down ballot races of funding as those small donors are gonna be kind of broke, and a lot of them are feeling burned right now

      This, of course, is the purpose of gerrymandering. No need to campaign if you have a lock on your district.

  • Dilbert (Score:5, Interesting)

    by Orgasmatron ( 8103 ) on Wednesday December 09, 2020 @01:19PM (#60812412)

    It will be interesting to see how Youtube handles Scott Adams. Since election day, he's posted maybe 30 videos where he:

    1) makes the case that widespread election fraud was and did necessarily happen because the incentives to commit fraud were massive, the risk of detection very low, and the risk of meaningful punishment even lower.

    2) argues that each of the specific examples of fraud that you've heard about is very likely to be false, as are the allegations in the thousands of sworn statements that have been collected.

    I presume that he is going to keep saying these things, so it will be interesting to see if Youtube cancels him or not. He has, I think intentionally, planted himself firmly in the grey area.

    • Re:Dilbert (Score:5, Insightful)

      by magzteel ( 5013587 ) on Wednesday December 09, 2020 @03:05PM (#60812982)

      It will be interesting to see how Youtube handles Scott Adams. Since election day, he's posted maybe 30 videos where he:

      1) makes the case that widespread election fraud was and did necessarily happen because the incentives to commit fraud were massive, the risk of detection very low, and the risk of meaningful punishment even lower.

      2) argues that each of the specific examples of fraud that you've heard about is very likely to be false, as are the allegations in the thousands of sworn statements that have been collected.

      I presume that he is going to keep saying these things, so it will be interesting to see if Youtube cancels him or not. He has, I think intentionally, planted himself firmly in the grey area.

      Scott is a perfect example of why our tech overlords shouldn't be in charge of what we are allowed to hear. You may disagree with him but he presents coherent, non-inflammatory, persuasive arguments. There is no reason to suppress what he has to say but it could easily happen. I don't know what the right answer is but something has to change. Big tech has become much too powerful.

  • CAN WE PLEASE... (Score:5, Insightful)

    by xession ( 4241115 ) on Wednesday December 09, 2020 @01:20PM (#60812414)
    Stop asking for these corporations to act as supposedly impartial arbiters of truth? The end game for this is not to the benefit of the actual truth, but to the "truth" as viewed from the perception of whatever is most beneficial to the corporate interests at hand.

    Imagine for a moment that Youtube was in a situation of monopolistic control over streaming video content and journalists wanted to make videos discussing this. What if from Youtubes perspective, it isn't truthful that they are a monopoly because other sites are still allowed to exist, regardless of their impact on the market.

    Or the often trotted out example of the Iraq war... What if in 2003 we had streaming services aligning with corporate news fully claiming that it was untruthful to suggest Iraq had weapons of mass destruction and intent to use them against the US? Whats the recourse for the censorers being wrong? Hint: There is none. By the time recourse could happen, the damage is done.

    Censorship NEVER works. It only winds up being used as a tool to shut down inconvenient conversations. If someone has actually published some bullshit, call it out as such. Get public consensus that its bullshit and be done with it. This has happened countless times through history to great effect. Instead, people seem to be itching for a digital version of Fahrenheit 451 to really bring us fully into our dystopian future. To that I say fuck you.
    • "whatever is most beneficial to the corporate interests at hand. " This right here. Anyone who thinks mega-corporations act "ethically" are either naive or unethical themselves. The ONLY thing that matter is profits and appeal to their largest customer base.
    • by Gravis Zero ( 934156 ) on Wednesday December 09, 2020 @01:35PM (#60812502)

      Stop asking for these corporations to act as supposedly impartial arbiters of truth?

      Since it's their platform, it's literally both their prerogative and their responsibility. The only way around this is to build your own platform and decline to arbitrate anything and hope you don't lose your lose your protections under section 230 because you didn't make a good faith effort to moderate your platform. Disinformation does actually harm people (many have been killed as a result of it), so not taking it down would be a failure to moderate the platform.

      It's the law, so until section 230 is amended, it will remain this way.

    • by hey! ( 33014 ) on Wednesday December 09, 2020 @02:48PM (#60812870) Homepage Journal

      The problem is that platforms like YouTube or Facebook aren't designed to present users with an informative cross section of news. They employ algorithms which maximize user engagement. If they allow conspiracy theories to flourish on their site, their sites will turn into brainwashing machines.

      The real answer is not to use social media to shape your world views. But social media companies won't like that, because their mission is to keep you on their site every waking moment; if you doubt me, try installing Facebook's mobile app.

      So social media companies are stuck. They don't *want* to be arbiters of the truth, but their mission to fill every waking moment of your life becomes unthinkably destructive if they let it fill your consciousness with falsehood, paranoia and superstition. If don't believe that, just imagine what the antifa side of the house is seeing in its feed. So these companies do the predictable corporate thing, which is pay some lip service and hope you won't notice they're poisoning your mind.

      What they *could* do is tweak your feed so you see a cross-section of views. But that's like asking General Mills to try to make your breakfast more nutritious. You end up with the "part of this nutritious breakfast" incantation, with a picture of what would be a healthy breakfast if you deleted the Lucky Charms. General Mills doesn't really want you to fill your belly up with nutritious food they don't sell. They don't want to be blamed for the consequences of you eating the nutritional equivalent of a half a cup of sugar with a vitamin pill on the side.

  • I am sure Putin would love the same YouTube policies in Russia. Citizens should not be going around complaining about elections on Internet.

    • You fucking dipshit.

      If you bitch loudly enough about an election in Russia, you fall off of a fucking balcony.
      Some private corporation doesn't kick you off of their property.

      Christ- dumbfucks like you are dooming us.
  • It has been clear for a while what your understanding of "Don't be Evil" is all about. Sergey and Larry, you should be ashamed of yourselves, all the more so because you would still probably have more money than you can reasonably spend without having allowed your company to become almost as despicable as Microsoft.
  • by davide marney ( 231845 ) on Wednesday December 09, 2020 @01:34PM (#60812496) Journal

    I've never quite understood the business side of censorship. If someone says something Utterly Stupid and Clearly Wrong but it attracts 100,000 views, hey, that's 100,000 opportunities to make an advertising buck. If what they said was illegal, then obviously that might open you up to legal risk. In that case, those 100,000 opportunities are also 100,000 possible legal actions. From a cost-benefit perspective, that sounds like a losing idea.

    But baring promoting clearly illegal speech, there's no down-side that I can see. Sure, I can imagine a need to target what advertising appears on what content, especially to make sure that your high-value advertisers don't wind up looking like they are promoting the Utterly Stupid and Clearly Wrong people. But flat-out censorship?

    Nah. There's lots and lots of people who will buy stuff advertised on Utterly Stupid and Clearly Wrong posts.

    • by sinij ( 911942 )
      The business side of censorship is exerting power in order to collect special treatment at a later stage. In this particular case, Google is using Yutube censorship to the benefit of Democratic party and the payout will be Democrats turning a blind eye on anti-competitive and monopolistic behavior by Google.
    • by SirSlud ( 67381 )

      If you believe that misinformation leads to an environment that threatens the institutions that permit you to make money, there's your self-interest. Weakening public trust in the institution of federal elections could reasonably considered to be one such example. If enough people spread enough misinformation about how skipping work makes you eligible for a giant government grant, and enough people believe that that it hurts your employee productivity, there's another.

      There are lots of examples where that m

    • The reason is that Youtube is trying to head off a boycott - an advertiser boycott would hit first.

    • I'm surprised that you first state that you don't understand the business side of censorship and then you explain it perfectly.

      High-value advertisers don't want to be associated with Utterly Stupid and Clearly Wrong. And if your platform exceeds some "USCW" threshold, the high-value advertisers will leave even if you promise to put their ads only with the "good" content. And some of the high-value content creators will leave for a platform with a better reputation. Your own brand value suffers

      Sure yo

    • That's what the tech companies did, make money off misinformation, until the backlash grew too large for them to ignore.
  • An election so clearly aboveboard that videos that disagree must be removed... Did Maduro, Putin, and Xi help craft these guidelines?

    It has come to the point where France based Dailymotion and UK based Bitchute respect free speech more than a US based "platform", and this change is coming to thunderous applause by morons who don't think this won't backfire stupendously on them. Historically, the left has *always* been the ones censored, because they challenge the status quo / upset the apple cart. And now

  • by DeplorableCodeMonkey ( 4828467 ) on Wednesday December 09, 2020 @02:01PM (#60812646)

    The company said it would now remove misleading videos that claim widespread fraud or other errors changed the outcome of the US presidential election

    So this means we can count on YouTube to take a moderating flamethrower to all user accounts that put forward the idea that "Trump stole the election" if the SCOTUS rules in favor of TX and Republican legislatures in the swing states go for Trump, right?

    Of course we know the answer to that is a resounding no, and they'll allow every two-bit fake legal expert to explain how the SCOTUS got it wrong and it was a coup.

    And this is why people are warming up to the idea of pulling back on S230 and forcing platforms to host speech they don't want to host or face legal consequences. After all, wouldn't it be perfectly fair to force YouTube to pay up for slandering someone as a spreader of disinformation if they turned out to be right?

  • Google is a company managed by anti-American immigrants. Support youtubers who are moving to other platforms like Rumble, Full30, Gab, Parlor, and Locals.

  • by minogully ( 1855264 ) on Wednesday December 09, 2020 @02:29PM (#60812796) Journal
    They should just preface the video with an non-skippable "advertisement" clip explaining a random logical fallacy in an entertaining way.

    Education is the way out of this mess.
  • Should they also take down videos questioning the official stories for 9/11 and the JFK assassination too? Or videos questioning the Big Bang, or Creationism? Where does it end?

Some people manage by the book, even though they don't know who wrote the book or even what book.

Working...