Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Youtube Communications Media The Internet

YouTube Removes 17,000 Channels For Hate Speech (hollywoodreporter.com) 409

An anonymous reader quotes a report from The Hollywood Reporter: YouTube says it has removed more than 17,000 channels for hate speech, representing a spike in takedowns since its new hate speech policy went into effect in June. The Google-owned company calls the June update -- in which YouTube said it would specifically prohibit videos that glorify Nazi ideology or deny documented violent events like the Holocaust -- a "fundamental shift in our policies" that resulted in the takedown of more than 100,000 individual videos during the second quarter of the year. The number of comments removed during the same period doubled to over 500 million, in part due to the new hate speech policy. YouTube said that the 30,000 videos it had removed in the last month represented 3 percent of the views that knitting videos generated during the same period. YouTube says the videos removed represented a five-times increase compared with the previous three months. Still, in early August the ADL's Center on Extremism reported finding "a significant number of channels" that continue to spread anti-Semitic and white supremacist content.
This discussion has been archived. No new comments can be posted.

YouTube Removes 17,000 Channels For Hate Speech

Comments Filter:
  • Comment removed based on user account deletion
  • by chuckugly ( 2030942 ) on Tuesday September 03, 2019 @04:34PM (#59153976)
    It's their business to run as they wish, but why should the public extend the protections afforded to a platform like the phone system to an entity that is behaving like a publisher? If they want to be MSNBC/CNN/Fox, let them abide by the same rules.
    • Re: (Score:3, Insightful)

      If only "the public" could voice a clear consensus on what FB "should" do, Zuckerberg might see it in his financial interests to comply.

      As a personal matter, I utterly detest FB. Yet I do see here Zuckerberg as damned if he does and damned if he doesn't.

    • by e3m4n ( 947977 )

      Its not about wanting. The MINUTE they made their first dollar in paid AD revenue, they became a MEDIA company. Hold them to the SAME standards. Both twit parties in congress are guilty of this clusterfuck.

      • Modern social media companies are supported by advertising (and data collection for the purposes of advertising). I imagine that Facebook could take direct monthly payments, but they might be one of the few companies able to convince consumers to pay them for what was a free service. The consequence of this could be regulatory lock-in. They would be taking advantage of controlling a large share of the market before the new regulations, and enjoying a world where upstarts could never compete as long as the r

        • by e3m4n ( 947977 )

          it doesnt matter, either you are selling web hosting and are not responsible for the content someone pays you to host... OR you are selling ad space on that content. If you are going around selling ad space based on ISIS beheadings; If your revenue stream is based on the _type_ of content you host, such as the social media version of nielson ratings, the number of clicks, the number of visits, the number of views, you become responsible for the content. If you profit in ANY way more on one type of content

    • They don't want to be like MSNBC/CNN/Fox; they want to be like a website that takes user-generated content and removes the worthless shitposts. So they're behaving basically like the phone company, but with convenient shit&spam filters. If your phone provider offered a shit filter, wouldn't you love that?

      Maybe there's some downside to this, which I'm not seeing. What phone-like protections are being offered to Youtube, that you think are inappropriate for the kind of service they run? Even with a shit

      • by ChrisMaple ( 607946 ) on Tuesday September 03, 2019 @08:09PM (#59154778)

        What Facebook is doing is claiming that certain political views they oppose are hate speech.. They then remove the posts they disagree with, and often also ban the poster.

        There are several problems here.
        One is that in most cases Facebook refuses to exactly identify the offending sentences, making it impossible to defend against the charge that the material is hateful.
        Another is that their political bias is claimed to be just common decency, and that is fraud.
        Third, claiming that the poster is making hateful statements when he's not is libel.

      • The key difference is that phone spam is something I get pushed upon me. YouTube trash videos is something I actively have to go get. I can very easily avoid nazi crap on YouTube, they're most definitely not shy to declare clearly what kind of rubbish they spew.

    • by rsilvergun ( 571051 ) on Tuesday September 03, 2019 @08:24PM (#59154854)
      of the Communications Decency Act [eff.org], which was specifically written to give safe harbor to computer based platforms that moderated their users.

      I used to think there was only two options too.

      And make no mistake, the Internet lives and dies on Section 230. Take it away and only the big, establishment media outlets can survive the never ending onslaught of lawsuits. If you want to turn the Internet into Cable TV and hand the single biggest event in human history since the printing press over to the establishment elites then by all means, go after Section 230.

      But if free, independent media is your goal (as it was with many of Section 230's authors) then you know what to do.
  • Problematic (Score:5, Insightful)

    by Chromal ( 56550 ) on Tuesday September 03, 2019 @04:35PM (#59153978)
    Normally, my reaction would be "Good, what the hell took you so long, hate speech may be protected in the public square, but has no place in civilized society online." But it's problematic when we see stories like this, four days ago: YouTube Bans Anti-Nazi Documentary From 1938 For Violating Hate Speech Policy (https://paleofuture.gizmodo.com/youtube-bans-anti-nazi-documentary-from-1938-for-violat-1837436638)
    • Re:Problematic (Score:5, Insightful)

      by penandpaper ( 2463226 ) on Tuesday September 03, 2019 @04:47PM (#59154032) Journal

      What's that you say? Censorship has unintended consequences and impacts innocent people? I would have never guessed.

      When has censorship ever worked?

      • by Chromal ( 56550 )
        I'd say the problem here is specifically automation attempting to make judgements about what is or isn't hate speech.
        • >"I'd say the problem here is specifically automation attempting to make judgements about what is or isn't hate speech."

          That might be this specific example. But the overall problem remains. Exactly what is "hate speech"? That will vary from person to person and over time. It is almost completely subjective. So such filtering WILL have things banned that many of us do not consider "hate speech" and others actually want to see. And these publishers (because they are certainly not platforms anymore) w

      • Your statement is demeaning and offensive! We should censor your complaint about censorship! Elimination of dissent is the only way to ensure censorship is not a problem... /sarc
      • When has censorship ever worked?

        You, obviously, have no children or you're a horrible parent.

      • Re:Problematic (Score:5, Insightful)

        by Aighearach ( 97333 ) on Tuesday September 03, 2019 @10:07PM (#59155220)

        What's that you say? Censorship has unintended consequences and impacts innocent people? I would have never guessed.

        When has censorship ever worked?

        That's absolutely ridiculous. When private businesses engage in "censorship" for business reasons, as in this story, it works out great.

        I'll give you an example. If you go into a restaurant and use swear words that are offensive to other patrons, there is a longstanding practice of censoring you by telling you to shut up, or kicking you out. This is also true at most other businesses.

        Some taverns just tell people to "take it outside" if they don't get along, but other taverns have bouncers that kick out people saying things to try to upset other customers.

        Local food markets often have a bulletin board outside where people can post community announcements, items for sale, etc. If you see something offensive, and you report it to the store, they will usually take it down. Though more often, whoever is offended just takes it down themselves, and if anybody sees them do it they'll say "Thank You!"

        If you ever decide to become interested in Free Speech, you'll find that this is well-trodden ground. You'll also find that choosing what you publish or republish is the Free Speech. When you write a letter to the editor of the local newspaper, and they publish it, that is not your right of free speech being exercised. It is the owner of the press' right of free speech being exercised. When you write a letter to the editor and they don't publish, you were not censored. You merely failed to get your letter published.

        This is the same as a letter to the editor. It is Youtube's speech at issue, and nobody is being censored. Unwelcome participants are merely being shown the door. If you don't follow the rules, Youtube is not going to censor your video to remove the offensive part. They're just going to boot your ass out the door.

        • It can work out great in a restaurant, but I don't think a restaurant is analogous to a website designed for people to share and comment on videos. Exchanging offensive ideas at a restaurant interferes with other patrons' ability to engage in the primary function of a restaurant (serving food); exchanging ideas (of which offensive ideas are a subset) is the primary function of YouTube (or at least a primary function). Now, you might believe that it's still ethically acceptable to limit what information or i
    • Normally, my reaction would be "Good, what the hell took you so long, hate speech may be protected in the public square, but has no place in civilized society online." But it's problematic when we see stories like this, four days ago:

      YouTube Bans Anti-Nazi Documentary From 1938 For Violating Hate Speech Policy (https://paleofuture.gizmodo.com/youtube-bans-anti-nazi-documentary-from-1938-for-violat-1837436638)

      The linked article has an update saying that the video was reinstated just a few hours after the article was posted. This is how the system was designed to work, with appeals allowed with quick resolution.

  • by UnknownSoldier ( 67820 ) on Tuesday September 03, 2019 @04:48PM (#59154040)

    If you follow all the links you eventually end up:

    * Hate speech policy [google.com]

    Hate speech is not allowed on YouTube.

    LOL. Because we can "trust" the bots to do this job.

    We remove content promoting violence or hatred against individuals or groups based on any of the following attributes:

    Age
    Caste
    Disability
    Ethnicity
    Gender Identity
    Nationality
    Race
    Immigration Status
    Religion
    Sex/Gender
    Sexual Orientation
    Victims of a major violent event and their kin
    Veteran Status

    Notice how there is ZERO exceptions for Humor, Jokes, or Parody. In other words you can express your opinion as long as it agrees with OUR definitions and opinions of what we think you can say. There are ZERO exceptions. WTF?

    I love race, nationality, and religious jokes about MY race, nationality, and religion. If a joke go to far then guess what -- I stop fucking listening. I don't get butt-hurt and attempt to censor everybody else because humor is a PERSONAL thing.

    Censorship is NEVER the solution. It is precisely the problem. You have a brain and ears. Fucking use them.

    Attempting to "delete the problem" doesn't make it go away. People have various boundaries. Trying to apply a "one size fits all" leads to an echo chamber of group think. We have 20 years of social media like usenet, /. and reddit to show that censoring "taboo" subjects is akin to pissing in the wind. People will ALWAYS have contrary and "offensive" opinions.

    Sometimes it can be good to laugh at difficult subjects like death. It helps us cope with traumatic experiences. Attempting to put up artificial "categories" of "this is bad" always leads to the question: WHO decides what is tasteful ?

    As I heard this cliche on /. a long time ago:

    One man's fetish is another man's disgust.

    Where does it end until we have every possible category added as this bullshit "hate speech".

    When did America turn into a bunch of wussies?

    I guess it takes too much to press "next" or "back" for something I don't like. :-/

    • by LynnwoodRooster ( 966895 ) on Tuesday September 03, 2019 @04:54PM (#59154064) Journal
      Sadly, Blazing Saddles would be considered hate speech by a wide swath of society, today...
      • Great example -- and fantastic movie! Funny. As. Hell.

        If we can't laugh at "serious" things then maybe we shouldn't be so serious about them.

        • There's nothing funny about neo-Nazis or White Supremacists, other than the fact that they're ridiculous -- but they're only funny in a BAD way.
          • by Train0987 ( 1059246 ) on Tuesday September 03, 2019 @06:14PM (#59154380)

            What gives you the right to determine what someone else should find funny? Nazi's mock themselves. Their videos are fun to laugh at. By attempting to ban them you only give them power.

            Don't look now but you've become the totalitarian in your quest for virtue-points.

          • Comment removed (Score:4, Insightful)

            by account_deleted ( 4530225 ) on Tuesday September 03, 2019 @07:44PM (#59154684)
            Comment removed based on user account deletion
      • Ever watch any of the old show The Honeymooners [wikipedia.org]? How about All In The Family [wikipedia.org]? You should. Go watch several hours of each. Then come back and tell me if the 'humor' in those shows is still appropriate.
        Also go see if you can find something even older: Little Rascals [wikipedia.org]. You tell me if the depiction of blacks in that show would still be considered appropriate today.
        • Re: (Score:3, Insightful)

          Heck, what about Married with Children? MASH? Barney Miller? WKRP in Cincinnati? Three's Company? Lawrence Welk? Mutual of Omaha's Wild Kingdom? Sesame Street from the 80s? HR Puffenstuff? Star Trek? Shakespeare? Dante's Inferno? You can find offense in ANYTHING, if you look hard enough.

          The problem isn't the offensive nature of historical texts, movies, songs. It's that people today CHOOSE to be offended and want to demand others change their behavior accordingly. Your right to not be offended

        • Again, who the F are you tell anyone else what humor they should find appropriate? People like you are scarier than the Nazis. You're just like them.

        • The problem isn't so much the humor.

          The problem is that there is no humor at all.

          Then problem is that some of these channels, some of these videos, prey on the weak minded, or those who lack a sense of belonging.

          It's sociology 101. People like to feel part of something. If there are a bunch of people intentionally misleading you for their own agenda you will be inclined to adopt their views in order to achieve a sense of belonging.

          Then you have the argument , I like Yellow, well i like purple fuck you.

    • by Chromal ( 56550 )
      I don't think you can play the Trump card of claiming you were "only joking" and "didn't really mean it" if you said something calculated to denigrate an individual or group for their intrinsic qualities or differences.
    • WHO decides what is tasteful ?

      On YouTube, Google gets to decide

    • and all of the people none of the time. That old saying has a ton of merit. The problem is that on a world stage, there will always be someone that considers something as bad and wants it removed. Now, if you listen to what every individual wants removed, you wind up with no individual having anything left.

      A world society doesn't work too well, this is but one example.
      • and all of the people none of the time. That old saying has a ton of merit. The problem is that on a world stage, there will always be someone that considers something as bad and wants it removed. Now, if you listen to what every individual wants removed, you wind up with no individual having anything left. A world society doesn't work too well, this is but one example.

        Meh. This is an Appeal to Extremes [logicallyfallacious.com] argument. You don't need to remove everything any individual wants removed, just things that sufficiently-large groups of people can reasonably argue are actually harmful. Note the two layers of filtering there: (1) sufficiently large groups (hundreds of millions to billions, I'd say) and (2) reasonably argue are actually harmful. Yes, there is inherently a lot of subjectivity there. Welcome to the reality of dealing with large social issues; there is always more gray

    • I'ts not censorship if it's a private company. Why do you want pro-neo-nazi, pro-racism, and pro-white supremacy content on YouTube? Will you miss it?
      Telling people to merely ignore content they don't like in this case is like telling people to ignore their cancer diagnosis. It'll just get worse and worse if you do nothing about it. A private company has decided to do something about it on their own websites. Why is this a problem for you?
    • I love race, nationality, and religious jokes about MY race, nationality, and religion. If a joke go to far then guess what -- I stop fucking listening. I don't get butt-hurt and attempt to censor everybody else because humor is a PERSONAL thing.

      So, you're claiming that the 17,000 removed channels were full of jokes about Naziism and the Holocaust? I wouldn't be shocked if some small number of channels were taken down erroneously, but I'll bet the vast, vast majority of them were deadly serious.

      Censorship is NEVER the solution. It is precisely the problem.

      Pre-Internet, I'd have agreed with you. The problem is that some characteristics of the net collide in rather destructive ways with some characteristics of human cognition.

      Specifically, people judge truth of facts/ideas they can't personally check in lar

  • by Arzaboa ( 2804779 ) on Tuesday September 03, 2019 @05:09PM (#59154170)

    The problem with all of this is that people are conflating hate speech and hateful ideologies.

    There is a big difference in saying "I don't like antifa" in response to an article about Antifa throwing veggie shakes at police, and immersing ones self into hateful ideologies and spending your entire life trying to be a professional hater because its US vs. THEM.

    Without context, both can sound the same. These companies are not the government, they are private companies that rely on money from investors and donors while serving the majority of people that simply have no time for this rhetoric.

    I help moderate all sorts of discussions across the internet. People have bad days all of the time. That is OK. People say things they will regret tomorrow. That is OK. When I see people try to recruit others to their hateful ideology, that is a different category. It all starts the same... "I hate ." The difference is one comes back with "I'm sorry" the next day while the other comes back with "We're going to DOX you now."

    The difference is that when Sally gets pissed at George and says something flippant, it ends there. It crosses the line when people try to recruit others into spending their time bullying others.

    Moderating speech is a fine, and wavering line. Going down the rabbit hole of hate is never a solution. Moderation is full of messy solutions. In my experience, crying fascist, and socialist and conservative and communist doesn't help, but that seems to be the go to line for large swaths of people on all sides of these discussions.

    What are these companies supposed to do when they see this stuff spiraling out of control?

    --
    Your silence will not protect you -- Audre Lorde

  • I hope they don't ban Scotty Kilmer. He has a "Live Free Or Die" sign above his garage door. Rev up your YouTube take downs!
  • One day soon, Nazi and white supremacist will mean anyone who is against welfare, against making decisions based on race or gender, and being in favor of national borders. Then what word will we use to identify the baddies?
  • If nazis can't ever talk to sane people, they'll just keep getting more and more insane, until you're guaranteed to get more shootings.
  • by khchung ( 462899 ) on Tuesday September 03, 2019 @07:12PM (#59154570) Journal

    I think putting it that way would make things clearer for many people.

  • Comment removed (Score:3, Insightful)

    by account_deleted ( 4530225 ) on Tuesday September 03, 2019 @07:13PM (#59154580)
    Comment removed based on user account deletion

Hackers are just a migratory lifeform with a tropism for computers.

Working...