Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Advertising Youtube Businesses Communications The Almighty Buck

Disney, Nestle, and Others Are Pulling YouTube Ads Following Child Exploitation Controversy (bloomberg.com) 191

An anonymous reader quotes a report from Bloomberg: Disney is said to have pulled its advertising spending from YouTube, joining other companies including Nestle, after a blogger detailed how comments on Google's video site were being used to facilitate a "soft-core pedophilia ring." Some of the videos involved ran next to ads placed by Disney and Nestle. All Nestle companies in the U.S. have paused advertising on YouTube, a spokeswoman for the company said Wednesday in an email. Video game maker Epic Games and German packaged food giant Dr. August Oetker KG also said they had postponed YouTube spending after their ads were shown to play before the videos. Disney has also withheld its spending.

On Sunday, Matt Watson, a video blogger, posted a 20-minute clip detailing how comments on YouTube were used to identify certain videos in which young girls were in activities that could be construed as sexually suggestive, such as posing in front of a mirror and doing gymnastics. Watson's video demonstrated how, if users clicked on one of the videos, YouTube's algorithms recommended similar ones. By Wednesday, Watson's video had been viewed more than 1.7 million times. Total ad spending on the videos mentioned was less than $8,000 within the last 60 days, and YouTube plans refunds, the spokeswoman said.
Two years ago, Verizon, AT&T, Johnson & Johnson and other major companies pulled their ads from YouTube after learning that some of their ads surfaced next to extremist and violent content. Yesterday, YouTube released an updated policy about how it will handle content that "crosses the line" of appropriateness.

"Any content -- including comments -- that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube. We took immediate action by deleting accounts and channels, reporting illegal activity to authorities and disabling violative comments," a spokeswoman for YouTube said in an email.

Disney, Nestle, and Others Are Pulling YouTube Ads Following Child Exploitation Controversy

Comments Filter:
  • by WCMI92 ( 592436 ) on Wednesday February 20, 2019 @05:12PM (#58154464) Homepage

    I hate ads on Youtube.

  • As sick as it is.. (Score:5, Insightful)

    by scsirob ( 246572 ) on Wednesday February 20, 2019 @05:14PM (#58154474)

    .. There will always be sick people imagining things in pictures or videos that are not there. There will always be people that are offended by anything you do or say. When you put out stuff on the internet for the whole world to see, there's no way to *NOT* offend or trigger some idiot somewhere on the planet.

    Please return to common sense. If a video shows a girl having fun, it's about the girl having fun. Not about to sicko three doors down who has sick fantasies. Turning to censorship will not change that.

    • Channels such as Naomi 'SexyCyborg' Wu [youtube.com], where I'm not quite sure if people are watching her for what she does, or what she looks like.

      • by Anonymous Coward on Wednesday February 20, 2019 @05:26PM (#58154548)

        She's an adult who can make the content she wants and it really doesn't matter why others might like her work. She seems like a fun lady who does fun things and is fun to look at, and that's just fine.

        • by AmiMoJo ( 196126 )

          She is also an accomplished engineer. She has a great deal of skill when it comes to 3D modelling and printing/CNC, particularly an ability to conceive of a product and turn it into something workable quickly.

          She has done a lot of China too, especially promoting open source. She is responsible for the first three open source hardware products out of China. She went to the manufacturer of some 3D printers, convinced them to open source the design, helped them do it and meet all the requirements, and got it c

      • by amicusNYCL ( 1538833 ) on Wednesday February 20, 2019 @07:47PM (#58155324)

        Why does it matter? Serious question.

        In addition to dictating what people are allowed to watch, are you suggesting also trying to dictate why they should be watching it?

        • by Anonymous Coward

          Exactly. Sounds a lot like thought crime.

        • by gweihir ( 88907 )

          Nothing less than total control over others will satisfy the current generation (well, all generations really) of authoritarian scum. The sad thing is that these people are again being listened too. As if the numerous fascist, stalinist and other authoritarian catastrophes the human race has caused in the past are not enough and we need some more.

        • It does not matter to me, I'm just curious if she would have as many followers if she didn't have her breast implants.
          It's a good move on her part for exploiting the male psyche, though.

      • by hoofie ( 201045 )

        Look the first time you watch some of her videos, let be honest it's quite easy on the eye.....

        After a few of them you really don't notice that aspect anymore as the content she presents is too interesting - you are a true geek if you can do that.

        She has been shamefully treated though and comes across as quite a lovely person.

    • by Anonymous Coward

      The issue is that the girls in the videos are lured and manipulated into these compromising situations. It's not so victimless as you think.

    • by zlives ( 2009072 ) on Wednesday February 20, 2019 @05:24PM (#58154528)

      can we have a conversation about why people are uploading their kids on the internet in the first place?

      • can we have a conversation about why people are uploading their kids on the internet in the first place?

        Why? Narcissism.

      • by grep -v '.*' * ( 780312 ) on Wednesday February 20, 2019 @06:23PM (#58154872)

        why people are uploading their kids on the internet in the first place?

        You wouldn't download a car, you wouldn't download a movie. But would you download a kid?

        IMAGES. They're uploading IMAGES -- to share. They're also the parental guardians over them, so they have that right. (I think that's a right.) They're presumably doing it to show off to their friends or their kids friends, they don't realize (or don't care) that literally their entire world can watch them as well.

        And what's wrong with that? Maybe another dance class instructor will learn a new move or have a new idea (OMG, CULTURAL APPROPRIATION!) Maybe another kid will watch these and decide "I can do that, too!"

        Now let's be real -- SOME people are going to watch that and imagine sexual acts. Or bestiality. Or ritual sacrifice. Or beheadings. Or maybe new costume designs or even haircuts. That's true of ANYTHING. (Sheep jokes about NZ. Blond jokes. Rule 42. Someone took an actress's face and glued it on their sex doll.) If you're concerned someone might see something bad in a picture, you (a) shouldn't post it and (b) need to get your head out of the sand. (My mom used to say, "Get your head out of the gutter." Seemingly now-a-days, everyone's head IS the gutter. Not quite sure what happened -- "Sex Sells" or something.)

        If we're leading up to talking about banning pictures because they might offend (or attract) someone, then notice that God, Allah, Re, Zeus, Odin, Thor, and FSM appear in EACH and EVERY picture on the internet. They're invisible, so you can't see them, but they're always there. So, let's start banning all pictures everywhere. (Here's a naughty one. [theonion.com] But include Muhammad there and heads will literally roll.)

        And BTW: if I start fantasizing about AOC, Nancy Pelosi, and Trump, can we ban every picture about ALL of them? I'd like for SOMEthing good to come out of this.

      • by MobyDisk ( 75490 )

        Yes.

        Growing up, a group of children in my neighborhood won an award for some civic duty. The newspaper printed a picture of the group and their first names, all except for one child. That child's parents were outraged that the newspaper wanted to publish that information. The believed that having people know their faces and names presented a danger to them. Or maybe it was a news broadcast now that I think of it. But over time, that kind of thinking mostly died out. That kind of information is so triv

        • Are you suggesting that a video of a child on YouTube, with no last name and no location (which is less information than the newspaper offered) is a cause for concern?

          Perhaps.. It depends on both the context and unknown future events. There is just no way to know for sure.

          I think most folks vastly under estimate the amount of data they actually are giving up when they post stuff online. It's hard to understand just how the information will be stored, used and impact people in the future. It may be nothing, or, it may be a huge deal.

          I think the young adults who grew up in a culture of oversharing everything, getting your 15 min of fame by going viral, it's tempting to

          • Every year or so, there's another news article about some mommy blogger who's children just found out every moment of their life was memorialized online - and they are very angry. It's usually followed by the mommy blogger expressing no remorse, and talking about how they are still going to post everything their kid does (including this whole argument), usually with a fig leaf or two about some privacy "protecting" nothing that shows they "compromise".

      • They are proud of their children and want to send videos to family and friends, but don't know how the privacy features of YouTube work (or that they even exist).
      • Maybe I'm reading you wrong, but believe it or not, there's more than one reason for children to be present on the Internet. It would be wrong to allow some pervs from eastern Europe, et al. to spoil it for the overwhelming majority of the legitimate, innocent content out there for and by children. There seems to be a lot of hand waving surrounding this topic arguing for the equivalent of shutting down playgrounds, water parks, etc. because some perverted letch might be leveraging them for his jollies. A

      • by AmiMoJo ( 196126 )

        It's hardly a new thing. There have always been public events for children, or maybe more like their parents, to show off their ballet skills or whatever. A few years ago there was a panic over parents taking photos of their kids at swimming events or even just playing football.

    • by Anonymous Coward

      .. There will always be sick people imagining things in pictures or videos that are not there. There will always be people that are offended by anything you do or say. When you put out stuff on the internet for the whole world to see, there's no way to *NOT* offend or trigger some idiot somewhere on the planet.

      Please return to common sense. If a video shows a girl having fun, it's about the girl having fun. Not about to sicko three doors down who has sick fantasies. Turning to censorship will not change that.

      Completely agree. This is just another internet do gooder on some fucking crusade. I've seen these videos and its just bunch of girls having fun. There are some slips but you can see that on TV and in Movies if you actually look for it. Until this idiot brought it up I've never have even noticed some of the shit he is bitching about.

    • by Anonymous Coward on Wednesday February 20, 2019 @06:20PM (#58154854)

      A fully clothed girl doing gymnastics is not porn.

      But, pedophiles will watch such a video and experience lust.

      The fact that this will happen freaks people out, and a boring run-of-the-mill "look how adorable my kid is" video has suddenly become dirty.

      This situation is fueled by emotion, so it is not logical. Emotion has tremendous motivating power, so you can't talk it down with logic.

      People will suffer all kinds of injustice under the banner of protecting children. Its a strong instinct. That's why politicians like to play that card at every opportunity.

      It is not fair. Too bad. That's the world we live in.

      • by AmiMoJo ( 196126 )

        The problem is that they treat YouTube like social media, but it's not really. On Facebook they can post the video and only their friends of the gymnastics group can see and comment on it. Anyone posting inappropriate stuff gets booted out.

        YouTube doesn't have that kind of access control. Visibility is either everyone or no-one, comments are either on or off.

    • Have you seen the video in question? This isn't an overreaction or censorship issue.

      There are hundreds of comments on these videos of people calling the kids sexy and posting timestamps where the kids are in suggestive positions. often with some short exclamation or suggestive emojis attached, and some even linking to real child porn.

      He started a brand new Youtube account, searched something slightly risque but completely safe and normal, and within two clicks found one of these videos along with countless

    • by Kjella ( 173770 )

      .. There will always be sick people imagining things in pictures or videos that are not there.

      That covers misinterpreting innocent actions or creative editing but people sexualize things entirely on their own, which is why we have rule 34, all the erotic fan-fic, people ogling swimsuit catalogs when they didn't have pr0n and so on. The reasoning here is like saying that because the Harry Potter movies made a lot of people have the hots for Emma Watson the movies must be soft porn. Basically if you're going to remove all the innocent material that could be fodder for someone's spank bank like kids d

      • by sjames ( 1099 )

        you'd better dress everyone up in burkas right now.

        There's almost certainly a few who would find that to be "hot" and about 100 times as many trolls who will claim they do too.

      • by gweihir ( 88907 )

        .... you'd better dress everyone up in burkas right now.

        That is probably pretty much the original reasoning behind the burka anyways.

    • Seriously, this is absurd. Literally the only difference in "proper" or "improper" use is the intent of the person watching it, and now they want to try to regulate intent? The content itself is not the problem, it's the intent of the viewer that's the problem, so why do people think they can A) determine and B) regulate that intent?

      • by gweihir ( 88907 )

        Seriously, this is absurd. Literally the only difference in "proper" or "improper" use is the intent of the person watching it, and now they want to try to regulate intent?

        They do and they have done so in the past. What do you think the real reason for an all-seeing "God" is? It is to censor and enforce restrictions on intent, nothing less. As more and more people see through that old scum these days, they are now trying other ways to control what people dare to think. They are starting with something where they think many people will agree, but of course, this is planned to be extended as soon as they won this battle.

    • by gweihir ( 88907 )

      Indeed. It is pretty hard to see how anybody is getting exploited here. The test for exploitation (the subject is harmed) seems to fail completely. This panic is not new though. In fact, a pretty large part of the human population will be offended by anybody female being shown at all, and this is going in that direction eventually. Do we really want to go that way?

  • by hymie ( 115402 ) <hyrosen@mail.com> on Wednesday February 20, 2019 @05:24PM (#58154532)

    ...people complaining about how YouTube pulled the video of their children being adorable.

    • by zlives ( 2009072 )

      FTFY ...people complaining about how YouTube pulled the video of their children before they could be exploited by the parents lack of self esteem and vicarious living through children as the only justifiable answer to committing an hero

      • by Luckyo ( 1726890 )

        Out of sight, out of mind, eh?

        Hint: just because the spotlight isn't shining at the problem any more doesn't mean that problem went away.

    • they'll just de-monetize it and won't promote it (since they can't monetize it).

      It's actually really annoying. Lots of good YouTube content has gone poof thanks to this. Stuff like Glove and Boots, Freaky Frank, Talking Classics and the like can't make a living on YouTube since the "adocalypse". A few channels made the jump to Pateron but that only works if you've already got a following. New up and comers needed that YouTube ad revenue to get going.

      Somewhat annoyingly the hoards of anti-SJW channel
      • by Anonymous Coward

        See, when your content does not appeal to a minority of the population it is really easy to get viewers. It also helps people are sick to death of idealogs telling them how terrible they are.

      • by AmiMoJo ( 196126 )

        Can't really force advertisers to advertise on videos they don't want to be associated with, so we need another solution.

        How about some kind of charity to support worthwhile YouTube channels? Patreon is too specific and people are unwilling to sign up to a dozen $5/month subscriptions, but a charity that supports a large group of channels might work.

        I suppose the problem would be people objecting to some of the channels, but at the very least it would be an interesting experiment in seeing if people really

    • by Mashiki ( 184564 )

      That's not really the problem. There a few actual problems, one being that Youtube knows that there's a pedo problem already, they have an automatic comment restriction system that kicks in on a lot of kids videos. Another problem is that, it's trivial to find the content and as pointed out in the video the comment section is cancerous and full of pedo crap. The real problem is that the pedos will start telling the kids what to do in the comment section, then the kid starts doing it. "Kids being cute"

      • by djinn6 ( 1868030 )

        Why wouldn't censorship be a solution?

        Kids watching YouTube makes them money. Kids uploading videos generates bad press. Block all videos with kids in them and the problem goes away.

  • It won't work (Score:4, Insightful)

    by TWX ( 665546 ) on Wednesday February 20, 2019 @05:33PM (#58154584)

    From Youtube: "Any content -- including comments -- that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube. We took immediate action by deleting accounts and channels, reporting illegal activity to authorities and disabling violative comments," a spokeswoman for YouTube said in an email.

    It won't work. The fundamental problem is that it's expensive to editorialize/police content and advertising. Major television networks employ standards boards, local television stations have station managers and other staff, and even cable networks have to maintain staff to both sell and to police the content of television shows and of ads. These entities have to spend a sizable amount on salary for these censors, and even being limited to airtime that's limited to 1440 minutes in a 24-hour period they still get it wrong.

    There are claims that 5 billion videos are watched daily on Youtube, and more than 400,000 hours of content is added to Youtube every day. There's simply no way to keep up as censors with that kind of content. Hell, Google can't even keep its ad delivery networks free from malicious ads, how do they expect to keep inappropriate content off when those uploading content don't have a strong financial tie with a particular salesman or censor?

    • Assign every content uploader to be the personal censor for a random other uploader. And then pick a third random uploader to be the metamoderator for the first censor. Shuffle every week. You can't upload if you haven't caught up on your backlog of reviewing and rejecting or approving whoever you're assigned to. Make the random assignments balanced -- people who upload lots of stuff are assigned other people who upload lots of stuff.
      • by mark-t ( 151149 )

        You can't upload if you haven't caught up on your backlog of reviewing and rejecting or approving whoever you're assigned to.

        What happens if you approve stuff that the metamoderator disagrees with? Do you also get blocked from uploading?

        Bear in mind that your suggested system is simply self-reinforcing, and does not generally allow for the insertion of new or potentially even controversial information, even if there is an otherwise objectively legitimate reason for it.

    • Of course there is. ML algorithms could easily screen stuff. Those 400,000 daily hours uploaded is backed by the computational horsepower to support that. You're not asking your personal Dell PC to check this stuff. There's any number of low cost risk mitigation strategies that could be employed. For example, while helping my kids navigate Youtube I've found that much of the pervy stuff comes in the form of reposts. That is, someone ripping off content from a legitimate channel and posting the trimmed
      • by gweihir ( 88907 )

        No, it cannot. ML is maybe 90% accurate if really, really good. You need a lot more than that here.

    • by AmiMoJo ( 196126 )

      In this case though all they need to do is disable comments. They are not objecting to the content of the video, just the comments from people jacking off over them.

      Of course the problem with that is that some of those channels want comments, because they create engagement which translates to $$$. Even just people reading the comments while the video plays adds to its watch time.

  • by fahrbot-bot ( 874524 ) on Wednesday February 20, 2019 @05:36PM (#58154606)

    Matt Watson, a video blogger, posted a 20-minute clip detailing how comments on YouTube were used to identify certain videos in which young girls were in activities that could be construed as sexually suggestive, such as posing in front of a mirror and doing gymnastics.

    Just about *any* activity can be sexually suggestively to someone, somewhere. Not judging, just sayin' ...

  • This just illuminates there is not anything that cannot be ruined by having comments that are not heavily policed - if you cannot police said comments, do not allow them.

    It also shows an unexpected misuse of a simple tool - likes and recommendation engines. But here I really don't see any way to solve this, because what if someone really WANTS to see gymnastic related videos? That by itself is harmless or even useful. I don't think we should break all useful tools just because someone can misuse them - i

    • by djinn6 ( 1868030 )

      But here I really don't see any way to solve this, because what if someone really WANTS to see gymnastic related videos? That by itself is harmless or even useful.

      Simple. You pay for it.

      YouTube and almost all YouTube creators rely on ad revenue. If enough people make a fuss over something, no matter how dumb, advertisers will demand change and YouTube must comply.

      The only way to change that is to stop relying on ad revenue, but YouTube Red / Premium didn't really take off.

  • So if I did have a daughter who was into gymnastics, and I posted her winning the super duper first prize, is this sexually suggestive? I believe someone out there will find it that way. So where to we draw the line. Would I get in trouble for posting such a video, if "someone" says this is sexually suggestive?

    • by Anonymous Coward

      Ah but it doesn't need to be sexually suggestive. If it "could be construed as sexually suggestive" (by whoever writes the rules at that particular moment and for that particular video) then it will be. Next thing you'll be doxxed and you'll have child protection services knocking at your door, with their mind set because they align with the crazy social marxists. You think I jest?

    • Get in trouble is not about right and wrong. It's about whether someone with the capability and the well perceives you to be in the way of what they want. if so they're identify something about you that they can exploit to create a dialogue where you are an evil.
    • by sconeu ( 64226 )

      We should censor the Olympics for putting gymnastics on the air. /s

      [sarcasm tag included per the ADA for the sarcasm-impaired]

    • Would I get in trouble for posting such a video, if "someone" says this is sexually suggestive?

      If you don't monetize the video, you probably won't have a problem. It's the threat to their income streams that galvanized Youtube. It's not like they actually give a shit about your kids, or anyone else's.

    • by Mashiki ( 184564 )

      So if I did have a daughter who was into gymnastics, and I posted her winning the super duper first prize, is this sexually suggestive? I believe someone out there will find it that way. So where to we draw the line. Would I get in trouble for posting such a video, if "someone" says this is sexually suggestive?

      Some where someone probably would. But here's the question, is your daughter turning around and then reading through the comments and preforming specific actions that are sexually suggestive or sexual because people were asking for it. See in most cases, people are talking about the latter and not the former. Youtube has an automatic filtering system to close comments on videos that she's posting, and in turn youtube already knows it has a problem asking kids to engage in sexually suggestive or provocativ

    • by gweihir ( 88907 )

      Eventually, if this panic goes on, this will have your daughter being taken away and you being put in prison.

  • Sad how things ratchet to authoritarianism. People should stop banging on Youtube/Google/etc to take down everything they don't like and then act surprised when they start censoring and deplatforming based on political viewpoints. Google/etc is not and should not play cop. If the law has a problem it should handle it with Google on as narrow as possible case by case basis. But Google/MS/etc are not completely blameless. They seem to enjoy playing cop themselves a little too much and have set themselves up
  • 'Soft core' (Score:4, Interesting)

    by philmarcracken ( 1412453 ) on Wednesday February 20, 2019 @05:57PM (#58154698)

    Does 'soft core' acts actually pose any serious mental damage to the children? Are these acts volitional?

    • by Jarwulf ( 530523 )
      Scrolling through the video it looks like this 'softcore child porn' is mostly videos often recorded by the girls themselves dancing and or wearing swimsuits. So basically every parent who has recorded beach day and anyone who has walked down a beach and looked at people is now to be considered a child molester. This is where we're at as a society today and everybody is too terrified to speak out against this.
      • by gweihir ( 88907 )

        Indeed. This stuff is not porn in any normal way, it is only porn in the minds of some deviants and the maker of this video seems to be one of them. (Same principle as the most extreme anti-homosexuals are usually secretly one of them...) It seems the potential harm is rather limited and does not justify this outcry at all. Probably some censorship agenda being pushed as well, as happens so often these days.

  • I just have to laugh...
    Gives them a dose of the hell they give people who host user generated content (chat forums, image hosts, etc..).

    I hope our A.I. overloads come for their heads (and Zucker_Borg) first.
  • The problem here is that the internet is, by and large, anonymous. You can literally post any steaming pile you like, be as rude, abusive and socially unacceptable as you choose with zero real implications.

    The problem here is that there is no personal accountability, at least not really. Sure, you can get TOSed or your account deleted, but it's not like a nymph shift is hard to do. Grab another E-mail, create a new account and post your garbage again and again.

    The only real solution I see is to require

    • by sjames ( 1099 )

      Of course, the problem then is that sometimes anonymity is legitimately needed. For example, when sincerely posting a legitimate political opinion that might piss off the powers that be. Those powers might be government, a grumpy family member, or an employer with extreme opposite political views.

    • by djinn6 ( 1868030 )

      Facebook is not anonymous, yet you see the same stuff. In some cases, you can get people to stop by raising big stink with their employer and getting them fired from their job. But I'm not sure that'd be a good route to go down.

      If Slashdot or someone on Slashdot could send everything I posted here to my manager, I wouldn't be posting here at all. I'd rather deal with the shit hole that is 4chan than to risk my job for an internet discussion.

  • Hi Dan, Thank you for your message. This was brought to our attention yesterday and we have pulled all ads from Youtube. Kinnek does not condone this abhorrent behavior and until the issue is resolved, we have excluded all ads from Youtube. Best, Kimmy Shiller
  • by Orgasmatron ( 8103 ) on Wednesday February 20, 2019 @07:16PM (#58155158)

    Or rather, watch the damn video before commenting. This is probably among the 10 worst summaries I've read on this site, and I've been here just about every day since nearly the beginning.

    This isn't a problem about parents uploading videos of their kids, or of kids uploading their own videos. It isn't about the videos at all.

    The problem is that there is a side of youtube that most of us would never find on our own. But if you know it is there, you can get to it with one search and two clicks, and shown in the video.

    Most of the videos there have been downloaded from other users and re-uploaded under a different account so that the parents and kids have no idea this is happening. The comments on the re-uploaded videos are full of creepy comments and timestamps to suggestive moments, or to other videos.

    Once down the rabbit hole, all of the recommended video links are to other videos of the same type with the same disgusting comments and links.

  • by quenda ( 644621 ) on Wednesday February 20, 2019 @07:46PM (#58155322)

    So Nestle is making a fuss over videos of kids eating lollipops?

    Nestle, the company who knowingly killed how many thousands of babies, pushing baby formulae in third world counties?
    And have made billions stunting the development of millions of babies by promoting the same products to mothers who were capable of breast feeding?

    https://en.wikipedia.org/wiki/... [wikipedia.org]

    • So Nestle is making a fuss over videos of kids eating lollipops?

      No, they're making a fuss over something they think harms their brand image.

      Nestle, the company who knowingly killed how many thousands of babies, pushing baby formulae in third world counties?

      And now you see WHY they care so much about something that might harm their brand image.

    • Let's not forget the young african children who were slave labor at cocoa plantations. Fully explained in the documentary 'The Dark Side of Chocolate.'

  • Alex Jones (Score:4, Informative)

    by bill_mcgonigle ( 4333 ) * on Wednesday February 20, 2019 @11:08PM (#58156008) Homepage Journal

    I'm just sitting here laughing because, as crazy as that bastard is, YouTube proudly deplatformed Alex Jones who is constantly railing against child sex-trafficking rings.

    And all the while it turned out that YouTube was the one promoting such things with its technology and/or lack of care.

    Not that I'm expecting one single moment of introspection from YouTube.

    • by AmiMoJo ( 196126 )

      Alex Jones railing against child sex-trafficking is the very definition of virtue signalling. He doesn't really care, he just does it to have something that makes his detractors look bad. "YouTube bans child sex-trafficking activist" sounds bad, until you realize that Alex Jones has been harassing the victims of Sandy Hook for years, and that's not even the worst of it.

  • Disney toys and clothes: Made for children, by children.

"What is wanted is not the will to believe, but the will to find out, which is the exact opposite." -- Bertrand Russell, _Sceptical_Essays_, 1928

Working...