Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Advertising Youtube Businesses Communications The Almighty Buck

Disney, Nestle, and Others Are Pulling YouTube Ads Following Child Exploitation Controversy (bloomberg.com) 238

An anonymous reader quotes a report from Bloomberg: Disney is said to have pulled its advertising spending from YouTube, joining other companies including Nestle, after a blogger detailed how comments on Google's video site were being used to facilitate a "soft-core pedophilia ring." Some of the videos involved ran next to ads placed by Disney and Nestle. All Nestle companies in the U.S. have paused advertising on YouTube, a spokeswoman for the company said Wednesday in an email. Video game maker Epic Games and German packaged food giant Dr. August Oetker KG also said they had postponed YouTube spending after their ads were shown to play before the videos. Disney has also withheld its spending.

On Sunday, Matt Watson, a video blogger, posted a 20-minute clip detailing how comments on YouTube were used to identify certain videos in which young girls were in activities that could be construed as sexually suggestive, such as posing in front of a mirror and doing gymnastics. Watson's video demonstrated how, if users clicked on one of the videos, YouTube's algorithms recommended similar ones. By Wednesday, Watson's video had been viewed more than 1.7 million times. Total ad spending on the videos mentioned was less than $8,000 within the last 60 days, and YouTube plans refunds, the spokeswoman said.
Two years ago, Verizon, AT&T, Johnson & Johnson and other major companies pulled their ads from YouTube after learning that some of their ads surfaced next to extremist and violent content. Yesterday, YouTube released an updated policy about how it will handle content that "crosses the line" of appropriateness.

"Any content -- including comments -- that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube. We took immediate action by deleting accounts and channels, reporting illegal activity to authorities and disabling violative comments," a spokeswoman for YouTube said in an email.
This discussion has been archived. No new comments can be posted.

Disney, Nestle, and Others Are Pulling YouTube Ads Following Child Exploitation Controversy

Comments Filter:
  • by WCMI92 ( 592436 ) on Wednesday February 20, 2019 @06:12PM (#58154464) Homepage

    I hate ads on Youtube.

  • Comment removed (Score:5, Insightful)

    by account_deleted ( 4530225 ) on Wednesday February 20, 2019 @06:14PM (#58154474)
    Comment removed based on user account deletion
    • Channels such as Naomi 'SexyCyborg' Wu [youtube.com], where I'm not quite sure if people are watching her for what she does, or what she looks like.

      • by Anonymous Coward on Wednesday February 20, 2019 @06:26PM (#58154548)

        She's an adult who can make the content she wants and it really doesn't matter why others might like her work. She seems like a fun lady who does fun things and is fun to look at, and that's just fine.

        • by AmiMoJo ( 196126 ) on Thursday February 21, 2019 @05:17AM (#58156706) Homepage Journal

          She is also an accomplished engineer. She has a great deal of skill when it comes to 3D modelling and printing/CNC, particularly an ability to conceive of a product and turn it into something workable quickly.

          She has done a lot of China too, especially promoting open source. She is responsible for the first three open source hardware products out of China. She went to the manufacturer of some 3D printers, convinced them to open source the design, helped them do it and meet all the requirements, and got it certified. Gave them a nice sales boost too as westerners love open source hardware. She also helped take some of the stigma off Chinese products, demonstrating that they can be good quality and that the manufacturer can engage with the western world.

          There is also the Sinobit, a single board computer for learning. She does a lot to help kids learn about engineering. The design is a little bit like the British Microbit board, but with a larger LED display because the British one is too small to display Chinese characters. Again, open source.

          I'm amazed that she kept going after western journalists from Vice nearly destroyed her. They put her in real danger - I won't get into it because that would just be compounding the problem, but suffice to say many people would have gone into hiding after that.

      • by amicusNYCL ( 1538833 ) on Wednesday February 20, 2019 @08:47PM (#58155324)

        Why does it matter? Serious question.

        In addition to dictating what people are allowed to watch, are you suggesting also trying to dictate why they should be watching it?

        • by gweihir ( 88907 )

          Nothing less than total control over others will satisfy the current generation (well, all generations really) of authoritarian scum. The sad thing is that these people are again being listened too. As if the numerous fascist, stalinist and other authoritarian catastrophes the human race has caused in the past are not enough and we need some more.

        • It does not matter to me, I'm just curious if she would have as many followers if she didn't have her breast implants.
          It's a good move on her part for exploiting the male psyche, though.

        • Well for good or ill that currently is our laws. It is simply illegal for people to create content intended to be masturbated to where the star(s) are girls under age. If you don't think that any of these YT videos are this, than you have your head in the sand.

          Similarly, I suspect having/watching video content you yourself consider to be CP makes it CP... Porn is a nebulous consent with regard to the law, it is all about intent. I defiantly would not want to be in a courtroom trying to argue that while I di

      • by hoofie ( 201045 ) <mickey@MOSCOWmouse.com minus city> on Wednesday February 20, 2019 @10:10PM (#58155660)

        Look the first time you watch some of her videos, let be honest it's quite easy on the eye.....

        After a few of them you really don't notice that aspect anymore as the content she presents is too interesting - you are a true geek if you can do that.

        She has been shamefully treated though and comes across as quite a lovely person.

    • by zlives ( 2009072 ) on Wednesday February 20, 2019 @06:24PM (#58154528)

      can we have a conversation about why people are uploading their kids on the internet in the first place?

      • Comment removed based on user account deletion
      • by grep -v '.*' * ( 780312 ) on Wednesday February 20, 2019 @07:23PM (#58154872)

        why people are uploading their kids on the internet in the first place?

        You wouldn't download a car, you wouldn't download a movie. But would you download a kid?

        IMAGES. They're uploading IMAGES -- to share. They're also the parental guardians over them, so they have that right. (I think that's a right.) They're presumably doing it to show off to their friends or their kids friends, they don't realize (or don't care) that literally their entire world can watch them as well.

        And what's wrong with that? Maybe another dance class instructor will learn a new move or have a new idea (OMG, CULTURAL APPROPRIATION!) Maybe another kid will watch these and decide "I can do that, too!"

        Now let's be real -- SOME people are going to watch that and imagine sexual acts. Or bestiality. Or ritual sacrifice. Or beheadings. Or maybe new costume designs or even haircuts. That's true of ANYTHING. (Sheep jokes about NZ. Blond jokes. Rule 42. Someone took an actress's face and glued it on their sex doll.) If you're concerned someone might see something bad in a picture, you (a) shouldn't post it and (b) need to get your head out of the sand. (My mom used to say, "Get your head out of the gutter." Seemingly now-a-days, everyone's head IS the gutter. Not quite sure what happened -- "Sex Sells" or something.)

        If we're leading up to talking about banning pictures because they might offend (or attract) someone, then notice that God, Allah, Re, Zeus, Odin, Thor, and FSM appear in EACH and EVERY picture on the internet. They're invisible, so you can't see them, but they're always there. So, let's start banning all pictures everywhere. (Here's a naughty one. [theonion.com] But include Muhammad there and heads will literally roll.)

        And BTW: if I start fantasizing about AOC, Nancy Pelosi, and Trump, can we ban every picture about ALL of them? I'd like for SOMEthing good to come out of this.

      • by MobyDisk ( 75490 )

        Yes.

        Growing up, a group of children in my neighborhood won an award for some civic duty. The newspaper printed a picture of the group and their first names, all except for one child. That child's parents were outraged that the newspaper wanted to publish that information. The believed that having people know their faces and names presented a danger to them. Or maybe it was a news broadcast now that I think of it. But over time, that kind of thinking mostly died out. That kind of information is so triv

        • Are you suggesting that a video of a child on YouTube, with no last name and no location (which is less information than the newspaper offered) is a cause for concern?

          Perhaps.. It depends on both the context and unknown future events. There is just no way to know for sure.

          I think most folks vastly under estimate the amount of data they actually are giving up when they post stuff online. It's hard to understand just how the information will be stored, used and impact people in the future. It may be nothing, or, it may be a huge deal.

          I think the young adults who grew up in a culture of oversharing everything, getting your 15 min of fame by going viral, it's tempting to

          • Every year or so, there's another news article about some mommy blogger who's children just found out every moment of their life was memorialized online - and they are very angry. It's usually followed by the mommy blogger expressing no remorse, and talking about how they are still going to post everything their kid does (including this whole argument), usually with a fig leaf or two about some privacy "protecting" nothing that shows they "compromise".

      • They are proud of their children and want to send videos to family and friends, but don't know how the privacy features of YouTube work (or that they even exist).
      • Maybe I'm reading you wrong, but believe it or not, there's more than one reason for children to be present on the Internet. It would be wrong to allow some pervs from eastern Europe, et al. to spoil it for the overwhelming majority of the legitimate, innocent content out there for and by children. There seems to be a lot of hand waving surrounding this topic arguing for the equivalent of shutting down playgrounds, water parks, etc. because some perverted letch might be leveraging them for his jollies. A

      • by AmiMoJo ( 196126 )

        It's hardly a new thing. There have always been public events for children, or maybe more like their parents, to show off their ballet skills or whatever. A few years ago there was a panic over parents taking photos of their kids at swimming events or even just playing football.

    • by Anonymous Coward on Wednesday February 20, 2019 @07:20PM (#58154854)

      A fully clothed girl doing gymnastics is not porn.

      But, pedophiles will watch such a video and experience lust.

      The fact that this will happen freaks people out, and a boring run-of-the-mill "look how adorable my kid is" video has suddenly become dirty.

      This situation is fueled by emotion, so it is not logical. Emotion has tremendous motivating power, so you can't talk it down with logic.

      People will suffer all kinds of injustice under the banner of protecting children. Its a strong instinct. That's why politicians like to play that card at every opportunity.

      It is not fair. Too bad. That's the world we live in.

      • by AmiMoJo ( 196126 )

        The problem is that they treat YouTube like social media, but it's not really. On Facebook they can post the video and only their friends of the gymnastics group can see and comment on it. Anyone posting inappropriate stuff gets booted out.

        YouTube doesn't have that kind of access control. Visibility is either everyone or no-one, comments are either on or off.

      • Yes, and off of YT they are often not even clothed. And sometimes the parent who thinks its adorable, is instead a parent, pimp, or other who is simply trying to maximize revenue any way possible.

        The world and particularly the internet is full of people and organizations who live off of grooming children and selling sexual content they create. I guarantee that at least some of the children in these videos have content on other websites behind a pay filter hosted in non-extradition counties.

        It makes sense th

    • Have you seen the video in question? This isn't an overreaction or censorship issue.

      There are hundreds of comments on these videos of people calling the kids sexy and posting timestamps where the kids are in suggestive positions. often with some short exclamation or suggestive emojis attached, and some even linking to real child porn.

      He started a brand new Youtube account, searched something slightly risque but completely safe and normal, and within two clicks found one of these videos along with countless

      • by gweihir ( 88907 )

        So the problem are the comments, not the videos?

        • So the problem are the comments, not the videos?

          Half the problem is the comments. The other half (also demonstrated in the video) is people copying & reposting these videos under throwaway accounts.

          It seems like Youtube needs both a way for parents to prevent unsupervised postings from their children, as well as a way to better track these obvious abuse cases.

    • by Kjella ( 173770 )

      .. There will always be sick people imagining things in pictures or videos that are not there.

      That covers misinterpreting innocent actions or creative editing but people sexualize things entirely on their own, which is why we have rule 34, all the erotic fan-fic, people ogling swimsuit catalogs when they didn't have pr0n and so on. The reasoning here is like saying that because the Harry Potter movies made a lot of people have the hots for Emma Watson the movies must be soft porn. Basically if you're going to remove all the innocent material that could be fodder for someone's spank bank like kids d

      • by sjames ( 1099 )

        you'd better dress everyone up in burkas right now.

        There's almost certainly a few who would find that to be "hot" and about 100 times as many trolls who will claim they do too.

      • by gweihir ( 88907 )

        .... you'd better dress everyone up in burkas right now.

        That is probably pretty much the original reasoning behind the burka anyways.

    • Seriously, this is absurd. Literally the only difference in "proper" or "improper" use is the intent of the person watching it, and now they want to try to regulate intent? The content itself is not the problem, it's the intent of the viewer that's the problem, so why do people think they can A) determine and B) regulate that intent?

      • by gweihir ( 88907 )

        Seriously, this is absurd. Literally the only difference in "proper" or "improper" use is the intent of the person watching it, and now they want to try to regulate intent?

        They do and they have done so in the past. What do you think the real reason for an all-seeing "God" is? It is to censor and enforce restrictions on intent, nothing less. As more and more people see through that old scum these days, they are now trying other ways to control what people dare to think. They are starting with something where they think many people will agree, but of course, this is planned to be extended as soon as they won this battle.

    • by gweihir ( 88907 )

      Indeed. It is pretty hard to see how anybody is getting exploited here. The test for exploitation (the subject is harmed) seems to fail completely. This panic is not new though. In fact, a pretty large part of the human population will be offended by anybody female being shown at all, and this is going in that direction eventually. Do we really want to go that way?

    • Yes and no.
      First off, their are loads of little girls who made videos specifically to be sexually suggestive, and I guarantee you their are also adults filming these girls specifically trying to create sexually suggestive content. AKA, some of these content simply is CP from intent to execution.

      Other content is or course a gray area. They just want attention, their parents want them to by the next Miley Cyrus and they naturally gravitate to sexually suggestive content because 1) that is what their idols do

  • by hymie ( 115402 ) <hyrosen@mail.com> on Wednesday February 20, 2019 @06:24PM (#58154532)

    ...people complaining about how YouTube pulled the video of their children being adorable.

    • by zlives ( 2009072 )

      FTFY ...people complaining about how YouTube pulled the video of their children before they could be exploited by the parents lack of self esteem and vicarious living through children as the only justifiable answer to committing an hero

      • by Luckyo ( 1726890 )

        Out of sight, out of mind, eh?

        Hint: just because the spotlight isn't shining at the problem any more doesn't mean that problem went away.

    • they'll just de-monetize it and won't promote it (since they can't monetize it).

      It's actually really annoying. Lots of good YouTube content has gone poof thanks to this. Stuff like Glove and Boots, Freaky Frank, Talking Classics and the like can't make a living on YouTube since the "adocalypse". A few channels made the jump to Pateron but that only works if you've already got a following. New up and comers needed that YouTube ad revenue to get going.

      Somewhat annoyingly the hoards of anti-SJW channel
      • by AmiMoJo ( 196126 )

        Can't really force advertisers to advertise on videos they don't want to be associated with, so we need another solution.

        How about some kind of charity to support worthwhile YouTube channels? Patreon is too specific and people are unwilling to sign up to a dozen $5/month subscriptions, but a charity that supports a large group of channels might work.

        I suppose the problem would be people objecting to some of the channels, but at the very least it would be an interesting experiment in seeing if people really

    • by Mashiki ( 184564 )

      That's not really the problem. There a few actual problems, one being that Youtube knows that there's a pedo problem already, they have an automatic comment restriction system that kicks in on a lot of kids videos. Another problem is that, it's trivial to find the content and as pointed out in the video the comment section is cancerous and full of pedo crap. The real problem is that the pedos will start telling the kids what to do in the comment section, then the kid starts doing it. "Kids being cute"

      • by djinn6 ( 1868030 )

        Why wouldn't censorship be a solution?

        Kids watching YouTube makes them money. Kids uploading videos generates bad press. Block all videos with kids in them and the problem goes away.

        • by Mashiki ( 184564 )

          Censorship isn't the solution because it begets the slippery slope. See the UK for example, which first started restricting "allowable types of porn" and whatnot, and now are pushing for you to buy a porn pass.

          Blocking all videos with kids in them, would mean that videos that the RCMP does, or OPP would no longer be available either. It's similar to the "everything looks like a nail, when you're holding a hammer" approach.

          • by djinn6 ( 1868030 )

            If you only have a hammer, then you should use it until you get yourself a better tool. It's much easier to build a ML model to identify children in general than to identify children in potentially compromising positions, especially when you don't know what some freak might be turned on by.

            I also don't see how this can be a slippery slope. Having separate platforms for kids and adults to post content on makes a lot of sense. There's a good reason why schools are segregated by age, not the least of which is

  • It won't work (Score:4, Insightful)

    by TWX ( 665546 ) on Wednesday February 20, 2019 @06:33PM (#58154584)

    From Youtube: "Any content -- including comments -- that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube. We took immediate action by deleting accounts and channels, reporting illegal activity to authorities and disabling violative comments," a spokeswoman for YouTube said in an email.

    It won't work. The fundamental problem is that it's expensive to editorialize/police content and advertising. Major television networks employ standards boards, local television stations have station managers and other staff, and even cable networks have to maintain staff to both sell and to police the content of television shows and of ads. These entities have to spend a sizable amount on salary for these censors, and even being limited to airtime that's limited to 1440 minutes in a 24-hour period they still get it wrong.

    There are claims that 5 billion videos are watched daily on Youtube, and more than 400,000 hours of content is added to Youtube every day. There's simply no way to keep up as censors with that kind of content. Hell, Google can't even keep its ad delivery networks free from malicious ads, how do they expect to keep inappropriate content off when those uploading content don't have a strong financial tie with a particular salesman or censor?

    • Assign every content uploader to be the personal censor for a random other uploader. And then pick a third random uploader to be the metamoderator for the first censor. Shuffle every week. You can't upload if you haven't caught up on your backlog of reviewing and rejecting or approving whoever you're assigned to. Make the random assignments balanced -- people who upload lots of stuff are assigned other people who upload lots of stuff.
      • by mark-t ( 151149 )

        You can't upload if you haven't caught up on your backlog of reviewing and rejecting or approving whoever you're assigned to.

        What happens if you approve stuff that the metamoderator disagrees with? Do you also get blocked from uploading?

        Bear in mind that your suggested system is simply self-reinforcing, and does not generally allow for the insertion of new or potentially even controversial information, even if there is an otherwise objectively legitimate reason for it.

    • Of course there is. ML algorithms could easily screen stuff. Those 400,000 daily hours uploaded is backed by the computational horsepower to support that. You're not asking your personal Dell PC to check this stuff. There's any number of low cost risk mitigation strategies that could be employed. For example, while helping my kids navigate Youtube I've found that much of the pervy stuff comes in the form of reposts. That is, someone ripping off content from a legitimate channel and posting the trimmed
      • by gweihir ( 88907 )

        No, it cannot. ML is maybe 90% accurate if really, really good. You need a lot more than that here.

        • You don't need to be 100%. You don't need to have the ML remove 100% and you don't need to match 100% to improve the situation in a significant and meaningful way. An example ML involved strategy that I believe would be very effective would be a multi-stage filtration system system. Stage one, the ML, grabs the low hanging fruit--everything hitting a sufficiently high confidence threshold. Stage two, Google staff get handed stuff the ML flagged as likely, but not having sufficient confidence to automati
    • by AmiMoJo ( 196126 )

      In this case though all they need to do is disable comments. They are not objecting to the content of the video, just the comments from people jacking off over them.

      Of course the problem with that is that some of those channels want comments, because they create engagement which translates to $$$. Even just people reading the comments while the video plays adds to its watch time.

  • by fahrbot-bot ( 874524 ) on Wednesday February 20, 2019 @06:36PM (#58154606)

    Matt Watson, a video blogger, posted a 20-minute clip detailing how comments on YouTube were used to identify certain videos in which young girls were in activities that could be construed as sexually suggestive, such as posing in front of a mirror and doing gymnastics.

    Just about *any* activity can be sexually suggestively to someone, somewhere. Not judging, just sayin' ...

    • by DontBeAMoran ( 4843879 ) on Wednesday February 20, 2019 @06:41PM (#58154620)

      Tell me about it! Who came out with the perverse idea of a four slots toaster?!

    • by Luckyo ( 1726890 )

      In this case though, the behaviour is often overtly suggestive and is literally the point of the video, down to it being mentioned in the click bait video name. One comedy youtuber I occasionally watch openly mocked one channel that did this to an extreme degree quite a while ago. Literally girl in early teens in outfits highlighted in the video clickbait topic as "making her boyfriend jealous" doing a rather awkward attempt at acting like a porn star right before sex scene starts. Then making out with her

  • This just illuminates there is not anything that cannot be ruined by having comments that are not heavily policed - if you cannot police said comments, do not allow them.

    It also shows an unexpected misuse of a simple tool - likes and recommendation engines. But here I really don't see any way to solve this, because what if someone really WANTS to see gymnastic related videos? That by itself is harmless or even useful. I don't think we should break all useful tools just because someone can misuse them - i

    • by djinn6 ( 1868030 )

      But here I really don't see any way to solve this, because what if someone really WANTS to see gymnastic related videos? That by itself is harmless or even useful.

      Simple. You pay for it.

      YouTube and almost all YouTube creators rely on ad revenue. If enough people make a fuss over something, no matter how dumb, advertisers will demand change and YouTube must comply.

      The only way to change that is to stop relying on ad revenue, but YouTube Red / Premium didn't really take off.

  • So if I did have a daughter who was into gymnastics, and I posted her winning the super duper first prize, is this sexually suggestive? I believe someone out there will find it that way. So where to we draw the line. Would I get in trouble for posting such a video, if "someone" says this is sexually suggestive?

    • Comment removed based on user account deletion
    • Would I get in trouble for posting such a video, if "someone" says this is sexually suggestive?

      If you don't monetize the video, you probably won't have a problem. It's the threat to their income streams that galvanized Youtube. It's not like they actually give a shit about your kids, or anyone else's.

    • by Mashiki ( 184564 )

      So if I did have a daughter who was into gymnastics, and I posted her winning the super duper first prize, is this sexually suggestive? I believe someone out there will find it that way. So where to we draw the line. Would I get in trouble for posting such a video, if "someone" says this is sexually suggestive?

      Some where someone probably would. But here's the question, is your daughter turning around and then reading through the comments and preforming specific actions that are sexually suggestive or sexual because people were asking for it. See in most cases, people are talking about the latter and not the former. Youtube has an automatic filtering system to close comments on videos that she's posting, and in turn youtube already knows it has a problem asking kids to engage in sexually suggestive or provocativ

    • by gweihir ( 88907 )

      Eventually, if this panic goes on, this will have your daughter being taken away and you being put in prison.

  • 'Soft core' (Score:4, Interesting)

    by philmarcracken ( 1412453 ) on Wednesday February 20, 2019 @06:57PM (#58154698)

    Does 'soft core' acts actually pose any serious mental damage to the children? Are these acts volitional?

    • by Jarwulf ( 530523 )
      Scrolling through the video it looks like this 'softcore child porn' is mostly videos often recorded by the girls themselves dancing and or wearing swimsuits. So basically every parent who has recorded beach day and anyone who has walked down a beach and looked at people is now to be considered a child molester. This is where we're at as a society today and everybody is too terrified to speak out against this.
      • by gweihir ( 88907 )

        Indeed. This stuff is not porn in any normal way, it is only porn in the minds of some deviants and the maker of this video seems to be one of them. (Same principle as the most extreme anti-homosexuals are usually secretly one of them...) It seems the potential harm is rather limited and does not justify this outcry at all. Probably some censorship agenda being pushed as well, as happens so often these days.

      • by Shotgun ( 30919 )

        And don't dare let your children go to the mall. The larger than life posters of underwear clad young women made up to look underage outside of Victoria Secret will make heads explode.

  • I just have to laugh...
    Gives them a dose of the hell they give people who host user generated content (chat forums, image hosts, etc..).

    I hope our A.I. overloads come for their heads (and Zucker_Borg) first.
  • The problem here is that the internet is, by and large, anonymous. You can literally post any steaming pile you like, be as rude, abusive and socially unacceptable as you choose with zero real implications.

    The problem here is that there is no personal accountability, at least not really. Sure, you can get TOSed or your account deleted, but it's not like a nymph shift is hard to do. Grab another E-mail, create a new account and post your garbage again and again.

    The only real solution I see is to require

    • by sjames ( 1099 )

      Of course, the problem then is that sometimes anonymity is legitimately needed. For example, when sincerely posting a legitimate political opinion that might piss off the powers that be. Those powers might be government, a grumpy family member, or an employer with extreme opposite political views.

    • by djinn6 ( 1868030 )

      Facebook is not anonymous, yet you see the same stuff. In some cases, you can get people to stop by raising big stink with their employer and getting them fired from their job. But I'm not sure that'd be a good route to go down.

      If Slashdot or someone on Slashdot could send everything I posted here to my manager, I wouldn't be posting here at all. I'd rather deal with the shit hole that is 4chan than to risk my job for an internet discussion.

  • by Orgasmatron ( 8103 ) on Wednesday February 20, 2019 @08:16PM (#58155158)

    Or rather, watch the damn video before commenting. This is probably among the 10 worst summaries I've read on this site, and I've been here just about every day since nearly the beginning.

    This isn't a problem about parents uploading videos of their kids, or of kids uploading their own videos. It isn't about the videos at all.

    The problem is that there is a side of youtube that most of us would never find on our own. But if you know it is there, you can get to it with one search and two clicks, and shown in the video.

    Most of the videos there have been downloaded from other users and re-uploaded under a different account so that the parents and kids have no idea this is happening. The comments on the re-uploaded videos are full of creepy comments and timestamps to suggestive moments, or to other videos.

    Once down the rabbit hole, all of the recommended video links are to other videos of the same type with the same disgusting comments and links.

  • by quenda ( 644621 ) on Wednesday February 20, 2019 @08:46PM (#58155322)

    So Nestle is making a fuss over videos of kids eating lollipops?

    Nestle, the company who knowingly killed how many thousands of babies, pushing baby formulae in third world counties?
    And have made billions stunting the development of millions of babies by promoting the same products to mothers who were capable of breast feeding?

    https://en.wikipedia.org/wiki/... [wikipedia.org]

    • So Nestle is making a fuss over videos of kids eating lollipops?

      No, they're making a fuss over something they think harms their brand image.

      Nestle, the company who knowingly killed how many thousands of babies, pushing baby formulae in third world counties?

      And now you see WHY they care so much about something that might harm their brand image.

    • Let's not forget the young african children who were slave labor at cocoa plantations. Fully explained in the documentary 'The Dark Side of Chocolate.'

  • Alex Jones (Score:4, Informative)

    by bill_mcgonigle ( 4333 ) * on Thursday February 21, 2019 @12:08AM (#58156008) Homepage Journal

    I'm just sitting here laughing because, as crazy as that bastard is, YouTube proudly deplatformed Alex Jones who is constantly railing against child sex-trafficking rings.

    And all the while it turned out that YouTube was the one promoting such things with its technology and/or lack of care.

    Not that I'm expecting one single moment of introspection from YouTube.

    • by AmiMoJo ( 196126 )

      Alex Jones railing against child sex-trafficking is the very definition of virtue signalling. He doesn't really care, he just does it to have something that makes his detractors look bad. "YouTube bans child sex-trafficking activist" sounds bad, until you realize that Alex Jones has been harassing the victims of Sandy Hook for years, and that's not even the worst of it.

    • that de-monetized Call of Duty streamers for violence? YouTube will come down on this like a ton of bricks. Their main concern is keeping advertisers happy. YouTube, like regular TV, is about ads.
    • YouTube proudly deplatformed Alex Jones who is constantly railing against child sex-trafficking rings.

      It may surprise you to lean that Youtube did not deplatform him for his stance on this.

      And all the while it turned out that YouTube was the one promoting such things with its technology and/or lack of care.

      I just masturbated to this post. You need to be banned. Why is Slashdot facilitating your disgusting post.

      #outrage.

      Not that I'm expecting one single moment of introspection from YouTube.

      Oh wow. You actually finished your post with something smart. I'm sure you didn't intend to, but hey accidents happen.

So you think that money is the root of all evil. Have you ever asked what is the root of money? -- Ayn Rand

Working...