Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Youtube Advertising Businesses Facebook Social Networks The Almighty Buck The Internet United States

YouTube Is Heading For Its Cambridge Analytica Moment (cnbc.com) 100

Earlier this week, Disney, Nestle and others pulled its advertising spending from YouTube after a blogger detailed how comments on Google's video site were being used to facilitate a "soft-core pedophilia ring." Some of the videos involved ran next to ads placed by Disney and Nestle. With the company facing similar problems over the years, often being "caught in a game of whack-a-mole to fix them," Matt Rosoff from CNBC writes that it's only a matter of time until YouTube faces a scandal that actually alienates users, as happened with Facebook in the Cambridge Analytica scandal. From the report: To be fair, YouTube has taken concrete steps to fix some problems. A couple of years ago, major news events were targets for scammers to post misleading videos about them, like videos claiming shootings such as the one in Parkland, Florida, were staged by crisis actors. In January, the company said it would stop recommending such videos, effectively burying them. It also favors "authoritative" sources in search results around major news events, like mainstream media organizations. And YouTube is not alone in struggling to fight inappropriate content that users upload to its platform. The problem isn't really about YouTube, Facebook or any single company. The problem is the entire business model around user-generated content, and the whack-a-mole game of trying to stay one step ahead of people who abuse it.

[T]ech platforms that rely on user-generated content are protected by the 1996 Communications Decency Act, which says platform providers cannot be held liable for material users post on them. It made sense at the time -- the internet was young, and forcing start-ups to monitor their comments sections (remember comments sections?) would have exploded their expenses and stopped growth before it started. Even now, when some of these companies are worth hundreds of billions of dollars, holding them liable for user-generated content would blow up these companies' business models. They'd disappear, reduce services or have to charge fees for them. Voters might not be happy if Facebook went out of business or they suddenly had to start paying $20 a month to use YouTube. Similarly, advertiser boycotts tend to be short-lived -- advertisers go where they get the best return on their investment, and as long as billions of people keep watching YouTube videos, they'll keep advertising on the platform. So the only way things will change is if users get turned off so badly that they tune out.
Following Facebook's Cambridge Analytica scandal, people deleted their accounts, Facebook's growth largely stalled in the U.S., and more young users have abandoned the platform. "YouTube has so far skated free of any similar scandals. But people are paying closer attention than ever before, and it's only a matter of time before the big scandal that actually starts driving users away," writes Rosoff.
This discussion has been archived. No new comments can be posted.

YouTube Is Heading For Its Cambridge Analytica Moment

Comments Filter:
  • IMDB (Score:5, Interesting)

    by Kyr Arvin ( 5570596 ) on Friday February 22, 2019 @06:18PM (#58166804)

    IMDB removed their comments sections entirely rather than police them.

    Youtube's comments are more integral to the service, but if Youtube is going to be have to do more about them then respond to user complaints, they might find it easier to just shut that crap down preemptively.

    • That's the goal, enable YouTube to eliminate content their management doesn't want, and use demagoguery against anyone who criticizes their choices. And the best demagoguery is "it's for the children".
      • by Anonymous Coward

        "enable YouTube to eliminate content their management doesn't want," - Huh? They have that power now. What are you not getting about this? You have no "right" to youtube under any circumstances. They own your account.

        You upload content, it's theirs more than it is yours until lawyered otherwise. You think it's "demagoguery" to have control of their platform and run it like they want to? Maybe, but they can. Tighten up snowflake.

        Go host your own videos if you want to be Braveheart the pedo.

      • by Anonymous Coward

        muh conservative values!!1! I'm getting verklempt!!

    • but if Youtube is going to be have to do more about them then respond to user complaints

      They can just threaten to remove the channels if they do not police their own comments.

  • This is just silly (Score:5, Insightful)

    by rsilvergun ( 571051 ) on Friday February 22, 2019 @06:20PM (#58166818)
    YouTube isn't a social network. The controversy, such as it is, doesn't have anything to do with privacy. Also YouTube hasn't done anything dodgy or illegal, they've just responded poorly to a very minor bit of bad publicity.

    This'll blow over, some full time YouTubers will sadly lose out (and we'll lose out on some good content) and YouTube will go on.

    The CA thing was a mess because not only was there privacy concerns but there was the stink of corrupt American politics all over it.
    • by rsborg ( 111459 )

      The CA thing was a mess because not only was there privacy concerns but there was the stink of corrupt American politics all over it.

      Yeah, the CA thing was entirely different. The issue at stake was that because FB was really lax on security they effectively let CA pull the whole graph down on everyone even though those people may not have fallen for CA's dirty tricks directly. That information was then weaponized to create deepfake videos and radicalize the population.

      This is more along the lines of the usual "Google doesn't understand humans" policy intersected with pedos being pedos.

      If comments were turned off on YouTube would any

    • by Anonymous Coward

      Yea, the CA thing was far worse because it might have been used to help the GOP win elections.
      When the same thing was used in 2012 to help Obama win reelection, it was perfectly acceptable.

      So when you help the party that supports KKK members (VA governor), rapists (VA lt governor), and advocates of infanticide (killing live babies as part of abortion law) you should be able to break whatever rules exist.
      If you want to cut taxes on middle class or protect US citizens from Mexican drugs and violent criminals,

    • As I understand it this was pushed by a YouTuber who had been demonetized because his content was creepy. His response was to use the following he had to complain to the advertisers about pedophiles commenting in certain videos in an effort to hurt YouTube and successful creators who had not been demonetized.

      Not that there wasn't a problem, but YouTube was aware of it and trying to address it without forcing creators to have to moderate content. The stream of complaints caused advertisers to pull their ads

    • Exactly. People are getting more ware of privacy violations, but I expect that the vast majority of YT viewers give exactly zero fucks about the fact that a kiddie porn ring is abusing parts of the comments section. They might care if it's their comment section being abused, whether they are hosting the video or just commenting on it, but it's such a small section of YT that people will just ignore it. No one is going to cancel their account over this.

      Advertisers on the other hand are very sensitive a
  • Sure, I was alienated by the Cambridge Analytica scandal, as was another nerd I know, but that's it in my circle of acquaintances, and neither of us gave up our accounts. When I brought up the scandal at a get-together, nobody else had even heard of it, and the conversation shifted to how much everybody likes Facebook.

    Honestly, I don't think Youtube can be blamed for the, admittedly clever, use of its comment system for nefarious reasons. If there is something they can do to stifle those uses, great...but

  • i hope the producers of this ""soft-core pedophilia" " content are punished severely.

    • Comment removed (Score:4, Insightful)

      by account_deleted ( 4530225 ) on Friday February 22, 2019 @06:42PM (#58166952)
      Comment removed based on user account deletion
      • by Anonymous Coward

        Nah, that's basically it. Really his point was two things: first, that the comments were being used for that, and second, that Google's recommendation algorithm worked really well and it only took looking at like two videos of little girls before YouTube decided that was all his new user account was interested in watching.

        I'm not sure exactly what he thinks Google could do about it. Comments are clearly coming from users, and other than disabling them entirely, what is Google supposed to do? The recommendat

      • by Anonymous Coward

        The problem is google's algorithms and how they recommend videos to people. They end up inadvertently creating what are almost communities, and sometimes centered around really creepy shit. I ran into something similar to this before, but with videos of monkeys. Apparently there's a whole community of people on youtube who really really hate monkeys, and like to upload and watch videos of baby monkeys suffering. It was just like the how the softcore pedo ring was described, where you'd go to one video a

    • by ffkom ( 3519199 ) on Friday February 22, 2019 @06:47PM (#58166992)
      Do you realize this is about completely mundane videos of children in everyday activities, with the only thing related to "pedophilia" being the claim that it was pedophiles who left idiotic comments that suggest parts of these videos were somehow arousing sexual feelings?

      While theoretically, extremely stupid pedophiles might actually have been the authors of those comments, it seems just as likely that trolls seeking attention for either fun or publicity or money wrote those comments themselves to then base a "scandal" on them.
      • by Mashiki ( 184564 )

        One of the problems is, you can go back over say the last 5 years of court cases where that type of action shows a "pattern of behavior" of an individual who's been criminally charged with a sex crime against a minor, or trafficking in child porn. One of the problems is that kids see the attention they're getting then act out the things suggested, that *is* a problem. Said problem has multiple failures of multiple people at multiple levels. If you have kids, or nieces or nephews and so on. Especially ki

  • by Chas ( 5144 )

    Basically, with the new "comments on your video can get you demonetized" policy is going to slowly strangle the content creator community.
    Because it favors established creators with multiple revenue streams already established.

    And it is slowly looking to present an insurmountable barrier to entry for NEW creators.

  • I already pay and have an ad free experience. Worth every penny considering the amount of great content on it.

    • by ffkom ( 3519199 )
      So you are fine with Youtube getting all that content for free while you pay them for it. I for one would rather consider to pay the authors of the videos, not Youtube.
    • I already pay and have an ad free experience.

      I use an ad blocker and have an ad free experience.

      What's your point?

  • Re: (Score:1, Insightful)

    Comment removed based on user account deletion
    • by AHuxley ( 892839 )
      Use some p2p method to make the watcher offer some of their bandwidth back.
      A way for viewers to support people directly who make video clips without the third party politics and CoC of a CC brand, a membership platform.
      No longer can an online payments network, CC brand, membership platform stop funds going to artists and content creators from their viewers.
      Ensure the video clips flow from the creator to the people searching for them.
      Back to been a utility connecting content to users rather than becomi
  • Once again a what was once a college project used by a few thousand has morphed into a multi-billion dollar corporate enterprise, sadly those still running it like it's a little college server for a few laughs.

    Some people are scum and they will ruin things that most of us are happy to use for a fun. As anyone who works in retail will tell you, it's always the 5% that are the most difficult to deal with but sadly everything has to be geared up to deal with the minority and the trouble they cause, it's never

  • Did anything happen to FB? Did they change any practices? Did they stop gathering data from non-members? Did it effect their long term revue or stock?
    The answer to all is a "no"... there's no consequences for corporate oligarchs.
  • Amazing how difficult it is to police freedoms, is it not? Their excuse of, "we're too big to police ourselves" can't work forever.

Keep up the good work! But please don't ask me to help.

Working...