Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Youtube

YouTube's Recommender AI Still a Horror Show, Finds Major Crowdsourced Study (techcrunch.com) 81

An anonymous reader shares a report: For years YouTube's video-recommending algorithm has stood accused of fuelling a grab bag of societal ills by feeding users an AI-amplified diet of hate speech, political extremism and/or conspiracy junk/disinformation for the profiteering motive of trying to keep billions of eyeballs stuck to its ad inventory. And while YouTube's tech giant parent Google has, sporadically, responded to negative publicity flaring up around the algorithm's antisocial recommendations -- announcing a few policy tweaks or limiting/purging the odd hateful account -- it's not clear how far the platform's penchant for promoting horribly unhealthy clickbait has actually been rebooted. The suspicion remains nowhere near far enough.

New research published today by Mozilla backs that notion up, suggesting YouTube's AI continues to puff up piles of "bottom-feeding"/low-grade/divisive/disinforming content -- stuff that tries to grab eyeballs by triggering people's sense of outrage, sewing division/polarization or spreading baseless/harmful disinformation -- which in turn implies that YouTube's problem with recommending terrible stuff is indeed systemic; a side effect of the platform's rapacious appetite to harvest views to serve ads. That YouTube's AI is still -- per Mozilla's study -- behaving so badly also suggests Google has been pretty successful at fuzzing criticism with superficial claims of reform. The mainstay of its deflective success here is likely the primary protection mechanism of keeping the recommender engine's algorithmic workings (and associated data) hidden from public view and external oversight -- via the convenient shield of "commercial secrecy." But regulation that could help crack open proprietary AI blackboxes is now on the cards -- at least in Europe.

This discussion has been archived. No new comments can be posted.

YouTube's Recommender AI Still a Horror Show, Finds Major Crowdsourced Study

Comments Filter:
  • Quelle coincidence (Score:4, Insightful)

    by Impy the Impiuos Imp ( 442658 ) on Wednesday July 07, 2021 @04:54PM (#61560559) Journal

    Are people raging about this? Or are politicians raging about this, facetiously feigning concern of disinformation, so they can hope to slide their opposing politicians' posts into the disinformation, ergo banned, bag?

    Keep an eye on those dirtballs.

    • More importantly, is it really "AI" that we are talking about or it is simply a pattern matching algorithm?

      • by laird ( 2705 ) <lairdp@gmail.TWAINcom minus author> on Wednesday July 07, 2021 @05:20PM (#61560659) Journal

        It's an algorithm, not particularly AI - page ranking is pretty well documented in its base mechanics, though of course by now who knows how much clever math is piled up on top of it. The challenge is that the page ranking mechanism promotes things that are widely referenced and clicked on, and the right-wing, in particular, has aggressively gamed the system to promote their content. So they create huge pools of sites that cross-link to each other, for example, creating artificial page rank boosts, and whenever Google adjusts the algorithms to block out one form of manipulation they adapt their tactics, in an ongoing cat-and-mouse game. And since their not going to stop creating propaganda and promoting it however they can, it's not going to go away completely - every time they come up with a new tactic, Google has to figure out how to detect and block that tactic without also blocking legitimate content, which is a pretty challenging task. And since the trolls are well funded and then supported by well-meaning idiots, it's a never-ending battle. A lot like email spammers, but worse.

      • No, that's not the important question. It doesn't matter at all what word you use to describe it.

      • What is the difference between the two?

    • Re: (Score:3, Interesting)

      What really bugs me is when I get off-the-wall recommendations based on things I have said aloud. Which sometimes hit a bit too close to home. When I have *never* used nor authorized Google Voice. All that does is make me paranoid. Enough to tape over the selfie cam and super glue the mics on my Android tablet.

      • I hear this complaint a lot but, despite having 3 Google Home devices and 2 Android devices in my home, I have never experienced this phenomenon. The closest thing is a search on Amazon will then continue to roll ads for that thing on Amazon itself and maybe a few other sites that display Amazon ads.

        I don't go out of my way to block ads but I do block tracking sites, 3rd party cookies and javascript. I also don't install many apps and don't allow apps permission to devices that they don't need.

    • Are people raging about this? Or are politicians raging about this

      So, I take it from your question that you didn't read the article [slashdot.org].

      Mozilla took a crowdsourced approach, via a browser extension (called RegretsReporter) that lets users self-report YouTube videos they “regret” watching. The tool can generate a report that includes details of the videos the user had been recommended, as well as earlier video views, to help build up a picture of how YouTube’s recommender system was functioning.

      ...A substantial majority (71%) of the regret reports came from

    • by pbasch ( 1974106 )
      Well, I know n=1 here, but I'm raging about it. It's horrible. I think The Onion put it best (as usual): https://www.theonion.com/we-mu... [theonion.com]

      Actually, I remembered that bit as being about YouTube, not FaceBook, but he principle applies.
  • The problem isn't the algorithm. The problem's the data - YouTube users' data. If you object to hate speech & its consequences, boycott YouTube & ask your political leaders to look into blocking &/or prosecuting the owners for hate speech. It's a crime in most countries.
    • Hate speech should not be a crime. We should not even have the concept of a "hate crime". If I say I hate murderers then is that hate speech or a hate crime? No. There's a general agreement that murder is bad so hating people that commit that act isn't a bad thing. Okay, what if I were to express hate people from some nationality, religion, or ethnic group? That's bad, right? Something worthy of a crime? Why? Speech should not be a crime. Not unless it falls into very specific categories. Calling

      • Hate speech should not be a crime.

        No, but it shouldn't be pushed on people either.

        The story is not saying hate speech should be a "crime". The story is about an algorithm that, when you view one video that you ask for, follows that up with another video that is conspiracy theories and detestable garbage... because vile garbage provokes outrage, and outrage glues eyeballs to the video screen.

        But I don't think you even read the article [slashdot.org] you're commenting on (e.g., https://www.theguardian.com/te... [theguardian.com])

        • The story is not saying hate speech should be a "crime".

          That's right, but the comment I replied to suggested using hate speech laws to get people posting hate speech prosecuted and punished. I believe such laws are a bad idea because hate crimes are too subjective and nearly impossible to enforce consistently.

          But I don't think you even read the article you're commenting on

          I doubt you read the post I replied to. Are you filtering at "+3"?

      • A hate crime is a crime which has as a goal to suppress a group's basic rights. It has to be a crime in itself first, before it can be a hate crime.

        Yes, murder is already illegal, and obviously bad in itself. But murder for the purpose of intimidating other people from exercising their rights, for instance voting, is extra bad.

        The state has an additional, important legitimate interest in preventing this, besides the interest it has in preventing murder in general.

        Don't conflate this with "hate speech". Hate

        • Yep. Same way terrorism is ethically and legally worse than the basic crimes or acts of war employed.
          People try to use the term for basically anything they don't like these days, in the hope of capitalising from the stigma associated with it, but the core of the concept is violence aimed at striking fear into the civilian population, as opposed to reducing military capability, which can be more readily defended as "defense".

        • According to your definition, hate crimes are actually just terrorism, no? Using force against civilians to terrify them into political compliance. A better formulation of the concept I think, and one whose terminology should prevail - we should get rid of the idea of "hate crimes" and reserve the enhanced sentences for acts of terrorism. Otherwise I see no reason to treat the motive of one murderer differently from another's. Does it really matter if a victim was murdered for looking at someone the wro
      • by jvkjvk ( 102057 )

        >Hate speech should not be a crime.

        Hate speech has the purpose of terrorizing, or suppressing people in and of itself. Or of causing additional harmful action to come to someone or group.

        "Hate Crime" is crime which has the additional purposes noted above. So, since the extent of the crime is greater, so should the punishment be. Pretty simple concept.

        • Hate speech has the purpose of terrorizing, or suppressing people in and of itself. Or of causing additional harmful action to come to someone or group.

          You just defined assault, and we already have laws against that.
          https://www.merriam-webster.co... [merriam-webster.com]

          Hate speech can be a verbal assault but merely expressing hate is not always assault. The problem has become that hate speech is not enforced consistently, and hate crimes have the same problem of inconsistent enforcement.

          "Hate Crime" is crime which has the additional purposes noted above. So, since the extent of the crime is greater, so should the punishment be. Pretty simple concept.

          Again, we already have laws against that. To use the example of assault there is simple assault, aggravated assault, and felony assault. Perhaps different jurisdictions define them slightly

    • Or just don't fall for clickbait. The problem isn't the data, it's behavior. People click on certain kinds of videos more often than others, so YouTube shows them more of what they've been watching. Why are there so many garbage "Reality" TV shows? Because people watch them. Why so many horribly unrealistic police procedural shows? People watch them. Why so much garbage on YouTube? Because people... ?
  • by rsilvergun ( 571051 ) on Wednesday July 07, 2021 @04:59PM (#61560587)
    Left wing commentators who debunk these conspiracies. It's to the point where several of them have stopped doing the debunking videos. Meanwhile the conspiracy theorists are happy to just create new channels, likely because they're being paid by someone to do this. and while the left wing commentators can appeal YouTube's decisions and bans oftentimes and especially for the small channels YouTube just tells them they reviewed their appeal and the ban stands
    • I could have sworn they were paid a cut of the advertising by YouTube. What makes you think "they're being paid" by anyone else? What makes you think the other side isn't?

      I'll tell you what I think the answer is. You don't want to accept that people really believe the things YouTube is trying to suppress, and really don't want to consider the possibility that they're right. So, easier to dismiss them as paid liars saying things nobody really believes. And YouTube is more than willing to help you bel

  • by BrendaEM ( 871664 ) on Wednesday July 07, 2021 @05:01PM (#61560595) Homepage
    Perhaps the AI is already trying to destroy us.
  • then hold the Google execs personally responsible when shit hits the fan.

  • I've seen plenty of ads feeding paranoia on youtube.
  • New research published today by Mozilla backs that notion up, suggesting YouTube's AI continues to puff up piles of "bottom-feeding", low-grade, divisive, disinforming content -- stuff that tries to grab eyeballs by triggering people's sense of outrage,

    I'm pretty sure it has no idea about outrage or eyeballs but I'm sure it's prioritizing videos that people spend time on and comment on. I don't think the up-votes or down-votes are a significant factor, only how engaged people were with the video. This is Facebook's strategy too a 't' and it looks like YouTube uses it as well.

  • Used to be a fan, but they have become irrelevant and woke. We should be focusing on the censorship of the tech giants, not being recommended things we disagree with. People should be educated to use their own brains and think for themselves and not be spoon-fed "this is true, we swear" government sanctioned and tech monopoly parroted junk.

    • Re: (Score:2, Interesting)

      by markdavis ( 642305 )

      If I had points I would mod you up.
      First, Mozilla needs to focus on browsers and the like, not website content on Youtube. And second, the three SPECIFIC examples they sited in the article of inappropriate video recommendations were not "hate speech" (even if something like that could be defined, which it can't), were not misinformation, and were at least somewhat relevant (although far from great recommendations). What did the three have in common? They were more conservative rather than leftist. OMG H

  • So called authoritative sources they want entertainment

  • I'm just trying to relocate youtube videos from 7 years ago. They still exist. I have relied on Google's search engine for ages. Search criteria are quite specific. Yet it cannot relocate those videos. Move over to MS Bing - same search criteria. Bingo! No problem.

    Who'd a thunk it? I might make Bing my default search engine when it comes to searching for youtube videos.

  • people like my mom and her boyfriend.

    Some elderly people don't have the wits left to tell nonsense from truth, and the emotional fear mongering of the nuts can make it impossible to get life saving medicine to youtubes victims.

    All the libertarians and god knows what else in this comment section need to get a sense of perspective. Youtube is literally killing my family!

  • 2009: A squirrel video is posted.
    2010: Nothing
    etc.
    2020: Nothing
    2021: Youtube AI recommends squirrel video to all users resulting in 3.7 million views.

  • It does four things.
    It recommends random old videos.
    It recommends videos trending popular with everyone.
    It recommends several things in line with one of the last 10 videos you viewed.
    It recommends youtube products to you.

    If you watch a covid video, suddenly you have a dozen covid videos.
    If you watch about 10 videos on other subjects, it forgets you ever watched a covid video.

    I found the Netflix algorithm was similarly shallow. I have a *wide* range of tastes. It couldn't keep track of what I liked. So I'

    • You're right, that's what it does.
      But most of all - it focuses on content that can relate to the most profitable ads.

      Netflix however, was a total chaos (at least for me) in the beginning, after 6 years with Netflix, it's actually starting to "get me".
      I tend to watch a lot of 80s movies and shows, and a certain genre of stand up comedy.
      There are also certain actors I like and some I just don't like at all.

      This is were youtube and netflix goes their separate ways, netflix actually succeed after a few years le

      • for me, when I could give 1-5 ratings, Netflix got me. Once it went to "thumbs up/thumbs down", it became painfully unable to get me any more.

        the funny thing about youtube is that I literally don't see or remember them even without an adblocker.
        And *forced* videos that won't let me skip, just make me leave the video.

        I'm probably thinking of getting an adblocker tho because the wasted time (even if I don't recall the ads) is getting to be too much.

  • by bn-7bc ( 909819 )
    This might be a good idea, but i would advise any applicant to think twice. Videos are extrenmlu easy to screw up, bad lighting, bad sound, bad focus, messy background etc, I don'r envy the tenth applicant with a bad video in a day, whith the recruiter hating his/her life.
  • I am more concerned about the search mechanism. It is considerably shorter than it used to be and it is no longer possible to find anything in the long tail. I know the approximate names of videos from ten years ago and I can no longer find them using search. You also cannot find any low subscriber content that is being promoted by high subscriber channels. The whole thing has become an amplifier for a very limited set of content creators. You can no longer sort your subscription list so it is even tiresome

  • Maybe it's because people are clicking on them, even if they hate them. I rarely see anything like that in my feed, it's mostly blacksmithing, cooking, geek stuff, wood working, sailing, and Star Trek. And lately I've been seeing videos about road design. Because I find it interesting to learn stuff instead of living vicariously through online personalities that I don't know, don't really care about, and want to share things that should be kept private in my opinion. I really am not interested in some ce
  • Or supermarket tabloids or any other business built on getting people to look at something?

    And let's not forget that the system bases its recommendations largely on prior viewing history, so mine isn't anything like what's described. Yesterday, it was all British comedy, music I listen to and scientists. Unless Monty Python, Neil Degras Tyson and Mr. Bungle count as ""bottom-feeding" /low-grade /divisive /disinforming content" [quote edited because the ascii art filters are complete shit], I've no idea

    • by cusco ( 717999 )

      You're lucky, I have no idea why it spits out the recommendations that I get. I have **never** clicked on an 'ancient aliens' or Illuminati video, but there they are! Was watching videos on rooting mulberry plants from cuttings and the next recommendations were rap videos and anti-GMO foods.

      • Because YouTube knows Mulberries are a myth perpetrated by a global conspiracy of gardeners bent on world domination?
  • Actually, YouTube's recommender is perfect. Pretty much every single item it recommends is one that I have watched. Previously.

  • I don't watch politically charged YouTube videos. Yet all of the sudden there were just tons and tons of garbage recommendations for political hot topics. Usually completely opposite of my own personal political beliefs. I spent a few weeks clicking on Don't Recommend This Channel on each one that popped up. Now I don't get them anymore, but admittedly it took awhile to finally clean it all up.
  • YouTube reco engine is crap crap crap. Worst - you cat turn it off...

Beware of Programmers who carry screwdrivers. -- Leonard Brandwein

Working...