Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Youtube United Kingdom

'A Mistake by YouTube Shows Its Power Over Media' (nytimes.com) 147

"Every hour, YouTube deletes nearly 2,000 channels," reports the New York Times. "The deletions are meant to keep out spam, misinformation, financial scams, nudity, hate speech and other material that it says violates its policies.

"But the rules are opaque and sometimes arbitrarily enforced," they write — and sometimes, YouTube does end up making mistakes. (Alternate URL here...) The gatekeeper role leads to criticism from multiple directions. Many on the right of the political spectrum in the United States and Europe claim that YouTube unfairly blocks them. Some civil society groups say YouTube should do more to stop the spread of illicit content and misinformation... Roughly 500 hours of video are uploaded to YouTube every minute globally in different languages. "It's impossible to get our minds around what it means to try and govern that kind of volume of content," said Evelyn Douek, senior research fellow at the Knight First Amendment Institute at Columbia University. "YouTube is a juggernaut, by some metrics as big or bigger than Facebook."

In its email on Tuesday morning, YouTube said Novara Media [a left-leaning London news group] was guilty of "repeated violations" of YouTube's community guidelines, without elaborating. Novara's staff was left guessing what had caused the problem. YouTube typically has a three-strikes policy before deleting a channel. It had penalized Novara only once before... Novara's last show released before the deletion was about sewage policy, which hardly seemed worthy of YouTube's attention. One of the organization's few previous interactions with YouTube was when the video service sent Novara a silver plaque for reaching 100,000 subscribers...

Staff members worried it had been a coordinated campaign by critics of their coverage to file complaints with YouTube, triggering its software to block their channel, a tactic sometimes used by right-wing groups to go after opponents.... An editor, Gary McQuiggin, filled out YouTube's online appeal form. He then tried using YouTube's online chat bot, speaking with a woman named "Rose," who said, "I know this is important," before the conversation crashed. Angry and frustrated, Novara posted a statement on Twitter and other social media services about the deletion. "We call on YouTube to immediately reinstate our account," it said. The post drew attention in the British press and from members of Parliament.

Within a few hours, Novara's channel had been restored. Later, YouTube said Novara had been mistakenly flagged as spam, without providing further detail.

"We work quickly to review all flagged content," YouTube said in a statement, "but with millions of hours of video uploaded on YouTube every day, on occasion we make the wrong call "

But Ed Procter, chief executive of the Independent Monitor for the Press, told the Times that it was at least the fifth time that a news outlet had material deleted by YouTube, Facebook or Twitter without warning.
This discussion has been archived. No new comments can be posted.

'A Mistake by YouTube Shows Its Power Over Media'

Comments Filter:
  • decentralization (Score:4, Insightful)

    by phantomfive ( 622387 ) on Saturday October 30, 2021 @11:50AM (#61942207) Journal

    We need options that are not Youtube.

    • by c-A-d ( 77980 )

      Well, you are in luck.

      Bitchute
      Odysee
      GabTV

      There's three right there. Oh, you may not like who started them, but there are alternatives out there.

      • Re:decentralization (Score:5, Informative)

        by fazig ( 2909523 ) on Saturday October 30, 2021 @01:29PM (#61942471)
        Relative to the disinformation you'll find on those, the disinformation you'll find on youtube is positively scientific.
        • Re: (Score:1, Flamebait)

          by Luckyo ( 1726890 )

          Freedom comes with responsibility. And if you're a brainlet who lets such simple disinformation destroy your reasoning, you're probably best staying off internet entirely.

          Because as you yourself note, youtube misinformation is far more sophisticated, and that means, dangerous to your ability to reason.

          • by fazig ( 2909523 )
            I was actually saying that the stuff I keep seeing on those platforms is even dumber than what's on youtube.
            But sure, by contrast that makes the already very dumb shit on youtube technically more sophisticated. Like if you compared a dog turd with fly excrement. The one is a lot larger and produced by a more complex organism. The animal that produces the latter may well have eaten some of the dog turd for the nutrient that it still contained. But in the end both is excrement.


            Though of course that's not
            • by Luckyo ( 1726890 )

              >I was actually saying that the stuff I keep seeing on those platforms is even dumber than what's on youtube.

              Yes, and that is exactly what I addressed above. Interesting that you failed to parse and comprehend this. Is this the damage from misinformation that you are in fact consuming preventing you from doing the simple cognitive parse on the two liner post above?

              • by fazig ( 2909523 )
                Nah, I just smell the Kremlin logic in your phasing. You probably came to embrace it as making sense to you. To me it doesn't.
                • by Luckyo ( 1726890 )

                  Totally. Hail Putin, the great overlord of above and below. Glory to Arstotzka!

                  • Re: (Score:2, Insightful)

                    by fazig ( 2909523 )
                    Well, you used the "better by virtue of being trashier" rationale that I came to often hear from people who regularly watch Russian state media and think of it as highly informative.

                    They believe that what they see on there has less influence on them, because they know it's trashy propaganda.
                    Unfortunately for them, what they don't realize, is that they believe they'll be able to discern the truth from the lies in a better way than from other news sources who work in more subtle ways. And through personal
                    • Re:decentralization (Score:4, Informative)

                      by Luckyo ( 1726890 ) on Saturday October 30, 2021 @05:27PM (#61943125)

                      I called it correctly above. You are a victim of high level disinformation. One of the languages I'm fluent in is Russian, and don't need RT or similar English language media tailored specifically to those foreign to Russia to see the Russian perspective.

                      And what you describe as what you think RT does it hilarious, because that is the common CNN style misinformation version of it. "It's those idiots we oppose that buy RT's misinformation, but you, our viewer are a smart one. You can see through it. With our help of course. Here's an ad".

                      What RT actually does is basically the same thing that CNN Russia and BBC Russia does in Russian language which if I remember correctly they modeled the original version on when it was started. RT offers a Russian take with Western style and sensibility on Western issues, with goal of demonstrating to the target foreigners that their system has critical flaws. They typically employ tactics like emphasis of specific easily observable flaws and then conflation of those flaws with a larger whole, to paint the entirety of the whole with the same ideological brush as the provable smaller flaw.

                      You may observe the exact same strategy used in partisan bickering within US, such as CNN vs Fox News entertainment pseudo-newscasts. It's just that those are about bickering of internal factions within a nation state, rather than pursuit of interstate interests.

                      And your view on what RT does? Yeah, that's CNN pseudo-newscast version. Which is in fact misinformation aimed to reduce reach of a similar pseudo-newscast competitor doing pretty much exact thing they're doing. It's just that this isn't the hilarious BS that you can see on bitchute et al. It's high level, high production value misinformation that you need to be actually smart, curious AND able and willing to source your information widely to perceive. It also helps to have low tolerance for your own cognitive dissonance and will to take the pain to recalibrate your views when experiencing even a minor one. Missing any single aforementioned criteria will render you vulnerable to it, which is why it works so well on its target groups. Just like similar misinformation by their competitors from different ideologies tailor their similar misinformation at their groups.

                      I wish I could recommend a book on this, but I can't remember which one it was that helped me understand the basic fundamentals of this mechanism. It's actually quite well understood style and methodology in relevant circles and has been for quite a while. The only thing that changed recently is that internet allowed for Big Data style collection and collation of "what works and what doesn't" in near real time, which allowed all of the major actors to polish their specific kind of message much closer to perfection compared to what it was before the internet and Big Data. Because without accurate polling of exactly how successful you are for each individual topic, approach used and exact efficiency of each of those against specific target groups, you can't really tailor this misinformation well enough.

                      And that is something that small scale bitchuters, or even large scale youtubers have little access to. This requires data analysts on the back end and narrative crafting experts on the front end all working in near perfect unison. It requires a group of diverse professionals. It requires something like CNN, Fox News or RT.

                      And those aren't on bitchutes of the internet. They're on Youtube, and biggest ones have priority access to get into people's feeds as pre-selected authoritative sources. Which is why youtube is far worse at misinformation. It deploys it at scale and efficiency that gets people like you.

                    • by fazig ( 2909523 )
                      You see, that CNN reply is also quite common from that angle. The "whadabout CNN?" "whadabout BBC?" or if I'm not to omit the equivalent of the country I live in "whadabout DW?"

                      The thing is, I don't support and or watch CNN or BBC or DW either (or Fox News). I'm also not going to defend them here and become an apologist for them, like you do with RT.
                      Now I'm not exactly sure what's going on in here. Either you're pulling my leg by mimicking pretty much the typical Kremlin brainwashed Westerner for satire,
                    • by Luckyo ( 1726890 )

                      Then the next question is, how did you get CNN's version of what RT is if you're not watching them? How did they get you?

                    • by fazig ( 2909523 )
                      People I have on Signal and Telegram link me that stuff on occasion. And I tell them to stop linking me that shit.
                      That's when I also get those CNN replies and then ask them to show me where I ever linked to CNN or the likes.
                      Yeah, maybe just a big coincidence that hit a nerve on my side.

                      I stopped watching TV about 20 years ago. I became disillusioned with most of Western media when I graduated from the University about 12 years ago. At the time I was partially interested in Science Journalism and since I
                    • by Luckyo ( 1726890 )

                      Interesting, I thought that I felt some similarities there before and it seems that's correct. Your story sounds a lot like mine. We even gave up on owning a TV around the same time, and we seem to be in similar age bracket, and come from similar background.

                    • by fazig ( 2909523 )
                      The thing is that these methods are seemingly used everywhere, with very little exception.
                      So when someone tells me that they've lost trust in our Western media, I might say, "ok, cool, you started to think for yourself". But when they instead come up with those 'alternative sources' which are believed to be more reliable I will think "I see, you likely only had that realization because someone else told you so and now you're a sucker for their narrative".


                      Regardless of where you look it's almost always a
                    • by Luckyo ( 1726890 )

                      I agree with most of this. It really makes me wish I remember which book pointed me toward how propaganda in the news works and how it is crafted, but I read it a long time ago, and I no longer remember what it was called.

                      But the thing I've found to work quite well is to think of news like you think of radiolocation systems. To find out where truth that you cannot see lies, there is a way. You need to know what direction the truth is likely deviated from on each source that you can see, and approximately ho

                    • by Luckyo ( 1726890 )

                      And here's a great example of how generating good clickbait newsbites works. They intentionally strip context, then use juicy out-of-context quote mining to generate short, readable soundbites and then deliver those with short, hard hitting commentary that mixes deep sectarianism into it. This is how you turn truth of "this was what was said" into "this was what was said, how horrible those people are!"

                      Thank you for delivering it mr. troll. It really helps to get a good example of exactly what you're talkin

                    • by Luckyo ( 1726890 )

                      It's ok darling. You already delivered what I asked for. Your services are no longer needed. You may go back to bitching in threads about China not engaging in genocide in Xinjiang, and that US policy killed more people than PRC... somehow.

      • Bitchute is far worse than Youtube when it comes to misinformation. Anti-female redpillers, hard-core antivax, child caskets, Qanon disinfo (children sex rings, GMOs cause genetic breakdowns), "holocaust lies exposed"...and this is just all from their FRONT PAGE. I actually don't see anything that isn't mostly insane on there. In my world (I work as an ISSEC compliance analyst under DoD contract and have a clearance) I automatically distrust anything posted to Bitchute, from my own personal "risk assessment
      • by lsllll ( 830002 )
        A search of each of those 3 for "solti beethoven" turned up exactly zero hits. I stopped counting YouTube's. Enough said.
    • Re:decentralization (Score:4, Informative)

      by 50000BTU_barbecue ( 588132 ) on Saturday October 30, 2021 @12:09PM (#61942265) Journal

      Yeah, it's called Vimeo.

      • Bet they have power over media as well and aren't shy about demonstrating it. Why don't people simply be honest with themselves and say they don't want others telling them what to do. Doesn't matter who's platform "other" is or "what" is being prevented.

      • by AmiMoJo ( 196126 )

        And Facebook and TikTok and banned.video. PornHub if that fits your content.

        YouTube is big but far from the only big player in streaming video.

    • by AmiMoJo ( 196126 )

      There are a lot of other video sharing sites.

      • by lsllll ( 830002 )
        And NONE come close to what's on YouTube. I've said in other posts that YouTube would probably be the ONLY service I'd pay for, out of all of Google's services. For everything of theirs that I'm using, there's an alternative, but there is absolutely ZERO for YouTube. Whether I want to search for "Jordan Peterson" or "Solti Beethoven", I'm sure to get the most hits on YouTube.
        • by AmiMoJo ( 196126 )

          Okay, but what is the point you are trying to make? Why is being on a less popular site an issue?

    • by hey! ( 33014 ) on Saturday October 30, 2021 @01:06PM (#61942419) Homepage Journal

      You can't have an alternative to YouTube with out that alternative becoming YouTube.

      Content creators post to YouTube because that's where the audience is. The audience is on YouTube because that's where the content is. Scale and scope are fundamental to its utility. They are also fundamental to the negative aspects of the platform -- the unmoderatable toxic comment sections, the impossibility of curating TOS-violating content without sweeping in innocent bystanders.

      The only thing left for other platforms is scraps, like education (Coursera, Skillshare), or exploiting ADHD hyperfocus (TikTok). Vimeo exploits the niche of being ad-free, which means it also charges content providers for services, which is probably the way you have to work if you want to avoid most of the problems with YouTube. I suspect this also means Vimeo will always be smaller than YouTube, which doesn't necessarily matter as long as they can make money. After all Harvard has a smaller enrollment (7000) than the University of Central Florida (62,000); smaller scale hasn't hurt Harvard's brand any.

      • by AmiMoJo ( 196126 )

        Freedom of speech does not require anyone to furnish you with a platform to speak from, so what is the argument here?

        Are you going for the "town square" angle that YouTube should be some kind of public forum? That seems problematic because the analogy is a real stretch and to make sense the recommendation system would have to be included. Then there is monetisation, which is even more problematic.

        There is a decent argument for Google integrating YouTube so tightly that it gives them an advantage. That's mon

        • by hey! ( 33014 )

          I never said it required anyone to furnish anyone else with anythign. Sometimes things which are good in principle have bad effects; just because it's *good* that YouTube can do whatever is good for its profits doesn't mean that YouTube does is good.

    • We need options that are not Youtube.

      There are options that are not Youtube. The fact that content creators don't use them doesn't mean they don't exist.

  • by gurps_npc ( 621217 ) on Saturday October 30, 2021 @11:57AM (#61942229) Homepage

    That is NOT sufficient.

    It is not good enough to say "Sorry, my bad". People deserve an explanation of what the mistake was.

    No one likes to explain the mistake because it is embarrassing. But it is ESSENTIAL to understand what kind of mistake. There are 3 basic kinds of mistakes. External, Internal process, Internal Decision. External would be a scumbag from outside thinking you are evil, despite not breaking the rules, Internal would be something like a misspelled ID, while Internal decision is someone internal thinking you are evil despite not breaking the rules.

    There is a huge difference between a mistake because your ID was similar to actual infringing content (name, number, or whatever) and a mistake caused by accidentally believing a scumbag trying to hurt you. Because people have the right to sue a scumbag who lied and led to damages against you.

    Usually companies do not like to do this because sometimes it is an internal Decision. They are afraid of them being sued. They have a never revealing any mistake so they never have to say, oops, we hurt you, you can sue us.

    But this prevents you from suing anyone, even people not related to the company.

    • by Gravis Zero ( 934156 ) on Saturday October 30, 2021 @01:06PM (#61942421)

      It is not good enough to say "Sorry, my bad". People deserve an explanation of what the mistake was.

      As much as I despise what Google has morphed into, I have to disagree.

      You may not like it but YouTube is shooting at moving targets that are constantly changing. This means what triggers a filter may change daily, may be part of a neural network (and thus beyond human understanding), and if they users already agree when signing up that their account be terminated at YouTube's discretion, justified or otherwise. If these were paid accounts then they might have more rights but as it is, they have none.

      I know it's awful and unfair but this is this is the agreement made as part of "Section 230". They are making a good faith effort and if that is insufficient then you are free to use another platform or serve your own content.

      • I just wish there was some rhyme or reason to it.

        On the one hand I had an 8 year old video deleted. The video had 55 views, shows an Andalusian dancing back and forth in the middle of a Spanish bullfighting arena. No actual bulls in the video, but it had bullfighting in the target. Apparently 8 years after being posted and after being seen by a whole 55 people (probably 10 of which was me showing it to someone) it needed to be nixed as it didn't meet Youtubes strict community guidelines.

        I for one am glad th

        • slip of the tongue "bullfighting in the title" though saying it was in the "target" sort of makes sense too.

        • I just wish there was some rhyme or reason to it.

          There is reason to why your video was taken down: it clearly tripped multiple filters.

          On the one hand I had an 8 year old video deleted. The video had 55 views, shows an Andalusian dancing back and forth in the middle of a Spanish bullfighting arena. No actual bulls in the video, but it had bullfighting in the title.

          The fact that it go yanked after eight years just means that they are applying content filters to old content as well as new content, with no regard to the number of views.

          It could be that it was the full video of the bull fight and was taking it down for copyright reasons. It could be that the a new neural network identified the dancing horse as being in distress and thus removed for animal cruelty. It could be some sym

          • There is reason to why your video was taken down: it clearly tripped multiple filters.

            And that's kind of my point. The moderation system is dumb and not based on any actual community guidelines, risk or even a normal review.

            It could be that it was the full video of the bull fight and was taking it down for copyright reasons.

            No, DMCA violations show up differently, mind you copyright is a thing for American megacorps, not for someone dude's mobile phone footage of a horse dancing without music in a bullfighting arena in a town with the population of a large university for which entry cost 3EUR

            Google has invested a lot into deep learning and I'm pretty sure you've simply discovered that it's not silver bullet.

            You give Google too much credit. I think it's clear they've invested very little. I mean we are talki

            • I don't believe they should blindly do this without human review

              I ran the numbers and assuming they filter out duplicates uploads automatically, it seems feasible to human reviewers for all takedowns. Presuming they can filter by language and send it to an appropriate reviewer, they would need to employ between hundreds and thousands of people per language. It's big, costly and expensive but ultimately it's doable. There is incentive to prematurely discard a video without cause by human reviewers, so to measure accuracy of reviewers, you need to have a single takedow

      • by tlhIngan ( 30335 )

        You may not like it but YouTube is shooting at moving targets that are constantly changing. This means what triggers a filter may change daily, may be part of a neural network (and thus beyond human understanding), and if they users already agree when signing up that their account be terminated at YouTube's discretion, justified or otherwise. If these were paid accounts then they might have more rights but as it is, they have none.

        You and I know why YouTube doesn't provide reasons. Because if they did, it w

    • They're a private company, they can do what they want, isn't that what you guys always say when the shoe is on the other foot?

      What, you thought you could trust a giant corporation?

      • When they lose US Section 230 protection, I'll agree with you. But as it stands, they are pruning content and acting as a publisher that curates the content, not simply moderators. If the rules are opaque, then it's not moderation, it's publishing. You want to claim it's moderation of content? Then the rules need to be clear, the infractions transparently communicated. And as it stands, they SHOULD be held liable for silencing people. The internet promised freedom, not more of the same structures that
        • I don't think that they should "loose 230 protections", but the whole idea of 230 needs to be re-worked to take into the account the differences between "user content and curation" vs "machine intelligence curation" (or what people call AI, but everyone here knows this isn't even close to a real AI yet LOL). 230 assumes that there is too much content for "just humans" to catch everything; but now that they (YouTube, Facebook, etc) are using massive amounts of code-based curation then that specifically shoul
        • You are drawing a distinction that does not exist in the law. I recommend reading section 230. It is short and clearly written.

    • by rsilvergun ( 571051 ) on Saturday October 30, 2021 @01:21PM (#61942449)
      they don't tell people because then folks will learn how to get around the automated systems, and YouTube can't police the large amount of content they have without that automation.

      I suspect also they don't want to give competitors insight into their algorithm, though that's probably coming more from management than anyone else, but their algorithm isn't anything to brag about (not counting their speech recognition, which somehow managed to Close Caption The Smengis [youtube.com]).
      • I honestly don't know if the average YouTube employee has any idea HOW to get around their own automated system. At this point that system has such a giant dataset to pull from it's a statistical guessing game to get around this. I doubt there is more than a handful of people who understand that algorithm in it's entirety.
    • by Mitreya ( 579078 )

      It is not good enough to say "Sorry, my bad". People deserve an explanation of what the mistake was.

      Heh. I agree, but I would settle for making sure that the error is rectified.
      How many people get "Sorry, my bad" response with a fix and how many people just can't contact anyone and are just out of luck?

    • You live in ideal times, which were really not that long ago.

      These days, we are right at the point a YT rep can walk up to you, kick you in the nuts, and get away with it scott free.

      Unsorry, you have to take what they dish out, and be eternally greatful they even let you view the site. This goes for all major companies.

      Scince people became complacent and allowed the resources of the world end up under control of the few, this is what they can expect, and should've expected even before things got the

    • That is NOT sufficient.

      Dont use the service.

      PROBLEM SOLVED.

    • The "mistake" is just what the article stated. If youtube receives enough complaints it drops a channel, comment etc. It is their way of handling the incredible volume of channels and information. In their view, and maybe in order to comply with laws, it's better to drop channels undeservedly than to leave a channel that is in breach of decency laws up. Left wing extremists use this tactic as well to drop content they don't agree with, it isn't a tactic primarily used by the right anymore. The left does not

  • Can't these media companies host their own content or otherwise pay for a CDN? Surely they can get people to come visit their site directly, etc. I know BBC has video (The Reel) so it's definitely possible.

    Sounds like these media companies are just becoming victims of DMCA/copyright style stuff and Youtube is clearly trying to stay compliant with that. The irony that media companies are being affected by such is pretty funny actually. Their accounts are no more special then any other "news source" on youtub

    • Comment removed based on user account deletion
    • by q_e_t ( 5104099 )

      Can't these media companies host their own content or otherwise pay for a CDN?

      They can, and do, but the issue is discoverability. If everything is on YouTube then you risk being invisible if you aren't also on YouTube. YouTube isn't going to say "If you liked this, then go and watch this other thing on a competing platform".

  • it's funny (Score:4, Insightful)

    by Anonymous Coward on Saturday October 30, 2021 @12:11PM (#61942273)
    This makes the news only because it was a left leaning channel that got deleted.
    Once again proving the bias in the tech industry towards the political left. Very Deliberate.
    Just like this convenient facebook whistleblower, contrast it with stuff like Witness K in Australia and countless other whistleblowers that have been cancelled, deleted, punished, prosecuted and even drugged by anti terror police units here in my own home country in Australia. (An executive of Origin Energy, Fiona Wilson) So very, very convenient.
    • Not the kind of left that the censors like. Among other things, Novara Media had recently covered the Belmarsh tribunal, a big display of condemnation of the war crimes Julian Assange is in Belmarsh for exposing, with a ton of poitical leaders and famous people. (I'm not surprised if you haven't heard of it - it got virtually no mainstream news coverage).

      And Assange just had another extradition hearing. It's probably in expectation of coverage of that that the channel got taken down/got a reporting campaign

  • by Dog-Cow ( 21281 ) on Saturday October 30, 2021 @12:16PM (#61942291)

    It is unreasonable to the point of being irrational to expect YT to do more policing of dangerous content while also not expecting them to make mistakes. Whether the filtering is done by computers or humans, mistakes will be made. The only thing YT might improve is the resolution process when someone has a valid complaint against deletion.

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      It's reasonable to expect them to document the reasons for their decisions. Not by checking a box. By writing paragraphs saying things like "at 27:49 in video A, you say X, which violates policy Y in ways Z and W. Furthermore, when taken in the context established by video B at 13:24, it establishes a pattern of violations because of the repeated references to V".

      It's reasonable to expect them to provide all of that to the user. In every single solitary case.

      And when the appeal comes in, it's reasonable to

  • Yes I understand it is easy for a start-up to use Youtube for publishing of their news.
    But when you run a legit news outlet it looks like a bad idea to use an outside site like Youtube to publish important news.
    They should run their own server for these video's.

    Some of the worst abuses are government politicians using for example Twitter and Facebook to announce policies.
    As if their government doesn't have it's own website and server(s).
    • But when you run a legit news outlet it looks like a bad idea to use an outside site like Youtube to publish important news.

      Your comment is too late, every legit news outlet uses Youtube. It's where the audience is.

  • by boundary ( 1226600 ) on Saturday October 30, 2021 @12:25PM (#61942317)
    Gary McQuiggin, Novara’s head of video, tweeted in 2020: 'It’s not censorship when a private company decides to remove you from it’s [sic] platform.’ And on Tuesday he tweeted: ‘Whether or not you agree with what we publish, it shouldn’t be the whim of giant tech companies to delete us overnight with no explanation’.
    • by Gravis Zero ( 934156 ) on Saturday October 30, 2021 @01:12PM (#61942427)

      I agree... however, I think this could also be resolved by being able to request a human review with a simple $100 deposit. If you pass the human-based review then you get the money back, if not then they keep it.

      • Why the $100 deposit? Is Alphabet so strapped for cash that we need to give them monetary incentives to moderate in a fair way?

        • Why the $100 deposit?

          Without a monetary deposit then spammers could simple overload their human reviewers with more spam than they can process (even with tens of millions of human reviewers). Hell, that's why they use algorithms in the first place.

      • A good idea on paper, but would that not just incentivize the appeals process to lean heavily towards the side of false positives? What's the downside for them to just refuse the vast majority of claims, which would lead to the same situation we're in right now?
        • I doubt it because only a fool would put $100 up for an appeal they know they will lose. It only makes sense that everyone filing an appeal would be doing so in good faith. They make a LOT of money from advertisers which means willfully ruining their own reputation from people appealing in good faith would cost them far more money than the scraps they would collect from the appeals process. It's one thing to accidentally harm someone but when done willfully for profit, that's how your make a mortal enemy

          • Thing is, if YouTube were to actually implement such a system, you can bet that it will have zero transparency so that it'll be impossible for anyone to actually see whether or not they're doing their due diligence in the appeals process. It'll essentially be a game of 'he said, she said'. The only way any misbehavior on YouTube's part could possibly come to light is if an extremely massive, influential content creator gets hit with it and makes a big stink on social media and/or they get taken to court.
    • by Anonymous Coward on Saturday October 30, 2021 @01:57PM (#61942523)
      There's no contradiction. The first tweet supported a company being allowed to set its own policy for content. The second essentially complained about a company's customer service.
    • by quall ( 1441799 )

      Hilarious example of the left eating itself. They love when their opponent's "rights" are trampled on, but when they get sucked into the same hole it becomes a different story... It's practically a cliche at this point. Too dense to understand that they too must follow the same rules, well, unless you're on Twitter.

  • Very surprised they would let this happen to higher profile accounts so often now, simply because it represents such a direct threat to their base revenue stream. The safe bet would seem to be that it was due to an algorithmic change problem vs. the right thing to do blah blah (algo meaning either some actual flagging code or review policy change or both). So suspect the folks that let this happen will get a spanking internally and rollback to the safe zone.

    Btw, here's another high(er) profile account tha

    • These ad networks have between 10%-30% ad click fraud, which means someone built a channel, and has robots watching movies so they can make money.

      That means there must be some very popular channels that literally no one is watching.

  • If YouTube is unable to handle the volume of video it has to deal with, then the company may not be properly managed. Or the company may have incorrect goals. Either way, the problem youtube faces should not be blamed upon the volume of videos it processes.
  • "We work quickly to review all flagged content," YouTube said in a statement, "but with millions of hours of video uploaded on YouTube every day, on occasion we make the wrong call "

    If you're so "big" that you can't handle the content in an appropriate manner, then perhaps it's an indication that you're "too big" and "will fail".

  • accountability (Score:4, Interesting)

    by skogs ( 628589 ) on Saturday October 30, 2021 @12:50PM (#61942381) Journal

    "coordinated campaign by critics ... file complaints with YouTube,... a tactic sometimes used by right-wing groups to go after opponents"

    Really? They really think that 'coordinated' spammers of complaint buttons are restricted to a single ideology?

    Pretty sure this is everybody that doesn't understand that it doesn't matter if what they see online is something they like or not. People speak their moronic opinions on the interwebs (this comment included) and that is their right. They pay for any consequences of that posting. These ass clowns that complain have no repercussions for lying about their actions and flagging something as copyright or infringing somehow. They just click wildly with no accountability.

    I got an idea - you want to be a stupid troll on the internet, in order to complain about a thing somebody else posts - your system takes a quick picture of you. I think the anti-bad-post squad would get to know a few faces and then be able to sensibly ignore their stupidity....just like a normal workplace where you ignore the imbecile.

  • of left wing channels, it seems to happen a lot to them. Like, a lot. [youtube.com]
    • I'm a left-winger as well, although I watch quite a number of centrist and slight-right channels in addition to the usual left-wing content. I've seen it happen on channels of all types. Much of the content I watch has to refer to topics using code because the number of words that YouTube finds offensive has grown so much and the list is completely opaque. The only way I can describe the feeling of watching these videos is that it's reminiscent of when I sat at the "kiddie" table growing up and we wanted
  • It isn't the only video site on the web

    • It is the only one with free university level lectures, seminars and talks aimed at the general public. Not everything is cat videos or politics or tap dancing influencers or sport or music. YouTube is actually fabulous and add free for free if you want to learn.

  • by argStyopa ( 232550 ) on Saturday October 30, 2021 @02:28PM (#61942587) Journal

    ...first, haven't we just heard for about the last 10 months how it's PERFECTLY LEGITIMATE and not censorship at all for a private company (Twitter) to de-platform anyone (Trump) for any reason, after all, as we've been repeatedly, insistently told: The first amendment is only about GOVERNMENT censorship.

    Right? Or did I misunderstand something?

    "... a tactic sometimes used by right-wing groups to go after opponents..." just right wing groups, huh? Vox didn't deplatform Steven Crowder then, or is Vox "right wing" now?

    • by haruchai ( 17472 )

      "haven't we just heard for about the last 10 months how it's PERFECTLY LEGITIMATE and not censorship at all for a private company (Twitter) to de-platform anyone (Trump) for any reason,"

      How long was Trump on Twitter & how many times did he or staff using his account violate Twitter's TOS before any restrictions were enforced?

      • So is it fine for a private company to deplatform someone or not?
        Don't be a cowardly hypocrite, answer.

  • If algorithm they are using to determine what should be blocked gives a simple yes/no response, this is simply inadequate for the task. If they use supervised learning, training data could be annotated with what part of the content policy is violated. Unless their training data is based on somebody's gut feeling, they should in principle be able to determine what is being violated by a channel. When in doubt, a channel can be flagged to be reviewed by a human (if the channel is important enough). Otherwise,
  • Despite one's opinion on the heavy handedness of youtube's censorship processes, it's correct and important that the same procedure is used on "approved" media companies. If it's justifiable in general, the amount of harm done will be acceptable, but if it can't be justified in other cases, giving channels with large reach special treatment will obscure the problems.

    Unfortunately this complaint is really a specific case, and we've already seen situations where recognized media companies get special permiss

  • Well, it's actually unfair for those individuals who are actually legitimate. Youtube should do better when it comes to something like this because it would be very frustrating on someone's side who trusted the platform and then got his/her channel deleted. Well, I understand there are rules to follow, but they should take action for this. Maybe, they should include this or amend something in their algorithm. Flagged the channel first, and there should be a detailed (human-touched) review. - https://carmela [wixsite.com]

Some people manage by the book, even though they don't know who wrote the book or even what book.

Working...