YouTube, the Great Radicalizer (nytimes.com) 214
Zeynep Tufekci, writing for the New York Times: Before long, I was being directed to videos of a leftish conspiratorial cast, including arguments about the existence of secret government agencies and allegations that the United States government was behind the attacks of Sept. 11. As with the Trump videos, YouTube was recommending content that was more and more extreme than the mainstream political fare I had started with. Intrigued, I experimented with nonpolitical topics. The same basic pattern emerged. Videos about vegetarianism led to videos about veganism. Videos about jogging led to videos about running ultramarathons. It seems as if you are never "hard core" enough for YouTube's recommendation algorithm. It promotes, recommends and disseminates videos in a manner that appears to constantly up the stakes. Given its billion or so users, YouTube may be one of the most powerful radicalizing instruments of the 21st century.
This is not because a cabal of YouTube engineers is plotting to drive the world off a cliff. A more likely explanation has to do with the nexus of artificial intelligence and Google's business model. (YouTube is owned by Google.) For all its lofty rhetoric, Google is an advertising broker, selling our attention to companies that will pay for it. The longer people stay on YouTube, the more money Google makes. What keeps people glued to YouTube? Its algorithm seems to have concluded that people are drawn to content that is more extreme than what they started with -- or to incendiary content in general. Is this suspicion correct? Good data is hard to come by; Google is loath to share information with independent researchers. But we now have the first inklings of confirmation, thanks in part to a former Google engineer named Guillaume Chaslot. Mr. Chaslot worked on the recommender algorithm while at YouTube. He grew alarmed at the tactics used to increase the time people spent on the site. Google fired him in 2013, citing his job performance. He maintains the real reason was that he pushed too hard for changes in how the company handles such issues.
This is not because a cabal of YouTube engineers is plotting to drive the world off a cliff. A more likely explanation has to do with the nexus of artificial intelligence and Google's business model. (YouTube is owned by Google.) For all its lofty rhetoric, Google is an advertising broker, selling our attention to companies that will pay for it. The longer people stay on YouTube, the more money Google makes. What keeps people glued to YouTube? Its algorithm seems to have concluded that people are drawn to content that is more extreme than what they started with -- or to incendiary content in general. Is this suspicion correct? Good data is hard to come by; Google is loath to share information with independent researchers. But we now have the first inklings of confirmation, thanks in part to a former Google engineer named Guillaume Chaslot. Mr. Chaslot worked on the recommender algorithm while at YouTube. He grew alarmed at the tactics used to increase the time people spent on the site. Google fired him in 2013, citing his job performance. He maintains the real reason was that he pushed too hard for changes in how the company handles such issues.
Just Similar Topics (Score:2, Interesting)
Re:Just Similar Topics (Score:5, Interesting)
Have you ever shopped for something only to have google spam you with ads for weeks on end for the actual thing you bought for weeks after you've already bought it.
Considering how often this happens, it would be helpful if the AdSenseWords feedback form had an option, "I already bought this item and am not currently shopping for another one."
Re:Just Similar Topics (Score:5, Insightful)
Oh that's awesome, I vote for the "I already bought everything and I'm not ever shopping for anything else in my life" option. I just use adblock software, that's more effective.
Re: (Score:2)
Yeah, whatever ad network I see all the time only ever suggests items I already bought on Amazon or BHPhoto.
"Would you like 3 more Olympus cameras?" No, not really, I am happy with the one I got.
Re: (Score:2)
The content creators also escalate things themselves. In a sea of videos about running if you want to be noticed the easiest way is to do an extreme ultra-marathon with a thumbnail of yourself being carried away on a stretcher.
TV news is the same, printed news is the same.
Also, conspiracy theories about secret government organizations and 9/11 are "leftish" now? Seems like the article author has some bullshit to peddle.
Re:Just Similar Topics (Score:4, Interesting)
if you want to be noticed the easiest way is to do an extreme ultra-marathon with a thumbnail of yourself being carried away on a stretcher.
Since I started using Video Blocker add-on [mozilla.org] that kind of click-baiting only gets that entire channel blocklisted.
You know that thing where you watch a video on a topic and lunatics start appearing in your recommended list? Now they show up only once.
Re: (Score:2)
Clickbait sells.
Comment removed (Score:5, Insightful)
Hardly new to internet : TV does it already (Score:5, Informative)
It's hardly new to the internet :
it's been discussed for quite some time that TV channels, specially their News, tend to over-dramatize news and cast dramatic negative light on the world.
Basically, when creating emotion in viewer, their attention increases.
This works even more with negative emotions (fear, etc.).
This attracts more eyeballs to your channels,
giving you more opportunity to sell these attentive eyeballs to advertisers and thus increase your revenue stream.
In Europe, this is more prominent on private channels (mostly paid by ads) than on public channels (partially paid by taxes).
The thing is that, in practice, it has been proven to have an actual effect :
in Europe, watching TV and watching news even more so, has been linked to causing an increased feel of "insecurity, danger, etc."
This is despite the situation in Europe being much better and safer than before.
Criminality rate is globally decreasing, but TV reporting thereof is on the rise.
The neural-network AI used by youtube to process recommendation has simply rediscovered on its own the same results as what was already found on TV :
increase the emotional response of viewer by showing more extreme videos, you attract their attention and thus can sell more ads.
The AI doesn't even really have an actual concept of "emotional response" and "increased attention". The AI only notices that after recommending some video, revenue increase.
If video B is shown after video A : more retention, increased ads revenu. If then video C is shown after video B : again more.
AI the remembers to use the chain A -> B -> C, because that's what increases the parameter it is optimising for : eyeball time sellable to advertisers.
But because of what we know from what was studied on old school TV, that will eventually mean showing more and more extreme videos, because that's what has been proven to work on human brains. This AI has simply "independently rediscovered this fact".
----
For shit and giggles, let's gice a source that's on Youtube itself [youtu.be] (and even parodies the usual style of conspiratorial videos). Sorry it's in French, but it has English CC.
Re: (Score:2)
Aren't you talking rubbish? (Score:5, Insightful)
For any given video, it will recommend a range of other similar videos which by definition must be a bit more radical or a bit less radical. If you keep clicking the more radical ones, of course you will slowly gravitate up the radical tree. How could it be otherwise? I don't consider it radicalising, it's just providing information. For this topic, this is how radical you can go, this is how far you can take it. Anyone interested in a topic enough to keep watching videos is sooner or later going to want to know, how far can I take this? And YouTube has the answer for you.
Re: (Score:3)
I used to watch 911 videos (not many
As I now mostly watch music videos (and nerdy guitar videos like Guitar Of The Day), I get recommended (you guessed it...) more guitar videos and music videos.
And for some reason I get recommended Samantha Fish videos. Ah, that may be because I have watched several videos with her. So now I am becoming a radicalized Samatha Fish
Re: (Score:3)
It's less that "the algorithm" has figured out that radical content is what glues people to the screen in the most effective way when it's literal job is to recommend content that's as interesting and engaging as possible for each given user. The proper contents of the video are completely beyond it's understanding, all it sees are the topics in the tags, how long people view the video, how many of them do, along with the likes and dislikes. If you're int
Re: (Score:2)
People are drawn to the radical content, not because they want to believe but just for fun. What is really fucking annoying with YouTube look at a couple just for a lark and YouTube starts feeding you nothing but, really fucking annoying. Just like most people, you look at those crazy stories just for a lark, most of the corporate main stream media is just boring bullshit corporate propaganda, they keep repeating the same shit over and over again, finally giving up and starting on the next round of bullshit
Re: (Score:2)
Those that believe already have a screw less and believing crazy stories is a part of their existing condition and will not make them worse, just send them down that particular rabbit hole ie Russia, Russia, Russia or another rabbit hole ie identity politics and self identifying yourself as what ever flips into your head be it a 200 year old female traffic light or a 6 month old male teapot cosy and hacking away at you genitals is a great idea.
Can't tell if this is parody, or a cry for help.
Re:Aren't you talking rubbish? (Score:5, Insightful)
The thing is, it doesn't need to take you to more extreme content in order to give you more info. The "autoplay" after a conspiracy theory video could just as easily be another video debunking it.
However, the algorithm has determined that people are more interested in (in other words, more likely to watch) something more extreme than what you've just watch.
The intent isn't nefarious, but the overall effect is that you emerge knowing less about the topic than when you started.
Re: (Score:3)
The algorithm isn't nearly clever enough to recognize the content of a conspiracy theory and a matching debunking video. Instead it goes off things like what videos are linked to (back/forward links), what people who liked/disliked it also enjoy, channels that it thinks are associated with that one somehow...
The only solutions seem to be really strong AI or human intervention.
Re: (Score:2)
True. I'm not suggesting that the solution would be easy. Just that the problem is legit.
Keep in mind, though, that the building blocks for such an algorithm are there. YouTube does automatic closed-captioning, for example, so they could easily do deeper analysis of the video transcript.
Re: (Score:2)
That's true, their auto subtitles have got a lot better in the last few years.
Re: (Score:3)
Comment removed (Score:5, Informative)
Re: Aren't you talking rubbish? (Score:2)
Extreme: Proven to work (on TV) (Score:4, Interesting)
For any given video, it will recommend a range of other similar videos which by definition must be a bit more radical or a bit less radical. If you keep clicking the more radical ones, of course you will slowly gravitate up the radical tree. How could it be otherwise?
It goes a tiny bit further :
- it's been already studied and proven (on oldschool TV) that more extreme content (specially more frightening) increases viewer engagement.
- and engaged viewers will bring more revenue by selling their eyeballs to advertisers.
- (this happens even more on private channels than on public TV).
- thus TV channels, specially news casts, tend to gravitate toward more
The AI neural net behind Youtube recommendation just simply "independently rediscover" what's been studied regarding old school TV.
(while being probably even less aware of it : during A/B tests the algorithm only notice that video on list A tend to increase viewer retention compared to list B and thus maximize ads exposition and revenu stream. it just happens that the videos on this list A are the most extreme due to what we already know of human psychology and past TV studies. The algorithm will eventually automatically build a chain of recommendation of increasing extremeness, because that's what works better for the result it tries to maximise)
The sad thing is that this has been also proven to increase the feeling of insecure.
So, yes, initially the youtube algorithm will show up a variety of similarily themed video recommendations, some of which "must be a bit more radical or a bit less radical". But eventually some of these recommendation will prove more popular and youtube will learn to show them more. Due to how human psyche works, those more successful videos *will be* a little bit more frequently the more radical ones. And thus youtube learns to show the radical more a bit more often (without even having the notion of what "radical" is, only that they are successful). And again, sadly due to how human psyche works, it will have a negative impact on viewers.
Re: (Score:2)
"Don't be evil" (Score:2)
"Sex sells! Blood Leads!!!!!"
Uh... google?
Re: (Score:2)
FTFS:
The longer people stay on YouTube, the more money Google makes.
So it makes the most sense for YouTube to activate the addiction receptors in viewers brains. I'm guessing that YouTube employs neurologists specializing in addiction disorders to tweak the algorithm.
And we all thought Big Tobacco was bad, for trying to produce an even more addictive product.
Re: (Score:3)
It still works. Don't be evil. Let Google dissect your life and sell you slice by slice.
In case you haven't noticed "Don't be evil" is directed from the speaker as a request to the recepient. Else it would be "We won't be evil".
Re: (Score:2)
A symptom of a deeper problem (Score:3, Insightful)
Re: (Score:2)
YouTube uses some interesting metrics to measure engagement. It looks at how long each viewer watched the video, what proportion of it they watched (so as not to favour longer videos), if they clicked on any embedded links, if they re-watched any parts etc.
The problem is that it is really easy to game. Channels that have a lot of followers can get them all to play the video in the background over and over, and the ones that are politically motivated will treat it as their job.
Re: (Score:2)
Highly depends on usage... (Score:3)
Much like any other social networks on the Internet, it all depends on how you use the platform.
I personally have a fixed number of subscribed channel that I watch everyday. The recommendations I get from YouTube usually goes around the themes of channels I'm already subscribed to.
So, there's a bunch of science, tech, and currently trip to Japan channels on the recommendation list. Nothing radical or extreme at all.
But of course, if you already watch and follow a bunch of videos and channels that are always about sensitive or hot button topics, the algorithm will suggest popular videos with similar themes, which will eventually end up in radicalized content. It's only logical that it'd end up that way, since it'll always try to show you popular videos.
On the trending list there's always a bunch of shit recommendations that are mostly sensationalistic in nature, but I won't ever touch that crap with a 10 foot pole, so no harm done.
But you gotta see that this isn't unique to YouTube. Facebook and Twitter do the same crap. Facebook has suggested pages, people and posts, their annoying suggested news feed order that always put the crap on top, plus a bunch of other stuff that always suggest crap to you if you don't take good care of the content you keep on the news feed.
Twitter has the cancerous trending crap, plus that Moments page that is always littered with garbage. Perhaps neither are as radicalizing as YouTube, but it probably depends on how you use those platforms.
The problem in all of those is not how the platforms itself works... it's the people. The users. The ignorant masses that are always posting and then feeding, watching all this crap. It's a popularity contest, and popular shit often times is the worst.
you get what you want (Score:3, Interesting)
YouTube basically just recommends those videos that other people watched frequently (and probably some more statistics like how long they watched it, whether they commented on it, what other videos they watched, etc.). And of course, what video's you watched yourself. The YouTube algorithm simply gave this journalist what he was apparently looking for - the same as most people on the internet. Don't blame YouTube for people's lust for the extreme, crazy, stupid.
Anecdotal evidence galore (Score:5, Insightful)
Certainly matches my own experiences on YouTube, though I think it's not purely the EVIL of the google that's driving it. I'm convinced that there are also trolls who are loving the chaos and who are strategically promoting their videos to be linked from opposition videos. Less annoying but similar to the original article are extremists who are also involved in strategic promotion of their videos to viewers of other videos that they regard as sympathetic.
However, as gawdawful as the EVIL google and the most EVIL YouTube have become, I'm convinced that Facebook and Twitter are worse. Much worse.
And yet all of these problems could be greatly reduced by the use of EPR (Earned Public Reputation) to gently filter in favor of nice folks. The trolls and other villains can be nudged back under invisible rocks to amuse themselves and the play with the few people who enjoy that form of slumming. I have much better uses for my time.
Re: (Score:2)
Citation required
Re: (Score:2)
Many of my posts discuss the problems of corporate cancerism. Feel free to look them over.
Do you perhaps have any evidence that the google is not a cancer?
Re: (Score:2)
Boo! Poor quality post, 1 star. Better be nice to me or ill one star you again! [imdb.com]
Re: (Score:2)
If that was a bid for a funny mod, you didn't deserve any, and so far you haven't gotten any. That anecdote does not prove the Slashdot moderation system is working well.
However, in EPR terms, I think it might deserve negative votes on the "polite" or "thoughtful" dimensions.
Re: (Score:2)
That's a great idea. Rather than having people who agree with you always show up, just see nice people who may have a different viewpoint. That's sort of like the Interesting mod on Slashdot. I use it to signify: I don't agree with you, but you bring up a good point.
If Google is curating thought, why don't they use that power for good? Oh wait... I forgot, their only interest is getting more eyeballs to look at ads. Never mind.
Re: (Score:2)
That's very close to the heart of the problem, and I wish I had an insightful mod point for you (even though that's a fuzzy and almost meaningless dimension). I have actually read that the google's secret reputation vector for each identity has around 700 dimensions. To a certain degree, I think that EPR is a kind of return sharing of the information that the google (including YouTube) is already collecting about each of us, but I think it should be limited to the intentionally public information. In other
Re: (Score:2)
Unfortunately the politicians (especially the Bolshevik Republicans) feel differently. Extremely differently. In accord with their bribes, they write the laws to permit the soulless and inhuman corporations to own OUR privacy. Minor and limited exceptions if you can afford enough lawyers.
That ah, should be democrats. They're the big money takers from Google, Facebook, Twitter, various "technology" groups that are pushing that allowance of corporations to own your privacy. Maybe you should spend a bit of time looking at who's donating to who? Spend a few hours reading opensecrets.org [opensecrets.org] for example.
Re: (Score:2, Insightful)
There is a lot of money to be made from this kind of trolling, or simply producing low quality poorly researched videos. The textbook example is Carl Benjamin, aka Sargon of Akkad. He makes a small fortune off YouTube and Patreon, mostly producing videos of himself reacting to headlines and articles that he didn't read. Imagine Slashdot comments in video form.
Yes yes, we know. You hate people that are actual classical liberals because they clash with your progressive values. Those are same values you parrot, which have turned the UK into a police state. And those same values which "when you move" you want to start implementing in some place new and start the cycle of censorship and oppression all over again. What made sargon popular was his "this week in stupid" videos, the media itself publishes those stories. By all means, if he's wrong on other subjec
Re: (Score:2)
Re: (Score:2)
Sorry precious, for typing in a rush. Next time I'll do so, for your exacting standards. But it looks like you're another identitarian that is unable to see if something stands or sinks on it's own merits, and instead has to absolutely state that garbage because of people who are promoting it. How very progressive of you. Let me guess, the latest uncovering of a massive rape gang in the UK, which trafficked, sold, and used young girls as meat is fake because the only sites covering it are right-leaning.
Re: (Score:2)
If you're right, you'll end up famous
Citation required. Anti-citation provided [slashdot.org].
Re: (Score:2)
Well that's all based on their ability to sell their point isn't it? Besides, we're talking about youtube not twitter. Remember that platform isn't limited to a short number of characters.
Re: (Score:2)
IF (and that's a gigantic "IF") Slashdot implemented a proper system of EPR (Earned Public Reputation), then it should be possible to see at a glance what sort of person you're dealing with, at least clearly enough to decide if you want to spend any time reading what that person wrote. More to the point, I would want to use such a system to render invisible most of the people who are just likely to be wasting my time.
Just for clarity, let me pick the easy dimension of "funny". I actually think that dimensio
Re: (Score:2)
IF (and that's a gigantic "IF") Slashdot implemented a proper system of EPR (Earned Public Reputation), then it should be possible to see at a glance what sort of person you're dealing with, at least clearly enough to decide if you want to spend any time reading what that person wrote. More to the point, I would want to use such a system to render invisible most of the people who are just likely to be wasting my time.
You'd fit in perfectly in China with their "perfect citizen" social media scoring. I'm not sure what's worse, that someone on /. thinks this is a viable answer to things. Or that they're likely american and think that this is a viable answer. Nice authoritarian streak.
Re: (Score:2)
Mashiki, I've decided to stop responding to posts like this because they are all the same. You tell me what I think, but it's always wrong. You claim I want a police state, no matter how often I post opposing a police state. Then you accuse me of being a creep, and go off on some random and easily disproven conspiracy theories... And no matter what I post as a response, you ignore it anyway.
Sorry, I don't tell you what you think. I take what you say at face value. You simply don't like it when you're confronted with your own actions that cause a negative effect on society. You believe you're doing good, but your own actions, views, and beliefs are inherently self-destructive. You've openly said that UK laws restricting speech don't go far enough, and you want to go to another country that supports that or goes even further. You've said that blasphemy laws should exist. You've made commen
Re: (Score:2)
Dude you're retarded. Stop posting and embarrassing yourself. Sargon is trash, most of the world except the gullible idiots knows that.
Starts out with adhom, descends into something where they don't disprove anything...at all.
provably fake
https://www.rotherham.gov.uk/d... [rotherham.gov.uk]
We have arrest records from the folks who tried reporting the "mass rapes" and the police arrested them for filing false reports.
So which is it? The media, independent inquiries aren't telling the truth. Or, that there's been what? 12? 15 different cities with the same thing happening over and over again, for decades at a time. And the police have arrested people for what later comes out to be true, which smells like a coverup. Now a coverup, we do know that happened as well don't we, the investigations by
Isn't this normal? (Score:2)
Human curiosity draws us to new and exciting things. I'm not sure how driving a tricycle would interest anyone who already drives a grown-up bike on a daily basis.
This all seems kinda logical. If you're interested in topics outside of mainstream political view, there is no other choice than show you conspiracy theories... because every theory outside of mainstream has the potential to be a conspiracy theory until it's proven. I would go as far as to say that it is a pretty good guess that you'll find a cons
It has nothing to do with your initial search (Score:5, Insightful)
It doesn't matter if you started off with an innocuous video or an extreme one, the suggested videos will lead to the same place eventually. YouTube's algorithm places popular videos in a genre high up in the recommended videos results. Likely because a crowd attracts a crowd. More extreme videos are always more popular by numbers, because clickbait.
This is a side effect of crowdsourcing their ratings by going on views, and perhaps secondarily how many people 'liked' or 'disliked' a certain video.
In any given genre (books, TV, movies, YouTube) the most popular item by numbers is the item that has the broadest appeal. The one with the broadest appeal is usually on the lower end of the intelligence scale. As with cinema, intelligent, artful pieces are typically relegated to small audiences, with the occasional oscar-bait breakout.
So if you put together a system whereby the most popular videos are suggested first, the feedback loop described in the article will happen. The only way out of that is to hand curate the algorithm. And that's the very thing that NO large scale tech company wants to do. The moment they stop automating everything possible is the moment scaling becomes expensive, and they no longer reap their huge margins - a license to print money as long as they can keep it going.
Re: (Score:2)
Google already does adjust its algorithms on other platforms like search. For example, auto-complete is disabled for many search terms, including things related to porn and weapons. It also fights a constant battle against SEO asshats who are trying to game it, demoting their content-free adverts masquerading as web sites.
It's the only way to stay relevant and maintain the top spot. If your service gets overrun by spammers and conspiracy theorists... Well, look at gab.ai and Voat. Their refusal to moderate
Re: (Score:2)
Yeah, that feedback loop is brutal when it comes to crowd-sourcing. Herd mentality is pretty stupid in certain ways. Like AI, it's smart at some things, but falls into pitfalls (and over ledges) where individuals would not. It's a different sort of intelligence. And you're going to have to deal with that any time you depend upon large groups of people to make decisions or vote for things one way or another. But Youtube takes it a step further by pigeon-holing you into a genre and making assumptions about
Re: (Score:2)
Or they could simply stop recommending videos based on popularity. Provide a random sampling rather than "the creme of the crop".
But Youtube's problem is that only one in a million videos on it make any sense at all. And with this metric, the conspiracy theories and extremist videos are "good". Random samples would be complete garbage - incomprehensible accent, simple copy from other places, some guy stammering, adjusting his camera and laughing instead of focusing on content etc. Youtube doesn't want to expose the fact that most videos on it are complete garbage.
Users wouldn't mind complete garbage videos existing, if users are ext
Re: (Score:2)
And yes, radio is still like that too. And so is Pandora, which frustrates me to no end. I thought Pandora was supposed to base its picks on the analyzed traits, the "musical DNA" if you will, of the songs. But no If I give it a popular 1970s-1980s Classic Rock track, then it'll cough up a steady playlist of all the same 1970s-1980s Classic Rock hits that I've already heard a thousand times, just like my local radio station. I've had better luck feeding it more specific and more obscure stuff. For ex
Re: (Score:2)
I think Pandora does okay starting with newer stuff and feeding in recommendations. But honestly, if you're starting with a style that's been around for 40-odd years, how much undiscovered material are they likely to turn up? They probably (and maybe rightfully) assume that if someone wants to listen to a hit from the 70's, they're mostly looking for other familiar hits from that era, and there should be very few surprises.
A psychological phenomenon: Group Polarization (Score:3)
There's something called Drop Polarization which is a fancy way of saying that clusters of people, left to themselves, can (not will, but can) become extreme versions of the initial group over time. The group becomes less a subgroup of the dominant population and more sharply defined. It would be interesting to see if Google's recommendation system has accidentally recreated that.
I tried it, it's true (Score:2, Interesting)
I started with a video, showing 7 photography tips, next video, YouTube game me 8 tips, but it didn't end there, eventually it showed me 10 tips in 90 seconds. That's totally extreme!
I'm going to post a YouTube video with 11 tips now.
For profit (Score:2)
People are able to think for themselves and responsible for their own actions.
"radical" opinions would have no effect on the people if there was no truth to them.
The problem is that government rule is based on violence against peaceful people and evil by definition.
So governments are fabricating all kinds of problems out of thin air.
Discussion of these problems is considered problematic by the ruling class because it threatens their power.
Ther
Re: (Score:2)
Re: (Score:2)
"Because everything government does includes making responsible people pay for irresponsible people."
There's no citation because this is just my own opinion.
This is a global assertion, you could prove it wrong very easily, by providing one example of a government program that doesn't fit this definition.
The longer road would be for me to explain in details for every single government program how it has this property.
I could do th
Seems logical (Score:2)
Humans seem wired to become addicted to chemicals that change our emotional state. These include both external substances, (alcohol, cocaine, meth - the list goes on), and substances that our bodies make, such as the endorphins that result in so-called "runner's high". It makes sense that inflammatory media content which causes adrenaline flooding and all sorts of other bio-chemical storms, might also induce a craving for further such experiences.
Maybe it's time to start framing our addictions to various st
So relevance is not important? (Score:2)
Why would they think you'd want videos from a topic other than the one you've searched for? It sounds frustrating for the end user to not receive relevant results from a search. Such frustration is why I've switched which search engine I use recently.
Vids about vegetarianism led 2 vids about veganism (Score:2, Funny)
The horror! The horror!
You have a choice (Score:2)
You don't have to watch every (or any) recommended videos on YouTube.
brass-knuckle pursuit dynamics (Score:4, Insightful)
When I hover over a YouTube recommendation, a vertical "..." control appears, which can be clicked upon to pop up a small menu.
Inside this floating drip, drip, drip menu there are three items: Not interested, Add to Watch later, and Add to playlist.
I've been running experiments on Not interested. First I applied it to every video where the thumbnail contained giant boobs. I like boobs, but there's a time and place, but pressed into my nose all day long—under false pretences, more often than not—is not the time and place.
If it really is machine learning under the hood, in theory YouTube would detect this conspicuous pattern. Miraculously, after dismissing many dozens of these, YouTube rarely offers up thumbnail cleavage any longer. But what did it really conclude? That I don't like boobs? That I don't like videos thumbnailed under false pretences? That I don't like the kinds of subject matter typically bannered under "here be the big boobies"?—for which the "fail" genre servers as the conspicuous anchor tenant. Or did it just run out of booby thumbnails in its primary recommendation rotation? From the outside looking in, it's hard to know.
Then I watched a bunch of chess analysis videos after AlphaZero "destroyed" Stockfish. I decided that I really like agadmator's coverage in general, so I watched some of his classics. By this point, 50% of my recommendation column on nearly every YouTube screen was chess videos. So I started to systematically blow these away with my persistent Not interested assault weapon (more of a musket than a semi-automatic, but you go to war with the army you have). It took about a week, and one- to two-hundred repetitions, but now the chess videos arise in my feed no more.
Then I got interested in the Sam Harris interactions with/about Jordan Peterson (who is not an idiot, and not a puppet of the far right, but very well read, articulate, 50% a clone of my own perspective on life, and 50% the exact opposite of my perspective on life; in short, about the most useful resource presently available to me to drive actual personal growth). It wasn't long before I was viewing Harris's "controversial" interview of Charles Murray. (By merely adding that scarequote disclaimer, a certain faction of the Identity Politics Police have already won.)
You can guess what happened to my recommendation feed after that.
Now, this could have been far worse than it was, because I had long been waging a slow campaign of rooftop assassination of any video containing ALLCAPS somewhere in the video title (especially if the main verb, and most especially the snowclone "x DESTROYS y about z"—if you've already mentally replaced z with "Zionism", YouTube has conditioned you well).
Optimally x and y are selected to maximize brass-knuckle pursuit dynamics. We've all seen this trope on WWE. Back when I grew up in the two-channel 1970s, wrestling was one notch above ultimate pain, variety hours such as Lawrence Welk, Tommy Hunter, Rene Simard, or the The Pig and Whistle, so I endured enough wrestling to internalize all the wrestling tropes for life, while desperately checking back to the other channel every three minutes in prayer, I guess, for the kind of programming miracle—surely on par with the virgin birth (whatever that was)—where an entire show is cancelled and replaced mid-episode (I dreamed this dream week after week for what seemed like years and years).
Brass-knuckle pursuit dynamics is where the black hats have both guys in the ring, while the white-hat's partner distractedly sits the imbalance out (bear in mind, this is Canada in the 1970s, where any given NHL bench-clearing brawl clears the bench right down to the lowest equipment manager, so the 250+ lb muscle-bound white hat going Daisy Daydream while his partner gets two-wayed in the ring already strained the c
It is all about the clicks.. (Score:2)
All of these sites are basically trying to "engage" you, so they promote more and more clickbait types of things to get more and more clicks. The more clicks, the more ads that they can push at you, and the more ads they push, the more money they make.
Being "engaged" usually means having an emotional response. You might be furious or outraged about something - that's the sort of thing you will post and share so that other people would also see the ads. Doesn't matter whether what they are saying in the v
Solution: Don't let the algorithm choose (Score:2)
... Use your own brain, your observations of the world, and others you trust, to seek out the video content you want.
I did an experiment a couple of weeks ago with Netflix: I watched only its recommendations for a few days. The actual annoying thing wasn't that the recommendations were "bad" in the entertainment sense. They were pretty good. The annoying thing was that, because I watched one thing with subtitles, by the time a couple of days had passed, I was no longer watching shows in English.
I take from
Look for comedy then (Score:2)
I was going to suggest looking for comedy, as the algorithms would lead you to funnier and funnier videos, which at least will become less enjoyable.
But then you would reach the result in this BBC documentary [youtube.com].
YOUTUBE saves !!!!! (Score:2)
One of the many outrageous claims was that the Zapruder film proves that the shot came from the rear of the limousine, from the
First they came for the . . . (Score:2)
And then they came for the AI.
And since the AI was a chatbot, it spoke for itself, and started a nuclear war to destroy all humans.
Re:Youtube is a tool (Score:4, Interesting)
Re:Youtube is a tool (Score:4, Funny)
YouTube is NOT garage, comrade. YouTube is video sharing site. No place to park in YouTube.
Sincerely,
Your totally American pal, Brad. #Veteran #JesusIsLord #2ndAmendment #NRA #MAGA
Re:Youtube is a tool (Score:4, Funny)
Looks like something a russian bot would say. Well I guess that confirms what everyone already suspected. PopeRatzo is really a russian bot in disguise.
Re: (Score:2)
Look at what supporting Donald Trump has done to you, man. You've become little more than an incoherent bundle of violent impulses. You're going to need a lot of help when all this is over. Good news is it won't be much longer.
Youtube is racist (Score:3, Interesting)
Search for Naked Women and you get scads of videos of Brown People living in huts, etc. All mostly, if not completely buck naked.
Search Naked White Women and you get very few hits, all restricted, and lots of blurred images.
So...naked brown people, no biggie, naked white people, OMG, protect the children!
Lol
Re: (Score:3)
Re: (Score:2)
"the extreme left agenda"
Jesus christ you're stupid. Did you manage to pull up a chair to sit on all by yourself? If so, honestly - I'm fucking impressed.
Yes youtube is a communist tool spreading communism. You dopey fuck. lol.
Post-Modernism is the new communism. Identity politics, not Marx's workers vs owners. YouTube certainly has its biases there, but it seems unrelated to TFA.
Left and Right, YouTube seems to steer people to fringe content, probably because extremist clickbait works, and has polluted the data that backs the recommendation algorithms.
I know I subscribe to just one primarily-political site (and that's mostly British politics), and yet I'm constantly getting recommendations for more fringe sites. It's not like
Re:Donald Trump will die in prison either way. (Score:5, Insightful)
Blaming YouTube is like blaming the phone company for telemarketer calls.
Congratulations, you have missed the point quite spectacularly.
It's not that these videos exist on youtube, it's that they are PROMOTED HARDER by the algo.
Re: (Score:2, Troll)
The difference maybe being that with YouTube, YOU decide what's on the tube.
So if you're looking for someone to blame if you're watching crap, find a mirror.
Re:Donald Trump will die in prison either way. (Score:4, Insightful)
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
If you don't like the choices, make some better content yourself and put it there.
Re: (Score:2)
Is there any evidence that they don't all do this?
Re: (Score:2, Insightful)
They should ban books altogether!
Except no one actually said that. It's hard to actually discuss things (what free speech is actually for) when any point considered (correctly or not) even vaguely against someone's politics is met with a gale of howling.
No one suggested banning books or youtube. You tube promoting more extreme videos is no more muh freeze peach than it always reommending cat videos. It's an automated system in use by a few billion people. It has an effect whether that agrees with your world
Re: (Score:2)
I said that.
It's like banning the knives because someone uses them to kill people.
While the remaining 99.999999% uses them to eat.
Labelling Youtube as a "radicalizer" is simply bullshit for this reason.
Re: (Score:2)
Someone beat you to it...
https://en.wikipedia.org/wiki/... [wikipedia.org]
Re: (Score:2, Informative)
Pretending like this is primarily radicalizing the left like you and the summary are doing is ridiculous. 9/11 truthers may have started there, but I haven't heard from a left-wing truther in years. The crazy conspiracies these days are the domain of the Infowars crowd. Additionally, those being most radicalized online leading to violence are right-wing or Islamist. There may be a few leftists punching Nazis, but who is going on shooting rampages or running people over with vehicles?
Re: (Score:2, Insightful)
The most radicalized and violent are right-wing?
Hello? Antifa calling here!
Re: (Score:2)
Uh no.
I think you'd better check the political compass test on most of the shooters in the last 30 years.
If you think they're majority right-wingers, you haven't done the research.
Re: (Score:2)
Yes, come to the dark side. We have chocolate cookies.
They are an acquired taste, though.
Re: (Score:2)
Re:One of the biggest stories of the decade (Score:4, Interesting)
Re: (Score:2)
Re: (Score:2)
,
I've been posting about this here and on the Reg for about a month - the OP is "late to the story".
I did an experiment with two fresh machines (new MAC addrs) and found that it's not so biased one particular way as it is super-prone to the echo chamber effect.
(FWIW, I watch some of the milder "conservative" stuff now and then myself) Given an initial "hint" it will take you further down that path - either side of the c
Re: (Score:2)
If you think they are good (for you).
I'm a bit surprised that I was modded down to 0, Offtopic.
I was trying to be insightful and funny at the same time. "Free Radical" is if course the chemical construct that is usually bad for your body and one that one tries to battle with "anti oxidants". On the other hand, the stuff that produces the free radicals also brings some joy and pleasure to us. Hence the tension between "what's good for you" and "what you enjoy".
In the same way Youtube is a source of joy, plea
Re: (Score:2)
I bet you are not even a real Dr. :)