Comment by infecto
5 days ago
Not entirely related to MrBeast but related to YouTube. I genuinely miss the older algorithm where after watching a video you would go progressively further down a hole of videos somehow related to the one you just watched. It was quite entertaining and really uncovered fascinating videos.
I have no clue how to use YouTube. It seems like as soon as it latches on 3-4 interests of mine, the entire home page is exclusively filled with videos relating to that. I can mark videos as "not interesting" but it doesn't do much. I will see exactly the same videos on the home screen, ones that I'm not interested in and don't plan on watching, for weeks or months sometimes.
I'm sure there's plenty of interesting content about topics I haven't searched for, but YouTube seems intent on not letting me out of whatever bubble it thought out for me.
I think what you describe is what infecto was saying. You can't use YouTube to find interesting content anymore. You can only use it to find more of what you've already seen. In the past, it was better at unearthing new things.
Personally, I added uBlock filters so the home page is empty and recommended videos aren't shown. I only ever go to subscriptions now.
It doesn't even show me more of what I've already seen, half the sidebar is videos I've already watched (or watched halfway before dropping, with a helpful indicator of my lack of progress). Like, "we see you like video X, why don't you watch video X today?" Thanks, I already have bookmarks.
> I added uBlock filters so the home page is empty and recommended videos aren't shown. I only ever go to subscriptions now.
There's a setting to turn it off, no need for uBlock filters for that
5 replies →
Similarly, I use Unhook because it gives me fine control over what YT displays. I now find YT to be completely unusable without it.
While I do agree it has a very strong focus on suggesting more of what you've recently watched, I feel it's also managing to suggest new and interesting unrelated stuff from time to time.
Some habits I have is to subscribe to channels which I truly enjoy, instead of marking as "not interested" I select "do not suggest channel", and be cautious of click-bait titles. If I get lured in, I remove them from my history.
So for me it's mostly great, though I get your frustration as well. For example I recently watched a couple of informative videos on the LA fires as I have some relatives living in the area, and suddenly my feed is tons of that and little else.
I've found sometime in the last year or so YouTube has been suggesting random videos from very small channels much more, which I like a lot. Most of the videos are garbage, but every once in a while I'll find a gem that entertains me and my friends.
Recently I found a video of a young kid doing a taste test of a sour soda, and then demanding his dad come over from the other room to try it too. At one point the kid does a really loud burp that I found funny. Obviously not something that will do numbers, but it satisfies my people-watching itch.
2 replies →
I too liked to prime my own algorithm but Yanis Varoufakis book Technofeudalism kind of ruined it for me. On a individual level it's nice to get good recommendations, but on a societal level I think it's starting to get a bit scary to the point of me wanting to opt out and instead curate my own feeds based on first hand sources.
1 reply →
> It seems like as soon as it latches on 3-4 interests of mine,
Its worse than that. I thought that Youtube worked as you described, trying to find videos suited to your interests but it actually works the other way around.
Youtube has a series of rabbit holes that it knows maximise engagement, so its trying to filter you the human down one of those rabbit holes. Do you fit the mr beast ssniperwolf hole, or the jordan peterson joe rogan rabbit hole? Howabout 3 hour video essay rabbit hole, is that one your shape?
Its designing paths for engagment and filtering humans down not filtering videos for humans, its perverse and awful and it explains why the algorithm simply does not work for humans, because you are not the target audience, you are the data being sorted.
The homescreen recommendations seems to aggressively prioritize what you last searched for.
I missed the end of the recent Australia vs India cricket series so I searched for highlights from the final day of play. Since then for the past 2 weeks my homescreen has been an endless stream of cricket related videos. For some reason it has a particular focus on podcasts related to cricket.
Podcasts are popular enough high engagement content. Here engagement is mostly watch time. As such algo pushes those videos that have been recently watched a lot. Recently being popular is also other mechanic that seems very common.
I made the mistake of clicking on a Jordan Peterson video several years back. I'd never heard of him before and the title seemed interesting, so I clicked. 15 seconds in my charlatan detector went off, so I exited the video. For the next couple weeks I was playing wack-a-mole with a never-ending supply of manoshpere and right wing nonsense. Easy to see how so many people get sucked into sphere of influence.
One of the best things I've learned is that you can go into your watch history and remove something and that seems to work pretty well to fixing the algorithm after clicking on something and realizing it's garbage.
Don't use the homepage, use the subscriptions page
When you turn off search history, it makes the homepage useless and the subscriptions page becomes unavoidably the next step.
Discovery of content happens in the sidebar from videos I enjoy now, and only when I'm in the mood to discover something.
I've come accustomed to deleting cookies on browser close. The first ~10 or so YouTube page requests, the sidebar of recommended videos is pretty good. After that, as you said, it gets way too muddled. I think a good plugin for YouTube would be to always delete cookies before opening a video so that you're getting as close to a pure vanilla recommended feed as possible.
Usually the better content is down the page a bit on the YouTube home page. I use this CSS snippet to hide the first 12 videos from my home page.
You have to block entire channels. I've blocked all the major news networks, all the major content farms, and all the major "garbage" snack size content channels. There's hundreds of blocked channels on my account.
It's _almost_ like the old youtube.
I obviously don't know your personal experience, but your description is still how YouTube works for me. For example, over the holidays I would occasionally put on a video from a channel that plays holiday music with various videos of this guy's model train setup in the background. I immediately started receiving model train videos, which, of course, I had to click on and now I know a little bit about trains and building realistic environment models.
That being said, occasionally I do have to go into my Google data and clear/clean the watch history to reorient my recommendations.
I don't think thats the case. Don't confuse homepage recommendations with end of video queue recommendations. It used to be end of video recommendations were heavily weighted on the current video or chain of videos you just watched. Essentially you could keep going to the next video and go down a weird hole of obscure videos. Now the algorithm will quickly circle you back to your profile homepage of videos as opposed to the video you just watched.
Oh, I see. I suppose I do recognize more of the 'general recommendations' in the post-video grid rather than basing those solely on the video itself. That being said, I don't use that mechanism generally and tend to rely on the homepage-refresh and side bar to discover additional videos.
1 reply →
Same, and now I've spent the last two weeks trying to convince YouTube that I don't need several different videos of Christmas music playing over a fireplace.
Feels a whole lot like the dumb emails I get from places like Home Depot, where because I bought a table saw they feel I should know about all these other table saws they have.
Agreed. This is still how YouTube still works for me. It's great.
> That being said, occasionally I do have to go into my Google data and clear/clean the watch history to reorient my recommendations.
Can you elaborate on this? What effect does this produce for your specifically?
If I find I am receiving recommendations in which I am uninterested and are clearly based on a handful of videos I watched previously, I can clear those from the history and the algorithm doesn't use them for future recommendations. The simplest example would be watching videos for a one-time use case such as repairing a specific home issue. I definitely don't need more recommendations to fix that/related issues, but YouTube is likely to spend a little time sending them my way. I can fix that quickly by removing the original videos from my history.
1 reply →
This is exactly how YouTube still works for me... I'm still finding new interesting content and creators everyday.
Do you subscribe to creators you enjoy, and like their videos? You still need to feed the algorithm.
I think you missed how the old end of video algorithm worked. The old way had a much higher weighting when picking the next video on the current video being watched. The current next video pick algorithm places a much higher weighting on your preferences instead of the current video.
I don't.
My experience with their algorithm between 2014 and ~2020 was that autoplay would quickly turn into a form of video diarrhea composed largely of Jordan Peterson and Lex Fridman. Was pretty bizarre because I only watched a few Peterson videos in the beginning, mostly his "Maps Of Meaning" videos which I think are mostly poppycock, and ever since then YouTube would quickly bring me back to his content even though I was never navigating to it organically. I had to resort to clicking "Not interested" and "Don't recommend channel" on several videos, which sort of worked, but it wasn't fool proof.
These days it happens way less often, though usually that loop contains a lot of "gurus" in general and less of Peterson.
I hope I never have to hear the voices of Jordan Peterson or Lex Fridman again. I'm not a fan of either one, but YouTube insisted I was for many years.
I think they can be paid to do that, but I'm not sure quite how it was arranged.
That, or the Peterson pipeline is a good representation of a local maximum: a fairly obvious way a set of videos can direct people to related videos and increase the appetite for them. That'd produce algorithmic reinforcement without anybody getting paid. Apart from Youtube, content-agnostically hungering for being paid in views on their platform.
It could have sent a very strong signal that 'this content maximally sends a statistically significant number of viewers down MASSIVE youtube rabbitholes never to emerge, therefore take the gamble and try to show everyone the content, ???, profit!'
This was at the center of controversy many years ago, described as a sort of alt-right pipeline. I believe there are studies about that exact algorithm behavior on youtube. My understanding is that it was changed to loop back around to trusted content sooner.
Dunno about parent but to me "old algorithm" is more like 2010, not 2014.
Edit: 2010 and earlier. To me old youtube is before the rise of Minecraft. There's probably a better threshold but that's the one that comes to mind.
Funny, I have the opposite experience. I used to get relevant videos to what I was watching. If I'm watching a Phish video, it would recommend other Phish videos. These days, if I'm watching a Phish video I will literally, as in literally literally, get a Candace Owens video recommendation. I have literally never clicked on one of her videos ever. I don't watch political content on youtube at all, and if I did I am very left leaning. I can't fathom what has made the algorithm so terrible that if you're watching 90s Phish videos it recommends right-wing talking heads.
I truly miss the way the next video used to be picked. I don't get political videos but I will be watching The Band live concerts and after the second video I will get something like a singing video with The Wiggles (I have a child). It makes absolutely no sense to me.
It's so weird and obvious that shenanigans are going on in the recommendation algorithm. I'll watch a Video Game Streamer, and in the sidebar, the top ten recommended related videos are:
- Same streamer, different video
- Different streamer
- Far right pundit blasts immigration
- Video game streamer
- Video game streamer
- Video game review
- Same streamer, similar content
- Ben Shapiro OWNS Liberals with FACTS
- Video game streamer
- Video game streamer
I've never watched one of these blowhards in my life, but man, YouTube thinks I'd love it. Because I watch video games? Is this the gamer-to-alt-right pipeline I keep reading about?
3 replies →
I miss YouTube where you could just browse topics, like right around when Google bought it is when I liked it the most I think. I much preferred categories/topics based UI over this spoon fed algorithm. I think there's also a sweet spot in production value that I prefer. I like Technology Connection, Adrians Basement, Cathode Ray Tube Guy (name?), etc level of production much more than LTTs high production value.
I don't miss watching a video about a dog herding sheep, and then getting nothing but dog herding sheep videos the next week, heh. But I also don't like the new algorithm, it is as if youtube has assigned me to a demographics and really wants me to watch what other techy males around 30 y.o. watch, constantly trying to give me some rage political content to test the waters.
What I am describing is not the home page recommendations but the hole you would go down on post video recommendations, essentially the queue that Youtube would create for you had for a long time a heavier weighting on the current video you were watching. The simple example being you watch the dog herding sheep, then you the next video was about sheep, then you got to some video from a different country with sheep in it, then finally you ended up with some person who pretended to be a sheep. Purely making that example up but it often even in weird obscure videos quite quickly.
I tend to agree with you there, admittedly there was a skill in not being accidentally radicalised but you watch one 'lo-fi' video and accidentally fall asleep to four more and that's all YouTube shows me now!
I hope it doesn't happen anymore, but it used to be a game to clear your history and go to a fresh Youtube page and keep clicking on the top video in the suggestion list until you hit UFOs, Flat Earth, Climate Change denial, Lizard People, Chemtrails, or some other crackpot video. It was shocking just how fast the algorithm would gravitate towards that content, sometimes after just a couple of videos. I have a theory that the YouTube algorithm is partially to blame for the explosion in conspiratorial thinking on the modern Internet.