Comment by Tabular-Iceberg
10 hours ago
I remember when QuickTime came out in 1991 and it was obvious to everyone that video should be copied, pasted and saved like any arbitrary data.
It's absolutely insane to me how bad the user experience is with video nowadays, even video that's not encumbered by DRM or complex JavaScript clients.
> It's absolutely insane to me how bad the user experience is with video nowadays, even video that's not encumbered by DRM or complex JavaScript clients.
The video experience for typical video files is great these days compared to the past. I think you may be viewing the past through rose colored glasses. For years it was a pain to deal with video because you had to navigate third party players (remember Real Player?), Flash plugins, and sketchy codec pack installs from adware infested download sites. If you were tech support for friends and family during that era, it was common to have to remove adware, spyware, and other unwanted programs after someone went down the rabbit home of trying to install software to watch some video they found.
The modern situation where your OS comes with software to play common files or you can install VLC and play anything is infinitely better than the past experience with local video.
Local video could be a nightmare in 90s. I remember those days. I remember when it was revolutionary that the Microsoft Media Player came out, and you could use one player for several formats, rather than each video format requiring its own (often buggy) player. Getting the right codecs was still a chore, though.
MS Media Player eventually fell behind the curve, but eventually we got VLC and things got great.
> MS Media Player eventually fell behind the curve, but eventually we got VLC and things got great.
And in-between those we had Media Player Classic together with the Combined Community Codec Pack, and once you had MPC + CCCP installed, you could finally view those glorious aXXo-branded 700MB files found on a random DC++ hub.
5 replies →
That's if you weren't using a Mac
If you weren't using a Mac and wanted to play Quicktime videos? Then you have to install Apple's Quicktime player for Windows which was a piece of garbage.
Ah man, Macs of yore could play .rm/.rv files natively?
I'm absolutely not viewing the past through rose colored glasses. RealPlayer was a dumpster fire, but that came later.
I could hold shift and drag on the timeline to select, copy, then paste it into a document or another video. I can't do that with VLC today. Apple removed the feature in later releases too.
What you’re describing with QuickTime was a proprietary nightmare that didn’t even work correctly across Apple products, let alone Microsoft or Linux.
Today with modern tools like VLC or MPV and ffmpeg nearly anything can be viewed, streamed, or locally saved by your average user with basic Google search skills.
And the number of free and paid video editing tools as far beyond what we ever had in the past.
Then there’s the vast improvement in codecs. It’s quite insane that we can have a feature length - 4k video with 8 channel audio in a 3GiB file.
The only problem about the modern world is streaming companies who purposely degrade the experience for money. And the solution is simply to fly the pirate flag high.
9 replies →
You're not viewing the past with rose colored glasses. You're just viewing the past. We had simpler codecs with simpler computational complexities. Holding Shift and selecting a chunk of a video to copy was simple because videos were mostly a succession of independently compressed frames. Nowadays, we have forward- and backward- dependant frames, scene detection, and lots of other very advanced compression techniques.
There are whole projects striving to provide a reliable way to just cut videos without having to recode [1] and after years the results are mixed and only working for very specific codecs; no wonder Apple decided that doing the same, to their quality standards of the time, was not worth the effort or a secondary feature that was not in scope.
[1]: https://github.com/mifi/lossless-cut
5 replies →
> RealPlayer was a dumpster fire
And actually malware IMO. IIRC many of its installs were through tricks: silent installations with other software, drive-by downloads, etc. And once in, by fair means or fowl, it took over every video playing avenue whether you wanted it to or not, and it itself included other malware like Comet Cursor.
> For years it was a pain to deal with video because you had to navigate third party players (remember Real Player?), Flash plugins, and sketchy codec pack installs from adware infested download sites.
How is this any worse than what YouTube does now? Real Player and flash never made you watch ads.
It seems you may be misremembering. From Wikipedia [1]:
> Past versions of RealPlayer have been criticized for containing adware and spyware such as Comet Cursor. ... PC World magazine named RealPlayer (1999 Version) as number 2 in its 2006 list "The 25 Worst Tech Products of All Time", writing that RealPlayer "had a disturbing way of making itself a little too much at home on your PC--installing itself as the default media player, taking liberties with your Windows Registry, popping up annoying 'messages' that were really just advertisements, and so on."
[1] https://en.wikipedia.org/wiki/RealPlayer
1 reply →
Real player was one of the first real video players, it wasn't a pain, it was a genuine addon.
Flash, also almost came built into every browser.
By the time both had gone away, HTML video built in was here. Of course, there were players like jwPlayer what played video fine.
Today, most browsers have most codecs.
Real Player was an early innovator. Mostly in dark patterns.
1 reply →
1991 was the vibrant, exciting, crazy "adolescence" of the PC age and well into the period where it was cool to have a desktop PC and really learn about it.
Phones are dominant now and have passed the PC generation by - in number, not capability. The concept of copy/paste/save for arbitrary data lives on for the non-tech masses only in the form of screenshots and screen recording features.
The thing that stands out to me looking back over a few decades is how much of consumer/public computing is exploring the latest novel thing and companies trying to cash in on it. Multimedia was the buzzword aeons ago, but was a gradual thing with increasing color depth and resolution, video, 3D rendering, storage capabilities for local playback, sound going from basic built in speaker beeps to surround and spatial processing. Similar with the internet from modems to broadband to being almost ubiquitously available on mobile. Or stereoscopic 3D, or VR, or touchscreens, or various input devices.
Adolescence is a very good word to encompass it, lots of awkward experiments trying to make the latest thing stick along with some of them getting discarded along the way when we grow out of them, they turn out not to be (broadly) useful or fashion moves on. What I wonder about is if the personal computer has hit maturity now and we're past that experimental phase, for most people it's an appliance. Obviously you can still get PCs and treat them as a workstation to dive into whatever you're enthusiastic about but you need to specifically go out and pursue that, where the ecosystem might be lacking is a bridge between the device most have as their personal computer (phone/tablet) and something that'll introduce them to other areas.
> The concept of copy/paste/save for arbitrary data lives on for the non-tech masses only in the form of screenshots and screen recording features.
When it's not impeded by DRM, that is
Depending on where personal/portable AI devices go, phones might be significantly different or not exist in 10 years as they do today.
There might be a resurgence of some kind of device like a PC.
Seeing iPadOS gain desktop features, and MacOS starting to adopt more and more iPadOS type features clearly shows the desktop, laptop and tablet experiences will be merged at some point by Apple at least.
I think it'd be biased more in the direction of the Ipad. If anything there's one feature apple's trying to avoid and that's Macos' waning ability to run third party binaries
"Fitting into my pocket so I can use it in line at the post office" is a capability that desktop PCs have yet to manage to achieve.
> use it in line at the post office
If it were a powerful, useful device that I could load my own software onto and make programmable without jumping through a bunch of hoops, instead of the ad-laden crapware that resulted from primarily two megacorps duking it out over how to best extort billions from app developers and users for their own benefit, then sure, I'd agree.
But phones aren't awesome little PCs, they're zombifying the majority of the public. They also, incidentally, are insidious little snitches busy at work trying to monetize every single thing about our daily lives.
4 replies →
But are DRM and poor user experiences hard requirements for something to fit in your pocket?
Otherwise, I don’t think I get your point - maybe you could clarify?
1 reply →
My GPD pocket 4 fits into really large cargo pants if that counts lol, and there is the micropc2 too that’s even smaller :p
1 reply →
Well… https://www.zotac.com/page/zotac-vr-go-4
There are also various handheld PCs.
"Fitting into my carry-bag so I can use it in line at the post office" is already possible for a PC and many people do it all the time.
2 replies →
Handhelds like the Steam Deck are PCs and can fit in some pockets :P
I’m also waiting for the gallon sized water bottle I can fit in my <1l pocket.
long press -> save image/video is perfectly supported on a phone, it's just content diffusion platform that arbitrarily restrict it.
No, it's also iOS that's arbitrarily restricting it. I opened a bare .webm directly in Safari and got nothing on long press and nothing in any of the control widgets to save it.
2 replies →
You can't even make a screenshot if the app doesn't allow it. Phones are broken. (well, the OS on them is).
A specific issue with video data is that it’s much denser: the same concept in video takes up more bytes than in text or image. Therefore hosting is more expensive, so less people host and the ones that do (e.g. YouTube) expect revenue. Furthermore, because videos are dense, people want to download them streaming, which means hosts must not just have storage but reliable bandwidth.
Even then, there are a few competitors to YouTube like Nebula, PeerTube, and Odysee. But Nebula requires a subscription and PeerTube and Odysee have worse quality, because good video hosting and streaming is expensive.
The real problem is that YouTube built a model where the platform, not the creators, controls the money flow. They could have charged creators directly for hosting and left monetisation up to them, but by inserting themselves as the middleman, they gained leverage and authority over content itself. The "cost of hosting" is just the technical excuse for such centralisation.
> They could have charged creators directly for hosting and left monetization up to them
A platform could do that today. I doubt such a platform would've beat YouTube even in the early 2000s. Creators can get almost the same experience by hosting their own site on a VPS.
Back then, the focus was on optimising for the user. Now, however, companies prioritise their own interests over the user.
I think companies always prioritized their own interests.
A company can increase its profits (1) by improving their products and services, so that they'll get more customers or customers willing to pay more, or (2) by increasing how much of their revenue is profit by (e.g.) cutting corners on quality or raising prices or selling customers' personal information to third parties.
Either of those can work. Yes, a noble idealistic company might choose #1 over #2 out of virtue, but I think that if most companies picked #1 in the past it's because they thought they'd get richer that way.
I think what's happened is that for some reason #2 has become easier or more profitable, relative to #1, over time. Or maybe it used not to be so clearly understood that #2 was a live option, and #1 seemed safer, but now everyone knows that you can get away with #2 so they do that.
We even have a name for this now…
https://en.wikipedia.org/wiki/Enshittification?wprov=sfti1
Indeed, the good old days when "optimizing for the user" got us... Windows 3.1 (release date April 6, 1992 , ref https://en.wikipedia.org/wiki/List_of_Microsoft_Windows_vers...) or the first version of Linux - which I did not have the honor to use but I can imagine how user friendly it was considering what I ended up using couple of years later (https://en.wikipedia.org/wiki/History_of_Linux)
/s
We can have stable user-friendly software. We had a nice sweet spot in the early 2000s with Windows XP and Mac OS X: stable operating systems built on workstation-quality kernels (NT and Mach/BSD, respectively), and a userland that respected the user by providing distraction-free experiences and not trying to upsell the user. Users of workstations already experienced this in the 1990s (NeXT, Sun, SGI, HP, and PCs running IBM OS/2 Windows NT), but it wasn’t until the 2000s when workstation-grade operating systems became readily available to home users, with both Windows XP and Mac OS X 10.0 being released in 2001.
2 replies →
There are myriad ways to optimise for the user, user friendliness is only one of them.
As the old joke went "Unix is user friendly, it's particular about who its friends are".
I was just reading how ATSC 3 (over the air TV) is kind of stalling because they added DRM fairly late in the roll out. Several people bought receivers that are now incompatible.
Also, I'm not sure what the actual numbers are, but my impression is that a significant portion of OTA enthusiasts are feeding their OTA signals into a network connected tuner (HDHomeRun, Tablo, AirTV, etc.) and DRM kills all of these.
DRM being forced into freeview TV seems like a contradiction in terms, and yet here we are.
Experience with video is excellent for most people. All the complexity is hidden from the end user, unless you are trying to hack something. In the 1990s, streaming effectively didn't exist because people didn't have enough bandwidth (it was mostly dial-up), and there was very little legal offering, and the little that existed was terrible. Home video was limited too, as few people knew how to make video files suitable for online diffusion.
Piracy did pretty well, but that's because the legal experience was so terrible. But even then, you had to download obscure players and codec packs, and sourcing wasn't as easy as it is now. For reference VLC and BitTorrent released in 2001.
I'd say the user experience steadily improved and peaked in the mid-2010s. I think it is worse now, but if it is worse now, back then, it was terrible, for different reasons.
> It's absolutely insane to me how bad the user experience is with video nowadays
Has nothing to do with video per se. Normal embeddings, using the standard `<video>` element and no unnecessary JS nonsense, still work the same way they did in the 90s: Right click the video and download it, it's a media element like any other.
The reason why user experience is going to shite, is because turbocapitalism went to work on what was once The Internet, and is trying to turn it into a paywalled profit-machine.
I've always found it insane how much software development web sites are willing to undertake, just to avoid using the standard video, audio, and img HTML elements. It's almost hilarious how over engineered everything is, just so they can 'protect' things they are ultimately publishing on the open web.
Plain <video> elements are easy to download, but not great for streaming, which is what most people are doing nowadays. Much of the JS complexity that gets layered on top is to facilitate adaptive bitrate selection and efficient seeking, and the former is especially important for users on crappier internet connections.
I'm not a fan of how much JS is required to make all that work though, especially given the vast majority of sites are just using one of two standards, HLS or DASH. Ideally the browsers would have those standards built-in so plain <video> elements can handle them (I think Safari is the only one which does that, and they only do HLS).
I totally agree. And much of the JS complexity on smaller niche video sites aren’t even implemented properly. On some sites I just open developer console, find the m3u8 file URL and cookies in the request, and download it to view locally.
Browsers generally do allow native seeking if the video is properly encoded and the site supports such niceties as Accept-Range: bytes.
2 replies →
Chrome has finally just landed enabled by default native HLS playback support within the past month. See http://crrev.com/c/7047405
I'm not sure what the rollout status actually is at the moment.
1 reply →
The standard video element is really nice:
https://developer.mozilla.org/en-US/docs/Web/HTML/Reference/...
I have used it on a couple of client sites, and it works really well.
You can even add a thumbnail that shows before the video starts downloading/playing (the poster attribute). :-)
> still work the same way they did in the 90s: Right click the video and download it, it's a media element like any other.
I’m so confused reading these comments. Did everyone forget RealPlayer? Flash videos? All of the other nonsense we had to deal with to watch video on the internet?
RealPlayer was 1995, so a few years later, and arguably was a start of the trend of enshittification. Flash videos was around the times things really got bad.
That does mean we go, essentially:
Step 1: We barely have video at all.
Step 2: Everything is terrible.
Technically, you can profit off of ad revenue and subscriptions without exploiting the labour of your workers, so in this particular case it has nothing to do with the economic regime. Enshittification is its own thing.
The problem with a standard video element is that while it's mostly nice for the user, it tends to be pretty bad for the server operator. There's a ton of problems with browser video, beginning pretty much entirely with "what's the codec you're using". It sounds easy, but the unfortunate reality is that there's a billion different video codecs (and a heavy use of Hyrum's law/spec abuse on the codecs) and a browser only supports a tiny subset of them. Hosting video already at a basis requires transcoding the video to a different storage format; unlike a normal video file you can't just feed it to VLC and get playback, you're dealing with the terrible browser ecosystem.
Then once you've found a codec, the other problem immediately rears its head: video compression is pretty bad if you want to use a widely supported codec, even if for no other reason than the fact that people use non-mainstream browsers that can be years out of date. So you are now dealing with massive amounts of storage space and bandwidth that are effectively being eaten up by duplicated files, and that isn't cheap either. To give an estimate, under most VPS providers that aren't hyperscalers, a plain text document can be served to a couple million users without having to think about your bandwidth fees. Images are bigger, but not by enough to worry about it. 20 minutes of 1080p video is about 500mb under a well made codec that doesn't mangle the video beyond belief. That video is going to reach at most 40000 people before you burn through 20 terabytes of bandwidth (the Hetzner default amount) and in reality, probably less because some people might rewatch the thing. Hosting video is the point where your bandwidth bill will overtake your storage bill.
And that's before we get into other expected niceties like scrolling through a video while it's playing. Modern video players (the "JS nonsense" ones) can both buffer a video and jump to any point in the video, even if it's outside the buffer. That's not a guarantee with the HTML video element; your browser is probably just going to keep quietly downloading the file while you're watching it (eating into server operator cost) and scrolling ahead in the video will just freeze the output until it's done downloading up until that point.
It's easy to claim hosting video is simple, when in practice it's probably the single worst thing on the internet (well that and running your own mailserver, but that's not only because of technical difficulties). Part of YouTube being bad is just hyper capitalism, sure, but the more complicated techniques like HLS/DASH pretty much entirely exist because hosting video is so expensive and "preventing your bandwidth bill from exploding" is really important. That's also why there's no real competition to YouTube; the metrics of hosting video only make sense if you have a Google amount of money and datacenters to throw at the problem, or don't care about your finances in the first place.
Chrome desktop has just landed enabled by default native HLS support for the video element within the last month. (There may be a few issues still to be worked out, and I don't know what the rollout status is, but certainly by year end it will just work). Presumably most downstream chromium derivatives will pick this support up soon.
My understanding is that Chrome for Android has supported it for some time by way of delegating to android's native media support which included HLS.
Desktop and mobile Safari has had it enabled for a long time, and thus so has Chrome for iOS.
So this should eventually help things.
Any serious video distribution system would not use metered bandwidth. You're not using a VPS provider. You are colocating some servers in a datacenter and buying an unmetered 10 gigabit or 100 gigabit IP transit service.
Yes, I see Youtube going deep into enshitiffication. On my Macbook this morning with a FF-dev edition it just stopped to work this morning. Don't know if it's related to the fact I tried to install an extension to "force H264" on my Ubuntu box. On the latter fans started to go crazy as soon as I open a single youtube tab lately and a quick research led me there.
Actually at this point the only thing that makes the good old aMule a bit less inconvenient to my own expectations are
- it's missing snippet previews
- it doesn't have as many resources on every topic out there.
Well, the corporate policy in GOOG now is to only test everything on Chrome. Engineers are not even allowed to install Firefox. This is the result.
It’s not just you. My Firefox, with no extensions, have struggled on YouTube the past weeks.
Sometimes I can’t even click on the front page, sometimes when I open a video it refuses to play.
I don’t know what’s up, but it works in chrome.
I also had it stop working completely. I thought they finally wised up to my adblocker, but I decided to finally install that update I had been sitting on for a while and it just started working again
Probably just the typical nefarious activities of YouTube. Either "accidentally" driving users to switch browsers, or experimenting with circumventing ad blockers, or negligence in testing, or who knows what.
If they want the "Google has no browser monopoly!" claim, then they should be obligated to make their services work perfectly with the alternative, instead of subtly scheming and manipulating people.
One thing you can do is to use an invidious instance. Those don't support live streams and shorts, but at least you don't have to deal with the atrocious normal YouTube frontend.
That may also just be Firefox's way of telling you it has updated and needs to be restarted.
I've got a fresh install of endeavouros/arch and yt is horribly slow now. The upside is I've reduced my usage of the site.
Oh and it's not working at all on my desktop with the same setup, it's telling me to disable ad block. I'd rather give up yt.
Around 2012?, I had some extension that forced YouTube videos to play with Quicktime in-browser, which was leaner. Original file, no conversion.
YouTube should have been a distributed p2p system with local storage of your favorite videos. A man can dream...
Didn't work because asymmetric upload/download speeds (which now are a thing of the past; however, it gave youtube an early advantage).
Now largely more feasible. We should try again.
3 replies →
Guess why it was asymmetrical in the first place ... Telcos wanted to sell the upload bandwidth to streaming companies. Another double dipping Telco monopoly squeeze and customer boxing / enshitification from very early on.
1 reply →
I remember when QuickTime came out in 1991 and it was obvious to everyone that video should be copied, pasted and saved like any arbitrary data.
I remember when VCR's came out and everyone would take TV shows and share them with their friends.
By now we should be able to share video on SD Cards that just pop into a slot on the top of the TV, but the electronics companies are now also the content companies, so they don't want to.
You can plug a USB drive with videos on into a lot of TVs I've encountered over the years. Due to limited container/codec support I rarely made use of it though.
Remember RealPlayer? Grainy 128 x 128 streamed videos in 1998!
Was RealPlayer really that horrible or was it just trying to do streaming media on an extremely low bandwidth connection without hardware accelerated and sophisticated codecs? I only really used it with a 28.8K modem netscape and Windows 95. The experience was poor but the experience viewing moderately sized images wasn't great either. I remember at the time encountering MPEG decoder add-in cards (that nobody used), although I suspect video cards started to add these features during the 1990s at some point.
I've gotten to experience using RealPlayer again this year[0] and... a lot of it was just it being really early but a lot of it was just the software being really bloated with adware and terrible design decisions. It asks for your home address when you install it, there are a bunch of ad panes you have to manually disable etc
[0] https://kalleboo.com/linked/realplayer2025.png
I never bothered trying to stream anything, but I do remember downloading 20mb episodes of Naruto in surprisingly good quality due to the .rmvb format.
The BBC here used to put a ton of news content on it, it was pretty forward thinking really!
Remember RealPlayer? Grainy 128 x 128 streamed videos in 1998!
I remember when someone slapped a big "Buffering" sign over the Real Networks logo on the company's building in Seattle.
A media business is predicated on exclusive rights over their media. The entire notion of media being freely copied and saved is contrary to their business models. I think there's a healthy debate to be had over whether those models are entitled to exist and how much harm to consumers is tolerable, but it's not really obvious how to create a business that deals in media without some kind of protection over the copying and distribution of that media.
I think what breaks computer peoples' brains a bit is the idea that the bytes flying around networks aren't just bytes, they represent information that society has granted individuals or businesses the right to control and the fact technology doesn't treat any bytes special is a problem when society wants to regulate the rights over that information.
I have worked on computer systems for media organizations and they have a very different view of intellectual property than the average programmer or technologist. The people I find the most militant about protecting their rights are the small guys, because they can't afford to sue a pediatrician for an Elsa mural or something.