Comment by garciasn
1 year ago
Netflix uses 3-7GB an hour. The average person is spending 4-5hrs a day watching TV. I’d say most are above 100GB/month.
But that’s me.
1 year ago
Netflix uses 3-7GB an hour. The average person is spending 4-5hrs a day watching TV. I’d say most are above 100GB/month.
But that’s me.
Who has 4-5 hrs a day to watch television? ..or am I completely out of touch?
According to historical Nielsen data[1] from 1991 to 2009: most Americans.
Even back to 1950, for per household data, it was above 4 hours.
[1] https://www.nielsen.com/insights/2009/average-tv-viewing-for...
2022 data from the BLS: https://www.bls.gov/news.release/atus.t11A.htm
Men spent 3 hours a day watching TV, and women 2.5 hours. But TV time is lower (around 2 hrs/day) from ages 20-44, then increases again after 45 and peaks at 75 years old at nearly 5 hours a day.
Households without kids watch more TV, which surprised me.
1 reply →
Thats per household, not per person. That's different. And households also tended to get smaller.
1 reply →
_actively_ watch? Probably not many. Having it on as background noise however? 5 hours is pretty easy
Is that still a thing with young people? I associate leaving the TV on in the background as an older generation thing.
15 replies →
watch, or leave running as background noise …
Any recommendations for shows that make good background noise? I wish they had more concerts.
3 replies →
Shouldn't audio (radio) suffice for that?
1 reply →
Children, sadly.
Families sharing an internet connection. Kids watch 1 o 2 hours each, mom and dad another hour each.
Yep, but that data originates from the providers network and never leave the providers network, so they probably don't count it towards your usage the same way.
I don't think that breaks net neutrality either, which the FCC seems to be reimplementing
Edit: see https://openconnect.netflix.com/en/
All my data usage is over LTE and NR. On one line it mostly gets used for streaming video (YouTube,plex,twitch) and averages around 500GB/mo. I rent a line to a friend and he's doing over 10TB/mo on mostly machine learning stuff and astronomy data.
T-Mobile absolutely counts all data used over the network, my voice lines go QCI 9 (they are normally QCI 6) when over 50GB of any kind of data usage each month, the home internet lines are always QCI 9. I don't have congestion in my area so it does not affect my speeds. This is QoS prioritization that happens at physical sector level on the tower(s).
They absolutely count it the same way. Comcast just gives me a number for bytes used, with a limit of (IIRC) 1.2TB above which they start metering. Our family of four comes dances around hitting that basically every month. The biggest consumer actually isn't video, is my teenage gamer's propensity for huge game downloads (also giant mod packs that then break his setup and force reinstall of the original content).
I think a few hundred GB for a typical cord-cut household is about right.
This obviously has no relevance for starlink which does not have local datacenters for cdn purposes. All that bandwidth is going through the satellites right before it reaches the user.
I wouldn't be surprised if starlink doesn't at least experiment with making the satellites a big bunch of CDN nodes.
Imagine they put 10TB of flash memory on the satellites and run virtual machines for the big CDN companies (cloudflare, Google, Netflix etc).
I reckon that 10TB is still big enough to service a good little chunk of internet traffic.
2 replies →
Do you have a source on the 4-5 hrs?
https://www.statista.com/statistics/420791/daily-video-conte...
300+ minutes a day for TV + vMOD (streaming services). Since no one actually watches TV anymore, at least not through traditional TV, I summed them.