Comment by mrb
1 year ago
So that is "432 Mbit/s per laser, and 9000 lasers total". I don't know you guys but I find that statement much more relatable than "42 PB/day". Interestingly, they also say each laser "can sustain a 100Gbps connection per link" (although another part of the article even claims 200 Gbit/s). That means each laser is grossly underused on average, at 0.432% of its maximum capacity. Which makes sense since 100 Gbit/s is probably achievable in ideal situations (eg. 2 satellites very close to each other), so these laser links are used in bursts and the link stays established only for a few tens of seconds or minutes, until the satellites move away and no longer are within line of sight of each other.
And with 2.3M customers, that's an average 1.7 Mbit/s per customer, or 550 GB per customer per month, which is kinda high. The average American internet user probably consumes less than 100 GB/month. (HN readers are probably outliers; I consume about 1 TB/month).
>these laser links are used in bursts and the link stays established only for a few tens of seconds or minutes, until the satellites move away
The way Starlink satellites are in orbit, the same satellites will remain "ahead" and "behind" you in the orbital plane. Those laser links (specifically!) will remain relatively persistent. This arrangement is similar to Iridium FYI.
FTA: "in some cases, the links can also be maintained for weeks at a time"
FTA: "in some cases, the links can also be maintained for weeks at a time"
I think there is a lot of variance. The article also states about 266,141 “laser acquisitions” per day, which, if every laser link stayed up for the exact same amount of time, with 9000 lasers, means the average link remains established for a little less than an hour: 9000 (lasers) / 266141 (daily acquisitions) * 24 * 60 = 49 minutes
So some links may stay established for weeks, but some only for a few minutes?
I would guess that the links between satellites on the same orbit stay for weeks, but the ones that cross between orbits have to constantly re-established.
3 replies →
Partially! There are also ascending and descending satellites meeting. Ascending and descending doesn't mean altitude but in a "2D view" sense. See https://www.heavens-above.com/StarLink.aspx
Thanks, this is an important point. I missed the fact that Starlink's orbital planes actually cover the full 360° of RAAN[0], not just 180° like Iridium did (presumably to minimize the number of satellites).
So actually this Iridium-type "seam" disappears, meaning that every satellite should always have co-orbiting "neighbors" on both sides. Cool!
[0] https://en.wikipedia.org/wiki/Right_ascension_of_the_ascendi...
Most customers aren't served by lasers, their data goes up to the satellite and down to the nearest gateway. Lasers serve customers out of range of a downlink gateway, and the traffic probably travels the minimum hops needed to get to one.
But with lasers, it makes sense to route your packets via space. For example traffic to a different continent would be faster (and cheaper) through space. Furthermore, I assume lasers have more capacity than gateways, so they could increase capacity of one satellite by bundling with more gateways.
Unfortunately, the routing to make this feasible doesn’t exist yet. Users need a single IP address from a range that’s homed at a single PoP. Starlink doesn’t support user-user connections through the mesh, you need to go all the way out to your PoP, then over to the other users PoP, then back through Starlink to that user.
1 reply →
I thought that Starlink always "landed" to a base station back in the same jurisdiction? I think relaying through space could open a regulatory can of worms.
6 replies →
Netflix uses 3-7GB an hour. The average person is spending 4-5hrs a day watching TV. I’d say most are above 100GB/month.
But that’s me.
Who has 4-5 hrs a day to watch television? ..or am I completely out of touch?
According to historical Nielsen data[1] from 1991 to 2009: most Americans.
Even back to 1950, for per household data, it was above 4 hours.
[1] https://www.nielsen.com/insights/2009/average-tv-viewing-for...
4 replies →
_actively_ watch? Probably not many. Having it on as background noise however? 5 hours is pretty easy
16 replies →
watch, or leave running as background noise …
6 replies →
Children, sadly.
Families sharing an internet connection. Kids watch 1 o 2 hours each, mom and dad another hour each.
Yep, but that data originates from the providers network and never leave the providers network, so they probably don't count it towards your usage the same way.
I don't think that breaks net neutrality either, which the FCC seems to be reimplementing
Edit: see https://openconnect.netflix.com/en/
All my data usage is over LTE and NR. On one line it mostly gets used for streaming video (YouTube,plex,twitch) and averages around 500GB/mo. I rent a line to a friend and he's doing over 10TB/mo on mostly machine learning stuff and astronomy data.
T-Mobile absolutely counts all data used over the network, my voice lines go QCI 9 (they are normally QCI 6) when over 50GB of any kind of data usage each month, the home internet lines are always QCI 9. I don't have congestion in my area so it does not affect my speeds. This is QoS prioritization that happens at physical sector level on the tower(s).
They absolutely count it the same way. Comcast just gives me a number for bytes used, with a limit of (IIRC) 1.2TB above which they start metering. Our family of four comes dances around hitting that basically every month. The biggest consumer actually isn't video, is my teenage gamer's propensity for huge game downloads (also giant mod packs that then break his setup and force reinstall of the original content).
I think a few hundred GB for a typical cord-cut household is about right.
This obviously has no relevance for starlink which does not have local datacenters for cdn purposes. All that bandwidth is going through the satellites right before it reaches the user.
3 replies →
Do you have a source on the 4-5 hrs?
https://www.statista.com/statistics/420791/daily-video-conte...
300+ minutes a day for TV + vMOD (streaming services). Since no one actually watches TV anymore, at least not through traditional TV, I summed them.
1 reply →
Yeah 1TB seems average for anyone in IT who is really into data.
I'm kinda pissed their is no local ISP competition in my area....and iv tried reaching out to companies with little success...or they say were expanding to your area soon but will not say when.
10GB symmetric fiber isn't hard. Hell I'd use more bandwidth if I could but I'm stuck with no fiber atm
Data might get counted multiple times as it takes many laser hops to reach its destination.
Good point.
I’d have guessed they count “delivered bytes” not “transmitted bytes” and then you need to take into account each leg of the transfer. Which for starlink is at least two (for the simple bend pipe situation) and up to potentially something like ?20? (for a “halfway around the globe, totally starlink” connection). The latter is probably statistically negligible, but even the factor two would give ~2% utility. Which, taking into account, that at least 2/3 of the orbit time is spend out of reach of anywhere useful, this would give something like 1 in 10 possible bytes being transmitted. Which is much better than I’d have guessed if asked blindly
I think the average Instagram or TikTok user must be using more than 100GB/month. And if you count YouTube and Netflix, it's probably more than that.
Is resolution going to peak? Like speeding on a highway are there diminishing returns? On the other hand, bandwidth availability seems to also drive demand...
Two things:
Resolution is always determined by angular resolution at viewing distance, even for analog TVs(they were smaller and further away), and also,
Videos on Internet is always heavily compressed - the "resolution" is just the output size passed to the decoder and inverse of minimal pattern size recorded within, technically not related to data size. Raw video is h * v * bpp and have always been like low to dozen Gbps.
Just my bets, the bandiwth may peak or see a plateau, but resolution could continue to grow as needed for e.g. digital signage video walls that wraps around buildings.
> Is resolution going to peak?
not for awhile. apple vision / oculus will stream (4k/8k) 3d movies.
https://developer.apple.com/streaming/examples/
Sure, but "4k" is still being used as a differentiator for streaming companies in how much they charge. Even then they serve up some pretty compressed streams where there's room to do less of that for a noticeable notch in quality.
There's of course a limit. The "native" bitrate equivalent of your retina isn't infinite.
Next step though is going to be lightfield displays (each "pixel" is actually a tiny display with a lens that produces "real" 3D images) and I assume that will be a thing, we shall see if it does better than the last generation of 3D TVs/movies/etc. That's a big bump in bitrate.
There's also bitrate for things like game/general computing screen streaming where you need lots of overhead to make the latency work, you can't buffer several seconds of that.
The next gen sci-fi of more integrated sensory experiences is certainly going to be a thing eventually too. Who knows how much information that will need.
When more bandwidth becomes available, new things become possible, sometimes that are hard to imagine before somebody gets bored and tries to figure it out.
When I'm futzing around with ML models, I'm loading tens of gigabytes from disk into memory. Eventually something like that and things orders of magnitude larger will probably be streamed over the network like nothing. PCIe 4.0 x16 is, what 32 GBps? Why not that over a network link for every device in the house in 10 years?
> Is resolution going to peak?
It should. At some point you are beyond any difference a human eye can detect on a tv or monitor you’re sitting less than 10ft away from.
It probably won’t though because capitalism means there has to be a reason to sell you a new widget and 3D was an utter failure.
This is being downvoted but it's probably about right.
My smart TV used 483 GB in the last 30 days
There is one key issue of keeping lasers aligned for long durations between satellites and even between a satellite to a ground station. There are vibrations in satellites and even a tiny bit of that vibration translates to beam misalignment. Am not an expert though. That could explain the bursts.
So it's hard to sustain the theoretical 100GPS connection for hours let alone days across 2 end points which are in constant motion.
That means each laser is grossly underused on average, at 0.432% of its maximum capacity. Which makes sense since 100 Gbit/s is probably achievable in ideal situations (eg. 2 satellites very close to each other), so these laser links are used in bursts and the link stays established only for a few tens of seconds or minutes, until the satellites move away and no longer are within line of sight of each other.
I think I agree that each laser is grossly underused on average, but if you read the article, there's quotes about the uptime of these links. They're definitely not just "used in bursts [of] a few tens of seconds or minutes".
> That means each laser is grossly underused on average, at 0.432% of its maximum capacity.
Don't forget that every communication protocol has fixed and variable overhead.
The first is a function of the packet structure. It can be calculated by simply dividing the payload capacity of a packet by the total number of bits transmitted for that same packet.
Variable overhead is more complex. It has to do with transactions, negotiations, retries, etc.
For example, while the theoretical overhead of TCP/IP is in the order of 5%, actual overhead could be as high as 20% under certain circumstances. In other words, 20% of the bits transmitted are not data payload but rather the cost of doing business.
The first slide says "9000+", suggesting that the number of space lasers is slightly over 9000. I feel like that's an important distinction.
Most likely it's a reference to the "it's over 9000!" meme.
Starlink's big investor and launch customer was US Air Force.DoD had long complained about lack of fast sat comms, it's also why they effectively own Iridium.
So in addition to households add foreign bases and possibly drone command networks to possible sources of traffic going fast enough to warrant sat-to-sat connection.
Launch customer yes. Investor no. That's Google and Fidelity Paying above the regular rate. Definitely
USAF fronted significant (40%? Those kinds of numbers were floating around) of initial operational capability costs.
1 reply →
My parents moved in and, being old, stream TV all day (instead of cable) and end up using about 40 GB per day with 1080p. We keep hitting our max of 1.2 TB set by our cable company (because there are others in the home!).
I should probably see if my router can bandwidth limit their mac addresses...
> And with 2.3M customers, that's an average 1.7 Mbit/s per customer, or 550 GB per customer per month, which is kinda high. The average American internet user probably consumes less than 100 GB/month.
Dead internet theory (alive and well!)
The average household probably watches significantly more tv than HN users. That is almost all streamed - something like 6 hours per day times multiple TVs.
1tb feels reasonable to push that much video.
There’s probably redundancy in the links. In other words, A sends a MB to B which sends it to C, that’s 1 MB of information transmitted to customers but 2 MB of laser transmission.
I'm seeing about 6Mbps per customer during peak hour on my own network, so 1.7Mbps over a longer period of time sounds like it's in the right ballpark.
"Customer" may refer to households, not individuals, in which case it could be numerous internet users soaking up data per customer.
Just because they seem grossly underused, there are probably plenty of other non-ideal constraints like power usage for instance.
Thermal management is also a tremendous problem in space. All power generated must be radiated away, and satellites effectively sit inside a vacuum insulator.
I'd be interested in what the sustained power/thermal budget of the satellites is.
Where did you get that 100GB/mo number from? 4K streaming eats up data transfer quickly. Comcrap & friends knew what they were doing making arbitrary data caps that sounded like a big number at the time. Wireline data caps should be illegal.
if you stream you use a lot more than 100gb/month. I use around 1tb with a family of 3.
i think even more relatable is how many customers they can handle at say 200 mbps