Comment by kuon

2 months ago

Wait what? 40Mbps for a remote desktop? Event 10Mbps is insane. I remember deploying sunrays over dialup and the image wasn't that bad, yes it was low resolution and I think it was UDP, but the desktop was usable with a surprisingly low latency.

To monitor an IA you can lower the bit depth considerably and not lose that much details on what is happening. If you control the web rendered, disable text anti aliasing, and there might be other optimization that can help. Tile & diff the image... But video encoders already does that so it might just work out of the box.

Also if your single h264 image is larger that jpeg then you are doing something wrong, jpeg is a very poor encoding compared to what we have today.

Look at how other remote desktop protocol does it, VNC, RDP...

Managing streams over corporate network is well documented, many web frameworks will include a "longpoll" fallback (or SSE) for streaming to play nice even without web sockets. "Discovering" you cannot deploy whatever you want to an enterprise network is quite alarming.

I really don't want to be the graybeard guy saying "young engineers are bad", as I am more on the side of believing on the new generations, but please, don't act like computers spawned into existence in 2020 and that nothing has been done before.