← Back to context

Comment by harry8

3 months ago

> Was literally just transcoding some video, playing a podcast, and browsing the web.

Yeah that's the perfect use case for current system design. Nobody sane wants to turn that case into an embedded system running a single process with hard deadline guarantees. Your laptop may not be ideal for controlling a couple of tonnes of steel at high speed, for example. Start thinking about how you would design for that and you'll see the point (whether you want to agree or not).

Apologies, almost missed that you had commented here.

I confess I assumed writing controllers for a couple of tonnes of steel at high speed would not use the same system design as a higher level computer would? In particular, I would not expect most embedded applications to use virtual memory? Is that no longer the case?

  • "Hard Real Time" is the magic phrase to go as deep as you want to.

    • This isn't really answering my question. Have they started using virtual memory in hard real time applications? Just generally searching the term confirms that they are still seen as not compatible.

      1 reply →