← Back to context

Comment by bluGill

3 days ago

You are only able to do that because you are doing simple processing on each transaction. If you had to do more complex processing on each transaction it wouldn't be possible to do that many. Though it is hard for me to imagine what more complex processing would be (I'm not in your domain)

The order matching engine is mostly about updating an in-memory order book representation.

It is rarely the case that high volume transaction processing facilities also need to deal with deeply complex transactions.

I can't think of many domains of business wherein each transaction is so compute intensive that waiting for I/O doesn't typically dominate.

  • HFT would love to do more complex calculations for some of their trades. They often make the compromise of using a faster algorithm that is known to be right only 60% of the time vs the better but slower algorithm that is right 90% of the time.

    That is a different problem from yours though and so it has different considerations. In some areas I/O dominates, in some it does not.

    • In a perfect world, maximizing (EV/op) x (ops/sec) should be done for even user software. How many person-years of productivity are lost each year to people waiting for Windows or Office to start up, finish updating, etc?

  • I work in card payments transaction processing and IO dominates. You need to have big models and lots of data to authorize a transaction. And you need that data as fresh as possible and as close to your compute as possible... but you're always dominated by IO. Computing the authorization is super cheap.

    Tends to scale vertically rather than horizontally. Give me massive caches and wide registers and I can keep them full. For now though a lot of stuff is run on commodity cloud hardware so... eh.