← Back to context

Comment by GTP

7 months ago

But not putting the limits, leaves the door open to a different class of outages in the form of buffer overflows, that additionally can also pose a security risk as could be exploitable by an attacker. maybe this issue would be better solved at the protocol level, but in the meantime size limit it is.

Nah, just OOM. Yes, there does need to be a limit somewhere - it just doesn’t need to be arbitrary, but based on some processing limit, and ideally will adapt as say memory footprint gets larger.