Comment by Dylan16807
3 days ago
Small sizes have to be used with extra care, so I wouldn't want to make a generic function for all sizes. For bigger sizes we already have nice functions that take care of everything.
3 days ago
Small sizes have to be used with extra care, so I wouldn't want to make a generic function for all sizes. For bigger sizes we already have nice functions that take care of everything.
The article lays out exactly why you'd want small sizes, even with the risks. The good qualifier just means that it'd have to be no riskier than any other algorithm at the same length.
I agree? That doesn't affect what I said. You shouldn't make a one-size-fits-all function that scales that small. It should have to be a deliberate choice to switch from normal mode to small mode, and anyone that hasn't looked into it deeper shouldn't even know about the small mode.
I suppose I don't understand your point. On one hand, you can have different algorithms for each of 32, 64, etc with potentially different pitfalls and usage requirements. On the other, you can have one algorithm that implements all of them. I wasn't trying to comment on how that should be exposed in the library (because crypto lib design is a whole 'nother topic), but I'm not opposed to it being explicit.
Same as CRCs, really. You can easily write a function that performs CRCs of any size and expose different parameterizations as CRC-8/16/32/64 etc.
1 reply →