Comment by hendzen
8 years ago
I think this bug is kind of an indictment of Ragel. It has some great ideas, but since the generated code is so low level - and allows arbitrary blocks of code to be executed in the guts of the parser, bugs like these can result in this horrible memory issues - particularly since the generated code is often used to parse untrusted user input.
I don't blame Ragel.
Ragel shares part of the blame. Why did it use a strict equality check when it could have trivially done a >=?
Even a >= check would have been suboptimal. Rather than
or
they should have had
since having servers core dumping would have drawn attention to the bug in a way that counting one byte too many and then hitting _test_eof would not.
3 replies →
It's C. If you have an array, you may only compare to one element behind the last. Everything else is undefined behavior. So a compiler may just "optimize" your >= to ==.
4 replies →
That's a great defensive technique. But even when you do that, the underlying bug should still be fixed. I don't think the equality operator is the underlying bug.
Consecutive pointer increments without a bounds check in between sounds like a bug to me. But I don't really know Ragel, and perhaps the compiler doesn't have enough information to determine this is what's happening.
1 reply →
Speculation: people may want to use Ragel-generated code in C++, where strict equality checks are idiomatic. p may be an iterator instead of a raw pointer. >= can be absent, or slower than ==.