← Back to context

Comment by karel-3d

1 month ago

Well, it was "normal" crawlers that needed to work perfectly and deterministically (as best as possible), not probabilistically (AI); speed was no issue. And I wanted to debug when something went wrong. So yeah for me it was crucial to be able to record/screenshot.

So yeah, everything is a trade-off, and we needed a different trade-off; we actually decided to not use headless chromium, because they are slight differences, so we ended up using full chrome (not even chromium, again - slight differences) with xvfb. It was very, very memory hungry; but again was not an issue

(I used "agent" as in "browser agent", not "AI agent", I should be more precise I guess.)