Comment by ADcorpo
3 years ago
This post is a funny coincidence as I tried today to speed-up a CI pipeline running ~10k tests with pytest by switching to pypy.
I am still working on it but the main issue is psycopg support for now, as I had to install psycopg2cffi in my test environment, but it will probably prevent me from using pypy for running our test suite, because psycopg2cffi does not have the same features and versions as psycopg2. This means either we switch our prod to pypy, which won't be possible because I am very new in this team and that would be seen as a big, risky change by the others, or we keep in mind the tests do not run using the exact same runtime as production servers (which might cause bugs to go unnoticed and reach production, or failing tests that would otherwise work on a live environment).
I think if I ever started a python project right now, I'd probably try and use pypy from the start, since (at least for web development) there does not seem to be any downsides to using it.
Anyways, thank you very much for your hard work !
If you use recent versions of PostgreSQL (10+ I believe) you can use psycopg3 [1] which has a pure Python implementation which should be compatible with PyPy.
[1]: https://www.psycopg.org/psycopg3/docs/basic/install.html
Second this - no psycopg2 support and to a lesser extent lxml is a nonstarter and makes it pretty difficult to experiment with on production code bases. I could see a lot of adoption from Django deployments otherwise.
Yeah we don't use pypy for those exact reasons on our small django projects.
I work on pg8000 https://pypi.org/project/pg8000/ which is a pure-Python PostgreSQL driver that works well with pypy. Not sure if it would meet all your requirements, but just thought I'd mention it.
One compromise could be to run pypy on draft PRs and CPython on approved PRs and master?