Comment by andrewstuart
5 hours ago
I found it unusable due to out of memory errors with a billion row 8 column dataset.
It needs manual tuning to avoid those errors and I couldn’t find the right incantation, nor should I need to - memory management is the job of the db, not me. Far too flakey for any production usage.
That sounds like a rather serious application. Did you file an issue?
I filed many issues. They were aurtoclosed after 3 months of inactivity
No, I tried Clickhouse instead, which worked without crashing or manual memory tuning.
Search the issues of the duckdb GitHub there’s at least 110 open and closed oom (out of memory) and maybe 400 to 500 that reference “memory”.
> Search the issues of the duckdb GitHub there’s at least 110 open and closed oom (out of memory) and maybe 400 to 500 that reference “memory”.
Ah, missed this the first time around. Will check this out. And yes, I noticed that DuckDB rather aggressively tries to use the resources of your computer.
Understood: SQLite is to Postgres as DuckDB is to ClickHouse.
3 replies →