Comment by binsquare

2 months ago

Great idea - took a bit to figure out how to implement this.

I came up with a plausibility check based on the model's memory requirements: https://github.com/BinSquare/inferbench/blob/main/src/lib/pl...

So now on the submission page - it has a warning + an automate flag count for volunteers to double check:

```This configuration seems unlikely

Model requires ~906GB VRAM but only 32GB available (28.3x over). This likely requires significant CPU offload which would severely impact performance.

You can still submit, but your result will be flagged for review.```