← Back to context Comment by beastman82 5 months ago Vast? Specs? 1 comment beastman82 Reply BoorishBears 5 months ago Runpod, 2xA40.Not sure why you think buying an entire inference server is a necessity to run these models.
BoorishBears 5 months ago Runpod, 2xA40.Not sure why you think buying an entire inference server is a necessity to run these models.
Runpod, 2xA40.
Not sure why you think buying an entire inference server is a necessity to run these models.