← Back to context Comment by beastman82 1 year ago Vast? Specs? 1 comment beastman82 Reply BoorishBears 1 year ago Runpod, 2xA40.Not sure why you think buying an entire inference server is a necessity to run these models.
BoorishBears 1 year ago Runpod, 2xA40.Not sure why you think buying an entire inference server is a necessity to run these models.
Runpod, 2xA40.
Not sure why you think buying an entire inference server is a necessity to run these models.