← Back to context Comment by decide1000 3 days ago I use it on a 24gb gpu Tesla P40. Very happy with the result. 3 comments decide1000 Reply hkt 3 days ago Out of interest, roughly how many tokens per second do you get on that? edude03 3 days ago Like 4. Definitely single digit. The P40s are slow af coolspot 3 days ago P40 has memory bandwidth of 346GB/s which means it should be able to do around 14+ t/s running a 24 GB model+context.
hkt 3 days ago Out of interest, roughly how many tokens per second do you get on that? edude03 3 days ago Like 4. Definitely single digit. The P40s are slow af coolspot 3 days ago P40 has memory bandwidth of 346GB/s which means it should be able to do around 14+ t/s running a 24 GB model+context.
edude03 3 days ago Like 4. Definitely single digit. The P40s are slow af coolspot 3 days ago P40 has memory bandwidth of 346GB/s which means it should be able to do around 14+ t/s running a 24 GB model+context.
coolspot 3 days ago P40 has memory bandwidth of 346GB/s which means it should be able to do around 14+ t/s running a 24 GB model+context.
Out of interest, roughly how many tokens per second do you get on that?
Like 4. Definitely single digit. The P40s are slow af
P40 has memory bandwidth of 346GB/s which means it should be able to do around 14+ t/s running a 24 GB model+context.