← Back to context Comment by mv4 3 months ago I have also been running the 32b version on my 24GB RTX 3090. 3 comments mv4 Reply mv4 3 months ago if someone wants to run the real thing (R1) locally, someone posted their hardware specs on X. Total cost: $6,000.[0] direct link with login https://x.com/carrigmat/status/1884244369907278106[1] alt link without login https://threadreaderapp.com/thread/1884244369907278106.html rocho 3 months ago That's not DeepSeek, it's a Qwen or Llama model distilled from DeepSeek. Not the same thing at all. testrun 3 months ago I am doing the same.
mv4 3 months ago if someone wants to run the real thing (R1) locally, someone posted their hardware specs on X. Total cost: $6,000.[0] direct link with login https://x.com/carrigmat/status/1884244369907278106[1] alt link without login https://threadreaderapp.com/thread/1884244369907278106.html
rocho 3 months ago That's not DeepSeek, it's a Qwen or Llama model distilled from DeepSeek. Not the same thing at all.
if someone wants to run the real thing (R1) locally, someone posted their hardware specs on X. Total cost: $6,000.
[0] direct link with login https://x.com/carrigmat/status/1884244369907278106
[1] alt link without login https://threadreaderapp.com/thread/1884244369907278106.html
That's not DeepSeek, it's a Qwen or Llama model distilled from DeepSeek. Not the same thing at all.
I am doing the same.