Comment by gs17 7 months ago > 1T total / 32B active MoE modelIs this the largest open-weight model? 2 comments gs17 Reply adt 7 months ago No.At 1T MoE on 15.5T tokens, K2 is one of the largest open source models to date. But BAAI's TeleFM is 1T dense on 15.7T tokens: https://huggingface.co/CofeAI/Tele-FLM-1TYou can always check here: https://lifearchitect.ai/models-table/ bigeagle 7 months ago I believe so.Grok-1 is 341B, DeepSeek-v3 is 671B, and recent new open weights models are around 70B~300B.
adt 7 months ago No.At 1T MoE on 15.5T tokens, K2 is one of the largest open source models to date. But BAAI's TeleFM is 1T dense on 15.7T tokens: https://huggingface.co/CofeAI/Tele-FLM-1TYou can always check here: https://lifearchitect.ai/models-table/
bigeagle 7 months ago I believe so.Grok-1 is 341B, DeepSeek-v3 is 671B, and recent new open weights models are around 70B~300B.
No.
At 1T MoE on 15.5T tokens, K2 is one of the largest open source models to date. But BAAI's TeleFM is 1T dense on 15.7T tokens: https://huggingface.co/CofeAI/Tele-FLM-1T
You can always check here: https://lifearchitect.ai/models-table/
I believe so.
Grok-1 is 341B, DeepSeek-v3 is 671B, and recent new open weights models are around 70B~300B.