← Back to context Comment by gs17 18 hours ago > 1T total / 32B active MoE modelIs this the largest open-weight model? 2 comments gs17 Reply adt 11 hours ago No.At 1T MoE on 15.5T tokens, K2 is one of the largest open source models to date. But BAAI's TeleFM is 1T dense on 15.7T tokens: https://huggingface.co/CofeAI/Tele-FLM-1TYou can always check here: https://lifearchitect.ai/models-table/ bigeagle 18 hours ago I believe so.Grok-1 is 341B, DeepSeek-v3 is 671B, and recent new open weights models are around 70B~300B.
adt 11 hours ago No.At 1T MoE on 15.5T tokens, K2 is one of the largest open source models to date. But BAAI's TeleFM is 1T dense on 15.7T tokens: https://huggingface.co/CofeAI/Tele-FLM-1TYou can always check here: https://lifearchitect.ai/models-table/
bigeagle 18 hours ago I believe so.Grok-1 is 341B, DeepSeek-v3 is 671B, and recent new open weights models are around 70B~300B.
No.
At 1T MoE on 15.5T tokens, K2 is one of the largest open source models to date. But BAAI's TeleFM is 1T dense on 15.7T tokens: https://huggingface.co/CofeAI/Tele-FLM-1T
You can always check here: https://lifearchitect.ai/models-table/
I believe so.
Grok-1 is 341B, DeepSeek-v3 is 671B, and recent new open weights models are around 70B~300B.