Comment by varenc
7 hours ago
Are you worried about Google's response to this? Google reportedly reacts to distillation attempts "with real-time proactive defenses that can degrade student model performance". So if they detected you, they could have intentionally fed you a dumber but plausible variant of Gemini: https://cloud.google.com/blog/topics/threat-intelligence/dis...
But also, this model is small and just focusing on the tool use. In terms of token usage, you're probably not anywhere near the people that are trying to distill the entire model.
Well, it's like robbing the robbers, when it comes to training data
Except one of the robberers is a massive corporation with even bigger legal team...
It is more like imitating the imitators. There is not much of a legal case here, but poisoning the data is fair game both for those producing original data as well as for those producing its regurgitations.
1 reply →
You could run Gemma models locally to distill them. Or any other model with tool use.
Yeah, but we wanted Gemini