← Back to context Comment by swyx 7 months ago do LoRAs conflict with your distillation? 1 comment swyx Reply sangwulee 7 months ago The architecture is the same so we found that some LoRAs work out-of-the box, but some LoRAs don't. In those cases, I would expect people to re-run their LoRA finetuning with the trainer they've used.
sangwulee 7 months ago The architecture is the same so we found that some LoRAs work out-of-the box, but some LoRAs don't. In those cases, I would expect people to re-run their LoRA finetuning with the trainer they've used.
The architecture is the same so we found that some LoRAs work out-of-the box, but some LoRAs don't. In those cases, I would expect people to re-run their LoRA finetuning with the trainer they've used.