← Back to context Comment by QuadmasterXLII 3 days ago thats just mixture of experts 3 comments QuadmasterXLII Reply mnky9800n 3 days ago i thought mixture of experts didn't update itself with new sets of weights and was just a collection of already trained networks/weights? I could be wrong. QuadmasterXLII 3 days ago Well, that depends in whether you keep training it mnky9800n 3 days ago perhaps they should always be training and never static. haha. i allegedly grow wiser in my age, why not neural networks?
mnky9800n 3 days ago i thought mixture of experts didn't update itself with new sets of weights and was just a collection of already trained networks/weights? I could be wrong. QuadmasterXLII 3 days ago Well, that depends in whether you keep training it mnky9800n 3 days ago perhaps they should always be training and never static. haha. i allegedly grow wiser in my age, why not neural networks?
QuadmasterXLII 3 days ago Well, that depends in whether you keep training it mnky9800n 3 days ago perhaps they should always be training and never static. haha. i allegedly grow wiser in my age, why not neural networks?
mnky9800n 3 days ago perhaps they should always be training and never static. haha. i allegedly grow wiser in my age, why not neural networks?
i thought mixture of experts didn't update itself with new sets of weights and was just a collection of already trained networks/weights? I could be wrong.
Well, that depends in whether you keep training it
perhaps they should always be training and never static. haha. i allegedly grow wiser in my age, why not neural networks?