Comment by QuadmasterXLII 10 months ago thats just mixture of experts 3 comments QuadmasterXLII Reply mnky9800n 10 months ago i thought mixture of experts didn't update itself with new sets of weights and was just a collection of already trained networks/weights? I could be wrong. QuadmasterXLII 10 months ago Well, that depends in whether you keep training it mnky9800n 10 months ago perhaps they should always be training and never static. haha. i allegedly grow wiser in my age, why not neural networks?
mnky9800n 10 months ago i thought mixture of experts didn't update itself with new sets of weights and was just a collection of already trained networks/weights? I could be wrong. QuadmasterXLII 10 months ago Well, that depends in whether you keep training it mnky9800n 10 months ago perhaps they should always be training and never static. haha. i allegedly grow wiser in my age, why not neural networks?
QuadmasterXLII 10 months ago Well, that depends in whether you keep training it mnky9800n 10 months ago perhaps they should always be training and never static. haha. i allegedly grow wiser in my age, why not neural networks?
mnky9800n 10 months ago perhaps they should always be training and never static. haha. i allegedly grow wiser in my age, why not neural networks?
i thought mixture of experts didn't update itself with new sets of weights and was just a collection of already trained networks/weights? I could be wrong.
Well, that depends in whether you keep training it
perhaps they should always be training and never static. haha. i allegedly grow wiser in my age, why not neural networks?