Depending on the pre-training structure, you can fine-tune old local models (even locally if you have a nice GPU) to steer behavior towards your desires.
Exactly as useful as they are today. Sure, it might not hold a candle to a model trained in 10 years, but it'll still be exactly as useful then as it is today, and run a lot faster too.
They can still be very useful because new models are reaching an asymptote in performance. Meanwhile as hardware gets cheaper in the future (current RAM prices notwithstanding), these models will become faster to run on local hardware.
Depending on the pre-training structure, you can fine-tune old local models (even locally if you have a nice GPU) to steer behavior towards your desires.
https://www.youtube.com/watch?v=yGkJj_4bjpE
Exactly as useful as they are today. Sure, it might not hold a candle to a model trained in 10 years, but it'll still be exactly as useful then as it is today, and run a lot faster too.
They can still be very useful because new models are reaching an asymptote in performance. Meanwhile as hardware gets cheaper in the future (current RAM prices notwithstanding), these models will become faster to run on local hardware.