Date: November 8th, 2025 5:58 PM Author: chopped chud unc
In my experience it isn't good enough yet yeah I have a pretty good setup but nothing crazy and it's still just too slow
I do think in the long run that local will be the way to go but I don't think the models are distilled enough yet while still being able to maintain their capabilities
Date: November 8th, 2025 6:03 PM Author: metaepistemology is trans
I have h100 quota right now but only like 4K to spend on gpu hours. OSS seems like the only thing worth training for local but it’s limited still has safety constraints built into it