Date: December 5th, 2025 7:25 PM Author: Jared Baumeister is a cow who welches on bets
I doubted it for a long time because it's a Samsung product. However, it you ONLY want to run local LLMs it's actually 180. It runs Gemma 27b lightning fast. Sure it will drink 350W while it's doing it, but it drops right down to 5 W at idle. It's really quite efficient in that sense, because you're hardly ever hitting it. If all you want is something to host ollama, it's perfect. I know people are using AMD but it's a nightmare, believe me. Having Nvidia means everything just works. These are like $800 on ebay last I checked. 24gb of VRAM. There's no point in getting the ti if you're just using it for AI
Date: December 5th, 2025 7:28 PM Author: Jared Baumeister is a cow who welches on bets
Oh another 180 aspect of this card is that the non-Ti variant uses the good old fashioned power connectors. If you get the 3090ti you're forever stuck with the Nvidia proprietary connector, and it's 1st gen. AVOID