should i dump like 50k into a local LLM setup
| carnelian native | 12/17/25 | | low-t electric stage | 12/17/25 | | carnelian native | 12/17/25 | | low-t electric stage | 12/17/25 | | Dull private investor | 12/17/25 | | low-t electric stage | 12/17/25 | | Dull private investor | 12/17/25 | | low-t electric stage | 12/17/25 | | Dull private investor | 12/17/25 | | Pungent pale area | 12/17/25 | | Dull private investor | 12/17/25 | | low-t electric stage | 12/17/25 | | carnelian native | 12/17/25 | | Pungent pale area | 12/17/25 | | carnelian native | 12/17/25 | | low-t electric stage | 12/17/25 | | Pungent pale area | 12/17/25 | | low-t electric stage | 12/17/25 | | trip brethren lay | 12/17/25 | | Hateful stain | 12/17/25 | | Appetizing pistol | 12/17/25 | | big sable parlour associate | 12/17/25 | | razzmatazz keepsake machete | 12/17/25 | | odious hall newt | 12/17/25 | | carnelian native | 12/17/25 |
Poast new message in this thread
Date: December 17th, 2025 1:33 PM Author: Dull private investor
no lol there is still a huge disparity in the speed/quality of frontier models vs what is able to be self hosted. it's just the new thing for nerds to blow massive amounts of $$$ on to feel cool
self hosted llms are great for niche purposes (like they can basically solve the problem of Searching your OS being Awful), but those are going to be so small you won't need much to run them.
(http://www.autoadmit.com/thread.php?thread_id=5811364&forum_id=2#49516844) |
|
|