should i dump like 50k into a local LLM setup
| lemon dragon headpube | 12/17/25 | | judgmental pearl abode | 12/17/25 | | lemon dragon headpube | 12/17/25 | | judgmental pearl abode | 12/17/25 | | Flushed parlour | 12/17/25 | | judgmental pearl abode | 12/17/25 | | Flushed parlour | 12/17/25 | | judgmental pearl abode | 12/17/25 | | Flushed parlour | 12/17/25 | | Honey-headed wild school cafeteria factory reset button | 12/17/25 | | Flushed parlour | 12/17/25 | | judgmental pearl abode | 12/17/25 | | lemon dragon headpube | 12/17/25 | | Honey-headed wild school cafeteria factory reset button | 12/17/25 | | lemon dragon headpube | 12/17/25 | | judgmental pearl abode | 12/17/25 | | Honey-headed wild school cafeteria factory reset button | 12/17/25 | | judgmental pearl abode | 12/17/25 | | Amber personal credit line genital piercing | 12/17/25 | | Mint crackhouse | 12/17/25 | | crimson theater stage | 12/17/25 | | sticky travel guidebook | 12/17/25 | | 180 legal warrant | 12/17/25 | | misanthropic unhinged property | 12/17/25 | | lemon dragon headpube | 12/17/25 |
Poast new message in this thread
Date: December 17th, 2025 1:33 PM Author: Flushed parlour
no lol there is still a huge disparity in the speed/quality of frontier models vs what is able to be self hosted. it's just the new thing for nerds to blow massive amounts of $$$ on to feel cool
self hosted llms are great for niche purposes (like they can basically solve the problem of Searching your OS being Awful), but those are going to be so small you won't need much to run them.
(http://www.autoadmit.com/thread.php?thread_id=5811364&forum_id=2most#49516844) |
|
|