should i dump like 50k into a local LLM setup
| Learning disabled temple | 12/17/25 | | razzmatazz toaster | 12/17/25 | | Learning disabled temple | 12/17/25 | | razzmatazz toaster | 12/17/25 | | Bull Headed Dysfunction Ticket Booth | 12/17/25 | | razzmatazz toaster | 12/17/25 | | Bull Headed Dysfunction Ticket Booth | 12/17/25 | | razzmatazz toaster | 12/17/25 | | Bull Headed Dysfunction Ticket Booth | 12/17/25 | | cowardly bbw site | 12/17/25 | | Bull Headed Dysfunction Ticket Booth | 12/17/25 | | razzmatazz toaster | 12/17/25 | | Learning disabled temple | 12/17/25 | | cowardly bbw site | 12/17/25 | | Learning disabled temple | 12/17/25 | | razzmatazz toaster | 12/17/25 | | cowardly bbw site | 12/17/25 | | razzmatazz toaster | 12/17/25 | | racy ceo | 12/17/25 | | slippery principal's office | 12/17/25 | | white costumed stock car | 12/17/25 | | Diverse School | 12/17/25 | | alcoholic flesh dragon plaza | 12/17/25 | | blue national | 12/17/25 | | Learning disabled temple | 12/17/25 |
Poast new message in this thread
Date: December 17th, 2025 1:33 PM Author: Bull Headed Dysfunction Ticket Booth
no lol there is still a huge disparity in the speed/quality of frontier models vs what is able to be self hosted. it's just the new thing for nerds to blow massive amounts of $$$ on to feel cool
self hosted llms are great for niche purposes (like they can basically solve the problem of Searching your OS being Awful), but those are going to be so small you won't need much to run them.
(http://www.autoadmit.com/thread.php?thread_id=5811364&forum_id=2/#49516844) |
|
|