should i dump like 50k into a local LLM setup
| Pale Offensive Library | 12/17/25 | | crawly beta incel | 12/17/25 | | Pale Offensive Library | 12/17/25 | | crawly beta incel | 12/17/25 | | mischievous church | 12/17/25 | | crawly beta incel | 12/17/25 | | mischievous church | 12/17/25 | | crawly beta incel | 12/17/25 | | mischievous church | 12/17/25 | | Sapphire Location | 12/17/25 | | mischievous church | 12/17/25 | | crawly beta incel | 12/17/25 | | Pale Offensive Library | 12/17/25 | | Sapphire Location | 12/17/25 | | Pale Offensive Library | 12/17/25 | | crawly beta incel | 12/17/25 | | Sapphire Location | 12/17/25 | | crawly beta incel | 12/17/25 | | Provocative disrespectful crackhouse affirmative action | 12/17/25 | | Demanding stag film water buffalo | 12/17/25 | | Vibrant idiotic shrine | 12/17/25 | | misanthropic gaped chapel | 12/17/25 | | brass wrinkle parlour | 12/17/25 | | french market | 12/17/25 | | Pale Offensive Library | 12/17/25 |
Poast new message in this thread
Date: December 17th, 2025 1:33 PM Author: mischievous church
no lol there is still a huge disparity in the speed/quality of frontier models vs what is able to be self hosted. it's just the new thing for nerds to blow massive amounts of $$$ on to feel cool
self hosted llms are great for niche purposes (like they can basically solve the problem of Searching your OS being Awful), but those are going to be so small you won't need much to run them.
(http://www.autoadmit.com/thread.php?thread_id=5811364&forum_id=2/en-en#49516844) |
|
|