A
asasidh
One of the most frequent questions one faces while running LLMs locally is: I have xx RAM and yy GPU, Can I run zz LLM model ? I have vibe coded a simple application to help you with just that.
Update: A lot of great feedback for me to improve the app. Thank you all.
Comments URL: Show HN: Can I run this LLM? (locally) | Hacker News
Points: 37
# Comments: 40
Continue reading...
Update: A lot of great feedback for me to improve the app. Thank you all.
Comments URL: Show HN: Can I run this LLM? (locally) | Hacker News
Points: 37
# Comments: 40
Continue reading...