Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
CoolGuySteve
46 days ago
|
parent
|
context
|
favorite
| on:
Right-sizes LLM models to your system's RAM, CPU, ...
That’s like like 4 or 5 fields to fill in on a form. Way less intrusive than installing this thing
amelius
46 days ago
[–]
It can become complicated when you run it inside a container.
bilekas
46 days ago
|
parent
[–]
Why would it need to be a container?
riddley
46 days ago
|
root
|
parent
|
next
[–]
My ollama and GPU are in k8s.
amelius
46 days ago
|
root
|
parent
|
prev
[–]
Are you asking why people run things in a container?
bilekas
46 days ago
|
root
|
parent
[–]
No, I'm asking why a website that someone could fill in a few fields and result in the optimized llm for you would need to run in a container? It's a webform.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: