Hacker Newsnew | past | comments | ask | show | jobs | submit | baron-bourbon's commentslogin

Does this provide a Ollama compatible API endpoint? I've got at least one other project running that only supports Ollama's API or OpenAI's hosted solution (ie. the API endpoint isn't configurable to use llama.cpp and friends)


We need to stop chasing compatible API endpoints and work towards an AI standard. I wrote about it here https://news.ycombinator.com/item?id=42887610


I agree with what you wrote. The whole situation reminds me of the old "Standards" XKCD, to an extent. In the short term something like LiteLLM, which I just discovered doing more research on the whole topic, can at least hide some of the underlying complexity.

That being said, considering what you've done with Open Home and Home Assistant (which has run my home for years, thank you!), perhaps there is some hope of an open standard in the near future.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: