Does this provide a Ollama compatible API endpoint? I've got at least one other project running that only supports Ollama's API or OpenAI's hosted solution (ie. the API endpoint isn't configurable to use llama.cpp and friends)
I agree with what you wrote. The whole situation reminds me of the old "Standards" XKCD, to an extent. In the short term something like LiteLLM, which I just discovered doing more research on the whole topic, can at least hide some of the underlying complexity.
That being said, considering what you've done with Open Home and Home Assistant (which has run my home for years, thank you!), perhaps there is some hope of an open standard in the near future.