Any chance for R1 to allow connection to our own personal/private hosted LLM? I know we have a way of running them (Jan and others), it would be cool if we could connect to our own - especially if they are Trained off our Info. That blows the idea of the phone knowing about you out of the water. If we have out own personal/private trained LLM it can reach out to.
In that case, would you essentially just want R1 to handle STT + TTS for you? Or is the thought that it’s another option (sort of how you can ask for wolfram/perplexity now)?
I’d think as another options like “Ask about…” and it would send over the request and pull what it gets back. I briefly have ran them on my end but haven’t done any outside calls. There’s a thread about self-hosted apps that mentioned they have Jan-AI running on theirs and thought that would be cool to be able to call. Even cooler if we had on the Jan-AI end personally trained Models we could ask and have personal responses.
Personally, the idea of having the R1 have the ability to do tts/stt with offloading almost everything (optionally) to my own LLM would be amazing.