I really like to use my rabbit R1 and I understand that rabbit team wants to keep use of rabbit through R1 but especially when I am working, typing from the computer by using copy paste and attaching screenshots comes very handy. Can you please make the LLM part accessible within the Rabbit Hole by typing just like in Perplexity? I am currently using other tools for that but I want to use only Rabbit and have thread I make in a single place.
Thanks!
::personal thoughts::
It’s funny you mention that… Im helping my wife with a project of hers right now, and she made a similar comment about her project, “it should work just like ChatGPT”.
The problem is, OpenAI and Perplexity invested a lot of effort into delivering an AWESOME web first experience. It’s really hard, and that’s why you won’t see very many clones of them succeed at scale.
That said, it’s something that you can work towards over time by prioritizing the specific features that people want I think the same thing applies to rabbit
::work thoughts::
This is definitely a cool idea, and something I’ve heard discussed, but haven’t really poked my head into. The idea that you already have all this stuff in the journal… adding that context to a traditional LLM chat would be super powerful. If I hear more about it, I’ll make sure they know the community is interested
Thanks for the reply and the good work your team is doğng. It’s great to know I am not the only one ı am especially having problem when I need to ask Rabbit something specific like a term or someone. As I am not English native, it doesn’t get my pronunciation and then I’ll need to reach out Rabbit and type no that was not what I meant and every time it happens I say I wish I could use this from the web.
I know that Rabbit’s goal was never create a web first experience and my suggestion doesn’t need to be a web first experience but an additional experience to the device first experience.
Interesting… Hi I’m Diego from Argentina, new on here.
I feel you, having the ability to run you own local LLM would be nice but IDK if the architecture of the OS is intended to this. I understand this is a cloud based service where all the hard work is done outside the device.
IDK if its possible, probably would require tons of work, but I do feel will be a selling point. But not in a way it replaces rabbit but in a way like Rabbit can access the different Models inside the device.
I think this definitely could be a super plus for the community and the new owners, might be worth looking at for later in the future…
Looking forward to see what other thinks
Regards!
Hello Diego, welcome!
I feel like that won’t be that much of an issue as LLM can work with typing on the device and LAM works with both voice and written prompts but of course you may be right.
I think here there is already a good point where to start…
In the case of a self-hosted LLM, I would just add a simple chat interface on the self hosted side and expose it to the internet behind a login page.
Then add the login to the cookie jar
Then create a teachmode lesson that uses that LLM Very similar to the DeepSeek R1 video I posted
I mean if this is possible it would add a tons of value to the device. I would be a super selling point. IDK how that magic could be done but could be paradigm shift, because now we use local LLM in laptops but it could be as portable as having a Rabbit R1 and the Local LLM of your choice.
I’m thinking reusing a login system from communiverse.xyz to access to a site with the R1 the device can communicate to chatGPT APIS or Claude or DeepSeek. But you know that has a cost. The advantage of local LLM is kind of the same value I found on my Rabbit R1, now as my personal translator for free.
But later with the adition of self hosted / Local the value would be the same, right? So I would You dont need to pay chatGPT bc you already have not a chatGPT but LAM Model.
What I do think is that being a daily chatGPT user, when giving answers since doesnt have context of who I am, it gives general questions.
I think LLMs and local LLM are great but what is even greater is the ability for the Model to have context data about me, would not be great if we start thinking in terms of personalized AI device? This is a paradim ofcourse, if too personalized people can develop feelings for it, but if not, the device lacks of customization.
I feel rabbit can do so much more than what is inteded to do, but that needs to go hand by hand with this new ideas and the ideas that would really add value to future AI.
Hopefully we could see a feature like the one you described, in the next years.
Thanks for sharing some insights of how this could be a reality.
Have a great week!