🧠 - LEARN MODE, engage!

So I’ve been trying to figure out how to interact with the r1 and to “teach” it things to test the LEARN MODE capability. Like most of you, I’m still finding its various uses still, but I did have a “teaching moment” the other day…

Here is what happened
I asked the r1 to, “tell me a riddle”, which it did. Then I asked it to “tell me another riddle”, to which it told me another riddle. Here is where things got a bit weird…
When I asked it to “tell me another riddle” for a 3rd time, the r1 repeated the 1st riddle it told me minutes ago. I proceeded to tell it that, “you had already told me that riddle. Please tell me another riddle.” It then told me a new riddle. After that, I again asked it to “tell me another riddle”, which it told me another new riddle.
Once again, I asked it to “tell me another riddle”. THIS time, it again told me the same 2nd riddle it did a few minutes prior. Again, I told it that, “you had already told me that riddle. Please tell me a new riddle that you have not told me before.” It then told me a new riddle. Then it swapped back to the 1st riddle again…
This pattern continued off & on for several more attempts to try to get it to tell me ALL NEW RIDDLES each time I asked, but to no avail. It kept wanting to cycle back to the 1st few riddles. I then asked the r1, “how many riddles do you know?”, to which it answered, “I have access to millions of riddles. If you’d like to hear one, let me know.”

Now, before I continued to ask it for another riddle, and just going down that rabbit hole again (*boom), I wanted to try something different, so I told the r1 the following command: 

“I would like you to remember any time you tell me a riddle, To not tell me that same riddle for at least another 6 months. Alright?”
…to which not responded…
“Understood. I’ll keep track of the riddles I share with you to ensure a variety in the future. If you have any other preferences or requests, feel free to let me know.”

:exploding_head::open_mouth::scream: Needless to say, I was surprised/shocked/in aww. Yes, this was a very basic thing, but still… Ive never been able to “teach” a command to my cell phone just by talking to it, because if it’s not in to programming already, you can’t change the setting in the phone anywhere.

I am hopeful for this product and look forward to what other “Learning/Teaching” capabilities that will present themselves. What experiences have y’all had with the LEARN MODE so far?

14 Likes

Great work!

Yes! I saw this brought upin the discord dealing with weather and celsius vs fahrenheit. I think that, if at first the rabbit doesn’t succeed, getting creative with the prompts is key here. it’s so neat to think that it can learn new behaviors which persist. Great post

1 Like

I’ve been wondering this. When we prompt R1, does it ack as a single thread in ChatGPT or is each new prompt a different thread? Is there a way to pre-prompt? Of have it access the Rabbithole for memory?

Thanks!! :blush:

I haven’t messed with ChatGPT, but I know with this one, you can hold long conversations/arguments and reference your previous comments from that same conversation and it will respond all as if its from a long conversation. Now, I’m not sure if it will hold over a conversation from a previous day, hours before, but sure that will get figured out pretty quick. Ha!

1 Like

Kinda like what I just mentioned in this post, I’m going to start trying to recognize “behaviors/responses” that I don’t like, or that I want to change (the issue).

Then when I go to make a “corrective/instructional” prompt to the r1 to address “the issue”, I’m going to make sure I give it a definite “timeframe” for the correction to last, and also a positive/favorable “solution” to the issue for it to mimic/enact upon instead for the future.

We’ll see how it goes! Ha!

2 Likes

Amazing!! Prompt Engineering/Manipulation is Key.

2 Likes

I had kinda the same experience and can tell (from mine) it won’t do what it said. I got it a couple times to confirm things i told it about how to behave/answer specific things. Then 10 minutes later it was like nothing ever happened.

1 Like

@Not_Yours I’ve used your same statement to avoid r1 repeating riddles, but it took the command as a prohibition to tell riddles :confused:
So I used a different command: “Please remember the riddles you tell me and do not repeat them”. It understood the command explicitly responding that “I will keep track of the riddles so as not to repeat them again”. But… it repeated two riddles again in one session at a distence of 2-3 rounds.

It seems there’s still a lot of work to be done for the model behind the Rabbit r1 to handle requests about its behaviour.

Anyway, you’ve pointed out an interesting potential aspect of our little rabbit: it can be instructed to behave as we wish.

1 Like

Sorry for being picky but it can not. (for now). It tricks you into thinking it can by agreeing and confirming it can and telling you it will. But then it forgets and goes back to bad habits. Right now you can not prime your R1 like you would prime a GPT.

Rabbit should do it the exact same way OpenAI does : in settings,

CUSTOMIZE R1
Custom Instructions
Do you have any information to provide to R1 to help it better respond to you?

this part is free text where you could say " my name is XXX, i was born in XXXX, my sons are XXX and born in XXX, i only use metric system / celsius degrees / euros, my dog is an australian shepherd, etc etc anything leading to your best use so it has a user centric way to think or search when you talk with it.

8 Likes

Yes, I was imprecise in my statement, I should have used the conditional: it could be instructed. Mine was more a hope than current possibility.

2 Likes

And did you test it after that? It happens the same with asking for Jokes, repeats the same ones even you ask to not repeat what it said 2 min ago.
At one case with the Celsius, it also told me that will keep it in mind, but after a simple x5 tap , or if you rabbit restarts of shut down it cant recall nothing anymore.

i dont think there is a learning or memorizing capabilities if it cant even recall to use C. from F. / Im sure they gonna fix it but im curious if by memory or just by settings …

i test this prompt you say it work for you but it doesnt on this end, have a look yourself , also didnt take correct riddle ( replaced it with reader )



2 Likes

same here!

1 Like

it can not because r1 is not an AI ( for now ).

i did this " my name is XXX, i was born in XXXX, my sons are XXX and born in XXX, i only use metric system / celsius degrees / euros, … etc "

only thing could recall is my name :slight_smile:

2 Likes

Haha, i deleted the entries in my journal but i remember precisely talking about my sons like “i want you to remember my 2 sons are Louis and Arthur” r1 replying “got it, i will remember your sons are named Louis and Arthur”. Literally one second after i tell it “Tell me my sons names” rabbit says “your sons are Alex and Max”.

Come On Wtf GIF

4 Likes

Same tried asking it to take a note of one composite piece of information(my housemates names), then asked for both of them, and it would accurately recall them 9 times out of 10, unsure as to what causes an error sometimes, skipping over the complete journal entry? I will try to reduce the length of the note and only include one name in it, or a single location to eventually recall, it might help the R1 find the right information better and more consistently.

1 Like

Same here. For examples I have mine to save several notes that I update: remove, add items, clear all entries. It always confirms the modifications accordingly, and if I check right after it is usually correct, but when I come back to it some time after, I realise it isn’t the case.

1 Like

I had a similar experience where I asked R1 to remember my preference. It confirmed it did. When I asked the exact same Task again, It already forgotten that he promised it, but delivered this along my preferences.

So I guess there is a way to go it o really learn Preferences, and remember what it told in the past and not Come back With the same repetitive content

1 Like

UPDATE…

Soooooo, yeah…it totally forgot the original “riddles” command that I taught it. :man_shrugging:t2:

3 Likes