I disagree in so far that my chatgpt has no problem remembering that preference of mine and i don’t have to prompt him for it each time (metric system). You can customise gpts with custom instructions and that basically should be the same for the R1 - even more so if it is advertised as an assistant of sorts.
Can you elaborate about your playground issues? Do you execute the playground on the computer or the R1? I usually use the computer but it still seems to connect to the R1 for it ? It does work ok though for me, i mean yeah it glitches and gets stuck sometimes but generally it does something.
I’m using it mostly with R1, otherwise it makes no sense…
Finally. Today at 0900 I set a reminder for 1000 local time and at 1000 sounds the alarm and the reminder on the screen and also show me the reminder when I told it to show it again. I need one for the 23rd and going to set it. Let’s see on the 23rd the result. Thanks
I set 2 reminders for the 23rd of october. When I ask to show me all reminders only shows the last one. I Will wait until the 23rd to see if both sounds at 0900 and 10000
Is it just me but it seems that in the latest SW update, “shake to return home” function no longer works whenever I navigate to Settings, I have to scroll all the way to the top again in order to return home. Previously, a quick shake will conveniently get me straight to home and I find that quite a handy feature especially if future SW version introduces more menu items and more places to explore.
Maybe we need to report a bug here, because I’ve noticed the same thing, fix something seems to break things sometimes.
We’ve identified a few bugs with reminders, and we’re testing the fixes for them now. They’ll almost certainly be hitting your devices in the next OTA next week.
Are you planning to add ScreenReader Support in the next OTA update?
I hope we get feedback sounds to improve general accessibility.
What do you want to have read out loud? Menu items on the R1?
I expect a complete screen reader who reads:
- All the grafics, menus, interfaces…
- Allows me to type with the keyboard in Terminal Mode.
- Reads every alert from R1: allarms, timers, reminders, battery…
I expect a Screen reader working similar to Google TalkBack for Android or VoiceOver for iOS, but adapted and simplified for the R1.
That’s a lot. I’m all for equal access and understand where you are coming from. But can’t imagine Rabbit giving all of this priority with their small team. (Maybe i’m wrong)
If you would think of one thing that is most impactful and effective and maybe even easy to implement what would that be?
At least the reading of the menus I scroll using the ScrollWheel. Not having the touch gestures, it can be easy to integrate.
Same here.I even thought the shaking was intentionally erased…
Thanks for your feedback. I agree this would be good to have as an accessibility setting.
For your case, we probably need haptic feedbacks when scrolling through the menus with the wheel and for accessibility’s sake, it should read the menu out loud whenever we touch the screen (or stopped at the place where the marker is).
Notice that I am completely blind, so for my case, if you need to control the menus with the scroll wheel I need it to read each element when it becomes highlighted in orange.
Where is it? My R1 doesn’t update and no release notes were published.