Generative UI and ideas around it.

I remember it being about generative UI in one of the Rabbit keynotes and I am very interested in the concept of it.

Are there some ideas around this for implementation in the foreseeable future?

For example you could have responses with multiple answers and have them in a list that can expand per item. (Rabbit could understand what item is currently open if follow up queries are made)

This ofcourse would also mean that what is on screen would not be read out verbatim.

2 Likes

The forum isn’t really active…

Another example of a generative UI would be showing a map with adresses when Rabbit gives results after ‘looking for places’.

The scrollwheel could scroll between the spots on the map to ask follow up questions about the highlighted spot.

Highlighting things in general could be a great way to eleborate on results.

2 Likes

I agree, I appreciate the way that Perplexity Explains what it’s doing with that information. Rabbit should be passively showing that information on that generative UI.

1 Like

Wow, congratulations with the update today! I didn’t see the generative ui update yet but i’m updating now. Very exciting to see what Rabbit did! And I remembered this post…

1 Like

Perplexity does not seem to work with the Gen AI. Which is a pitty because those are usually really long answers.

1 Like

Yes, I have noticed that it seems to truncate or summarise my Perplexity based questions. I was wondering if this was in effort to fit the UI experience it generated. I’m sure that can be fixed, though right? I wonder if adding it into the UI prompt would help.

1 Like