← Back to context

Comment by stavros

2 days ago

I'm really excited about the Index. I don't love that it's disposable, but I really like the UX. I couldn't wait, so I made my own (obviously not a ring, but airtag-sized), and it's amazing. I have it in my pocket, I take it out, speak a little note, and it goes off to my AI assistant for whatever needs doing.

That and the AI assistant have really changed how I operate day to day. I'm super excited about the Index, and I hope it has the same capability my app has (mostly, sending a webhook with the transcription with exponential backoff, so I'm sure all my notes will eventually be sent).

I too am very excited. I had a voice recorder laying around and have worked that into my workflow over the past few months. Although, my AI assistant is a cobbled together set of python scripts.

What are you using for your AI assistant?

  • I made my own, as I thought OpenClaw was a bit too insecure:

    https://github.com/skorokithakis/stavrobot

    I love it, it's amazing. I want to add a small section to the README about how to use it well (how to manage memory and the database, basically), but it's just fantastic. It has had basically zero bugs, as well.

    • Interesting, would you mind sharing your architectural setup? How does your index communicate to your agent server, what is the main agent framework/engine used?

      Sounds like a cool concept to speak into your watch/wearable which automatically saves or performs tasks on the fly.

      What is the general execution time from:

      Prompt received -> final task executed?

      6 replies →

can you give some more detail about the airtag-sized device you made? This is exactly what I've been thinking about doing to test the "idea" of the Index, but haven't figured out how to go about doing it.

(Tried looking on your blog, but ended up instead reading your article about the little ESP8266 clock which convinced me to buy one to play with myself, thanks!)