Comment by arend321
5 hours ago
I've been trying out various mobile, ai-assisted coding workflows.
Packing a Linux mini-pc in my rucksack, connected to display glasses, and voice-to-text with handy. Voice to text gets injected into a remote (Docker) codex session, running a hot reload web stack. I prompt to implement various features in an existing code base, where codex understands the structure and requirements. If a feature is done, I take a moment to inspect the results on the display glasses, then move onto the next feature or keep iterating. It's not perfect, but I was able to implement a couple of not too complex features while walking my local national park. The display glasses have a built-in 4-microphone array, and solid speakers. No need for a bulky headset or earbuds. Glasses come with monochromatic dimming, you can easily switch between dimming and see through.
If this comes with Linux integration, I will certainly give it a try.
No comments yet
Contribute on Hacker News ↗