Show HN: Controlling 3D models with voice and hand gestures

4 days ago (github.com)

I'm sharing my project to control 3D models with voice commands and hand gestures:

- use voice commands to change interaction mode (drag, rotate, scale, animate)

- use hand gestures to control the 3D model

- drag/drop to import other models (only GLTF format supported for now)

Created using threejs, mediapipe, web speech API, rosebud AI, and Quaternius 3D models

Githhub repo: https://github.com/collidingScopes/3d-model-playground

Demo: https://xcancel.com/measure_plan/status/1929900748235550912

I'd love to get your feedback! Thank you

I understand you need your face in the videos for the demos. But, want to mention that you should make sure your system works with your hands in your lap. As shown, the user is going to experience "gorilla arm" fatigue very quickly.

  • Good points, maybe a second camera (phone?) pointed downwards at the tabletop would be good for that. Then the user can rest their hands in a "normal" position.

    Thank you for the feedback!

Amazing! Maybe use specific finger positions/gestures to trigger a rotation and scale functions (index finger up and within a bounding box of the model perhaps for rotation, similar for pinch to fingers to scale).

  • I'll try it, thank you! I separated them into completely different interaction modes to avoid misfires, but there's definitely room for efficiencies

Slightly on topic - anyone remember LeapMotion and is anyone aware of any current support for that? Found an original one in a drawer when I was having a clearout the other day

Awesome, nice work! This type of tech opens up a world of physical games.

Sounds very cool, but I could not make sense of the on-screen instructions. Some images or animations would go a long way to explain the controls.

  • Sorry about that, the instructions need to be improved.

    Does this video demo help?

    https://x.com/measure_plan/status/1929900748235550912

    If it makes it clearer, I'll upload it to the github repo directly

    • That video did help. I think I was thrown off by two things: 1) I was expecting 3D controls with more direct mapping (e.g. rotating my hand rotates the model). This is more like gesture mouse controls. 2) Some of the controls were too subtle. The scaling between my gesture size and effect on screen was smaller than I expected.

      Great area to develop though. There's so much untapped potential in applying Mediapipe.

      1 reply →

Great job! Looks very useful for interactive content creations and product showcasing. Definitely will testing it more. Thanks for sharing.

  • yes I'd love to go further with this concept so that 3D / CAD designers could easily present their models during video calls.

    thank you!