Comment by maaaats
11 years ago
The video latency is low enough that you don't notice. We tried with moving our hand in front of it, it feels weird too see yourself real-time!
The rotations of the cameras are a bit too slow though, but mainly because of our setup, so this can be a lot better. Our custom protocol couldn't really handle dropped packets, so if we filled the sender's or receiver's buffers, the servos would start to spasm. To counter that we just didn't send head position as often as we could.
Of course, by fixing that you still get the round-trip time of turning your head -> moving the cameras -> getting updated image back. But we think it's feasible.
I do a bit of FPV myself so I know what you mean about the real-time! The round-trip is what I meant - especially with all the work to get the latency down on the Rift.
I've always thought fixed, super-wide-angle lenses and software would be the ideal way to go - the groundstation does all the work.
Yeah, it's been suggested to us a few times after people have seen the project. Someone should definitely explore that way as well.
Any plans to use a brushless gimbal? It would be way faster and smoother, and I bet the latency would be imperceptible.
Any reason you didn't just transmit 3 normal RC PWM channels? You could do all the custom circuitry on the ground. Plug the Oculus into a microprocessor, then output to the trainer port of an RC transmitter or perhaps use one of the standard RC transmitter modules like JR or Futaba.
As I said in my top level comment, it was a project at the university. So the reason is basically that we were making a prototype the best way we could, in a way we knew we could finish in time. :)
I read your PDF. I haven't done anything along these lines, I just thought all the USB transmission and decoding sounded way more difficult than using standard RC transmission.
1 reply →
How does a weekend hobby project (and one that's been done before no-less) become a university research project?
2 replies →