Wow, cool that someone is sharing our project! It's for a course at my university called "experts in teams" where they combine master students from many programs in teams and give them tasks. A video with more shoots is available[1].
We got UAVs for the Norwegian oil industry as a task, and explored how they can be flown better in the future. Other teams with the same task made other cool stuff as well, for instance a custom drone for geological mapping. [2]
As one commenter on youtube said: "Shut up and take my money".
But seriously, have you guys considered commercializing this? You're offering people the ability to experience flight. Not just a video of flight, but controlled flying in their own neighbourhood.
Here's one from a year ago - FPV system with QuadCopter, commercialised because people were asking for 'affordable' versions: http://www.youtube.com/watch?v=aWPrf4pw6V8
It's not Oculus Rift based, but it is FPV flight.
Slightly off toic, NB. You need to also be a bit careful with flying regulations, as radio controlled aircraft are still regulated. In the UK for example, flying an FPV radio controlled aircraft can't be done on your own, you must have a competent observer keeping it in line of sight at all times and watching for risk of collisions, so no flying it behind buildings or hills. You must also not fly within 50 meters of any building or vehicle you don't own, or above any congested or crowded area, or takeoff/land within 50 meters of other people. So where you say "experience flight in their neighbourhood" - they might not be allowed to:
Yeah, it is weird. You kinda want to rub your eyes or something. It was even weirder when we adjusted the cameras. I looked through the Oculus Rift, while another person adjusted and I would tell him when they were properly aligned. Having a person "turn" your eye, drop it etc. was crazy!
The link[1] has a description. Basically, different professors create different "villages" one may apply to. Everyone taking a master's has to take this course, e.g. all 4th year students. Our university has everything from engineering to human sciences, so the diversity of the groups are great. This gives new influences to the field of the village, and good practice in teamwork for us students.
Cool project! And it's cool to see an NTNU student on HN. I study data visualization at the Institute of informatics at UiB, and I really wish we had research groups with the same practical focus you guys get.
This idea has been buzzing around the FPV community since the Oculus Rift was first announced. There have been a few barebones setups, but this is the first fully-fledged implementation I've seen.
It's a fun usage for VR goggles and a great build project, but it's honestly not super useful, even for hobby flying. Any quadcopter you're flying with a head tracker is going to be so far away from other objects that the 3D effect will be minimal. The 3D might be great for zipping quickly through trees/obstacles like a pod racer [0], but for that you don't dare use head tracking.
They're not controlling the quad with head tracking, just the camera orientation. So for "pod racing" this should work making it more realistic both because of the 3D effect and the fact that you can turn your head like you would if you were sitting inside it.
I know. It would still be disastrous to be able to turn your head when doing fast moves like this. To make it work there would need to be something in the frame of the camera, or overlaid on the video, to indicate the center of the frame, because orientation is extremely important.
Very cool idea. I could think of some indoor applications for it.
For outdoors, since the distance between the cameras and the object they're looking at is so far, and the distance between the two eyes is so close the difference between the two images is almost meaningless. I'd put the two cameras further apart and see what happens.
I remember looking through this thing: http://eyestilts.com/intro.html at Burningman many years ago - it makes things look very un-real - much like tilt-shift photos. I suspect it might become pretty disorienting to fly using such odd optics...
The lens in this device are pretty close to the looking subject. If you have cameras with a wider separation on a drone they'll be farther asay from the subjects and the effect will be less extreme, they are also very difficult to mount with 3 meters of separation.
That device is awesome! I would love to try it. It would be fairly simple to simulate the effect in any game with VR goggles.
A really weird effect I've tried is to reverse the images each eye sees. I did this (first unintentionally, then deliberately) with my 3D television. It inverts the depth perception, so a small object in the foreground looks like it's sunken deep into the background. But mostly it's just confusing and stressful on the eyes, since there's presumably no evolutionary preparation to make sense of such visual input. Someone should make a physical optical device to offer this effect in the real world (it probably already exists).
Our drone couldn't really fit anything wider, so we went with the average human IPD according to the rift SDK. You're right, at distance it's of no use. Closeup, as we tried to illustrate with the pole, it helps. Aligning the cheap CMOS cameras were hard, though.
Awesome project. Have you thought about using a spherical camera, so that the pan/tilt can be done entirely in software? Make it a bandwidth problem, instead of a mechanical problem.
I think one major issue there is that all of the available video transmitters for FPV transmit in standard definition. You'd have a hard time spreading that over a sphere and being able to coble anything useful together.
Most of the FPV videos you see on youtube are not showing what the pilot saw in real time. They're showing you the HD that was recording on board the plane and retrieved after landing.
You don't really have to send the entire video feed from the camera, only the part your looking at. That way, most of the "software" is running remotely, and you also don't have a mechanical component to worry about.
Or, for a more challenging (though more dangerous and probably less useful) project, make the head tracking control the pitch, roll, and yaw of the quadcopter directly. With altitude hold functionality (a lot of flight controllers implement this with a barometer, though not the DJI Phantom) you could have completely hands free flight.
I had thought of this, then I wondered, why not have head movement do both? That is, have the camera pan/tilt/roll in sync with your head and also shift the plane on the same axis. That way you could know which way was centered (looking out over the nose of the craft), and immediately see where you'd be headed when the turn completed.
As it appears to be wireless I would imagine the latency between head movement to camera movement and then video capture to video output would likely be enough to make users feel ill.
The camera movement latency is probably quite noticeable. With a brushless gimbal this could be greatly reduced. The 5.8 GHz video transmission setup is widely used for FPV flying, and its latency is negligible.
Wireless 2.4ghz radio latency is measured in milliseconds which is quick enough not to notice. Latency also depends on how quick the gimbal servos are but that too may be quick enough.
Apparently telepresence like this was the original application of HMDs (like the Philco Headsight); the concept was only generalised into VR (and AR) by Ivan Sutherland a few years later.
The video latency is low enough that you don't notice. We tried with moving our hand in front of it, it feels weird too see yourself real-time!
The rotations of the cameras are a bit too slow though, but mainly because of our setup, so this can be a lot better. Our custom protocol couldn't really handle dropped packets, so if we filled the sender's or receiver's buffers, the servos would start to spasm. To counter that we just didn't send head position as often as we could.
Of course, by fixing that you still get the round-trip time of turning your head -> moving the cameras -> getting updated image back. But we think it's feasible.
I do a bit of FPV myself so I know what you mean about the real-time! The round-trip is what I meant - especially with all the work to get the latency down on the Rift.
I've always thought fixed, super-wide-angle lenses and software would be the ideal way to go - the groundstation does all the work.
Any reason you didn't just transmit 3 normal RC PWM channels? You could do all the custom circuitry on the ground. Plug the Oculus into a microprocessor, then output to the trainer port of an RC transmitter or perhaps use one of the standard RC transmitter modules like JR or Futaba.
Good old fashioned vibration dampening would go a long way. That's a big area of concern for all FPV pilots, and it's fairly easy to get good results. The ultimate solution would be to also add a brushless gimbal for stabilization as well as low reaction latency:
Wow, cool that someone is sharing our project! It's for a course at my university called "experts in teams" where they combine master students from many programs in teams and give them tasks. A video with more shoots is available[1].
We got UAVs for the Norwegian oil industry as a task, and explored how they can be flown better in the future. Other teams with the same task made other cool stuff as well, for instance a custom drone for geological mapping. [2]
[1]: https://www.youtube.com/watch?v=ANSjwWomIJ8 [2]: https://www.youtube.com/watch?v=5di01L1mot8
As one commenter on youtube said: "Shut up and take my money".
But seriously, have you guys considered commercializing this? You're offering people the ability to experience flight. Not just a video of flight, but controlled flying in their own neighbourhood.
Here's one from a year ago - FPV system with QuadCopter, commercialised because people were asking for 'affordable' versions: http://www.youtube.com/watch?v=aWPrf4pw6V8
It's not Oculus Rift based, but it is FPV flight.
Slightly off toic, NB. You need to also be a bit careful with flying regulations, as radio controlled aircraft are still regulated. In the UK for example, flying an FPV radio controlled aircraft can't be done on your own, you must have a competent observer keeping it in line of sight at all times and watching for risk of collisions, so no flying it behind buildings or hills. You must also not fly within 50 meters of any building or vehicle you don't own, or above any congested or crowded area, or takeoff/land within 50 meters of other people. So where you say "experience flight in their neighbourhood" - they might not be allowed to:
http://www.fpvuk.org/fpv-law/
http://www.caa.co.uk/docs/33/ORS4%20number%20956.pdf
1 reply →
Does it look weird to have different little bits of video static in each eye? Also, a cool idea would be an OSD that supports 3D overlays.
Yeah, it is weird. You kinda want to rub your eyes or something. It was even weirder when we adjusted the cameras. I looked through the Oculus Rift, while another person adjusted and I would tell him when they were properly aligned. Having a person "turn" your eye, drop it etc. was crazy!
Could you tell us more about the "experts in teams" program? That sounds like a very interesting and unique approach.
The link[1] has a description. Basically, different professors create different "villages" one may apply to. Everyone taking a master's has to take this course, e.g. all 4th year students. Our university has everything from engineering to human sciences, so the diversity of the groups are great. This gives new influences to the field of the village, and good practice in teamwork for us students.
[1]: http://www.ntnu.edu/eit/formal-documents
This needs an 3rd person follow automatic.
Cool project! And it's cool to see an NTNU student on HN. I study data visualization at the Institute of informatics at UiB, and I really wish we had research groups with the same practical focus you guys get.
How much lag do you get before the head movements are picked up by the camera?
Have you considered doing some sort of aerial robot wars type competition?
How bad was the latency? Did it make anyone sick?
This idea has been buzzing around the FPV community since the Oculus Rift was first announced. There have been a few barebones setups, but this is the first fully-fledged implementation I've seen.
It's a fun usage for VR goggles and a great build project, but it's honestly not super useful, even for hobby flying. Any quadcopter you're flying with a head tracker is going to be so far away from other objects that the 3D effect will be minimal. The 3D might be great for zipping quickly through trees/obstacles like a pod racer [0], but for that you don't dare use head tracking.
[0] http://youtu.be/xlKrabm5Exg
They're not controlling the quad with head tracking, just the camera orientation. So for "pod racing" this should work making it more realistic both because of the 3D effect and the fact that you can turn your head like you would if you were sitting inside it.
I know. It would still be disastrous to be able to turn your head when doing fast moves like this. To make it work there would need to be something in the frame of the camera, or overlaid on the video, to indicate the center of the frame, because orientation is extremely important.
1 reply →
I think it is not for the 3D. It is for precise orientation of the camera using the head tracking.
Youtube here: https://www.youtube.com/watch?v=ANSjwWomIJ8
Very cool idea. I could think of some indoor applications for it.
For outdoors, since the distance between the cameras and the object they're looking at is so far, and the distance between the two eyes is so close the difference between the two images is almost meaningless. I'd put the two cameras further apart and see what happens.
I remember looking through this thing: http://eyestilts.com/intro.html at Burningman many years ago - it makes things look very un-real - much like tilt-shift photos. I suspect it might become pretty disorienting to fly using such odd optics...
The lens in this device are pretty close to the looking subject. If you have cameras with a wider separation on a drone they'll be farther asay from the subjects and the effect will be less extreme, they are also very difficult to mount with 3 meters of separation.
2 replies →
That device is awesome! I would love to try it. It would be fairly simple to simulate the effect in any game with VR goggles.
A really weird effect I've tried is to reverse the images each eye sees. I did this (first unintentionally, then deliberately) with my 3D television. It inverts the depth perception, so a small object in the foreground looks like it's sunken deep into the background. But mostly it's just confusing and stressful on the eyes, since there's presumably no evolutionary preparation to make sense of such visual input. Someone should make a physical optical device to offer this effect in the real world (it probably already exists).
Reminds me of this xkcd: http://xkcd.com/941/
Our drone couldn't really fit anything wider, so we went with the average human IPD according to the rift SDK. You're right, at distance it's of no use. Closeup, as we tried to illustrate with the pole, it helps. Aligning the cheap CMOS cameras were hard, though.
Awesome project. Have you thought about using a spherical camera, so that the pan/tilt can be done entirely in software? Make it a bandwidth problem, instead of a mechanical problem.
I think one major issue there is that all of the available video transmitters for FPV transmit in standard definition. You'd have a hard time spreading that over a sphere and being able to coble anything useful together.
Most of the FPV videos you see on youtube are not showing what the pilot saw in real time. They're showing you the HD that was recording on board the plane and retrieved after landing.
You don't really have to send the entire video feed from the camera, only the part your looking at. That way, most of the "software" is running remotely, and you also don't have a mechanical component to worry about.
Or, for a more challenging (though more dangerous and probably less useful) project, make the head tracking control the pitch, roll, and yaw of the quadcopter directly. With altitude hold functionality (a lot of flight controllers implement this with a barometer, though not the DJI Phantom) you could have completely hands free flight.
I had thought of this, then I wondered, why not have head movement do both? That is, have the camera pan/tilt/roll in sync with your head and also shift the plane on the same axis. That way you could know which way was centered (looking out over the nose of the craft), and immediately see where you'd be headed when the turn completed.
1 reply →
Think about the geometry of that problem harder - particularly when it comes to the stereo vision aspect.
As it appears to be wireless I would imagine the latency between head movement to camera movement and then video capture to video output would likely be enough to make users feel ill.
The camera movement latency is probably quite noticeable. With a brushless gimbal this could be greatly reduced. The 5.8 GHz video transmission setup is widely used for FPV flying, and its latency is negligible.
Video showing the reaction speed of brushless gimbals: http://youtu.be/Dr5CnIA52zY
Wireless 2.4ghz radio latency is measured in milliseconds which is quick enough not to notice. Latency also depends on how quick the gimbal servos are but that too may be quick enough.
Since the movement and change of speed of the copter isn't very fast you can apply time warp to this; or just make the camera film in 120 fps.
Perhaps, but it won't be long before omnidirectional arrays of cameras have all perspectives available even before you turn your head.
This is going to be a job someday.
People are going to train for this.
Very likely police, field agents, military, but there are absolutely going to be people whose sole jobs will be piloting drones like this.
(Until those people are replaced by AI algorithms, but still.)
Apparently telepresence like this was the original application of HMDs (like the Philco Headsight); the concept was only generalised into VR (and AR) by Ivan Sutherland a few years later.
Somewhat related: https://www.youtube.com/watch?v=SQ2tCMXOd_w https://share.oculusvr.com/app/hiyoshi-jump
I want to turn this into a sport.
Imagine this with some augmented reality thing where multiple contestants could shoot each others' copters with an IR laser or something.
Heck yes.
This game exists for the Parrot AR drones. http://singularityhub.com/2010/06/17/parrots-ar-drone-gets-c...
I'd love to know how well this works. I always assumed the pan and tilt setup would be too slow for the Rift.
The video latency is low enough that you don't notice. We tried with moving our hand in front of it, it feels weird too see yourself real-time!
The rotations of the cameras are a bit too slow though, but mainly because of our setup, so this can be a lot better. Our custom protocol couldn't really handle dropped packets, so if we filled the sender's or receiver's buffers, the servos would start to spasm. To counter that we just didn't send head position as often as we could.
Of course, by fixing that you still get the round-trip time of turning your head -> moving the cameras -> getting updated image back. But we think it's feasible.
I do a bit of FPV myself so I know what you mean about the real-time! The round-trip is what I meant - especially with all the work to get the latency down on the Rift.
I've always thought fixed, super-wide-angle lenses and software would be the ideal way to go - the groundstation does all the work.
1 reply →
Any plans to use a brushless gimbal? It would be way faster and smoother, and I bet the latency would be imperceptible.
Any reason you didn't just transmit 3 normal RC PWM channels? You could do all the custom circuitry on the ground. Plug the Oculus into a microprocessor, then output to the trainer port of an RC transmitter or perhaps use one of the standard RC transmitter modules like JR or Futaba.
6 replies →
i assumed it must use a 360 camera but i was wrong. i don't know much about quadcopters but it seems like a decent idea to me
Looks like the cameras could use some good OIS.
Good old fashioned vibration dampening would go a long way. That's a big area of concern for all FPV pilots, and it's fairly easy to get good results. The ultimate solution would be to also add a brushless gimbal for stabilization as well as low reaction latency:
http://youtu.be/Dr5CnIA52zY
Great to see more projects using the Oculus Rift for FPV. We are using it in out Sky Drone FPV system.
http://spiritplumber.deviantart.com/art/dancers-in-the-dark-... Here, I wrote you a little story.
Geez, why the hate? It's a story about FPV quadcopters dancing.