What VR working on
We have progress! After some researching into the Oculus SDK and getting to work I have some basic features built on the cockpit. The first feature I implemented was creating grabbables. OVR does a pretty good job with their prebuilt systems for grabbing objects, so I decided to stick with that. After getting it all set up, and throwing some cups around, I decided to ditch the gravity because were in space! This was the result.
Next Step was to get some rotation on the ship. I used a transform to lerp the rotation to. The transform rotates depending on the flight stick rotation, and the flight stick rotates based on hand movement from the grab point. After playing around with rotation speeds and almost puking a few times I found something that felt like a good speed. The lerp helps smooth out the movement so it doesn't feel and jarring and helps with motion sickness.
The next logical step was to create the thruster system. This was done by once again tracking hand position based on a grabbed origin, and then clamping the values. The rotation was limited between -50 and 35. So to calculate speed based off of it, I add 50 to the rotation and multiply by 1.17 (which is 100/85) to get a nice 0 to 100 scale. After testing and seeing 100 was WAY to fast of a speed I divided it by 10 to get a 0–10 speed range based off of rotation. So the equation ended up being Speed = (xRotation+50)*1.17/10. I then Lerped the current speed to this new target speed, to get an acceleration effect. After testing a few diffrent types of movement, I decided on directly changing the rigid bodies velocity so that it constantly moves in its forward direction. While adding force would be more realistic the velocity route just lets you have way more control which is important when its your only means of transportation through the world.
The last feature I implemented was a pressable interface. Any object I put this script on will now react based on its own logic to a press gesture. I had to create a state system for the controller to determine the correct gesture is being made, but it was worth it once I got it working. This system will greatly speed up development for me in the next couple days, and seems to be pretty efficient in terms performance.
All and all it feel good to be making some measurable progress. As these features continue being built, production will speed up even more. It’s awesome to see this project slowly coming together and I am getting more and more comfortable in VR development. Shout out to all the team leads who have helped me get Oculus up and running smoothly and who let me pick their brains for some best practices!