Wednesday 25 March 2015

Astronautics Simulation - Revisited (Post Mortem)

Even though the module was over and I received my grade, I spent a weekend fixing up this project to get it ready to demo on the 18th of March. I did the demonstration on Wednesday and things went well, which I am happy with. The members of the astronautics department seemed to like it a great deal and had many VR/Oculus questions. Below I'll summarise what I did to get the project finished and ready. After this I will cover future work and improvements that could be worked on.

In my last Astronautics Simulation post, I covered briefly what worked and what did not work with the system. I used these points to fix the original issues. I rebuilt the simulation, reusing the assets and some of the code from the previous version. However I removed the first minigame and designed 3 others. I also fixed the skybox bug in minigame 2 and completed it. I then added audio, a GUI and narration into the simulation, created by myself using Photoshop and Audacity along with a voice changer. The system still follows a similar modular structure. I'll map these below:

Main Menu:



The main menu had issues due to constricted movement. Even though this didn't stop the user from using the system, most of the playtest feedback received spoke of allowing users to move around this main menu with more freedom. Due to the nature of the Oculus Rift, many users wanted to explore every level, even the menu-panning by walking and looking around.

There was also a bug where the user could get turned around when booting into the menu, before putting on the rift. The tracking data would incorrectly set where the rift was pointing at the time before the user had it on their heads. I also found this issue to occur when loading from one level into another (a Unity Scene in this case). The tracking data would update before the level finished loading, causing the users' center view to be set off-center in-game. I fixed this issue  by offering the users a way to reload each level by pressing a button on the pad or key on the keyboard. This would allow the level to reset and the new tracking position to be set correctly. This is more of a work around than a bug fix, but it works well enough.

The new main menu was reconstructed to be a level in itself. This was changed from the row menu panning mechanic used previously. This new menu allows the user to select a minigame from 4 different panels which hover above the floor. The user is also able to navigate around the room and look at the various satellites and other eye-candy dotted around the level.  This allows for less constricted movement, which works better on the rift. The user selects a level the same way as before, by looking at the desired panel and pressing Enter or A on the gamepad.

Minigame 1 (New): - Weather Analyser



The original minigame1 was completely removed. The underlying design did not work well with the rift and as such I thought it best to start over. The new minigame has a similar look but deals with the planet Neptune rather than its moon Triton. The user can now look around unconstricted as they pan around the planet Neptune along the X and Y axis. Getting closer and further away from the planet as the enter / exit the dark side of Neptune.

The objective of this minigame is to navigate around Neptune and spot various weather anomalies. The user has a simple GUI consisting of a crossair and a context panel. Once the user spots a weather anomaly (by looking at it through the crossair) a message pops up at the bottom in the form of a context GUI panel. Once this occurs a voice over narration also starts to play explaining the weather anomaly in more detail.

At this moment in time there are a total of 5 weather anomalies to find. There is no time limit as the way the game is designed ensures the user always has a point of reference and as much motion sickness it not an issue. Camera pan speed was also slowed down to a comfortable level to aid in this. And at any time the user can quit out of the game via the B button on pad or the N button on the keyboard. This will allow them to play as long as they feel comfortable before quitting out.

Minigame 2 (Reworked): - Infrared Spectrography



This minigame worked well but was not feature complete and had a nasty bug in it. In this version the game was finished off and the terrible Skybox seams were fixed.

Apart from that not much changed from the original concept. The user uses the RB/LB/RT/LT buttons on the pad or UIOP on the keyboard to select a infrared mode. Based on the mode selected they will be able to identify different objects in the galaxy they see infront of them. The users can view the level in 360 degrees and they are not constricted in anyway in terms of camera panning along any axis.

Minigame 3 (New): - Probe/Orbiter/Satellite museum


This minigame is a simple room and allows the user to walk around inside it. In this room the users will find a close up model of both the Orbiter and Probe. These two models were recreated based on the CAD files provided by our supervisor. As much detail as possible was put into the recreation of these models. When the user walks up to one of these models and is close enough, the narrator will start narrating the background information about the Probe/Orbiter.

Along with these models the user can find 6 of Neptunes satellites, modeled by Begoña in great detail. When the user walks up to these the narrator provides the user with information about each satellite including physical traits.

Minigame 4 (New): - Triton Orbit


This moon was created by one of our modelers, Begoña, and includes a normal mapped texture taken from actual images of Triton. This minigame is pretty similar to minigame one, however the objective here is to observe where the photo quality of the texture drops due to lack if imaging data. This is to show how our understanding on Triton can be increased by additional visits and investigation.

Future Work

There is a lot of room for improvement. I'll cover this below in a general way and then per level.

Generally it would be a good addition to the simulation to add some particle effects or shaders that draw stars or interstellar gas as you orbit around things in the game. Even if this is a far off effect it would add to the immersion of the system.

It would be good to have more than 1 audio track in the game. This can be easily added the difficulty would be in tracking down appropriate audio tracks to use, with permissions.

The narration could be improved upon. As it stands I use a voice generator and modify the output using a voice changer to up the semitones of the voice by 2. This makes the voice sound warmer and more human but it is still not ideal. best solution would be to have somebody read out the narration rather than use synthetic voice generation tools.

The objectives in most of the minigames are similar in nature, it would be good to add additional gameplay elements. If this is done however, it's important to keep in mind which mechanics would work well in VR and which would not. At the moment due to time constraints I followed a similar path for all of the minigames as I knew they would work well in VR. Additional research into designing for VR would be of great use here.

Main Menu improvements

The lightning in this level is not perfect yet. This is largely due to the lack of shaders around the planet Neptune. because I did not have time to create these I had to use a hacky-way to make the planet look nicer. I did this by using lighting. But it's not ideal. It would be good to spend more time on the lighting and improving the Neptune planet itself.

The floor is a very basic texture and is pretty bright. It also has some issues with level of detail when rendering on the rift. It would be better to get a nicer texture and maybe redesign the level to match it.

Mission 1 improvements

Having a planet Neptune with a much more detailed texture would be great here. Preferably one that shows different weather patterns on it (such as the dark spot). 

Additionally it would be nice to have a UI graphic that guides the user towards different weather anomalies as at the moment they are difficult to track down. It would also be good to have a gUI graphic appear on the planet where the weather anomaly was spotted to inform the user about which ones they found and which are not yet found.

Mission 2 improvements

Add additional things to find. At the moment there are only 5-6 objects to find in this game. It would also benefit from GUI guidance and tagging of objects found.

Touching up the near and mid infrared skyboxes to be more realistic would also be a great improvement.

The only major bug left over in the system occurs in this minigame. For some unknown reason, movement controls will break game-wide (although not that often). The bug starts occuring when switching vision mode to the far infrared spectrum. There is not logical reason this should happen, but it does. Further more reloading the level does not fix the issue. The only way to fix this is to reboot the game. I suspect something is going wrong with the OVR Character controller. Further investigation will be required to see what is up here.

Mission 3 improvements

Adding a way to stop narration (by pressing a button for example) would be good. This would allow the user to stop the narrator if they accidentally triggered narration. Another solution could be having the user press a button to start the appropriate narration when looking at an object.

Mission 4 improvements

This is a pretty basic game. Expanding on this in more detail would be good. Adding additional narration or objectives perhaps.

In closing I'm pretty happy that I reworked the old simulation and got to a point where it was at least bug free and feature complete. If required and time permits I'll take a look at building onto this with hopes of the system being used for the task it was created for.


Monday 23 March 2015

Oculus Rift Side Project - Dolly zoom test

I did a test a few weeks back (quite a while now) to see how the Dolly Effect worked with cameras and their frustum. The idea was proposed by my supervisor as something that might be interesting to explore in my free time. we discussed this during one of my MSP module meeting.

I did not actively know about this 'Vertigo' effect until it was mentioned by my supervisor. Only after speaking with him did I realize that I have seen this effect in many movies. I thought it would be interesting to look into. I was specifically interested in how this would apply to the Oculus Rift and what sort of effect it would have on users when in use. 

I started by looking into the Dolly Effect on Wikipedia (Wikipedia, 2015), as it gave a good overview with examples as well as the maths that went into creating the effect. 

After this I looked at how to implement this in a prototype and hook it up to the rift. The easiest way I thought, would be to use Unity since I had other projects already set up in that engine to work with the rift. On an off chance I looked at the Unity documentation and found that they had all the code written for a dolly zoom effect! (Unity Documentation, 2015) This meant the only modifications I had to make was to convert the javascript code into C#. And after this, get the effect to apply to the Oculus Rift.

Converting the code into C# wasn't really required but I thought it would help me understand how they implemented the mathematics behind the concept better. And it did, converting the code did not take long at all. 

The next step was attempting to get the effect to work on the DK2. I couldn't simply attach my dolly effect script to each Oculus camera as this broke the cameras during runtime. I had to instead modify the OVRCamera script that came with the Unity integration package. I did this by 'merging' my script with the OVRCamera script. Most of the merging was pretty straight forward stuff. Adding variable declarations to the top etc. The only really interesting part was that I had to make use of the LateUpdate() method rather than a normal Update method. As the rift updates differently than standard Unity Update(). Once this was in place, the code worked like a charm and the effect worked awesome! The code changes can be found below, summarized:




What was even more neat was the fact that I would attach the focal object in the scene via a public variable and doing so would give a different effect. Stretching towards the door (see blow video for example). Or skewing the door when heading towards a teddybear. or bursting through the door completely.  It feels pretty trippy when this effect is used on the rift :D



Pretty fun stuff!

Wikipedia. (2015) Dolly zoom. [Online] Available from: http://en.wikipedia.org/wiki/Dolly_zoom [Accessed: 3 March 2015].
Unity Documentation. (2015) Dolly Zoom (AKA the “Trombone” Effect). [Online] Available from: http://docs.unity3d.com/Manual/DollyZoom.html [Accessed: 3 March 2015].
Youtube. (2015) Dolly Effect / Vertigo Effect on DK2 in Unity. Available from: https://www.youtube.com/watch?v=UfhAJR4cjzU [Accessed: 3 March 2015].