Thursday 20 November 2014

Decided Upon Design

At the moment we are awaiting approval from our mentor and we hope he is happy about our preferred design. I thought it would be a good idea to cover our agreed upon design before going into detail on the mechanics - please see below:

Design Proposal overview:
We brainstormed for a while and one key point which kept coming up was how the use of the Oculus Rift was pretty rigid, in terms of user engagement. Using the Oculus Rift meant that only one person would be able to experience the simulation at a time. At an expo this is not ideal, as foot traffic would keep passing and not be engaging or learning anything.

It would not do to simply have a giant screen with output taken directly from the Oculus Rift as the images would be stereoscopic and not look at all good. The output would be split into two separate camera lenses and appear blurry.

It is due to this that we came up with the idea of splitting up the simulation. At first we looked at a way to combine different types of cameras in the Unity engine. Attempting to have a dual-screen set up, one screen (the Oculus Rift really) would be stereoscopic and another screen (a standard display) would be a normal Unity camera. Our attempts to make this work did not go well and it crashed the engine a few times.

Even though there may be a way to get this idea working it would require a lot of work. As such we then decided to just do two separate, modular simulations. The one simulation would use the Oculus Rift and the other would be more informative and run on a standard display of some description.

The Oculus Part:
The Oculus Rift part of the simulation would deal with user engagement and have the most user interaction. This section deals with informing the user about Neptune and all the data we gather from it. 

The player would use the Rift to look at Neptune and investigate the planet as if they were the probe orbiting the planet. Currently the games designers are looking into ways to make the gathering and measuring of data interesting. This part of the simulation will need to be the most game-ified. 

(I have also been working on a LookAtPoint functionality which would allow feedback based on points in space (Vectors) being looked at. More on this in my next post.)

The Presentation Part:
The next part of the simulation deals with engaging users who cannot or do not wish to use the Oculus Rift. It will also inform the users of the journey a probe would have to make from Earth to Neptune. This will have some limited user interactivity in some way but for the most part will be a presentation that can be watched on a standard display.

The UX-designers are currently working on this part of the simulations display.

The idea is that users who are waiting to use the Oculus Rift Simulation can watch the journey the probe would have to make. They would then gain a greater appreciation for the Oculus part of the simulation as the probe has reached Neptune.

We are currently awaiting approval but we are also starting some generic work with this design in mind as we have a presentation deadline on the 25th.

No comments:

Post a Comment