Saturday, 22 November 2014

Note to Self - My subtasks

Just a note to self of things I have to do today (22/11/2014):

  • Vote on a logo (done)
  • Import Lukes Skybox, as I'm sure it'll look better in my simulation than my one
  • Compile my presentation slides based on my simulation thus far and send to Matias
  • Check out BegoƱa's new version of the animation/video she made
  • Get the survey off drop box and mail it to Dr.Akinwande at Kingston College as he is waiting on it
  • Upload all my projects source code into my github profile and add Akshayan to it

Things to do on Sunday (23/11/2014):

  • Potential team meet up to discuss the presentation
  • If I'm around campus try and track down Tom to see about the hardware (Oculus Rift) since I must have missed him on Tuesday

Thursday, 20 November 2014

Decided Upon Design

At the moment we are awaiting approval from our mentor and we hope he is happy about our preferred design. I thought it would be a good idea to cover our agreed upon design before going into detail on the mechanics - please see below:

Design Proposal overview:
We brainstormed for a while and one key point which kept coming up was how the use of the Oculus Rift was pretty rigid, in terms of user engagement. Using the Oculus Rift meant that only one person would be able to experience the simulation at a time. At an expo this is not ideal, as foot traffic would keep passing and not be engaging or learning anything.

It would not do to simply have a giant screen with output taken directly from the Oculus Rift as the images would be stereoscopic and not look at all good. The output would be split into two separate camera lenses and appear blurry.

It is due to this that we came up with the idea of splitting up the simulation. At first we looked at a way to combine different types of cameras in the Unity engine. Attempting to have a dual-screen set up, one screen (the Oculus Rift really) would be stereoscopic and another screen (a standard display) would be a normal Unity camera. Our attempts to make this work did not go well and it crashed the engine a few times.

Even though there may be a way to get this idea working it would require a lot of work. As such we then decided to just do two separate, modular simulations. The one simulation would use the Oculus Rift and the other would be more informative and run on a standard display of some description.

The Oculus Part:
The Oculus Rift part of the simulation would deal with user engagement and have the most user interaction. This section deals with informing the user about Neptune and all the data we gather from it. 

The player would use the Rift to look at Neptune and investigate the planet as if they were the probe orbiting the planet. Currently the games designers are looking into ways to make the gathering and measuring of data interesting. This part of the simulation will need to be the most game-ified. 

(I have also been working on a LookAtPoint functionality which would allow feedback based on points in space (Vectors) being looked at. More on this in my next post.)

The Presentation Part:
The next part of the simulation deals with engaging users who cannot or do not wish to use the Oculus Rift. It will also inform the users of the journey a probe would have to make from Earth to Neptune. This will have some limited user interactivity in some way but for the most part will be a presentation that can be watched on a standard display.

The UX-designers are currently working on this part of the simulations display.

The idea is that users who are waiting to use the Oculus Rift Simulation can watch the journey the probe would have to make. They would then gain a greater appreciation for the Oculus part of the simulation as the probe has reached Neptune.

We are currently awaiting approval but we are also starting some generic work with this design in mind as we have a presentation deadline on the 25th.

Wednesday, 19 November 2014

Sprint II - Rough Outline

My team did very well on Sprint 1 - even AK (Akshayan) who had to play catch up delivered a design proposal! As such the brainstorming session went very well and we all managed to agree upon a design during the workshop on Tuesday.

Sprint Breakdown

Sprint 1 Brain Storming / Final Design Proposals

We are currently working away at Sprint 2. This sprint is pretty much focused at meeting the presentation deadline on the 25th.

Sprint Breakdown

Sprint 1 Brain Storming / Final Design Proposals

Sprint 2 Start Design Work / Start Dev / Prepare Demo

I'll be updating the spreadsheet for tracking purposes as soon as I get home from work but just for now, so that some idea is accessible to everyone on the team please see above and below.

We have time to prepare up until Sunday - at which time we need to get all the information to Matias to start putting together the presentation. Some minor amendments to things are expected after this time but most of the work for the presentation should be done before Sunday.

Sprint 2 Outline:

Preparing for the presentation and starting on design/development.

Main focus should be to get this presentation out of the way, but it'd be best if what we made for the presentation would be stuff we could grow on during design/development.

Overview from the coursework assignment brief:

Individual Feedback - 25th Deadline
"This time has been organized for you to receive constructive feedback on your project progress so far. You are strongly advised to take advantage of this feedback. They are one to one consultations, but held in the classroom. The presentation should take the form of e.g. PPT presentation, sketches, diagrams, mock-ups, wireframes and, possibly an early artefact, and you will receive feedback on the content for you to improve your submission. The feedback will be delivered on a per role basis as well as on the project as a whole."

Roles and tasks:

UX -

  • Start work on designing the simulation part of the simulation
  • Get data/deliverables to Matias by Sunday (25th)

Games Design -

  • Start design of the oculus rift part of the simulation
  • Find ways to make measuring data fun
  • Design a logo for the team
  • Get data/deliverables to Matias by Sunday (25th)

Artists -

  • Start creating graphics for the presentation (if possible ones we can reuse or build upon in the actual project)
  • Start looking into asset creation options not for the 25th but long term, even if they are rough its best to make a start
  • Get data/deliverables to Matias by Sunday (25th)

Programmers -

  • Prepare a rough demo of the presentation part of the simulation for the presentation 
  • Reuse planet orbiting demo to show Neptune exploration concepts - perhaps update with designers current designs/ideas
  • Start working on building the simulation - co-ordinate with each other
  • Get data/deliverables to Matias by Sunday (25th)

Loose Ends:

  • Rob - Get in contact with Kingston College for the survey for UX guys (Done - awaiting Survey)
  • Rob - Get the Oculus rift after rescheduling pickup

  • Matias - Email Dr.Claus with project proposal (Done)
  • Matias - Prepare the presentation slides once all information is received from team. We can all help with this if required

  • Deon and/or Luke - Create a logo for our new team name. Doesn't have to be world changing so don't spend to much time on it :) (Done - A bunch of logo designs are now on Dropbox)


  • Potential team meeting on Sunday to get the presentation organized and ensure we are good to go.

I'll be posting more on the Design we decided on when I get home from work.

Tuesday, 18 November 2014

Agreed Upon Design Proposal

After our meeting yesterday the team decided on a proposal, combining a few different ideas. Many of us were thinking along the same lines. A brief summary of the agreed upon design can be found below:


Simulation to attract A-level students based on the mission to Neptune report by the Astronautics Department.

Project outline summary:

  • Week 1 - Brief
  • Week 2 - Organising / Meeting the team
  • Week 3 - Meeting with mentor / user requirements
  • Week 4 - Start research / Project report received
  • Week 5 (Current) - Brain Storming / Final Project Design outlined / Decide on team name / Start work on Logo
  • Week 6 - Deadline 1 - Project progress presentation / Start development / Continue design work
  • Week 7 - Continue Design / Continue Development (including testing) of design mechanics
  • Week 8 - Continue Design / Continue Development (including testing) of design mechanics
  • Week 9 - Continue Design / Continue Development (including testing) of design mechanics
  • Week 10 - Continue Design / Continue Development (including testing) of design mechanics / Structure report


Our aim is to build two modular simulations to best cover two key sections of the overall Neptune project.

Project Deliverable Structure:

A simulation broken into two parts.

Part 1: 'The Presentation'

This will cover the journey from Earth to Neptune and will be an informative presentation with some simple user interaction. The purpose of this is to inform the user of what it would take for the probe to journey from Earth to Neptune.

This section of the simulation will be modular (meaning independent of Part 2) and can be run as a stand alone application if desired. This will allow users who are not engaged with the Oculus Rift to still be drawn in and informed - perhaps while waiting for their turn on the Oculus or to grab their attention at first.

Part 2: 'Neptune Interaction'

This will make use of the Oculus Rift and allow users to gather data from the planet Neptune in a fun and interesting way. Making use of ideas gathered from the research as well as the "Payload" section of the Neptune project report. The design team is currently working on a design for implementing the data collection in an entertaining fashion. This will allow the user to see 'through the probes eyes' and orbit around Neptune, gathering various types of data in some way. For example by taking pictures of different elements on the planet or measuring graphs in a fun way.

This section of the simulation will be to inform the user about the planet Neptune and how the probe could potentially interact with it.

Monday, 17 November 2014

My Design Preposal

To help assist in brain storming this week, we are all trying to come up with some ideas for a design of our project. Mine are listed below:

Proposal I - Neptune Scanning Mini games:

The basic idea borrows from the XCOM and Mass Effect games (see screenshots below). The idea is to measure different details of the planet with the probe by making use of various tools. Each tool can be a minigame of sorts. The player needs to succeed in each mini game to reach a good score. Failing in the mini game could lead to less accurate readings and fuzzy data or mission failure etc. 

Concepts from other games

XCOM Look and feel concept:

Mass Effect example – probing a planet for information:
The documentation makes note of using different tools in the Payload section (more on that below). Page 5 lists a bunch of equipment that could be used on a theoretical probe – probing the atmosphere of Neptune in the section titled Probe exploration of Neptune’s atmosphere.

A probe package would contain a main probe:
  • ·         GCMS; sensors for temperature
  • ·         Pressure
  • ·         Acceleration
  • ·         Solar radiometers
  • ·         IR radiometers
  • ·         Nephelometer

And at least three more mini probes
  • ·         Species specific sniffers to sample different atmospheric regions
  • ·         Temperature
  • ·         Pressure
  • ·         Acceleration sensor

We could use some artistic license here to make the actual measuring fun in terms of a collection of mini games. Each way of measuring an element of the planet would be a different mini game and the more data gathered the better overview of the planet the player gets. As the player progresses with measuring we could explain the different aspects of Neptune.

We have this information in section 1.5 Neptune Background (Page 6). For example the player uses a Temperature, Pressure or Accelerometer Sensor and we can then unlock information about Temperature on Neptune or the atmosphere there. Information about different payloads can be gathered from the Payload section (pages 44-70). There seems to be quite a few of them so there’s a lot to work with here in terms of data collection minigames!

Some areas we could use these instruments to measure for are listed on page 59, using a HiRISE camera:

• Locate and characterize landing sites
• Cratering
• Volcanism
• Tectonism
• Hydrogeology
• Sedimentary processes
• Statisgraphy
• Aeolian processes
• Mass wasting
• Landscape evolution
• Seasonal processes
• Climate change
• Spectrophotometry [same role as photopolarometer]
• Glacier and periglacier processes
• Polar geology
• Regolith properties

Obviously there’s a lot more data than we need to get into, but these are examples of what we could work with. Since we aren’t all experts at space we can ‘gamerfy’ things like how to measure for Landscape evolution for example.

I was playing around with this idea as I thought it would look cool on the Oculus Rift.  For example my basic rotation demo in my earlier post.
Proposal II - Propulsion or Orbiting:
This idea is a bit more vague as I’ve not had much time to think on it. But I have seen how fun Kerbal Space program is to play. I think we could do something along the lines of this game using Propulsion, Pathing or Orbiting. I saw some interesting concepts in a game jam once, where the player would fling a planet past bigger planets and see the effects of gravity.

With this in mind, I think it would be cool to give the player a third person view of the probe, again using Oculus, and have them adjust different inputs. Based on these inputs the probe would continue on its journey to Neptune. If the player misjudged an input the probe could end up anywhere in space. (Well anywhere in the gameworld).

There is a lot of information of path planning in the documentation I think. It’s pretty math intensive but I think we could find a fun balance. If this is to complex perhaps we could make it a smaller scope, like just the orbiting path around Neptune or the Launch of the probe from Earth into space?

Example of Kerbal Space Program – a third person view might work pretty well on Oculus rift as the camera angles won’t appear as aggressive to the player. The object they are focused on is further away from them. This is just a theory however.

Example of Kerbal Space Program – a third person view