What I’ve learnt demoing VR to first timers.

Those who have the privilege of owning a Virtual Reality (VR) headset I’m sure has shown VR to someone who hasn’t experienced it before, I like to tag these people as VR first timers (jokes), I don’t actually like labelling things but for the sake of this post I will use it. Anyway back to the subject at hand.

I own a HTC Vive and it’s currently set up in my spare bedroom at home, I feel very privileged to own this hardware and to actually have a PC that runs it. I take advantage of this by allowing anyone to have a go on my VR rig.

HTCVive

Most of my family and friends have never heard of VR, they aren’t exactly geeky tech heads like myself, one of my friends thought it was some weird spy camera rig set up, having described that it’s the next big thing called Virtual Reality, my friend still didn’t get it and just brushed it off, I even had to plea with him just to give it a go, I was begging him to try it to see…I had a similar experience with my girlfriend as well.

IMG_2778.JPG

I have demoed VR to a lot of people at various technology conferences, typically at a Unity booth at a conference we would have some form of VR demo set up, whether it’s an Oculus VR, HTC Vive or GearVR. The advantage at a technology conference is I don’t have to beg people to try it.

What’s the different between the three? Quite drastic in my opinion and this question is relevant to the topic of this post because you need to think about what type of VR experience you would like to show to a VR first timer. I don’t want to delve into this question too much though but in simple terms Oculus is a sitting or standing VR experience connected to a PC, GearVR is a mobile experience, a device that has a slot to place your Samsung device into and the HTC Vive (saving the best ’till last) which is a room-scale VR headset connected to a PC (can also be a standing or sitting experience).

IMG_1959.JPG

GearVR has probably been the most accessible VR experience available on the market at present, given there’s some hundreds of millions of Samsung devices in the hands of consumers that are VR ready puts Oculus and Samsung in very powerful position, I personally believe in using the highest quality experience for showing VR to newbies, so that leaves Oculus and HTC Vive. In my experience demoing a mobile level quality VR demo to VR first timers hasn’t delivered the same experiences as other VR headsets, I would love to hear if someone has seen the opposite experience so please comment on this post below.

Let’s start with Oculus, Oculus delivers a higher level immersion, from my experience demoing to consumers, reactions to the Oculus have been along the lines of higher quality graphics, better immersion, this is partly due to the technology which in turn allows developers to push the barriers of immersive experiences. Considering Oculus runs off a high-end PC and not a mobile device (same for the HTC Vive). I’m not saying GearVR apps can not create high levels of immersive experiences…Immersion is not defined by graphics quality and or technology but it can help.

Moving on to the HTC Vive, In my opinion and from my experience demoing to VR first timers and talking to professional developers, the HTC Vive delivers the best VR experience on the market. Partly due to the room-scale features it offers, this is a complete game changer in a VR experience, being able to physically move around the virtual room or space you are experiencing is extremely powerful and is a superb feature for us developers to experiment with with new game designs.

What’s really strange is the ability to move around a virtual room seems to confuse consumers at first, almost every VR first timer who tried the HTC Vive didn’t think about walking around the room, it seems to come across as an unnatural thing to do, which is true, think about it; Almost all core gaming experiences are static, since the beginning of video games you have never had the opportunity to physically move around a virtual space to effect the gameplay, the Wii was really the first global success in doing physical movement to effect gameplay (is there a term for that?) but you were limited as you had to face towards the TV.

Content on the HTC Vive has introduced various input designs that allow you to traverse virtual environments in various ways, there has been a few talks on this and is a constant topic at present, many developers experimenting with new ideas and also defining input standards within VR. Teleportation is one common technique which doesn’t really work for all experiences, the ability to teleport already defines the genre of your game, I mean..surely your game has some sci-fi based advancements right? As teleportation is not a natural experience for a human, the power of teleportation can break the immersive experience you are having within the content. In a general positive VR empowers you to experience what a teleportation might feel like, in reality it’s a win win situation for a consumer.

From a VR first timer perspective the teleportation technique completely freaks people out at first, I tend to show VR first timers Valve’s demo called “Aperture Robot Repair” which has a cool teleportation implementation, but it takes a few times to get used it at first, especially amusing is when the first time they use it and mess it up by teleporting up against a wall it strikes shock. Other implementations involve various techniques which I won’t go into in this post.

My brother tried a VR experience which placed the player up on a high beam, this totally freaked him out, having a fear of heights didn’t help but for my amusement seeing him try and balance himself on a flat surface (i.e. the bedroom floor) was rather good.

So, moving on from this, I would highly recommend showing VR first timers the HTC Vive, it really delivers the best experience VR can currently offer, the room-scale capabilities enhance the experience tenfold.

My order of content I demo to VR first timers:

theBlu:

theBlu.jpg

A cinematic experience which introduces VR first timers to room-scale VR with no interaction required, limiting the senses and allows us to adapt to the virtual environment and power of room-scale VR. Everyone I have shown this to has been blown away by this experience, especially when a big whale swims by. Generally people are super convinced at this point.

Aperture Robot Repair:

Adding an extra layer into the room-scale VR experience with the ability to be able to interact with objects in a virtual room, the human senses are going a bit more crazy with this experience, usually receive a lot of nervous laughter and “whoa!!” outbursts. Introduces teleportation input technique, which freaks people out at first. Generally; feedback has been very positive with this experience.

TiltBrush:

TiltBrush.jpg

An artistic VR experience, empowering the user to be creative with painting and drawing in VR, user can move around and interact via painting strokes or objects into a virtual space. Usually at this point a majority of users have adapted to a VR room-scale experience and everything is starting to feel more natural. As far as I know there’s no teleportation technique implemented.

After they are done with these three experiences, I offer a choice of content next, this means they can pick something that might interest them or pick something that might enhance a human emotion such as fear as in horror or fear of heights, so here is my list in no particular order:

  • Job Simulator
  • The Rose and I
  • Fantastic Contraption
  • Space Pirate Trainer
  • AudioShield
  • The Brookhaven Experiment
  • CloudLands: VR Minigolf
  • Skeet: VR Target Shooting
  • Selfie Tennis
  • The Lab

I  acknowledge content and peoples interests plays a big part in personal VR experiences and notice I mentioned “personal”, from what  I’ve seen everyone’s VR experience is unique and personal, people react to different experiences differently, it can effect them emotionally, such as being excited, nervous, funny, scared, peaceful etc…it would be great to hear what content you like to show to VR first timers and also the reactions from these experiences you’ve seen, please comment below.

So what have I learn’t exactly, lets summarise:

  • Demo the best technology on the market i.e. HTC Vive with room-scale VR
  • Ask the individual if they have any fears before letting them play
  • Always ensure the player doesn’t step on the headset cable
  • Start with the most basic / limited experience for example theBlu which is awesome
  • Begin introducing more complex VR experiences which involves interaction and movement
  • Then go wild if they feel comfortable continuing

Notice I haven’t covered nausea in this post, content on the HTC Vive and the technology shouldn’t cause this but may be experienced with specific people, I haven’t had one person experience nausea with the HTC Vive but I’m sure someone out there has.

Thanks all!

Experimenting with HDR Skyboxes in Unity 5.

Unity 5 has implemented a HUGE update on the rendering / graphics side of the engine, introducing new lighting workflows with realtime GI, a Physically Correct Shader (Metallic & Specular workflows supported) among many other things..

I wanted to do an experiment today, where I test out HDR Skybox’s in Unity 5 to see how drastically the lighting and mood of a scene can change.

HDR:

High-Dynamic-Range is an imaging photography technique to produce a higher dynamic range of brightness. This is achieved by capturing different exposures of your chosen subject matter and combining them into one image.

In Unity we can use these HDR images to help blend 3D models into the environment, this can drastically add to the belief that the 3D model is actually in the environment.

Quick mention – I’m using HDR images from NoMotion HDR’s – 150 free HDR images, check EULA before using for commercial usages.

I’ll be showing how these 3 HDR images help create a completely different feel and look to the scene:

I have daytime, evening and night-time HDR’s, they should all create drastically different lighting conditions in Unity.

Time’s up chumps..let’s do this:

Unity 5’s new standard shader has built-in support for image based lighting (IBL), this really helps with the belief that the 3D model is in the environment I am setting with the HDR skybox.

Here’s the model with just Unity 5’s current default procedural skybox:

Default Skybox

With a bit of trickery (not really, just pushing a few buttons in Unity), let’s now see what happens when I add the daytime HDR Skybox:

Daytime Skybox

I haven’t messed with any lights at all, all I have is the default Directional Light in the scene with default settings. looks rather impressive after just adding a new HDR Skybox to the scene.

Let’s try the evening HDR Skybox now:

Evening Skybox

Awesome, I am really enjoying playing around with this, I love how quickly I can completely change the lighting conditions which in turn drastically changes the mood and atmosphere in the scene. I can’t wait to see how games are going to utilize these new features.

Okay, last one, the night-time HDR Skybox, I’m a little skeptical about this one, I’ve no idea how this will look and I imagine it’s not really desirable to use a night-time HDR image. Anyway, let’s see what it looks like:

Nighttime Skybox

Actually turned out rather well, the image doesn’t show the white spots on the model as much as in the editor, these white spots are produced by the specular smoothness on the Standard Shader, i.e the more smooth I make it the more white spot artifacts are produced, not sure why this is more present in the night-time scene, I’m sure there’s a setting I’ve missed or something..

Overall, I’m really impressed with this, especially how quickly I can change the mood and lighting of the scene and the visual output from Unity 5’s new rendering / graphics update.

Look forward to seeing what you all produce with Unity 5. 🙂

Improve your efficiency with MonoDevelop

Everyone has their own workflow and choice of Scripting IDE, but if you’re one of those using MonoDevelop with Unity this blog post should help with an increase in efficiency when working with it..

I’m using a Mac and so the workflow improvements will be based on using MonoDevelop on a Mac.

Syntax Highlighting is important for writing and reading code, MonoDevelop makes it easy to change this, simply click on MonoDevelop-Unity menu item and then click Preferences, the Preferences window will open, there’s a menu on the left hand side, under Text Editor -> Select Syntax Highlighting there will appear eight options for you to chose from. My preference is called Oblivion but pick and test one for yourself which suits you best. You can also enable semantic highlighting which helps with reading code. Use cmd + / –  to zoom in and out, so you can see the code you are writing.

Keyboard shortcuts:

Who doesn’t like a good keyboard shortcut, yes there’s many to remember from many different software packages we use, but I think it’s important to have a few for each package that are your most important to your workflow. So here’s a few I feel worth highlighting for an increased workflow:

Saving scripts:

  • cmd + S = Save Script – Unity picks up this save and recompiles to reflect the changes
  • cmd + shift + S = Save all scripts open, useful if you’re manipulating multiple scripts when coding public / public static functions etc

Scripting layout :

  • cmd +[ and cmd + ] = Indents code, allowing to quickly indent lines of code without having to use the tab key
  • cmd + / = Quickly comment out some code or  comment your comments
  • shift + arrow keys = Select lines of code or symbols real fast
  • cmd + shift + Y = Finds references to that code snippet and displays the info in a window below
  • alt / option + arrow keys = Move lines of code or the cursor through lines of code

Debugging:

  • cmd + return / enter = This starts debugging mode from the Run menu item, no need to go through menu items
  • cmd + \ = Toggles breakpoint, no need to go through the Run menu item
  • cmd + shift + I = Step Into, no need to go through the Run menu item
  • cmd + shift + O = Step Over
  • cmd + shift + U = Step Out

Other useful actions, “Attach to Process” and “Detach” when debugging Unity, these don’t have any default keyboard shortcuts, but if you enter MonoDevelop-Unity menu item and reopen Preferences window, you can add your own under “Key Bindings”. This also shows everything I just highlighted and more.

Hope this all helps and please share your workflow improvements!

Unity has 2D tools..

Yes..the much talked about, highly requested 2D feature is finally here and available to you all for free. In preparation for the Unity 4.3 release, I developed a small 2D physics puzzle game which enabled me to get up to speed with the toolset and also use for demo / presentation purposes on my travels.

This post will go through some of the new 2D tools, workflows and shed a little light on how to use the different features available.

2D Defaults

First new feature is the 2D defaults option in the project wizard, this sets up the unity engine to use 2D for defaults on things such as the texture importer, textures will now be imported as sprites automatically instead of needing to change the texture type in the asset importer.

Sprite_importer

There are other subtle touches with using 2D defaults, such as the main camera setup as orthographic and not perspective.

A new addition to the scene window is a little toggle button which if clicked on toggles the scene between 2D and 3D, using 2D defaults means the scene view will have 2D toggled by default without needing to click the button on launch, navigating the scene in the x and y axis only. Also note, creating a new project with 2D defaults enabled doesn’t mean you’re now bound to 2D, all 3D and 2D features are still available.

2D_scene

In the 2D viewport in the scene view, with the move tool and the sprite selected we now have a new gizmo to play with, this makes it a lot easier to do actions such as: move, uniform / linear scaling, rotation and highlights the pivot placement. We do not need to switch between different tools, it’s all there in one place.

Sprite_gizmo

Box2D Physics Integration

Implemented is the Box2D physics engine, a free open source 2-dimensional physics engine and is considered a leading industry standard. With Unity 4.3 we now have a list of new 2D physics components:

2D_physics_components

3 / 4 of the colliders are self explanatory but the more interesting one to look at closer is the Polygon collider.

Add the component to your sprite asset in the same way you would add any component and you will see a green highlighted collider outline, the green lines indicate it’s a collider and it should roughly match the shape of your sprite.

If you added the Polygon Collider to an empty GameObject a Pentagon shape will be generated, If you expand the “collider info” in the component you will see the value of 5 Vertices, a Pentagon is made of 5 points therefore 5 Vertices.

To add more Vertices hold down shift + left mouse button anywhere on the green collider line and then position your Vertices, to delete Vertices use ctrl + left mouse click in the same fashion as adding them, you will see the green collider line update to red to indicate delete is available:

Polygon_collider_info

The Sprite Editor

There’s a new window to edit sprites in, it’s called the Sprite Editor and can be opened in two ways, Window -> Sprite editor, or click on the sprite asset in the project window, in the inspector change sprite mode to “multiple” and click the Sprite Editor button.

Sprite_editor

Note: You need to change the sprite mode to multiple to be able to slice it up in the Sprite Editor.

Let’s take a look at the Slice menu, this will give you options to slice up different elements of your sprite image automatically or manually.

A typical workflow is to slice up the image manually, you can do that by clicking and dragging on the image, you should see a blue rectangular gizmo appear with handles in the corners allowing you to resize your slice area. Having positioned and resized your slice area you can add another by following the same action, and you can add as many as you like. You will also notice a new Sprite window appear giving you extra tools to play with, you can rename the area for that sprite slice and edit the size of the rectangle by coordinates. The Trim button will tightly pack the slice rectangle based on transparency, so it will fit nicely along the edges of the sprite image.

In many cases you can probably use the automatic slicing, Unity will do the work for you and therefore save some time. Using the Automatic slicing option, the Sprite Editor will guess the boundaries of the sprite _again_ by transparency.

The Grid is another slicing option available, very useful for rectangular sprites, the Pixel Size means you can define the height and width of the tiles in pixels.

After automatic slicing you can still edit the slices manually, also use the Trim button to tighten them up.

Lot’s of other cool stuff included such as using the Animation window to easily animate sprites, using Mecanim for 2D blendtrees but I won’t cover that right now.

To finish up here’s some useful info:

Box2D Performance tips:

  • Try to avoid a lot of OnCollisionStay callbacks, this will be expensive.
  • Profiler is your friend, scroll down in the Profiler to Profile Physics2D and profile early in your development cycle.
  • Using the Polygon Collider can be expensive if lot’s of Vertices are needed, Polygon Collider decomposes the Sprite into lot’s of shapes which has a huge overhead.
  • Each shape can produce up to two contact points, having a Sprite with 50-80 shapes then it could produce double that in contacts.
  • Keep an eye on the amount of contacts from dynamic bodies, if they don’t collide you can have hundreds maybe thousands moving around, when they come into proximity performance starts to crumble
  • Circle Collider is your friend.
  • Consider all of the above especially when targeting low-end mobile hardware

That’s all folks!

Nordic Game 2013

Unity_Nordic

I had the pleasure of attending Nordic Games Conference 2013 for the first time and it was a great event, this is my slightly overdue post but I wanted to share my experience with you all.

I arrived on the final day off Unite Nordic to enjoy the Indie Party, sampling some nice local beers and catching up with some people I haven’t seen for a while. At the party there was a few Indie games on display for us all to admire and even play, games such as Badland which is a really slick local multiplayer game, each player uses a corner of the display screen to control a character racing through a level avoiding obstacles and being captured by the following screen, a fun survival game.

badland

The other title that jumped out due to it’s visuals and sound design was The Plan, this was just an experimental game but definitely had impressive attributes, has potential if they were to take it further.

The Plan

Day 1 of the conference was really busy at the Unity booth, I met some great developers from all backgrounds, some already using Unity and looking for some technical assistance, others looking to move to Unity or just check it out and see it. Day 1 seem to consist a majority of game developers but on Day 2, I met developers from other backgrounds, one wanting to use Unity for Architectural + Visualization stuff on the Oculus Rift, exciting stuff and others for various mobile apps.

I really liked that although Nordic Games Conference is busy, it seems to have this chilled out atmosphere, very laid back which is a great environment to work in. I look forward to attending future Nordic Games Conferences.

GDC 2013 – San Francisco

GDC - San Francisco

Attending my first GDC has been a career highlight to speak off, it was a great show packed full of great companies and people.

Over at Unity land, we had a huge booth which hit you when entering the expo arena, sitting between giants Sony and Nintendo. We had lines of demo machines showing off some great Unity projects and games, benches and screens for our presentation area and a reception desk to ensure we could direct you to the right people, oh and we had really nice comfy carpet to help protect our feet.

A particular highlight of the week for me was seeing a group of indies begin game jamming on one of our demo machines, the project was some crazy 3d sheep physics based game where as the player you controlled the sheep by jumping & tumbling around to pick up objects, the movement of the sheep made the game quite amusing.

Indie Game Jamming - GDC

The game shortly became a talking point between some of us, intrigued to see how the game would progress over the 2 / 3 days they had, adding more graphics and control tweaks to it as they went, never know what happened to the project, hopefully they copied it of the machine and will continue to do something even more crazy with it.

One of the other main excitement and highlights for me was meeting and talking to users of Unity, them asking me for help with their projects and as well as technical questions, some experiencing bugs with their projects and a majority of others asking more about our new features or how to use certain aspects of them. I learn’t a lot from speaking to you guys, I get to see how you use our product and features, as well as hearing about the sorts of questions asked.

Another cool thing was one of our employee’s showing the WiiU support and how easy it is to set up the additional camera for both displays.

Unity + WiiU

Lot’s of other cool things were happening, but until next time, catch you later folks.

Modelling a Sports Stadium

Rugby Skills Challenge 2013

I’ve had a great month and half or so developing a 3d model and beta testing on a recently released iOS title by game developer Russ Morris.

We got talking about his Rugby game at the Unity UK Christmas party and Russ mentioned about needing a Stadium model for his game, naturally I said “hey, I’ll give it a go” and after a lengthy conversation in which I actually remember the day after, I nagged him for a bit to send over the details, thankfully Russ obliged and the creative development began.

So I wanted to blog about the development of the stadium model and some of the techniques I used, trying to keep it as low poly as possible for mobile development.

Starting point:

My starting point is the centre piece, the field of play, I needed to research the typical dimensions for a Rugby pitch. Modelling the basic rectangular shape then expanding out from there, having the focus point in place gave me a good start in terms of thinking and planning the architecture of the model. Using reference images taken from Cardiff’s Millennium stadium and the Aviva stadium in Ireland, the important thing was to ensure the shape and architecture of the model was directed at the main focus point (the field).

Pitch

The design is to have the curved corners in the tiers, so the stadium is a full oval curved shape like most modern day stadiums have. From here I started building up the second tier and adding some basic roof structure just to get an idea and feel for the model.

Adding details:

I quickly moved on to adding details to the tiers, adding steps and entrance points for isles, the challenging area was creating the corner tiers, I used the technique of extruding and rotating each time by hand, but I could of used the bridge tool and add the correct amount of segments, then reposition each set of poly’s, either technique would of worked fine.

Adding Detail

I finished the lower tier, so I had a fully completed lower tier with steps and isle details added, the good thing is, for the top tier, I can duplicate the lower tier and make adjustments to the positions of some poly’s and up scale when needed, no need to fully model the top tier from the start again. Here’s both tiers fully completed:

Fully completed stands.

I needed to close the gaps up and model some outer geometry, I modelled a bunch of cubes and joined the verts together at each adjacent point, It was just about getting the right positions and joining the correct verts together.

Creating the roof:

The roof structure design is to be based on the Millennium stadium in Cardiff, the corners needed to have gaps, the support structures needed to be added as well.

The Millennium Stadium
Adding roof structure
Support structure

Combining all this together gives me something nearly complete, all the rest of the tweaks were added by Russ in the game project, such as texturing and advertising boards etc.. Here’s the final model in Modo:

Rugby Stadium

Download and install the game now, you can download it by searching for Rugby Skills Challenge 2013 on the app store or by clicking the image below which redirects you to the app store:

appstoreThanks!