Getting Started with ProBuilder – Tools and Tips

A personal wish of mine is to have mesh modification tools in Unity. Seeing ProBuilder become part of the engine is a delight, one that I cannot stop smiling at.

The goal of this blog post is to help you to overcome any barriers and start you on the right path to having fun playing with this toolset, using it efficiently and effectively to mitigate any risk of becoming frustrated. So, I wanted to share some tools and tips around using ProBuilder for the first time because it is an excellent tool! If you are a noobie to 3D or an experienced modeller, this post will provide you with some great insights into getting started with ProBuilder. I should mention this is not a step by step tutorial, but I will be sharing helpful tips to getting started in using the tools available to you.

ProBuilder

ProBuilder:

Here are some great keyboard shortcuts that’ll get you building and modifying meshes in no time:

Ctrl + K – Spawns a Cube.
Ctrl + Shift + K – New Shape Tool.

Hold Shift while moving – Extrusion:

Extrusion.gif

Hold Shift while scaling – Inset:

Inset.gif

Alt + U – Insert Edge Loop:

Inset Edge Loop.gif
Alt + E – connects verts/edges:

Connect edges.gif
Alt + C – collapse verts (this will merge all selected verts into one vertex, centred at the average of all selected points):

Collapse verts.gif

Ctrl + Z – Undo – same as anything in Unity.

Escape key – reverts to object selection mode.

Use H, J, and K keys to chose between Verts, Edges and Faces selection modes.

Multi-select faces/edge/verts – Hold Shift and LMB.

Use backspace to delete verts/faces/edges – Delete will delete the object.

In general, tooltips can take 500ms or so to show, in ProBuilder this can be instant by holding the Shift key when hovering the mouse cursor over the feature buttons in the toolbar.

Now, I’ll share some tips and gotchas which I found are useful things to know, but first, if you are new, remember to start simple, common mistake is to start pushing too much geometry, this can become very complicated fast. Start by boxing out familiar shapes, it’s a lot easier to add geometry than to clean up complex geometry. Here we go:

When collapsing vertices, you might want to collapse them to the first vertex that you selected. To do this you need to click on the + button next to the Collapse Vertices option on the toolbar and select Collapse To First in the Options menu pop-up (be aware that all future Collapse Vertices will now Collapse To First if this checkbox remains enabled):

Collapse to first

To select vertices that are hidden out of view on the opposite side of the geometry, you will need to ensure Select Hidden: On is enabled. By default, this is On so be aware when you go to select multiple vertices to modify them.

There are many circumstances where you will need to flip normals to invert the faces of your geometry, ProBuilder has a feature called Flip Face Normals located in the toolbar (you will need to first select some faces on your mesh):

Flip Faced Normals

Use Detach Faces to detach parts of a mesh to a new game object, useful for cleaning up meshes, especially if you have created some ngons. Things to be aware of when using this feature:

  • The pivot will usually be in an incorrect position
    • Use Centre Pivot selection from the toolbar
    • For custom pivot positioning – select a vert – use Set Pivot

When using the New Shape tool to build Stairs. There is an option to uncheck Build Sides. If you know those sides of the object won’t be visible this is especially useful for:

  • Not having to go back and clean this up manually yourself
  • Optimal for baked lighting to ensure you not wasting texture space
    • Also, helps reduce bake times

Uncheck Build Sides.gif

Preferences for ProBuilder – customise shortcut settings

If like me, you like to see how many faces, triangles and verts a mesh has (especially when building on the fly), you will need to enable this in Edit -> Preferences -> ProBuilder -> Show Scene Info (not a great name).

Show Scene Info

ProGrids:

Used to make snapping easier, and excellent for controlling the unit size of your entire geometry, level design at it’s finest. Here are some quick tips:

  • Use +/- keys to scale the grid size – you can change this on the fly depending on your requirements for mesh modifying
  • Select all the verts – select the push to grid button to align current mesh with new grid size
    Hold V to snap objects around

PolyBrush:

A great addition, and lets you sculpt, smooth, texture blend, paint on textures, scatter objects etc.

To sculpt you may need to add additional geometry to your mesh, you can do this via using the subdivide tool, the way this works is it grabs all edges and connects them to the centre, a straightforward implementation, with nothing extravagant happening here.
To sculpt you will need to select the Push tool, which will push (sculpt) vertices on your mesh depending on brush settings, some tips for quickly editing brush settings for sculpting:

  • Hold ctrl to scroll brush up to a larger size
  • Hold shift to change the interior diameter of the brush
  • Hold shift + ctrl to adjust the strength (how much push will be applied to your mesh)

That’s it, I hope to write more of these posts over the coming weeks/months, please share your ProBuilder creations, I cannot wait to see what you create!

Unity’s New Video Playback Component:

Disclaimer: I am writing this post about the video player in it’s current state within the current beta version Unity 5.6.0b9 which was released: February 17, 2017

The long awaited replacement for Unity’s now legacy Movie Texture is here and comes in the form of a newly written Video Playback solution. This post will tell you all about it, how to use it, what features it contains and some general tips and tricks. If you have any questions which aren’t answered in this post feel free to write in the comments section and I’ll get back to you asap.

The Video Player is a new component that you add to game objects for movie playback in your scene. The Video Player will use the native video hardware capabilities of both the editor and target platforms.

VideoPlayer Component.PNG

Playback Source:

The first thing you might notice from the screenshot above is the Source is currently set to Video Clip, you can define the clip by dragging and dropping the clip into the property. The Video Player can play movies that were imported with the new Video Clip importer or it can read movies from Streaming Assets, local files or http sources using progressive streaming:

source-switch

Render Mode:

The component also contains a Render Mode property where currently there are five options, but perhaps some of the more exciting options is the ability to render a video through a Material Override option as well as a Render Texture.

rendermode

Material Override:

This option allows for a texture parameter in the current object renderer’s material to be used for receiving the video. If we take a 360 video example as a use case, you would use Material Override to ensure the video is being wrapped around a spheres mesh, I use a shader to invert the normals:

Shader "Custom/Mobile/Invert Normals"
{
	Properties
	{
		[NoScaleOffset]_MainTex ("Texture (RGB)", 2D) = "white" {}
	}
	SubShader
	{
		Tags { "RenderType"="Opaque" }
		Cull Front

		Pass
		{
			CGPROGRAM
			#pragma vertex vert
			#pragma fragment frag
			#include "UnityCG.cginc"

			struct appdata
			{
				float4 vertex : POSITION;
				float2 uv : TEXCOORD0;
				float3 normal : NORMAL;
			};

			struct v2f
			{
				float4 vertex : SV_POSITION;
				float2 uv : TEXCOORD0;
			};
			sampler2D _MainTex;

			v2f vert (appdata v)
			{
				v2f o;
				o.vertex = UnityObjectToClipPos(v.vertex);
				o.uv = float2(v.uv.x, v.uv.y);
				v.normal = v.normal * -1;
				return o;
			}

			fixed4 frag (v2f i) : SV_Target
			{
				return tex2D(_MainTex, i.uv);
			}
			ENDCG
		}
	}
}

360-video

If we were to use Unity’s Standard shader as a Material Override option, we then gain access to the available texture slots for that shader:

material-override-texture-slots

Just think about those possibilities for a second; do you now have the idea of animating a bump map with a movie? You could write a custom shader that deforms a mesh using one of the texture inputs. My example below is using a custom Holographic shader to render the movie clip in an interesting way:

hologram-video-shader_1

Shaders are incredibly powerful scripts and empower us to render things in interesting ways, it’s awesome we can apply custom shaders to the Video Player, opens up for some interesting possibilities.

Video Codecs:

The Video Player component aims at using h.264 and VP8 for hardware which are supported on a wide variety of platforms. On the audio side we have AAC and Vorbis. If you are looking for the alpha channel to be respected within a movie clip you need to utilise ProRes 4444 which is a lossy video compression format developed by Apple Inc.

Audio:

A slight complication comes with Audio here which I felt was worth mentioning; On OS X which has the most desirable workflow at present (arguably), you can chose the Audio Output Mode to be Direct, what this does is play back the Audio which is already embedded from within the movie clip. On Windows Direct Audio Output Mode does not work as it is only supported on Apple platforms, for OS X and iOS you are okay but non-Apple platforms you need to revert to the Audio Source option, this allows you to define the Audio file you want to have play back with the video. Since both the Movie clip and Audio clip are decoded from the same stream we shouldn’t see any issues with synchronisation.

audio-output-mode

Scripting:

The Video Player class can be found within its own namespace UnityEngine.Video. The documentation for this is extensive but I did want to highlight a specific API which I think is a very useful feature. video.IsPrepared; This API checks whether the player has successfully prepared the content to be played back. When the content is prepared the player can start playing back the associated content instantly, the API is especially important when you are using the Streaming Assets, local files or http sources, this short scripting example will demonstrate the video.prepareCompleted; function to ensure the content is ready to playback from a HTTP source:

using UnityEngine;
using UnityEngine.Video;

public class HTTPVideoScript : MonoBehaviour {

    // Use this for initialization
    void Start ()
    {
        var vPlayer = gameObject.AddComponent<UnityEngine.Video.VideoPlayer>();
        vPlayer.url = "http://www.quirksmode.org/html5/videos/big_buck_bunny.mp4";
        vPlayer.renderMode = UnityEngine.Video.VideoRenderMode.CameraFarPlane;
        vPlayer.targetCameraAlpha = 0.5f;
        vPlayer.prepareCompleted += Prepared;
        vPlayer.Prepare();
    }

    void Prepared(UnityEngine.Video.VideoPlayer vPlayer) {
        Debug.Log("End reached!");
        vPlayer.Play();
    }
}

That’s it folks, I hope this proves useful in your adoption of Unity 5.6 and importantly the new Video Player component, I’ll leave this with you as a thanks for scrolling all the way to the bottom 🙂

weirdness

What I’ve learnt demoing VR to first timers.

Those who have the privilege of owning a Virtual Reality (VR) headset I’m sure has shown VR to someone who hasn’t experienced it before, I like to tag these people as VR first timers (jokes), I don’t actually like labelling things but for the sake of this post I will use it. Anyway back to the subject at hand.

I own a HTC Vive and it’s currently set up in my spare bedroom at home, I feel very privileged to own this hardware and to actually have a PC that runs it. I take advantage of this by allowing anyone to have a go on my VR rig.

HTCVive

Most of my family and friends have never heard of VR, they aren’t exactly geeky tech heads like myself, one of my friends thought it was some weird spy camera rig set up, having described that it’s the next big thing called Virtual Reality, my friend still didn’t get it and just brushed it off, I even had to plea with him just to give it a go, I was begging him to try it to see…I had a similar experience with my girlfriend as well.

IMG_2778.JPG

I have demoed VR to a lot of people at various technology conferences, typically at a Unity booth at a conference we would have some form of VR demo set up, whether it’s an Oculus VR, HTC Vive or GearVR. The advantage at a technology conference is I don’t have to beg people to try it.

What’s the different between the three? Quite drastic in my opinion and this question is relevant to the topic of this post because you need to think about what type of VR experience you would like to show to a VR first timer. I don’t want to delve into this question too much though but in simple terms Oculus is a sitting or standing VR experience connected to a PC, GearVR is a mobile experience, a device that has a slot to place your Samsung device into and the HTC Vive (saving the best ’till last) which is a room-scale VR headset connected to a PC (can also be a standing or sitting experience).

IMG_1959.JPG

GearVR has probably been the most accessible VR experience available on the market at present, given there’s some hundreds of millions of Samsung devices in the hands of consumers that are VR ready puts Oculus and Samsung in very powerful position, I personally believe in using the highest quality experience for showing VR to newbies, so that leaves Oculus and HTC Vive. In my experience demoing a mobile level quality VR demo to VR first timers hasn’t delivered the same experiences as other VR headsets, I would love to hear if someone has seen the opposite experience so please comment on this post below.

Let’s start with Oculus, Oculus delivers a higher level immersion, from my experience demoing to consumers, reactions to the Oculus have been along the lines of higher quality graphics, better immersion, this is partly due to the technology which in turn allows developers to push the barriers of immersive experiences. Considering Oculus runs off a high-end PC and not a mobile device (same for the HTC Vive). I’m not saying GearVR apps can not create high levels of immersive experiences…Immersion is not defined by graphics quality and or technology but it can help.

Moving on to the HTC Vive, In my opinion and from my experience demoing to VR first timers and talking to professional developers, the HTC Vive delivers the best VR experience on the market. Partly due to the room-scale features it offers, this is a complete game changer in a VR experience, being able to physically move around the virtual room or space you are experiencing is extremely powerful and is a superb feature for us developers to experiment with with new game designs.

What’s really strange is the ability to move around a virtual room seems to confuse consumers at first, almost every VR first timer who tried the HTC Vive didn’t think about walking around the room, it seems to come across as an unnatural thing to do, which is true, think about it; Almost all core gaming experiences are static, since the beginning of video games you have never had the opportunity to physically move around a virtual space to effect the gameplay, the Wii was really the first global success in doing physical movement to effect gameplay (is there a term for that?) but you were limited as you had to face towards the TV.

Content on the HTC Vive has introduced various input designs that allow you to traverse virtual environments in various ways, there has been a few talks on this and is a constant topic at present, many developers experimenting with new ideas and also defining input standards within VR. Teleportation is one common technique which doesn’t really work for all experiences, the ability to teleport already defines the genre of your game, I mean..surely your game has some sci-fi based advancements right? As teleportation is not a natural experience for a human, the power of teleportation can break the immersive experience you are having within the content. In a general positive VR empowers you to experience what a teleportation might feel like, in reality it’s a win win situation for a consumer.

From a VR first timer perspective the teleportation technique completely freaks people out at first, I tend to show VR first timers Valve’s demo called “Aperture Robot Repair” which has a cool teleportation implementation, but it takes a few times to get used it at first, especially amusing is when the first time they use it and mess it up by teleporting up against a wall it strikes shock. Other implementations involve various techniques which I won’t go into in this post.

My brother tried a VR experience which placed the player up on a high beam, this totally freaked him out, having a fear of heights didn’t help but for my amusement seeing him try and balance himself on a flat surface (i.e. the bedroom floor) was rather good.

So, moving on from this, I would highly recommend showing VR first timers the HTC Vive, it really delivers the best experience VR can currently offer, the room-scale capabilities enhance the experience tenfold.

My order of content I demo to VR first timers:

theBlu:

theBlu.jpg

A cinematic experience which introduces VR first timers to room-scale VR with no interaction required, limiting the senses and allows us to adapt to the virtual environment and power of room-scale VR. Everyone I have shown this to has been blown away by this experience, especially when a big whale swims by. Generally people are super convinced at this point.

Aperture Robot Repair:

Adding an extra layer into the room-scale VR experience with the ability to be able to interact with objects in a virtual room, the human senses are going a bit more crazy with this experience, usually receive a lot of nervous laughter and “whoa!!” outbursts. Introduces teleportation input technique, which freaks people out at first. Generally; feedback has been very positive with this experience.

TiltBrush:

TiltBrush.jpg

An artistic VR experience, empowering the user to be creative with painting and drawing in VR, user can move around and interact via painting strokes or objects into a virtual space. Usually at this point a majority of users have adapted to a VR room-scale experience and everything is starting to feel more natural. As far as I know there’s no teleportation technique implemented.

After they are done with these three experiences, I offer a choice of content next, this means they can pick something that might interest them or pick something that might enhance a human emotion such as fear as in horror or fear of heights, so here is my list in no particular order:

  • Job Simulator
  • The Rose and I
  • Fantastic Contraption
  • Space Pirate Trainer
  • AudioShield
  • The Brookhaven Experiment
  • CloudLands: VR Minigolf
  • Skeet: VR Target Shooting
  • Selfie Tennis
  • The Lab

I  acknowledge content and peoples interests plays a big part in personal VR experiences and notice I mentioned “personal”, from what  I’ve seen everyone’s VR experience is unique and personal, people react to different experiences differently, it can effect them emotionally, such as being excited, nervous, funny, scared, peaceful etc…it would be great to hear what content you like to show to VR first timers and also the reactions from these experiences you’ve seen, please comment below.

So what have I learn’t exactly, lets summarise:

  • Demo the best technology on the market i.e. HTC Vive with room-scale VR
  • Ask the individual if they have any fears before letting them play
  • Always ensure the player doesn’t step on the headset cable
  • Start with the most basic / limited experience for example theBlu which is awesome
  • Begin introducing more complex VR experiences which involves interaction and movement
  • Then go wild if they feel comfortable continuing

Notice I haven’t covered nausea in this post, content on the HTC Vive and the technology shouldn’t cause this but may be experienced with specific people, I haven’t had one person experience nausea with the HTC Vive but I’m sure someone out there has.

Thanks all!

WebGL – The Future Ain’t What It Used To Be!

webgl

This post was valid as of 1st October 2015

Web deployment with Unity has took a hit in recent months due to the deprecation of NPAPI plugins in Chrome, which mean’t the WebPlayer plugin has been disabled. All major browser vendors are moving away from plugins.

Current solution is to look at the WebGL build target which came in a Preview state with Unity’s 5.0 release. We know there’s some challenges with supporting WebGL right now, but let me detail what the future should look like.

Why Preview?

WebGL now, is in Preview, it’s a way of Unity saying that some functionality and development needs to happen before they consider it a fully released product. Missing functionality from Unity’s side includes:

  • Substance realtime generation of procedural textures
  • Precomputed Realtime GI – believe that actually requires a port from Enlighten
  • MovieTextures – but actually you can get a nicer video playback using: Simple MovieTextures for Unity WebGL
  • WebCam – coming in 5.3
  • Microphone

Development needed from the browser vendors include:

  • WebAssembly
  • Data compression
  • Shared array buffers
  • WebGL 2.0

How will these help for the Future?!

WebAssembly which allows for taking ASM.js and turning it into byte code, in turn this makes things faster as byte code is faster to parse than JavaScript as well as being faster to execute. This will solve slow load times and slow downloads, also fixing up some memory usage issues.

Data Compression; Slightly self-explanatory, will allow for data to be kept in a compressed format in memory. Currently you need to handle this yourself, but not for much longer. This will greatly help with build sizes.

Shared Array Buffers is a feature that will allow for memory to be shared across web workers, using this Unity can map their current multi-threaded code to JavaScript, so WebGL can benefit from multi-threaded features such as PhysX. No more colliding on the main thread!

WebGL 2.0 – WebGL to become a graphic powerhouse, no really it can. Currently WebGL is using OpenGLES 2.0, remember those good old smartphones?! With WebGL 2.0 Unity WebGL gets the much needed bump up to OpenGLES 3.0, allowing for Unity to lift it’s restrictions on shaders, so we can have image effects (all the bloom you need), deferred rendering, skinning on the GPU. Unity’s 5.2 release included support for WebGL 2.0 as an experimental option, reason; because no major browser vendor has shipped support for it yet as of the date of this post, but you can try with a Firefox nightly build.

Some WebGL tips ‘n’ tricks:

Need to ship projects now with WebGL? Be sure to have taken note of what features in Unity that isn’t supported right now (see above) and also make use of these:

  • Crunched Texture Compression

You can use this feature as a texture format for JPEG like compression ratio and quality, however unlike JPEG it will directly decompress into DXT, so you have compressed textures on the GPU, no loss in GPU memory, happy days! This feature helps with memory reduction and keeping the data size down.

  • AssetBundles

A must use for any Unity project really, but for WebGL it will greatly help with a reduction in memory and build size. Let’s reduce everything!

  • Unity Profiler

If you’re not making use of Unity’s Profiler then you’re doing it wrong, as with any Unity project profile away, it’s fun and a great way of finding your bottlenecks. You should be optimising for WebGL as you do for any other platform in Unity, there’s a lot of resources online for optimising.

  • WebGL Memory Size

Within Unity -> Player Settings -> Publishing Settings you can specify how much memory (in MB) the content should allocate for it’s heap. If it’s too low you will get out of memory errors, but if it’s too high your content might fail to load in some browsers or on some machines, because the browser might not have enough available memory to allocate the requested heap size. Test and find the correct solution for your app. Optimising your app as much as possible will help with memory reduction therefore the need for a high heap size decreases..

  • Multiplayer Gaming

Use UNET in 5.1+ as this will work out of the box as it’s using the WebSocket API – you can also write JavaScript code to directly use WebSockets yourself.

Helpful resources:

Benchmarking can help you test different areas of Unity Engine to see how it performs on the WebGL platform. Unity developed a app so benchmarking can be easily executed: http://beta.unity3d.com/jonas/WebGLBenchmark/

Watch a Unite Europe 2015 talk from Unity’s lead WebGL developer Jonas Echterhoff: https://www.youtube.com/watch?v=RufJDxm6Lq8

Unity WebGL Forums for reporting issues and learning from other developers: http://forum.unity3d.com/forums/webgl.84/

Update:

I recently did a talk about WebGL & il2cpp – you can download the slides here: WebGL & il2cpp

Experimenting with HDR Skyboxes in Unity 5.

Unity 5 has implemented a HUGE update on the rendering / graphics side of the engine, introducing new lighting workflows with realtime GI, a Physically Correct Shader (Metallic & Specular workflows supported) among many other things..

I wanted to do an experiment today, where I test out HDR Skybox’s in Unity 5 to see how drastically the lighting and mood of a scene can change.

HDR:

High-Dynamic-Range is an imaging photography technique to produce a higher dynamic range of brightness. This is achieved by capturing different exposures of your chosen subject matter and combining them into one image.

In Unity we can use these HDR images to help blend 3D models into the environment, this can drastically add to the belief that the 3D model is actually in the environment.

Quick mention – I’m using HDR images from NoMotion HDR’s – 150 free HDR images, check EULA before using for commercial usages.

I’ll be showing how these 3 HDR images help create a completely different feel and look to the scene:

I have daytime, evening and night-time HDR’s, they should all create drastically different lighting conditions in Unity.

Time’s up chumps..let’s do this:

Unity 5’s new standard shader has built-in support for image based lighting (IBL), this really helps with the belief that the 3D model is in the environment I am setting with the HDR skybox.

Here’s the model with just Unity 5’s current default procedural skybox:

Default Skybox

With a bit of trickery (not really, just pushing a few buttons in Unity), let’s now see what happens when I add the daytime HDR Skybox:

Daytime Skybox

I haven’t messed with any lights at all, all I have is the default Directional Light in the scene with default settings. looks rather impressive after just adding a new HDR Skybox to the scene.

Let’s try the evening HDR Skybox now:

Evening Skybox

Awesome, I am really enjoying playing around with this, I love how quickly I can completely change the lighting conditions which in turn drastically changes the mood and atmosphere in the scene. I can’t wait to see how games are going to utilize these new features.

Okay, last one, the night-time HDR Skybox, I’m a little skeptical about this one, I’ve no idea how this will look and I imagine it’s not really desirable to use a night-time HDR image. Anyway, let’s see what it looks like:

Nighttime Skybox

Actually turned out rather well, the image doesn’t show the white spots on the model as much as in the editor, these white spots are produced by the specular smoothness on the Standard Shader, i.e the more smooth I make it the more white spot artifacts are produced, not sure why this is more present in the night-time scene, I’m sure there’s a setting I’ve missed or something..

Overall, I’m really impressed with this, especially how quickly I can change the mood and lighting of the scene and the visual output from Unity 5’s new rendering / graphics update.

Look forward to seeing what you all produce with Unity 5. 🙂

R&D with new Unity 5 Graphics..

My job is based on supporting Unity’s customer base in the EMEA region, to do a good job that means I need to learn all of the Unity things, features and new services, at least on a high level.

I tend to experiment a lot with Unity’s new features, today I wanted to share with you some of my R&D with Unity 5’s new Graphics overhall, this includes the new Physical Based Shader, Realtime Global Illumination and the new Scene Render Settings.

My experiments are usually aimed at producing a small scaled demo that can squeeze in as many features as possible, this enables me to demonstrate these features easily to customers while in the field.

Field

PBS (Physically Based Shader):

Unity’s new Physically Based Shader a.k.a one shader to rule them all a.k.a standard shader (actual name) allows us to create materials for a wide range of natural surfaces such as Metals, Dielectrics (non-metals): monolithic materials = rock, water, plastic ceramic, wood, glass etc..cloth and organic.

The new PBS also plays nice with IBL (Image Based Lighting), we can setup a skybox cubemap in the new Scene Render Settings to help with really blending our objects into the surrounding environments:

Scene Render Settings

One demo (not developed by me) shows a nice range of different surfaces used by the new standard shader in Unity 5:

We can see there’s at least six different surfaces represented here with the usage of just one shader – Ceramic, Cloth, Metal, Glass, Rock and Wood. The Scene Render Settings really help blending the Doll model into the surrounding area, helping us believe that the Doll is in the Forest environment.

The new shader includes many different texture slots, allowing you to add really nice detail to models. The shader includes a multi-variation of smaller shaders with versions for mobile and high-end.

Standard Shader PBS

Our built-in shader code is always available for download and currently with Unity 5 this will include the new standard shader as well – Could change but I doubt it.

Realtime Global Illumination:

The lighting for realtime is half pre-computed realtime lighting, allowing you to dynamically light your scene, dynamically changing: light sources, environment lighting and material properties such as diffuse reflective and surface emission.

Geometry needs to be marked lightmap static but you can relight geometry using Light Probes that are updated in realtime with the GI generated from the static geometry. In my little demo I’ve combined the usage of Realtime GI, PBS, Reflection Probes and Light Probes, the majority of the objects marked as static apart from a few props which demonstrate the usage of Light Probes for non-static objects:

Couple of shout outs, I’ve used the Medieval Assets Pack by Naresh and the Medieval Boat from Mr Necturus which are available on the Asset Store for most of the props. The wooden cottage model center piece is a free model from some site I can’t remember.

Here’s a Vine I recently sent out, little outdated comparing to the above screenshots, but demonstrates the realtime GI in editor with play mode on:

There’s more of this to learn as Unity 5 develops through the beta stage. Note: Screenshots from beta version of Unity 5 – May look different when released.

Also worth sharing is what Alex Lovett is doing with the Unity 5 beta and Realtime GI: http://forum.unity3d.com/threads/unity-realtime-reflections-and-gi-and-realism-exploration.266258/ – Now if only my Realtime GI R&D looked like that 😀

Improve your efficiency with MonoDevelop

Everyone has their own workflow and choice of Scripting IDE, but if you’re one of those using MonoDevelop with Unity this blog post should help with an increase in efficiency when working with it..

I’m using a Mac and so the workflow improvements will be based on using MonoDevelop on a Mac.

Syntax Highlighting is important for writing and reading code, MonoDevelop makes it easy to change this, simply click on MonoDevelop-Unity menu item and then click Preferences, the Preferences window will open, there’s a menu on the left hand side, under Text Editor -> Select Syntax Highlighting there will appear eight options for you to chose from. My preference is called Oblivion but pick and test one for yourself which suits you best. You can also enable semantic highlighting which helps with reading code. Use cmd + / –  to zoom in and out, so you can see the code you are writing.

Keyboard shortcuts:

Who doesn’t like a good keyboard shortcut, yes there’s many to remember from many different software packages we use, but I think it’s important to have a few for each package that are your most important to your workflow. So here’s a few I feel worth highlighting for an increased workflow:

Saving scripts:

  • cmd + S = Save Script – Unity picks up this save and recompiles to reflect the changes
  • cmd + shift + S = Save all scripts open, useful if you’re manipulating multiple scripts when coding public / public static functions etc

Scripting layout :

  • cmd +[ and cmd + ] = Indents code, allowing to quickly indent lines of code without having to use the tab key
  • cmd + / = Quickly comment out some code or  comment your comments
  • shift + arrow keys = Select lines of code or symbols real fast
  • cmd + shift + Y = Finds references to that code snippet and displays the info in a window below
  • alt / option + arrow keys = Move lines of code or the cursor through lines of code

Debugging:

  • cmd + return / enter = This starts debugging mode from the Run menu item, no need to go through menu items
  • cmd + \ = Toggles breakpoint, no need to go through the Run menu item
  • cmd + shift + I = Step Into, no need to go through the Run menu item
  • cmd + shift + O = Step Over
  • cmd + shift + U = Step Out

Other useful actions, “Attach to Process” and “Detach” when debugging Unity, these don’t have any default keyboard shortcuts, but if you enter MonoDevelop-Unity menu item and reopen Preferences window, you can add your own under “Key Bindings”. This also shows everything I just highlighted and more.

Hope this all helps and please share your workflow improvements!

Localization Support with Unity!

LanguageManage

New users to Unity tend to ask about an integrated support for localization within the editor (built-in tools), currently Unity does not have this, so users ask for solutions and what’s available to get this setup. I found a free package on the asset store titled Language Manager – which is a key based system and easy to integrate multiple language support for your games and apps. Let’s take a deeper look into the package and highlight the ease of use in getting it setup:

Getting Started:

  1. Create an AssetStore Account if you haven’t got a UDN account setup already
  2. Do a search for Language Manager or follow this link for the browser version: https://www.assetstore.unity3d.com/#/content/1018
  3. Download and import the asset package into a new project – Best to test it out before directly importing into your current professional project

Within your new Unity project, you will see a folder named “LanguageManager” in the project window. The folder contains the scripts and resources needed to add localisation support to your game or app, included support already for 6 different languages. We shall take a quick look at the sample scene included.

Double click the sample scene to open – click the main camera in the hierarchy window and notice the script component named: TestScript.

The script contains:

  • A GUI Selection Grid, allowing the user to press a GUI button to switch languages
  • A public string which gives the option to select the default language in the inspector
  • A Switch statement containing support for all 6 languages (English, Spanish, French, Italian, Chinese and Russian)

Hit play and observe the two GUI sentences at the top left of the game view are rendered in your chosen default language, the GUI buttons below allows the user to switch to a different language, clicking Russian will update the text to be displayed in Russian characters as well as for the GUI buttons. This example scene can be adapted to be used as a Language menu screen at the beginning of your game / app or in the main menu screen

It’s also worth mentioning you can get the system / OS language from an API call Application.systemLanguage – this returns the users OS default language, works on Mobile devices as well as PC, Mac and other major platforms, some are not supported so ensure to test that out. Example code for this in C# of course:

using UnityEngine;
using System.Collections;

public class OSLanguage : MonoBehaviour
{
void Update()
{
guiText.text = Application.systemLanguage.ToString();
}
}

Back to the LanguageManager package,. There is a custom Window included in the Unity Window drop down menu -> Language Editor, this allows the user to create new keys and create files for different languages (I imagine as many you like). The package contains support for an initializer – LanguageManager.LoadLanguageFile(defaultLanguage); and use it by calling – LanguageManager.GetText(“string key”); That’s where you relate your key’s added in the Language Editor window, for example the string key for  English language is “english”. 🙂 – with just a few lines of code you can get basic support for multiple languages within your games / apps and apply them to GUI elements in your scene/s.

Take a look at the package, it’s a free download after all, also checkout the many other packages on the Asset Store here.

Unity5 Announced!

Unity5http://unity3d.com/5

I’m getting straight to the point here, what is in included in Unity5:

Physically-based Shading:

There’s a new shader to setup great looking materials in a range of lighting environments, it’s one shader to rule them all, an uber-shader one might call it, you can use it for a range of different surfaces such as wood, metal, plastics, ceramics, cloth and many others.

Unity5_Teleporter

Realtime Global Illumination:

Built upon Geomerics Enlighten Technology, Unityhas integrated realtime physically-based Global Illumination for cross-platform, runs super nice on mobile / tablet devices. You can animate lights, setup beautiful environments lighting and make use of emissive materials to create stunning effects and visuals. What’s really nice, as an added bonus, you now don’t need to be rebaking any lightmaps, which is especially painful when bake times are long for larger scenes, Global Illumination updates immediately upon making any changes to help dramatically increase iterative times.

WebGL:

The plugin-less browser technology is approaching fast and Unitywill offer the option to deploy to WebGL without the need for a plugin download to playback content, with a one-click deploy system the building times are super fast and simular to what our WebPlayer plugin build system is like.

Audio Mixer:

New audio mixng technology enters Unitywith simple workflows for setting up different sounds within your 2d / 3d games. Setup realtime mixing graphs, ability to edit tweak in play mode, create and blend between snapshots, insert effects into the mixers, implement ducking of sounds and many more..

UnityCloud:

This service offers the ability to integrate cross-promotion campaigns for acquiring players and help with retaining them

64-bit Editor:

64-bit editor brings massive improvements to Unity for handling demanding tasks 32-bit version might just crash on with out of memory, the runtime was ported a while back now, but getting the editor ported with all the dependencies took time.

PhysX 3.3:

The much requested update to PhysX has arrived, NVIDIA completely rewrote the system, bringing excellent performance boosts which is great for mobile / tablet devices. A new wheel collider is available amongst other things, more PhysX 3.3 features will be exposed later in the 5.x cycle.

There’s many more features in Unitywhich will just populate my blog, but here is more smaller but equally juicy feature set:

  • AI: NavMesh supports LoadLevelAdditive.
  • NavMeshObstacle supports two basic shapes – cylinder and box for both carving and avoidance.
  • Editor: The editor is now a 64-bit application.
  • Graphics: Improved ambient lighting.
  • Cubemaps support texture compression
  • Improved LODGroup. A “fade mode” can be set on each level and a value of “how current LOD be blended/faded to the next LOD” will be passed to shader program in unity_Scale.z.
  • Non-uniformly scaled meshes no longer incur any memory cost or performance hit.
  • PluginInspector: new plugin system.
  • Scripting: Introduced option to auto-update obsolete Unity API usage in scripts / assemblies.
  • Version Control: Scene and Prefab Merging.
  • Asset Store: The asset store window is now many times faster, more responsive, and looks better.
  • Model importing: Updated FBX SDK to 2015.1
  • Windows Store Apps: You can now use joysticks in addition to Xbox 360 controllers

For a more visual look at Unity5‘s new feature set, take a look at the official Unity Feature Preview video:

With such an exciting announcement I can’t wait for all you guys to get your hands on this awesome toolset!

Thanks!

Unity has 2D tools..

Yes..the much talked about, highly requested 2D feature is finally here and available to you all for free. In preparation for the Unity 4.3 release, I developed a small 2D physics puzzle game which enabled me to get up to speed with the toolset and also use for demo / presentation purposes on my travels.

This post will go through some of the new 2D tools, workflows and shed a little light on how to use the different features available.

2D Defaults

First new feature is the 2D defaults option in the project wizard, this sets up the unity engine to use 2D for defaults on things such as the texture importer, textures will now be imported as sprites automatically instead of needing to change the texture type in the asset importer.

Sprite_importer

There are other subtle touches with using 2D defaults, such as the main camera setup as orthographic and not perspective.

A new addition to the scene window is a little toggle button which if clicked on toggles the scene between 2D and 3D, using 2D defaults means the scene view will have 2D toggled by default without needing to click the button on launch, navigating the scene in the x and y axis only. Also note, creating a new project with 2D defaults enabled doesn’t mean you’re now bound to 2D, all 3D and 2D features are still available.

2D_scene

In the 2D viewport in the scene view, with the move tool and the sprite selected we now have a new gizmo to play with, this makes it a lot easier to do actions such as: move, uniform / linear scaling, rotation and highlights the pivot placement. We do not need to switch between different tools, it’s all there in one place.

Sprite_gizmo

Box2D Physics Integration

Implemented is the Box2D physics engine, a free open source 2-dimensional physics engine and is considered a leading industry standard. With Unity 4.3 we now have a list of new 2D physics components:

2D_physics_components

3 / 4 of the colliders are self explanatory but the more interesting one to look at closer is the Polygon collider.

Add the component to your sprite asset in the same way you would add any component and you will see a green highlighted collider outline, the green lines indicate it’s a collider and it should roughly match the shape of your sprite.

If you added the Polygon Collider to an empty GameObject a Pentagon shape will be generated, If you expand the “collider info” in the component you will see the value of 5 Vertices, a Pentagon is made of 5 points therefore 5 Vertices.

To add more Vertices hold down shift + left mouse button anywhere on the green collider line and then position your Vertices, to delete Vertices use ctrl + left mouse click in the same fashion as adding them, you will see the green collider line update to red to indicate delete is available:

Polygon_collider_info

The Sprite Editor

There’s a new window to edit sprites in, it’s called the Sprite Editor and can be opened in two ways, Window -> Sprite editor, or click on the sprite asset in the project window, in the inspector change sprite mode to “multiple” and click the Sprite Editor button.

Sprite_editor

Note: You need to change the sprite mode to multiple to be able to slice it up in the Sprite Editor.

Let’s take a look at the Slice menu, this will give you options to slice up different elements of your sprite image automatically or manually.

A typical workflow is to slice up the image manually, you can do that by clicking and dragging on the image, you should see a blue rectangular gizmo appear with handles in the corners allowing you to resize your slice area. Having positioned and resized your slice area you can add another by following the same action, and you can add as many as you like. You will also notice a new Sprite window appear giving you extra tools to play with, you can rename the area for that sprite slice and edit the size of the rectangle by coordinates. The Trim button will tightly pack the slice rectangle based on transparency, so it will fit nicely along the edges of the sprite image.

In many cases you can probably use the automatic slicing, Unity will do the work for you and therefore save some time. Using the Automatic slicing option, the Sprite Editor will guess the boundaries of the sprite _again_ by transparency.

The Grid is another slicing option available, very useful for rectangular sprites, the Pixel Size means you can define the height and width of the tiles in pixels.

After automatic slicing you can still edit the slices manually, also use the Trim button to tighten them up.

Lot’s of other cool stuff included such as using the Animation window to easily animate sprites, using Mecanim for 2D blendtrees but I won’t cover that right now.

To finish up here’s some useful info:

Box2D Performance tips:

  • Try to avoid a lot of OnCollisionStay callbacks, this will be expensive.
  • Profiler is your friend, scroll down in the Profiler to Profile Physics2D and profile early in your development cycle.
  • Using the Polygon Collider can be expensive if lot’s of Vertices are needed, Polygon Collider decomposes the Sprite into lot’s of shapes which has a huge overhead.
  • Each shape can produce up to two contact points, having a Sprite with 50-80 shapes then it could produce double that in contacts.
  • Keep an eye on the amount of contacts from dynamic bodies, if they don’t collide you can have hundreds maybe thousands moving around, when they come into proximity performance starts to crumble
  • Circle Collider is your friend.
  • Consider all of the above especially when targeting low-end mobile hardware

That’s all folks!