On the future of Web publishing in Unity

A few weeks ago at GDC, we announced support for WebGL publishing for Unity 5. Now I’d like to share some more information on what this is all about, and what you can expect from it.

Some background

WebGL is a 3d graphics library built into the browser which allows JavaScript programs to do 3d rendering inside any supported browser without requiring any plug-ins. To us, this always seemed like a perfect fit for running Unity content on the web, as it would give end users the most barrier-free experience – as the browser would supply everything needed out of the box, and everything would just work without the user bothering with installing any plug-ins.


However, we initially had some doubts on whether this would be technically achievable, as  WebGL is a JavaScript API – which means that all our code (both our Unity runtime and your game code) needs to run in JavaScript somehow. But at the same time, we thought that this technology was too cool not to try it anyways, so we started experimenting with it at a HackWeek in Copenhagen two years ago. Also we had been talking to Mozilla around that time, who have been very eager to help us and to proof to us that this can indeed be done – so they had some engineers come over to Copenhagen to join the fun.

It took us a few more HackWeeks of tinkering around and some developments on the browser side as well, until we reached a point where we realized that we could make a real viable product out of this – which is when we started going into real production.

To give you an idea of what is possible right now, here is a Unity player exported to WebGL with a current alpha version of Unity 5

Currently supported browsers for this content are Firefox and Chrome 35 (Chrome 35 is currently in beta, and is needed, as the current Chrome 34 release version has a JavaScript bug which is causing this game to hang).

Click the icon below to play Dead Trigger 2 by Madfinger games in your browser, demonstrating an immersive fullscreen FPS experience in WebGL. Controls are WASD to walk, mouse to look, Q to switch weapons, Tab to switch to Melee combat, and 1, 2, and 3 for special powers (try them!).


And here is a build of our classic AngryBots demo (which runs fine on Firefox and the release version of Chrome):


Technical details

As mentioned above, to run in WebGL, all our code needs to be JavaScript. We use the emscripten compiler toolchain to cross-compile the Unity runtime code (written in C and C++) into asm.js JavaScript. asm.js is a very optimizable subset of JavaScript which allows JavaScript engines to AOT-compile asm.js code into very performant native code (see here for a better explanation).


To convert the .NET game code (your C# and UnityScript scripts) into JavaScript, we developed a new technology in-house which we call IL2CPP.  IL2CPP takes .NET bytecode and converts it to corresponding C++ source files, which we can then compile using any C++ compiler — such as emscripten to get your scripts converted to JavaScript. Expect more information on IL2CPP soon.


WebGL in Unity 5.0

We plan to WebGL support available in Unity 5.0 as an early-access add-on (before you ask: the terms and prices of this add-on have not been decided on yet). Early-Access means that it will be capable of publishing content to WebGL (like the examples above), but it will have some limitations in features and in browser compatibility. In particular, the following features will not be supported:

  • Runtime generation of Substance textures
  • MovieTextures
  • Networking other then WWW class (a WebSockets plug-in is available)
  • Support for WebCam and Microphone access
  • Hardware cursor support
  • Most of the non-basic audio features
  • Script debugging
  • Threads
  • Any .NET features requiring dynamic code generation

In terms of browser support, this initial version will only support the desktop versions of Firefox and Chrome (other browsers might work for some content, but only these two will be officially supported).

We expect to resolve most of those limitations (except for things which are restrictions imposed by the platform) during the 5.x release cycle, and to be able to support a wider range of browsers as well as the platform matures – at which point we will drop the early-access label and make WebGL a fully supported build platform in Unity.

The Unity Web Player in Unity 5

While WebGL is a very exciting new technology, currently, the Unity Web Player is still the most feature-complete and the most performant solution for targeting the web with Unity, and will stay as a supported platform in Unity 5.x. It may be a very useful strategy to dual-publish your content using both WebGL and the Web Player, in order to get the widest possible reach for your audience.

Longer term, however, we expect that the performance and feature gap between the Web Player and WebGL will become much more narrow, and we expect that browser vendors will make the Web Player obsolete by dropping support for plug-ins, at which point WebGL will become the prime solution for targeting the web with Unity.


Tutorial: Behave 2 for Unity


The Behave project, along with Path (now open-source MIT), were among the first projects I did in Unity after picking it up in the beginning of 2008. In an all too familiar story, I created the tools to replace those I had used at my previous job, but ended up focusing more on the tools than my game project.

old-behave I first shared the project at Unite 08, after Tom Higgins and David Helgason cornered me in a bar and persuaded me to give an open mic talk the next day on what at the time was the only full-on middleware solution integrated in the Unity editor.

This was Behave 0.3b. 1.0 was released a couple of months later and 1.2  went live in 2010 as one of the launch packages of the Asset Store.

When at Unity, my schedule was pretty packed, so 2.0 was quite a while under way. Large refactors, support for multiple engines and platforms plus feature creep did not help much either. But here we are now on 2.3.2 – fortunately updates since the release of 2.0 in August 13 did not take the time that 1.4→2.0 did.


So with the history lesson out of the way, what is Behave, really? In short, it is an AI behaviour logic system. It allows you to visually design behaviour trees, link them directly to your own code through a highly efficient compiler and finally debug the trees, running in the editor or on your target device.

One of the guiding principles behind Behave is to as much as possible avoid systemic requirements. That is designs which might chain your runtime integration into a certain way of operating. The result is a very lean and efficient runtime, with the integration possibilities more or less just limited to your imagination and practical needs.

Behaviour trees

Behaviour trees you say? Yes I do. A widely standardised method of describing behaviour logic, first used on scale in Halo, behaviour trees set themselves apart from methods like state machines in that they scale much better, are easy to read be made responsive.

Behave2I am going to assume familiarity with state machines (as you might know them from Mecanim or plugins like Playmaker) – to use them as a reference in describing behaviour trees. Though I clearly cannot describe all that is behaviour trees in the length of this article.

While state machines are in the business of selecting states within which actions are performed, behaviour trees build state implicitly from their structure and focus squarely on selecting actions to perform.

This means that while state machines allow you to set up states with any number of transitions (at scale often ending up in a hard to maintain spider-web of transitions), behaviour trees have a strict set of rules for connectivity and evaluation.

Them rules

behave4A behaviour tree is basically an upside-down tree structure – evaluation starting from the root at the top, filtering through a number of interconnected control nodes and ending in leaf nodes – actions. Actions are where you interface game logic with behaviour logic,  hooking up sensors and motors.

The responsiveness of behaviour trees stems from the fact that they are most often evaluated continuously, at some frame rate. Each evaluation start at the top and given the rules for the different control nodes, the flow is directed until an action node is hit.

Each node will then, rather than block on execution, return a signal back to its parent node. The parent then interprets, reacts and returns its own signal back up the chain until the root is reached again.

This signal can be one of three: Success, Failure or Running. Success and Failure obviously meaning that the node succeeded or failed in its task and Running meaning that the node has not yet reached the conclusion of its task and requests to get re-pinged on the next tree evaluation.

Example actions could be HasTarget, which would return Success if the agent executing the tree has a target and otherwise Failure or GoToTarget, which would return Running while on its way to the target and then Success when reached or Failure when determined to be unreachable.

Behave integration

So while the graphical editor lets you easily connect together these control nodes and actions, you of course need to hook this up to your AI agents at some point.

This is achieved via the one-click compilation of your Behave library (the asset containing your trees), which for the Unity target compiler generates a .net assembly. As it is output in your assets folder, Unity will automatically compile it in with the rest of your code.

What this means is that once you hit compile, you will be able to access generated classes from your code, representing your behaviour trees at runtime.

The central method of the generated library class “BL[AssetName]” is the InstantiateTreemethod. This takes as parameter first the tree type you wish instantiated (via an enum generated from the tree names in the editor) and second the agent you wish to integrate the tree with. This is the class which will need to implement the action handlers described earlier.


Out of the box Behave offers two ways of implementing action handlers. The default is you derive from the IAgent interface. In this case Behave will reflect your class for action handlers on instantiation, much like the Unity messaging system.

The second way of implementing action handlers is to define an agent blueprint in your library. At runtime, this results in an abstract agent class being defined, with predefined virtual handlers for all actions used by the trees supported by that blueprint. This method is less flexible, but removes the overhead of reflection on tree instantiation and gives you auto-complete on action handler methods in your favourite code editor.

With handlers defined, you then simply call the Tick method on the tree instance at a frame-rate or in response to some game event and the tree will in turn call one or more of your action handlers, depending on its design.

For core character behaviour logic, I usually create a coroutine named AIUpdate or turn Start into a coroutine, containing a loop which ticks the tree and then yields WaitForSeconds one divided by the frequency property of the tree. This property serves no other purpose at runtime than to communicate an intend from the designer to the programmer.

So as you can already see at this point, Behave does indeed follow the design goal of low complexity, leaving design decisions in the integration layer completely up to you.

The Behave runtime has much more runtime API and integration flexibility, but that is unfortunately a bit much to cover in this overview.


profileI hope you found this introduction useful and that you will consider using Behave for your next AI project. I would recommend you check out the following sources for more information on behaviour trees:

And of-course more information on Behave canbe found at:

Have fun!

Emil “AngryAnt” Johansen

Testing by gaming, ACCU and Ukraine

For several weeks I’ve been preparing a playtesting blank solution that contains integration tests based on Unity Test Tools and stubs for game objects.

Earlier this month, I was lucky enough to have the opportunity to present it at a workshop held at one of the best programming conferences in the world – ACCU conference. Each year, ACCU attracts top speakers from the computing community including Andrei Alexandrescu, James Coplien, Tom Gilb, Robert Martin, Kevlin Henney, Andrew Koenig, Eric S. Raymond, Guido van Rossum, Greg Stein, Bjarne Stroustrup and Herb Sutter.

Workshop attendees got access to the project source files which they could then work on in Unity. Scenes that contain tests are called “Level1”, “Level2” and so on. When you open the scene, the tests fail. The challenge is to start implementing functionality to make tests pass, and as you do so, the game starts growing.

When all the tests pass, you can proceed to the next level, and the process itself is like a game. After completing each level you can open the scene called “Game” and try it out.

If you’d like to play around with it, the Growing Games Guided by Tests project is available on GitHub. The game involves building an ultimate weapon of intergalactic destruction to fight back an invasion by green aliens: Have fun!

Solution packages are available for each level. If you get stuck, just navigate to the Solutions folder and open the package with the corresponding level name. Using these solutions you can navigate back and forth within the exercise. “Level 0” reverts the solution to its initial state.


My workshop gimmick is to trade chocolate coins for audience attention. If someone asks me a question or points to a mistake, I give them a chocolate coin in exchange. As it was a live coding session, I made both intentional and unintentional mistakes but the audience always noticed them.

They also asked lots of questions, even asking me to show how the tests were made and how to make one from scratch. That input that will let me make my next workshop much better. By the end I was right out of chocolate coins. Thanks guys!

On the conference’s second day I volunteered to hold a lightning talk: “Public Speaking for Geeks.” I’ve been holding talks since 2011, and when I delivered my first conference address it didn’t go smoothly. Actually, it was a failure. But I’ve learned a lot since then and I wanted to inspire people to try public speaking, learn from their experience and try again.

As you might already know, Unity Technologies has an office in Odessa, Ukraine; a beautiful city on the Black Sea coast. The Odessa office is home to 11 engineers from 3 teams: SDET, STE and Toolsmiths, and it’s where I’m based.

Ten minutes before my lightning talk, I got a message from my friend Tom Gilb: “Forget public speaking. Tell them about Ukraine!” It came as a shock. I suddenly realized how much I wanted to tell the truth about Ukraine, to tell people what has happened and how it affects us.

In a strange way this helped keep me calm and meant that my Public Speaking for Geeks address went well. Already, I had another idea for a talk I really wanted to hold.

The feedback I received after my Geek talk was very positive, and a number of people approached me the following day and told me that, after hearing my talk, they had also submitted lightning talk proposals. And that gave me extra motivation to talk about Ukraine.

In the end, the act of explaining the situation in my homeland to my audience made my talk a very emotional occasion, not least because of the feedback and support I received from so many people. ACCU, I already miss you.


The Mouse and the Asset Store: The Story of Ghost of a Tale

During the Ghost of a Tale Indiegogo campaign, many were surprised to learn that this game wasn’t the work of a whole studio of developers. According to Lionel Gallat, the veteran animator behind this little gem, being a one person studio isn’t what it used to be. And that’s a good thing.

“I get a lot of indispensable help from a lot of people. There are so many things I can’t do. I needed a proper programmer helping me. Same with music.” The crowdsourced funds allow him to pay freelancers and use the Unity Asset Store as much as he needs to.

“It’s indispensable, I find a lot of things there that are the building blocks of my game,” says Gallat. As an artist, he didn’t expect it to be so useful. “But soon enough, I found tools that I needed to build the game’s world. I don’t waste time trying to reinvent the wheel.”

When he worked as an animation director for such films as Despicable Me and The Lorax, big technical teams were available to adjust details at short notice. Often, he could just walk over to somebody’s desk and ask. As an indie developer, he doesn’t really have the same options.

Fortunately, the Unity community has been very helpful. “I send requests to developers of different assets, when I need some extra features. Most of the time, they’re really helpful and adjust the assets quickly, which is really great.”

However, in a few instances, communication with developers was an issue. “For example whenever there is an important bug to be fixed and the publisher is not quite responsive about it.” Gallat would like the publishers to consider how hard this is on small teams.“I’m mostly doing the thing on my own, so if they give up on support, there’s nothing I can do. Thankfully it doesn’t happen too often and the vast majority of publishers I worked with do offer excellent support.”

The assets that Lionel Gallat, aka Seith, praises the most are those that help him to achieve high-end graphics.

Decal textures are those that overlay other textures, providing a more detailed and realistic feel to the rocky terrain of the Ghost of a Tale world. Decal system allowed Gallat to use them easily and without burdening the performance of the game. And it’s one of those assets with helpful and swift support.  “It’s amazing. The support is top-notch, and this system allows me to slap beautiful decals anywhere I want at almost no cost. Oh, and it’s free—although I donated money because it’s so good,” he said.

Recently, all the game’s shaders have been converted to physically-based shaders using the Shader Forge tool. Why was this effort worth it? “It just looks so much better. Physically Based Shaders are more intuitive. Non physical shaders have always been a bit of a cheat and they tend to break in different lightning conditions,” explains Gallat.


But the main reason for the switch was that programming shaders took too much time. He needed an artist-friendly tool that would better suit his workflow.

“Shader forge is great, the manipulation, the UI, everything feels very integrated to Unity,” he says. “ The developer, Joachim Holmer, also assisted me with some customization. It was very fast, the features that I needed were already in the next beta.”

Ghost of a Tale is progressing at a brisk pace and Lionel Gallat hopes that his work will be even easier once he gets his hands on Unity 5, which he has already pre-ordered. “I’m really looking forward to all the graphics features, especially to real-time Global Illumination. Also, the Mecanim updates will come in handy.”

Whatever features are in the new version of Unity, however, the Ghost of a Tale developer plans to keep taking advantage of the Asset Store. “It’s always great to supplement skills that I don’t have. People specialize, develop very specific things that can be extremely useful”.

So when are we going to get our hands on the game? “We’ll probably going to show the game around at conventions in Europe during the summer. I really can’t wait to show that the screenshots that I post are in-game footage and not just concept-art!”

On Hunting the Uncommon Elephant

First bug ever found. Taped into log for evidence.

Hunting bugs is fun.  And every now and then you get away alive with a story to bore your grandkids with (“In my days, we still hunted bugs with sticks and stones” and all).

GDC 2014 had another such trophy-worthy hunting safari in store for us.  We were five days away from presenting Unity 5 to the world when we “spotted” (well, it was kinda hard to miss) an ugly little elephant of a bug: our shiny new 64-bit editor was randomly crashing on OSX to the point of being completely unusable.  There’s just nothing like being up on stage to showcase how awesome your bug reporter is every couple minutes.

So, Levi, Jonathan and I dropped all the awesome stuff we’re working on (more stories we want to bore our grandkids with) and went stalking.  All we knew at that point was that it crashed somewhere in the native code that Mono generates at run-time.

As every programmer knows, when you’re faced with a bug that isn’t obvious, you simply start by gathering evidence.  Once you’ve learned enough about the bug’s behavioral patterns, you’ll eventually get a shot at it.  And with the clock ticking, we were ready to shoot at pretty much anything.

But we were stumped.  For an elephant, the bug turned out to be surprisingly agile and sneaky.

It seemed to happen only on OSX 10.9 although Kim saw something that looked markedly similar on Windows with his heavy duty memory debugger branch.  And if you enabled Guard Malloc on earlier versions of OSX, you got what looked fairly similar as well.  However, as it was crashing in random script code at arbitrary depths in the call hierarchy, it was difficult to say with certainty what was the same crash and what wasn’t.  And the crash could be consistent for ten consecutive runs only to be totally different for the next five.

So while Kim and I waded knee-high through memory and thigh-high through assembly code, Levi ran an extensive trace on all of Mono’s secret and not so secret activities to generate a gigabyte log and an editor that ran at the speed of my grandma.  This yielded the first interesting insight: apparently we were always compiling the method we crashed in right before things got ugly.

But what made it crash?  The immediate cause was that we were trying to execute code from an invalid address.  How did we get there?  A bug in Mono’s signal handling where we don’t resume properly?  A bug in Mono’s JIT compiler that won’t jump back properly to the compiled code?  A different thread corrupting stack memory on the main thread?  Fairies and grumkins? (for a bit, the latter seemed the most likely).

After two days of hunting, the elephant was still well alive and out and about.

So, Saturday night I equipped myself with a notebook, four different colored pens and an ample supply of beer from our trademark Unity fridge (carefully making sure I don’t touch the awful canned Christmas beer we still have stuck in its crevices ).  Then I spun up Unity instances until I had four different crashes frozen in the debugger, labeled them “Red Crash”, “Blue Crash”, “Green Crash”, and “Black Crash” and went to work with my respectively colored pens to take notes and draw some not-so-pretty diagrams of everything I found.

Here’s my notes for Blue Crash:


And that’s when I made my first discovery: in every case, the stack was 16 bytes larger than it should be!

That then led to the next discovery: for all crashes, looking at those extra 16 bytes turned up a return address back into the function we crashed in.  From a trace it was clear that in all cases we already had executed some calls from the same method, and at first I thought the address was from the last call we had traced.  However, closer inspection revealed that it was actually the return address for a call whose method had not been compiled yet!

This puzzled me for a moment as in some cases there were several calls in-between the last traced method and this call that hadn’t been compiled yet either.  Looking closer, however, revealed that we always had jumped around them.

So, then I looked at that function we apparently were supposed to return from…


And there we have it (highlighted in blue):  We were jumping in the wrong direction!

What Mono does here is create little “trampoline” functions that contain only a call to the JIT compiler and some data encoded into the instruction stream after the call (used by the JIT compiler to know which method to compile).  Once the JIT compiler has done its work, it will delete those trampolines and erase every trace of having hooked into the method call.

However, the call instruction you see there is what is called a “near call” which incidentally uses a signed 32-bit offset to jump relative to the next instruction.

And since a signed 32-bit number can reach only 2GB up and down and we’re running 64-bit here, we suddenly knew why heap memory layout played such a crucial role in reproducing the bug: once Mono’s trampolines were further than 2GB away from the JIT compiler, offsets wouldn’t fit anymore into 32-bit and would get truncated when emitting the call instruction.

At that point, Jonathan quickly pinpointed the right fix and by the time his Sunday was over, we had a stable working build ready in time for GDC.

You all know the history from there.  We successfully demoed Unity 5 at GDC 2014 to rave reviews and after launch, it quickly became the most beloved piece of software ever.  Oh wait, that bit is yet to come…

Before that launch, there’s a whole lot more black and blue crashes to fix .

Extending the Unity Editor

The Unity editor is almost infinitely extendable. While we’re always  working on improving all the out of the box functionalities of the editor, anybody can make custom extensions.

There are thousands of editor extensions out there, some are only used by developers on specific projects, but a lot of them are accessible through the Asset Store. They bring Unity to new heights, enable less experienced programmers to reach higher and save everybody some time.

It’s easy to get lost among the hundreds of assets in the Editor Extensions category, so we picked a few hidden treasures for your enjoyment. Check them out to make sure you’re not missing out on a good deal that can make your life easier!

JustLogic Visual Programming by Aqla Solutions

Do you have the whole logic of your game in your head? With this tool, you can bring it to life straight in the inspector or even in runtime! Seriously, if you’re looking for an easy to use, non-expensive visual scripting tool, give it a try. The publisher also made a series of tutorials to help you get going.

uDev GUI Framework by Graveck

Graveck, one of the first studios to make a bet on Unity, is sharing their internal GUI toolset with the whole community. The asset builds on their long term relationship with the engine and therefore fits the editor like a snug glove. Smooooth!


Cruncher by RealWorldUnity.com

Having a Unity Pro, you’re probably serious about having properly designed and optimized models that don’t weigh down your game’s performance. The Cruncher is a big help along the way, especially for mobile devs. It reduces the polygons of your models right inside the editor.


Voxeland by Denis Pahunov

Step into the wonderful world of Voxeland! It’s basically a landscape creation tool that allows you to modify your both in editor and in-game in realtime. It comes with everything you need to start prototyping right away: land, cliff and grass textures, tree objects, shaders, scripts and even a simple sky. We spent wayyyy too much time just playing around with the demo!

Or make your own extensions!

Welcome Playnomics to the Unity Family!

More exciting news for you today!

We’re incredibly proud of our engine and development tools. We’re so pleased with where they are and where they’re going, that it would be easy to sit back and be happy with the impact our tools and business decisions have made in games development overall.

But game developers live and die not just by being able to create awesome games, but also by being able to connect with and keep a great audience.

We’re continuing our steps forward towards providing all of the developers using Unity the best tools to make their businesses a success after the process of creating an awesome game is complete. We announced the acquisition of Everyplay for this purpose, right before GDC and today we’re happy to tell you all that we’ve reached an agreement to acquire Playnomics and bring their talented and experienced team on board. As part of Unity, they’ll further develop their awesome technology to be part of the suite of services for Unity Cloud.

Much like Everyplay (even if very differently), Playnomics works on solving the challenges that developers face in unique ways. Playnomics has developed tools that help identify the ways that players interact with games and help you as a developer to make real-time decisions on how to interact with the community to keep them having fun and coming back for more.

The new integration of the Playnomics services into Unity Cloud is a work in progress, but once ready, you’ll have another tool in our increasingly comprehensive toolbelt of services you guys need to be successful across the board.

In the meantime, make sure to check out the new SDK. It’s currently available on the Playnomics website and will also be heading to the Asset Store soon. Give it a look!

David Helgason

Extend the editor: to infinity and beyond!

The Unity editor is almost infinitely extendable. While we’re always  working on improving all the out of the box functionalities of the editor, anybody can make custom extensions.

There are thousands of editor extensions out there, some are only used by developers on specific projects, but a lot of them are accessible through the Asset Store. They bring Unity to new heights, enable less experienced programmers to reach higher and save everybody some time.

It’s easy to get lost among the hundreds of assets in the Editor Extensions category, so we picked a few hidden treasures for your enjoyment. Check them out to make sure you’re not missing out on a good deal that can make your life easier!

JustLogic Visual Programming by Aqla Solutions

Do you have the whole logic of your game in your head? With this tool, you can bring it to life straight in the inspector or even in runtime! Seriously, if you’re looking for an easy to use, non-expensive visual scripting tool, give it a try. The publisher also made a series of tutorials to help you get going.

uDev GUI Framework by Graveck

Graveck, one of the first studios to make a bet on Unity, is sharing their internal GUI toolset with the whole community. The asset builds on their long term relationship with the engine and therefore fits the editor like a snug glove. Smooooth!


Cruncher by RealWorldUnity.com

Having a Unity Pro, you’re probably serious about having properly designed and optimized models that don’t weigh down your game’s performance. The Cruncher is a big help along the way, especially for mobile devs. It reduces the polygons of your models right inside the editor.


Voxeland by Denis Pahunov

Step into the wonderful world of Voxeland! It’s basically a landscape creation tool that allows you to modify your both in editor and in-game in realtime. It comes with everything you need to start prototyping right away: land, cliff and grass textures, tree objects, shaders, scripts and even a simple sky. We spent wayyyy too much time just playing around with the demo!

Or make your own extensions!

Turn your character into a player!

This blog post will run you through the steps to import, animate and control your character as a player in Unity. It is designed to be 3D package agnostic, so whether you use Max, Maya, Blender or any of the many supported modelling programs, you should find what you need to turn your character model into a player in a Unity scene. The sample assets will provide all the animations needed for controlling the character, but of course you can add you own. In this example I’ve created a Lola3000 character inspired by Barberella, Soroyama Metropolis amongst other influences and brought her to life, running through a tricky landscape of floating islands high above a sci-fi cityscape. NewMainImage

Follow the 12 Steps

We’ll begin with steps you can use to prepare and rig your character, before adding BlendShapes, verifying and then importing your rigged file into Unity. We’ll then set up some of the materials and shaders so that you can get your character looking it’s best. Next we will be creating an avatar to match your character rig and set it up for animation using a 3rd person controller with the Unity sample assets. We’ll load in a custom animation and setup a blendshape layer to further customise the character. Finally we’ll add lights and fx to the environment and camera to finish the look. SupportingImage_02   1 Preparing your Model Unity is a real-time platform, so prepare your model to look good without breaking the bank polygon wise. Name your materials and textures sensibly and use normal maps for extra detail, there are no polygon limits but the more you use, the less you have to spend on environment, FX and other characters. 5-25,000 can be a good range to aim for, depending on platform – so reduce polygons with the tools in your 3D package where necessary. Place your textures in a folder called textures within your Unity project assets folder and re-path them before you export. Character2Player__0000_Step_001   2 Rigging your character This stage will depend on your 3D package, skills and time available. Once your model is prepared in a t-pose you can either create a bone hierarchy from scratch – assigning skin weights – use your 3D package in-built tools to generate and skin to a skeleton, or use a fully automated solution like Mixamo Autorigger. In Maya for example, use Human IK, 3DSMax has Biped/CAT along with the skin modifier and Blender provides Rigify, as a few examples to create your skeleton hierarchy and assist with skinning. See Mecanim preparing your own character in the documentation for more details. Character2Player__0001_Step_002   3 Set up BlendShapes Unity supports BlendShapes (Morph targets) so decide which part of the character requires morphing, and set up in your package appropriately, using BlendShapes in Maya, Morpher in 3DS Max and Shape Keys in Blender for example. This is often used for phoneme shapes when animating a talking face and works by assigning morphed shapes of the same number of vertices (often a duplicate of the original) to a target so that you can blend between versions to obtain different shapes without animating a complex bone hierarchy. Character2Player__0002_Step_003   4 Verify and export This stage is important to minimize errors and troubleshooting when you set up your model later. Remove unused meshes and extraneous assets like lights or cameras from your scene, or simply use the export selected if you 3D package allows. Use the FBX file format if you can, to allow for file portability and simplicity – if you have your own animation clips be sure to check the animation check box in the export dialogue. Re-importing your exported model into the 3D package is often a good way to verify your model before bringing it into Unity. Character2Player__0003_Step_004   5 Importing your model You can drag your FBX into the Project pane, or if you exported here already your model will be picked up automatically. You can select your model in the project browser and set up the options in the inspector panel. You should probably leave most of these as default, but check the Scale Factor, as scale can vary hugely depending on units used in your 3D package and your export settings. Click apply and drag the model into the Scene. You can create a (1m) cube to make sure the scale is correct and readjust. Character2Player__0004_Step_005   6 Setting up your materials in Unity Select your character in the scene and observe the associated material(s) in the inspector, these should have been created in materials folder where your model is exported. Each material has a drop down for shader, choose one appropriately e.g. Bumped Specular, so that you can define a base colour (tint) specular colour and the texture maps for the diffuse (Base RGB), gloss in the alpha channel and a normal map to add surface detail. Reflective materials can also have a cubemap assigned for reflections, which you can render once in the Unity editor or realtime (pro) for dynamic reflections. Character2Player__0005_Step_006 7 Creating an avatar Once imported your character model needs to have an avatar applied, this will map your skeleton to an avatar to use with any humanoid animation. Select the character model FBX file in your Project pane. Select the Rig tab and choose Humanoid for ‘Animation Type’ – click configure to create and configure. If your rig is good to go it will all be in green, otherwise assign bones to the correct slots or revisit your bone hierarchy and re-export to closer match the avatar. You can test your skinning in the Muscles tab by dragging the sliders. Click Done when finished. Character2Player__0006_Step_007 8 Adding a controller Unity sample assets provide all you need to control your player. From the project window, drag the Third Person Character prefab from the Sample AssetsCharacters and vehiclesThird person Controllerprefabs folder into your scene. In your Hierarchy delete the Ethan node underneath Third Person Character. Drag your character node on top of your Third Person Character node which has all scripts,  parameters and the player tag already assigned. From the CamerasPrefabs folder drag the Free Look camera rig into the scene, add and position a ‘plane’ game object and press Play! Character2Player__0007_Step_008 9 Adding your animation If you have imported or acquired animation from the store, you can replace the animations from the character animator. Select your character root node, Open the animator from the Window menu – This opens a pane that manages which state your character is in and therefore which animation to play. Double click Grounded state to open a blend tree for when you character is on the ground. Select the blend tree and over in the inspector click the little circle next to an animation to choose another. Press play to preview then stop and make adjustments as necessary. Character2Player__0008_Step_009 10 Adding BlendShapes and tweaking your character Create an animation in your source package which blends between two or more meshes, as outlined in step 3. Re-export your mesh and include animation morphs in the FBX dialogue. In Unity create a new layer in your animator window set blending to additive and weight: 1, then drag in your clip from the project window, create an empty state and right click  Make transition – and transition to and from the clip. Set a condition for this e.g. forward: greater than 0.5 for to – and forward: Less than 0.5 for from transition in the inspector. Character2Player__0009_Step_010 11 Adding environment, lights and settings To immerse yourself in the game you can planes primitives to create a greybox test environment to play about in, or use levels from the sample assets or asset store and of course import your own environment artwork. Any imported artwork needs to have ‘Generate Colliders’ checked and applied in the inspector, for the imported file in the project view, so that you can walk on the surface etc. Create a ‘directional light’ from the Create button at the top of the hierarchy and adjust the parameters in the inspector. Character2Player__0010_Step_011 12 Adding Post FX and polish Unity pro includes a number of full screen Image FX that can help improve the look of your scene. Separate the ‘Game’ view by dragging the tab out to preview. Select the ‘Main Camera’ node under the Free Look Camera Rig. In the inspector click Add Component Image Effects Camera Depth of field – for example – to retain focus on your character, but blurring the background akin to a wide aperture. You can add as many others as your eyes and frame rate can handle so try vignette, bloom, ambient occlusion, so go ahead play!

Here are some Useful resources to continue with your project:

To follow the video tutorial click Play


Assets for Animation Awesomeness

We are so proud of Mecanim, Unity’s super flexible animation system. But we’re equally in awe of all the amazing work that the Asset Store publishers put in, so that you as a developer can just drag and drop to get your characters moving.

They capture all imaginable movements, from breakdancing, throwing enchantments or swinging a katana to sitting and drinking coffee. We picked a few assets that are really well-rated and popular among their users, but could use a little extra attention. Browse these hidden treasures to find new ways to get your game moving faster and cheaper!

Proportional Studios – Props Animations

This is a must have, if your game has any moving characters, which, let’s face it, it probably has. It has over 470 animations to date and if what you need isn’t in there, the publisher will add it to the asset at no extra costs. Proportional Studios are really a bunch of proper pros – the reviews point out that the support goes out of its way to help you get the most of this package.


Kubold – Movement Animset Pro

A set of high quality motion capture animations, optimized for seamless third person perspective character movement. You need to have a good idea about scripting and mecanim in general, but if you have the basics covered, this is the asset that will help you level up. Check out the demo.

Mister Necturus – Soldier Animation Collection

Get a full model of a soldier with 54 animation clips to get him moving and shooting exactly the way you want. Includes Root motion data used by Mecanim.

Gear Worx Productions – FPS Character Animation Pack

This is a really comprehensive package of movements that covers all possible shooting/running/climbing combos. It has a sample character, which shows initial setup for weapon placement and joint hierarchy. Plus it’s super neatly organized. Altogether, it will save you a ridiculous amount of time that you can instead spend making your shooter into something special.

McAnimation – Survival Shooter Animation Pack

Surviving is hard. Making a survival shooter game even harder. So why don’t you make your game dev work and your life in general easier by getting a great package of animations along with a handy “Getting Started” doc and an overview of frame-ranges?