Unity Awards 2014 Open Submissions Begin

It’s that time of year again where we open up submissions for the Unity Awards! Submissions will be open from now until June 30, 2014.

If you’ve created something awesome with Unity in the past year, whether it’s a game or some other interactive experience, we’d love to hear about it. All you have to do is head to the submission portal and click the link at the bottom that will start the process.

Submit your project for the Unity Awards now!

For those unfamiliar, the Unity Awards are held each year during the Unite conference to recognize the most impressive creations made using Unity. This year, the conference is taking place on August 20-22 in Seattle and the Awards ceremony itself will take place on August 21 at McCaw Hall. Read more about the conference and grab tickets at our Unite site.

This year, we’re changing the voting process slightly While the nomination committee here at Unity still look through the hundreds of projects submitted and narrow them down to six finalists in each category, we’re going to open up voting to the community for all categories. Community votes will account for 50% of the total vote with Unity employees accounting for the other 50%. This will be the same for all categories except for the Community Choice, which the community will account for 100% of the votes. General voting will begin in July 2014.

The categories this year include:

Best 3D Visual Experience – Submissions for this category will be judged based on artistic merit including thematic and stylistic cohesion, creativity, and/or technical skill.

Best 2D Visual Experience – Submissions for this category will be judged based on artistic merit including thematic and stylistic cohesion, creativity, and/or technical skill.

Best Gameplay – Intuitive control, innovation, creativity, complexity, and fun are what make games enjoyable and entertaining–we’re looking for games that excel in one or all of these areas.

Best VizSim Project – Unity projects come in all shapes and sizes; this year we’re looking for projects that have some real world grounded applications for visualization, simulation, and training.

Best Non-game Project – Unity-authored products that fall outside of games or VIzSim including projects such as art, advertising, interactive books and comics, digital toys, interactive physical installations, and informational programs will want to submit for this award.

Best Student Project – This award is for projects (games or otherwise) worked on by students currently being completed as part of the curriculum of an educational institution. Projects will be judged based on creativity, technical merit, and overall artistic cohesion among graphics, sound, and presentation.

Technical Achievement – Any project that provides an excellent example of technical excellence in Unity including but not limited to graphics, scripting, UI, and/or sound.

Community Choice – This category will be voted on by the community of game developers and represents the favorites of the community across the board.

Golden Cube (best overall) – This award is for the best overall project made with Unity in the last year. Everything from technical achievement and visual styling to sound production and level of fun will be taken into account to choose an overall winner.


Of course, there are some rules for submission that you’ll need to know, so here they are:

  • Only Unity-authored projects are eligible for nomination.
  • Projects must have been released from July 1, 2013 to June 30, 2014 to be eligible with the exception of student project submissions which must have been part of the coursework in the 2013-2014 school year.
  • Any projects nominated for previous years of the Unity Awards are ineligible for the 2014 Unity Awards with the exception of projects that were previously student work and have since turned into finished commercial projects.
  • Games currently in early access programs that not considered “final” products by June 30, 2014 will not be accepted to the 2014 Unity Awards.
  • Individuals or teams are welcome to enter multiple projects so long as they adhere to all other rules.

So submit those projects, tell your friends that release games this last year to submit their projects, and keep your eyes out in July for another announcement that community voting has begun. We’re really looking forward to seeing all of your submissions!

Announcing UNET – New Unity Multiplayer Technology

A few weeks ago, at our Unite Asia conferences, we announced that we are developing new multiplayer tools, technologies and services for Unity developers. The internal project name for this is UNET which simply stands for Unity Networking. But our vision goes well beyond simple networking. As you all know, the Unity vision is to Democratize Game Development. The Unity Networking team wants to specifically Democratize Multiplayer Game Development. We want all game developers to be able to build multiplayer games for any type of game with any number of players.

Before joining Unity, members of the networking team worked mainly on MMOs such as Ultima Online, Lord of the Rings Online, Dungeons and Dragons Online, Marvel Heroes, Need for Speed Online and World of Warcraft. We have a lot of passion for and a ton of experience with making multiplayer games, technology and infrastructure. The Unity vision was known to each of us and was always very appealing. When the chance to do something truly great like specializing the Unity vision with multiplayer came up, it was impossible to decline.  So we all left our former jobs and joined Unity to make this vision happen. Right now, we’re working hard to deliver these tools, technology and services so anyone can make their own dreams of a multiplayer game a reality.

This is of course a pretty big undertaking, but, like I said, we have all done this before, and we are all very driven to do it again (because it’s really, really cool!). The way we have tackled this is to divide our overall goal into phases which should be familiar to Unity developers. We take the approach of releasing a Phase 1, getting feedback from our users, adding that feedback to our work to make the next phase even better and repeating that cycle.

For UNET, Phase 1 is what we call the Multiplayer Foundation – more on that in a bit. Phase 2 is where we build on Phase 1 to introduce server authoritative gaming with what we call the Simulation Server, we’ll blog about this later. Finally, Phase 3 is where we want to introduce the ability to coordinate multiple Simulation Servers through a Master Simulation Server. As usual, exact dates for this are not possible and of course things can change, especially after gathering feedback from our users. But we can say that Phase 1 will be part of the 5.x release cycle and Phase 2 is in RD right now.

So what do we mean by the Multiplayer Foundation for Phase 1? The main features are as follows:

  • High performance transport layer based on UDP to support all game types

  • Low Level API (LLAPI) provides complete control through a socket like interface

  • High Level API (HLAPI) provides simple and secure client/server network model

  • Matchmaker Service provides basic functionality for creating rooms and helping players find others to play with

  • Relay Server solves connectivity problems for players trying to connect to each other behind firewalls

We had some inherent limitations with our legacy system that we needed to address and with our greater goal in mind it became clear that we needed to start from scratch. Since our goal is to support all game types and any number of connections, we started with a new high performance transport layer based on UDP. While it’s true that a lot of games are done quite well with TCP, fast action games will need to use UDP as TCP holds the most recently received packets if they arrive out of order.

From this new transport layer we built two new APIs. We have a new High Level API (HLAPI) which introduces a simple and secure client/server networking model. If you’re not a network engineer and you want to easily make a multiplayer game, the HLAPI will interest you.

We also wanted to address feedback we’d received on our old system: some users needed to have a lower level access for greater control. So we also have the Low Level API (LLAPI) which provides a more socket-like interface to the transport layer. If you are a network engineer and want to define a custom network model or just fine tune your network performance, then the LLAPI will interest you.

The Matchmaker service is used to configure rooms for your multiplayer game and get your players to find each other. And finally the Relay Server makes sure your players can always connect to each other.

We know from our prior experiences that making multiplayer games involves a lot of pain.  So the Multiplayer Foundation is a new set of easy to use professional networking technology, tools and infrastructure for making multiplayer games without this pain. To even get started, I think it is fair to say that making a multiplayer game requires a fair bit of knowledge of networking and protocols. You either overcome the painfully steep learning curve yourself or find a network engineer to join you.  Once you’ve gotten past that, you then have to solve the problem of getting your players to find each other.  And once you’ve solved that problem, you now have to deal with getting players to be able to actually connect with each other, which can be troublesome when they are behind firewalls with NAT.  But then if you’ve solved all of that you’ve created a bunch of associated infrastructure which wasn’t game development and probably wasn’t fun. And now you have to worry about dynamically scaling your infrastructure which usually takes a bit of prior experience to get right.

Our Phase 1 addresses each of these pain points. The HLAPI eliminates the need for a deep knowledge of networking. But the LLAPI is there if you are a network engineer and you want to do things your own way. The Matchmaker solves your problem of getting your players to find each other. The Relay Server solves your problem of getting players to be able to connect to each other. And we also solved your problem of the associated infrastructure and dynamically scaling it. The Matchmaker and Relay Server live in Unity’s Multiplayer Cloud. So not only do the physical servers scale up and down based on demand, but the processes scale up and down as well.

We are very excited about UNET and are eager to share more details. Over the next few weeks we’ll follow up with more blogs from the rest of the team.  We would love to hear what you think, and we can’t wait to see what you all make with this in the future.

Community posts on ‘Learn’ – Teach Us!


Asset Store, mobile, community, unity, QA, learn, Android, luug, tutorial, tutorials, windows phone 8, SimViz, contest, flash, london, shader, ios, testing, occlusion, teaching, Windows Store, Education, company news, indie, event, Union, meetup, Microsoft, unity 4.3, ar, augmented reality, usergroup, shaders, unity 4, teach, online services, training, editor, project, website

The Novelist and the Asset Store: The Visual Scripting Story

Kent Hudson made a game that is part The Shining, part Gone Home and part something new entirely. In The Novelist, you are a ghost helping a writer who’s struggling with work-life balance. The developer told me that the uScript plugin was his own friendly ghost in the machine.

“I know this sounds like a shameless plug, but it’s true: Unity and the Asset Store are the reason I’m able to make games independently.”says Kent, who’s been previously working on games like Deux Ex: Invisible War or BioShock 2 before going indie. He has more than a decade of game development experience, but says that without uScript Visual Scripting Tool, creating The Novelist would be out of reach for him.

“I come from a systems design background, so I think very technically, but I’ve never stuck with programming courses long enough to actually become a proficient engineer. I’m used to architecting reusable systems and game objects, though, so uScript was the perfect tool for me.” explains Kent Hudson

He used it for player movement, the memory system, controlling the UI, the human AI behaviors, the narrative structure of the game, and every other bit of on-screen functionality in the game. “Not a single line of code was written for my game; the entire thing was built in uScript”.

Here’s a the uScript editor window, opened up to the logic that computes character relationships when the player makes decisions. Click on the thumbnail to see the full screenshot:

Screen Shot 2014-04-17 at 5.10.21 PM

Another big advantage of using uScript is its powerful reflection system, which means that it can interface with other Unity plugins. There’s no extra support required to get it working with code from other programmers on the project or other Asset Store plugins and extensions.

Kent Hudson also used NGUI for the UI and its partner plug-in, HUD Text,  to create the thoughts that float above the characters’ heads. “Instead of crafting a UI system from the ground up, I was able to focus on writing the text that would be displayed by the UI.”

The Highlighting System by Deep Dream and Glow Per-Object plug-ins are responsible for object highlighting in the game. All of these are connected with uScript.

So what is Kent Hudson up to next? “Now that I’m so familiar with Unity, I feel like there aren’t many limits on what I can do for my next game. I can start up a new Unity project, import my key plug-ins, and start building things right away. The number of possibilities the Asset Store has opened up has been amazing, and I feel like it’s only going to get better from here.”

Here’s a shot of all of the possible outcomes that can result from the player’s decisions in The Novelist. Click on the thumbnail to see the full screenshot:

Screen Shot 2014-04-17 at 5.11.49 PM

All assets used:




Dependency injection and abstractions

Testability is an important feature of any software product – game development is not an exception. To enable testability, all the components should be independent and testable in isolation.

When we want to test something in isolation it means that we want to decouple it. Loose coupling is what we need. It is so easy to embed hidden dependencies into your game and it is so hard to break them. This article will help you understand loose coupling and dependency injection within a project in Unity, using the example project on github.

Lets take handling input as an example.

public class SpaceshipMotor : MonoBehaviour
  void MoveHorizontally ()
    var horizontal = Input.GetAxis ("Horizontal");
    // ...

MoveHorizontally method uses static Unity API (Input class) without telling you. It considers this call to be his private business and you can’t control or influence the situation. It makes SpaceshipMotor class tightly coupled with Unity static API, and you can’t verify the behaviour of the SpaceshipMotor class unless you physically press the key on the keyboard. It’s annoying.

Now lets take this situation under control. You are in charge here.

The SpaceshipMotor class is using only horizontal axis, so we can define a short description what kind of functionality it expects from user input.

public interface IUserInputProxy
  float GetAxis(string axisName);

Then you can substitute the call to real Input with the call to our abstraction.

public class SpaceshipMotor : MonoBehaviour
  public IUserInputProxy UserInputProxy {get;set;}

  void MoveHorizontally ()
    var horizontal = UserInputProxy.GetAxis (“Horizontal”);
    // …

Now you are in charge of the situation! The class can’t operate unless you provide it IUserInputProxy implementation.

This is called Dependency Injection (DI). When the dependency (Input in our case) is passed to the dependant object(SpaceshipMotor class) and it becomes part of it’s state (a field in our case).

There are several options of passing a dependency: constructor injection, property injection, method injection.

Constructor injection is considered to be the most popular and the most robust approach as when the dependency is passed in the construction phase our chances to have object in uninitialized state is minimal.

public class SpaceshipMotor : MonoBehaviour
  private readonly IUserInputProxy userInputProxy;

  public SpaceshipMotor (IUserInputProxy userInputProxy)
    this.userInputProxy = userInputProxy;

But Unity engine is calling the constructors for MonoBehaviours and we can’t control this process.

 Still, property and method injection are both usable in this case.

 The easiest approach for manual Dependency Injection (DI) would be to use the script that will inject the dependencies.

In “Growing Games Guided by Tests” we are using an interface to expose property dependency.

public interface IRequireUserInput
  IUserInputProxy InputProxy { get; set;}

And a script that allows us to set the parameters of fake input in the scene and inject it when the tests start.

public class ArrangeFakeUserInput : MonoBehaviour
  public GameObject Spaceship;
  public FakeUserInput FakeInput;

  void Start () {
    var components = Spaceship.GetComponentsMonoBehaviour ();
    var dependents = components.Where(c=c is IRequireUserInput)
    foreach(var dependent in dependents)
      dependents.InputProxy = FakeInput;

How does this contribute to testability?


We have lots of examples in “Growing Games Guided by Tests” where fake user input is injected with helper script and it lets us test the behaviour.

On the other hand we can write unit tests for classes that depend on abstractions.

public void ChangesStateToIsFiringOnFire1ButtnPressed()
  // Arrange
  // Setting test double for user input
  IUserInputProxy userInput = Substitute.ForIUserInputProxy ();
  // Telling GetButton method of test double to return true
  // if state of “Fire1” was requested
  userInput.GetButton(Arg.Is("Fire1")).Returns (true);
  // Passing the dependency to Gun object on creation
  Gun gun = new Gun(userInput);
  // Act
  gun.ProcessInput ();
  // Assert
  Assert.That(gun.IsFiring, Is.True);

Now you see that there is no magic to dependency injection. It is the process of substitution of concrete dependencies with the abstractions and making them external to the dependant object.

To use DI on a large scale you need a tool to automate it . This will be the topic for our next blogpost.


Shader Compilation in Unity 4.5

A story in several parts. 1) how shader compilation is done in upcoming Unity 4.5; and 2) how it was developed. First one is probably interesting to Unity users; whereas second one for the ones curious on how we work and develop stuff.

Short summary: Unity 4.5 will have a “wow, many shaders, much fast” shader importing and better error reporting.

Current state (Unity =4.3)

When you create a new shader file (.shader) in Unity or edit existing one, we launch a “shader importer”. Just like for any other changed asset. That shader importer does some parsing, and then compiles the whole shader into all platform backends we support.

Typically when you create a simple surface shader, it internally expands into 50 or so internal shader variants (classic “preprocessor driven uber-shader” approach). And typically there 7 or so platform backends to compile into (d3d9, d3d11, opengl, gles, gles3, d3d11_9x, flash – more if you have console licenses). This means, each time you change anything in the shader, a couple hundred shaders are being compiled. And all that assuming you have a fairly simple shader – if you throw in some multi_compile directives, you’ll be looking at thousands or tens of thousands shaders being compiled. Each. And. Every. Time.

Does it make sense to do that? Not really.

Like most of “why are we doing this?” situations, this one also evolved organically, and can be explained with “it sounded like a good idea at the time” and “it does not fix itself unless someone works on it”.

A long time ago, Unity only had one or two shader platform backends (opengl and d3d9). And the amount of shader variants people were doing was much lower. With time, we got both more backends, and more variants; and it became very apparent that someone needs to solve this problem.

In addition to the above, there were other problems with shader compilation, for example:

  • Errors in shaders were reported, well, “in a funny way”. Sometimes the line numbers did not make any sense – which is quite confusing.
  • Debugging generated surface shader code involved quite some voodoo tricks (#pragma debug etc.).
  • Shader importer tried to multi-thread compilation of these hundreds of shaders, but some backend compilers (Cg) have internal global mutexes and do not parallelize well.
  • Shader importer process was running out of memory for really large multi_compile variant counts.

So we’re changing how shader importing works in Unity 4.5. The rest of this post will be mostly dumps of our internal wiki pages.

Shader importing in Unity 4.5

  • No runtime/platforms changes compared to 4.3/4.5 – all changes are editor only.
  • No shader functionality changes compared to 4.3/4.5.
  • Shader importing is much faster; especially complex surface shaders (Marmoset Skyshop etc.).
    • Reimporting all shaders in graphics tests project: 3 minutes with 4.3, 15 seconds with this.
  • shaders-errorsErrors in shaders are reported on correct lines; errors in shader include (.cginc) files are reported with the filename line number correctly.
    • Was mostly “completely broken” before, especially when include files came into play.
    • On d3d11 backend we were reporting error column as the line, hah. At some point during d3dcompiler DLL upgrade it changed error printing syntax and we were parsing it wrong. Now added unit tests so hopefully it will never break again.
  • shaders-surfaceSurface shader debugging workflow is much better.
    • No more “add #pragma debug, open compiled shader, remove tons of assembly” nonsense. Just one button in inspector, “Show generated code”.
    • Generated surface shader code has some comments and better indentation. It is actually readable code now!
  • Shader inspector improvements:
    • Errors list has scrollview when it’s long; can double click on errors to open correct file/line; can copy error text via context click menu; each error clearly indicates which platform it happened for.
    • Investigating compiled shader is saner. One button to show compiled results for currently active platform; another button to show for all platforms.
  • Misc bugfixes
    • Fixed multi_compile preprocessor directives in surface shaders sometimes producing very unexpected results.
    • UTF8 BOM markers in .shader or .cginc files don’t produce errors.
    • Shader include files can be at non-ASCII folders and filenames.

Overview of how it works

  • Instead of compiling all shader variants for all possible platforms at import time:
    • Only do minimal processing of the shader (surface shader generation etc.).
    • Actually compile the shader variants only when needed.
    • Instead of typical work of compiling 100-1000 internal shaders at import time, this usually ends up compiling just a handful.
  • At player build time, compile all the shader variants for that target platform
    • Cache identical shaders under Library/ShaderCache.
    • So at player build time, only not-yet-ever-compiled shaders are compiled; and always only for the platforms that need them. If you never ever use Flash, for example, then none of shaders will be compiled for Flash (as opposed to 4.3, where all shaders are compiled to all platforms, even if you never ever need them).
  • Shader compiler (CgBatch) changes from being invoked for each shader import, into being run as a “service process”
    • Inter-process communication between compiler process Unity; using same infrastructure as for VersionControl plugins integration.
    • At player build time, go wide and use all CPU cores to do shader compilation. Old compiler tried to internally multithread, but couldn’t due to some platforms not being thread-safe. Now, we just launch one compiler process per core and they can go fully parallel.
    • Helps with out-of-memory crashes as well, since shader compiler process never needs to hold bazillion of shader variants in memory all at once – what it sees is one variant at a time.

How it was developed

This was mostly a one-or-two person effort, and developed in several “sprints”. For this one we used our internal wiki for detailed task planning (Confluence “task lists”), but we could have just as well use Trello or something similar. Overall this was probably around two months of actual work – but spread out during much longer time. Initial sprint started in 2013 March, and landed in a “we think we can ship this tomorrow” state to 4.5 codebase just in time for 1st alpha build (2013 October). Minor tweaks and fixes were done during 4.5 alpha beta period. Should ship anyday now, fingers crossed!

Surprisingly (or perhaps not), largest piece of work was around “how do you report errors in shaders?” area. Since now shader variants are imported only on demand, that means some errors can be discovered only “some time after initial import”. This is a by-design change, however – as the previous approach of “let’s compile all possible variants for all possible platforms” clearly does not scale in terms of iteration time. However, this “shader seemed like it did not have any errors, but whoops now it has” is clearly a potential downside. Oh well; as with almost everything there are upsides downsides.

Most of development was done on a Unity 4.3-based branch, and after something was working we were sending off custom “4.3 + new shader importer” builds to the beta testing group. We were doing this before any 4.5 alpha even started to get early feedback. Perhaps the nicest feedback I ever got:

I’ve now used the build for about a week and I’m completely blown away with how it has changed how I work with shaders.

I can try out things way quicker.
I am no longer scared of making a typo in an include file.
These two combine into making me play around a LOT more when working.
Because of this I found out how to do fake HDR with filmic tonemapping [on my mobile target].

The thought of going back to regular beta without this [shader compiler] really scares me.

Anyhoo, here’s a dump of tasks from our wiki (all of them had little checkboxes that we’d tick off when done). As usual, “it basically works and is awesome!” was achieved after first week of work (1st sprint). What was left after that was “fix all the TODOs, do all the boring remaining work” etc.

2013 March Sprint:

  • Make CgBatch a DLL
    • Run unit tests
    • Import shaders from DLL
    • Don’t use temp files all over the place
  • Shader importer changes
    • Change surface shader part to only generate source code and not do any compilation
    • Make a “Open surface compiler output” button
    • At import time, do surface shader generation cache the result (serialize in Shader, editor only)
    • Also process all CGINCLUDE blocks and actually do #includes at import time, and cache the result (after this, left with CGPROGRAM blocks, with no #include statements)
    • ShaderLab::Pass needs to know it will have yet-uncompiled programs inside, and able to find appropriate CGPROGRAM block:
      • Add syntax to shaderlab, something like Pass { GpuProgramID int }
      • Make CgBatch not do any compilation, just extract CGPROGRAM blocks, assign IDs to them, and replace them with “GpuProgramID xxx”
      • “cache the result” as editor-only data in shader: map of snippet ID – CGPROGRAM block text
    • CgBatch, add function to compile one shader variant (cg program block source + platform + keywords in, bytecode + errors out)
    • Remove all #include handling from actual shader compilers in CgBatch
    • Change output of single shader compilation to not be in shaderlab program/subprogram/bindings syntax, but to produce data directly. Shader code as a string, some virtual interface that would report all uniforms/textures/… for the reflection data.
  • Compile shaders on demand
    • Data file format for gpu programs their params
    • ShaderLab Pass has map: m_GpuProgramLookup (keywords – GPUProgram).
    • GetMatchingSubProgram:
      • return one from m_GpuProgramLookup if found. Get from cache if found
      • Compile program snippet if not found
      • Write into cache

2013 July Sprint:

  • Pull and merge last 3 months of trunk
  • Player build pipeline
    • When building player/bundle, compile all shader snippets and include them
    • exclude_renderers/include_renderers, trickle down to shader snippet data
    • Do that properly when building for a “no target” (everything in) platforms
      • Snippets are saved in built-in resource files (needed? not?)
    • Make building built-in resource files work
      • DX11 9.x shaders aren’t included
      • Make building editor resource file work
    • Multithread the “missing combinations” compilation while building the player.
      • Ensure thread safety in snippet cache
  • Report errors sensibly
  • Misc
    • Each shader snippet needs to know keyword permutation possibly needed: CgBatch extracts that, serialized in snippet (like vector vector )
    • Fix GLSLPROGRAM snippets
    • Separate “compiler version” from “cgbatch version”; embed compiler version into snippet data hash
    • Fix UsePass

2013 August Sprint:

  • Move to a 4.3-based branch
  • Gfx test failures
    • Metro, failing shadow related tests
    • Flash, failing custom lightmap function test
  • Error reporting: Figure out how to deal with late-discovered errors. If there’s bad syntax, typo etc.; effectively shader is “broken”. If a backend shader compiler reports an error:
    • Return pink “error shader” for all programs ­ i.e. if any of vertex/pixel/… had an error, we need to use the pink shaders for all of them.
    • Log the error to console.
    • Add error to the shader, so it’s displayed in the editor. Can’t serialize shader at that time, so add shaders to some database under Library (guid­errors).
      • SQLite database with shader GUID – set of errors.
    • Add shader to list of “shaders with errors”; after rendering loop is done go over them and make them use pink error shader. (Effectively this does not change current (4.2) behavior: if you have a syntax error, shader is pink).
  • Misc
    • Fix shader Fallback when it pulls in shader snippets
    • “Mesh components required by shader” part at build time – need to figure them out! Problem; needs to compile the variants to even know it.
    • Better #include processing, now includes same files multiple times
  • Make CgBatch again into an executable (for future 64 bit mac…)
    • Adapt ExternalProcess for all communication
    • Make unit tests work again
    • Remove all JobScheduler/Mutex stuff from CgBatch; spawn multiple processes instead
    • Feels like is leaking memory, have to check
  • Shader Inspector
    • Only show “open surface shader” button for surface shaders
    • “open compiled shader” is useless now, doesn’t display shader asm. Need to redo it somehow.

2013 September Sprint:

  • Make ready for 4.5 trunk
    • Merge with current trunk
    • Make TeamCity green
    • Land to trunk!
  • Make 4.3-based TeamCity green
    • Build Builtin Resources, fails with shader compiler RPC errors GL-only gfx test failures (CgProps test)
    • GLSLPROGRAM preprocessing broken, add tests
    • Mobile gfx test failures in ToonyColors
  • Error reporting and #include handling
    • Fixing line number reporting once and for all, with tests.
    • Report errors on correct .cginc files and correct lines on them
    • Solve multiple includes preprocessor affecting includes this way: at snippet extraction time, do not do include processing! Just hash include contents and feed that into the snippet hash.
    • UTF8 BOM in included files confusing some compilers
    • Unicode paths to files confusing some compilers
    • After shader import, immediately compile at least one variant, so that any stupid errors are caught displayed immediately.
  • Misc
    • Make flags like “does this shader support shadows?” work with new gpu programs coming in
    • Check up case 550197
    • multi_compile vs. surface shaders, fix that
  • Shader Inspector
    • Better display of errors (lines locations)
    • Button to “exhaustively check shader” – compiles all variants / platforms.
    • Shader snippet / total size stats

What’s next?

Some more in shader compilation land will go into Unity 5.0 and 5.x. Outline of our another wiki page describing 5.x related work:

  • 4.5 fixes “compiling shaders is slow” problem.
  • Need to fix “New standard shader produces very large shader files” (due to lots of variants – 5000 variants, 100MB) problem.
  • Need to fix “how to do shader LOD with new standard shader” problem.

Showing off the Shy Shaders

You can never have too many shaders. Whether you’re pursuing the elusive goal of hyper realistic 3D graphics or making a cute cartoon game for kids, shaders are definitely on your radar. And the Asset Store is here to help. 

While Unity 5 will make shader programming a breeze, there are still a lot of specialist assets that will come in handy in specific situations. We’d like to show you a few shaders that are currently on the shelves of the Asset Store, have great ratings and stellar support, but are a bit hidden behind the row of top sellers.

Candela SSRR: Advanced Screen Space Glossy Reflections by Livenda

This asset makes beautifully realistic reflections. In other words, it’s a highly optimized advanced screen space ray-traced glossy reflection post effect solution. And very easy to deal with, giving you the final control over the shiny surfaces in your desktop game. Pixel accurate. Pretty awesome. 

Depth of Field Mobile Shader by Barking Mouse Studio

If you’re making 3D mobile games with Unity Pro, you should definitely check this out. Just like a camera, it has an adjustable aperture, so you can intuitively control the depth of field while minimizing memory usage.


Planets by NexGen Assets

This great asset has a diffuse and a specular texture, it can also control the opacity of the clouds and night-lights on the planet, as well as its rotation and halo. Includes shaders for gas giants, stars and galaxies. Indispensable for space adventures!


Mobile HDR by Science Laboratory

Adapting the brightness of your scene swiftly can be draining on both the performance of your game and your development time. This HDR Bloom and Adaptive Brightness Correction tool has a custom inspector and includes full C# and Cg source code access. Save yourself the pain and get it!


Lens Dirtiness by David Miranda

Making a fast paced game and want to give players the feeling that the camera is right there in the dirt? Check out this camera post-processing effect for Unity Pro! Lens Dirtiness also includes lens flares and works on desktop and mobile. It costs less than a pizza!