Here’s a quick teaser image from the last project we were working on (taken by a friend, so do forgive the quality)…
It’s from a huge multi-user interactive installation we built along with Decane for a major U.S. wireless company. You’re looking at a 8 meter by 2 meter screen, made out of 140 MicroTiles, running a Unity application pushing out 12 million pixels per frame, controlled by multiple Kinects.
All those thousands of items flying around are individual autonomous UnitySteer agents. They track passing users by forming small swarms, and eventually come together to form avatars when a user is detected at certain spots.
More details coming as soon as we are cleared to share them!
Back in March 2013, we announced a partnership with Sony based on support for every single one of SCE’s PlayStation platforms, in addition to Unity for PlayStation®3. Today, we’re thrilled to announce that with Unity 4.3 we’re releasing Unity for PlayStation®Vita to the public.
Developers with a licensed developer agreement from SCE for PS Vita will now be able to deploy their game to PS Vita via the Unity engine and make use of platform-specific functionality including:
- Motion sensors
- Front and rear cameras
- Dual analog sticks
- Rear Touch pad
Furthermore, Unity for PS Vita will enable you to integrate the full suite of PSN features into your game, including Trophies, Friends and Matching functionality into your game.
As with all the other platforms we support, Unity for PS Vita allows you to develop your game once, without rewriting the code from scratch; simply build and run it on your PS Vita devkit. Not only that, you can now create both 2D and 3D games with Unity 4.3, animate almost anything with the native animation system Mecanim, and implement very cool graphics. And all of it can be run directly on your PS Vita devkit for quick iteration! Why don’t you head over to the release notes and take a look?
Zoink! has just released their new game “Stick It To The Man” developed in Unity for both PS3 and PS Vita. The game has already been very well received. Check out their game at their website http://www.stickitgame.com/ to see what it is all about!
We are proud to have reached this platform milestone and we are very eager to see what exciting games you will create for PS Vita!
To become a licensed developer for PS Vita please visit the SCE company registration site (https://www.companyregistration.playstation.com).
This is the second part of my post about how we monitor Unity performance. In Part I, I explained how we monitor for and fix performance regressions. In Part II I’ll go into some details about how we do the monitoring.
Building the rig
We have built a performance test rig that can monitor selected development and release branches and report selected data points to a database.
The performance test rig is continuously running and looking for new builds to test from our build automation system. As soon as new builds are ready, it runs tests on all platforms we have setup for performance testing. The results get reported into a database with all the related information like Unity version used, platform and information about the hardware and operating system. We also have a reporting solution that continuously monitors the data in the database and shows us if we have a significant performance regression. It can tell us in which release branch the regression occurs, which platforms and tests are affected by it.
The figure below shows the components of the performance test framework. For running the tests we have dedicated machines running tests on different platforms. We use a .Net web service to report test results to the database. And we have a reporting solution that presents results in nicely formatted reports.
When analyzing the test results we use a fixed set of hardware and software configurations and we look for changes over different versions of Unity. For example when testing Windows Standalone platform we use the same hardware and Windows version for all the runs. Only the Unity version is varying.
As all of the data is saved in the database, we can make reports from the data for different uses. For each Unity version we are about to release we make a chart that shows data points vs previously released versions. We can also see almost realtime status of performance tests for release branches we are developing in parallel. When analyzing a regression we can even see the individual measurements, not only the mean value.
Currently we have three dedicated performance test machines for handling the test running. We have one Mac Mini (OSX 10.8.5) that is used to run tests on the Mac platforms (Editor, Standalone). We have two Windows machines (Windows 7, Windows 8.0) that are used to run tests on the Windows platforms (Editor, Standalone). And we have a Nexus 10 device to run tests on Android. We intend to extend this to more platforms, hardware and software configurations, but more on that later. And before you ask, yes, OSX 10.9 Mavericks is coming. We just didn’t get to it yet.
Extending existing framework
We wrote about one of our test frameworks, the Runtime Test Framework, in an earlier blog post. For performance testing we have extended our existing test frameworks instead of creating a new one. This means we can tap into the existing test suites and use them for performance testing. Further, everyone that can write tests using existing frameworks can write performance tests. This also means it is easy to get new platforms running performance tests as soon as we have the platform running other tests.
The test suites
We have split the tests into two types called Editor Tests and Runtime Tests based on how they get run and what can be measured. Editor Tests are limited to the platforms the editor runs on. Editor Tests can make measurements outside of Unity user scripts. The tests measure things like editor start-up time and asset import times for different types of assets. Runtime Tests, on the other hand, can be run on all the supported runtime platforms. With Runtime Tests it is possible to measure things like frame time of rendered frames or to measure the time it takes to deserialize different types of assets. It is actually possible to measure anything you can do with user scripts in Unity.
First of all we will add more configurations. Hardware and software. OSX 10.9 is high on the list, and so are the remaining mobile platforms and consoles. We truly believe that the performance rig is a lifesaver when it comes to detecting performance regressions as early as possible. Further, the rig makes it very easy to compare performance across different platforms and versions of Unity. We will keep you posted as we find interesting results – and different usages of the rig.
(This guest blog post comes from our online service partner, Kii)
With Kii, game developers get a fast and scalable backend, powerful analytics and game distribution services, so they can focus on the stuff that matters — the game experience. Back in September when we released our Unity SDK, we briefly explained what Unity is and how your Unity games can benefit from using Kii Cloud by allowing you to focus on what matters to your players rather than developing the game backend.
We’ve learned a lot about the game development since then, including that practically all successful games have some common components. They start with a great idea, have an awesome user experience, and, they have a scalable backend to support any spikes in growth and ongoing performance needs. The best games also get insights into player behaviors and usage, and create a strong user acquisition machine
While Unity is the gaming engine that transforms an idea into beautiful experience, Kii provides the complete game backend — including tools, insights and a distribution package to help developers aggressively distribute their games
So, we’re excited to announce a partnership with Unity that makes Kii Cloud available via Unity’s Asset Store. Game developers can get their hands on Kii early in the development process so they can add a solid, scalable gaming backend. That means powerful analytics, user management, flexible data storage and retrieval, geolocation and much more. This also means you will have access to more targeted developer communities. Using Kii Cloud in your game. If you are a game developer, chances are you’re already using Unity or have at least heard of it — it’s one of the best gaming engines out there. But it might be harder to understand why you need a robust, carrier-grade backend built for games. So we’ll provide examples of how to leverage Kii in your games and also discuss in more detail how we’re taking our Unity commitment to the next level.
Why you need a game backend
As you build a game, you need to determine where you will store data. You can keep everything on the devices themselves, but that makes syncing data across devices and platforms challenging. You could move all data to a server and code all the logic to manage that data, but is that where you want to spend your time and resources? Finally, will it be fast enough to handle that data when your game scales to millions of users?
Building on a backend platform is the best choice for complex games. Our Unity-native SDK lets you easily manage data without sacrificing speed, security or scalability.
A carrier-grade backend also plays a huge role in your frontend user experience, which, as you know, is a determining factor in the success of your game. Response time is a huge part of this, and it depends greatly on your choice of backend.
Middleware that makes your games successful
Making your game competitive—and easier to deploy and manage—depends on the “middleware” pieces you deploy. Components like geolocation, data management, user management, push services and analytics are standard—but not necessarily pieces you should develop yourself. Kii provides these building blocks out of the box.
Track usage behaviors and iterate
You’ll need to understand game metrics like retention, engagement and player behavior to improve game design and monetization. Most popular analytics SDKs only offer “event-based” analytics — meaning when something you want to measure happens in the game, you have to fire an event that gets logged in the backend. This is difficult to maintain, since every time you want to measure a new metric you have to modify your game, redeploy and send new updates to your players. Kii Analytics supports event-based analytics but also leverages user-generated data that’s already being stored in the backend, so you can get deeper insights on the fly without ever touching the deployed game! In addition to standard analytics, you can create advanced metrics about player-player social interactions, user demographics, usage progress, dropoff points and more—so you can optimize for better design and increased game usage.
Increase the distribution of your game
You’ve built an awesome game, but how do you get people to try it? Through Kii to China, you can distribute your Unity-based games to the world’s largest smartphone market. And through Kii’s handset and carrier partnerships in Japan, you can also distribute your Unity-based games to the world’s best-monetized mobile gaming market.
Unity demos, code and tutorials for Kii Cloud
Since our addition of asynchronous call support to our Unity SDK (a feature that allows you to use our backend without your players ever noticing it) we have been working on a bunch of Unity demos:
- KiiUnitySDKSamples is a generic demo that systematically shows all Kii Cloud API calls via the Unity SDK. It’s not attached to a game, but it’s a Unity project that runs without modification, exposing a Unity-based GUI for interaction.
- UnityAngryBotsKii takes the official Unity 3D 4.3 AngryBots demo game and makes use of Kii Cloud via the Unity SDK. The demo is under development but it already showcases several Kii features.
- HelloKii-Unity is a skeleton project that shows basic user management and data management (including queries) in the context of a simple breakout game. It’s included with the Unity SDK package.
Getting started with Kii Cloud for Unity
Want to get started quickly? Check our Unity Quick Start for both Kii Cloud and Kii Analytics SDKs to get up and running in a snap! Alternatively, you can download the Unity Skeleton Project (an empty project with our SDKs already in place) when you create a Unity project on developer.kii.com. If you’re looking for more advanced examples check out our demo section above. We will keep working closely with the Unity community to bring you the leanest and fastest cloud backend for your games and hope you enjoy all the resources now at your disposal that will let you build better games and have more fun doing it. You can focus more resources on the things that matter like design and playability when you get rid of backend coding and maintenance.
Don’t hesitate to contact us on the Kii Developer Community to let us know about your Kii-powered games. We’ll be happy to showcase them through our channels.
(This guest blog post comes from our online service partner, Kii!)
Herb Sutter from the ISO C++ group, reached out to the Cairo folks: We are actively looking at the potential standardization of a basic 2D drawing library for ISO C++, and would like to base it on (or outright adopt, possibly as a binding) solid prior art in the form of an existing library.
And also: we are focused on current Cairo as a starting point, even though it's not C++ -- we believe Cairo itself it is very well written C (already in an OO style, already const-correct, etc.).
Congratulations to the Cairo guys for designing such a pleasant to use 2D API.
But this would not be a Saturday blog post without pointing out that Cairo's C-based API is easier and simpler to use than many of those C++ libraries out there. The more sophisticated the use of the C++ language to get some performance benefit, the more unpleasant the API is to use.
The incredibly powerful Antigrain sports an insanely fast software renderer and also a quite hostile template-based API.
We got to compare Antigrain and Cairo back when we worked on Moonlight. Cairo was the clear winner.
We built Moonlight in C++ for all the wrong reasons ("better performance", "memory usage") and was a decision we came to regret. Not only were the reasons wrong, it is not clear we got any performance benefit and it is clear that we did worse with memory usage.
But that is a story for another time.
The following blog post was written by Jasin Bushnaief of Umbra Software to explain the updates to occlusion culling in Unity Pro 4.3.
This is the last post in a three-post series. In the first one, I described the new occlusion culling system in Unity 4.3 and went through the basic usage and parameters. In the second one, I gave a list of best practices and general recommendations for getting the most out of Umbra. This last post deals with troubleshooting some common issues people tend to encounter when using Umbra.Visualizations
Unity offers a couple of helpers for figuring out what’s going on in occlusion culling. These visualizations may help you figuring out why occlusion culling isn’t behaving quite as you’d expect. The visualizations can be found by enabling the Visualization pane in the Occlusion window and selecting the camera.
The individual visualizations can then be enabled and disabled in the Scene view, in the Occlusion Culling dialog.
Let’s take a look at what the different visualizations do.Camera Volumes
The Camera Volumes visualization simply shows you, as a grey box, in which cell the camera is located. For more information on what the cells are, take a look at the first post. This is one way of figuring out how the value of smallest occluder changes the output resolution of the data, for instance. Also, if it looks like the cell bounds don’t make sense, for example when the cell incorrectly extends to the other side of what should be an occluding wall, something may be amiss.
The purpose of the Visibility Lines visualization is to show you the line of sight that Umbra sees. The way it works is that Umbra will project its depth buffer back into the scene and draw lines to the furthermost non-occluded points in the camera’s view. This may help you to figure out, for instance, which holes or gaps cause “leaks” in occlusion, ultimately causing some objects to become visible. This may also reveal some dubious situations where some object that clearly should be a good occluder, doesn’t occlude anything because of, say, forgetting to enable the static occluder flag for the object.
The Portals visualization will draw all the traversed portals as semi-transparent axis-aligned quads. Not only will this help you get an idea of how many portals Umbra traverses and thus help you deal with occlusion culling performance tweaking, but it also provides another way of looking at what’s in Umbra’s line of sight. So you can see if there are some spots in the scene that don’t really cause occlusion, and how the portals get placed into the scene in general.
While occlusion culling should just work in Unity, sometimes things don’t go quite as you’d expect. I’ll go over the most common issues people tend to run into, and how to solve those issues in order to make your game run smoothly.Hidden objects aren’t being culled!
Sometimes people wonder why some objects are reported visible by Umbra when in reality they seem to be occluded. There can be many reasons for this. The most important thing to understand is that Umbra is always conservative. This means that it always opts for objects being visible rather than invisible whenever there’s any uncertainty in the air. This applies to all tie-breaking situations as well.
Another thing to note is that the occlusion data represents a simplified version of the scene’s occluders. More specifically, it represents a conservatively simplified version, meaning some of the occlusion erodes and loses detail.
The level of detail that gets retained in the data is controlled by smallest occluder. Decreasing the value will produce higher-resolution data that should be less conservative, but at the same time, culling will lose some speed and the data will get larger.Visible objects are being culled!
Probably the most puzzling problematic scenario is when something gets reported by Umbra as occluded even though it shouldn’t be. After all the promises of always being conservative and never returning false negatives, how can this happen?
Well, there can be a couple of things going on. The first and by far the most common case is that you’re looking at something through a hole, gap or crack which gets solidified by Umbra’s voxelization. So typically the first thing you should try is to reduce the value of smallest hole and see if that fixes the issue. You can try temporarily tuning it down even quite a bit just to test if that’s the issue.
There are situations where this may not be completely obvious. For instance, if you have a book shelf in your scene where individual books are marked as occluders, too large a smallest hole may cause some of the books to be occluded either by the shelf or by the other books. So again, just decreasing the value of smallest hole is probably the first thing you should try.
Another case where objects may disappear is when your backface limit has been set to something less than 100 and your camera is in the vicinity of back-facing triangles. Note that the camera doesn’t have to actually be looking at the triangles nor do the triangles have to be facing away from the camera at that particular spot. It is enough that there is a topologically connected place (i.e. not behind a wall or anything) close to the camera from which some back-facing triangles can be seen.
The first thing to do to remedy this is obviously try with a backface limit of 100 and see if that fixes the issue. If it does, it may make sense to modify the geometry either by re-modeling some of the assets so that they’re two-sided or solid, or just removing the static occluder flag from the problematic objects. Or if you don’t care about the occlusion data size or don’t get a huge benefit out of the backface optimization, just disabling the backface test by setting the value at 100 is of course also an option.Culling gets weird very close or inside an occluder!
Culling may behave strangely if your camera goes inside an occluder, or infinitesimally close to one. Typically this may occur in a game with a 3rd person camera. Because Umbra considers occluders as solid objects, culling from inside one will typically mean that most of the stuff in your scene will get culled. On the other hand, if the backface test has been enabled, many of the locations inside occluders will have been removed from the data altogether, yielding undefined results. So you should not let the camera go inside occluders!
To be more specific, in general Umbra will be able to guarantee correct culling when the camera is further away from an occluder than the value of smallest hole. In most cases, going even closer will still work, but in some cases, because the limitations the voxel resolution implies on the accuracy of the occlusion data, going super close to an occluder may result in the camera being incorrectly assigned to a location inside an occluder. Hint: use the “camera volume” visualization to see in which cell the camera is located and what it looks like.
Generally, when the backface test is enabled (i.e. when backface threshold is something smaller than 100), Umbra will do a better job near occluders, because it is able to detect the insides of occluders, and correspondingly dilate all valid locations slightly towards them, so that you’ll get correct results even if you go arbitrarily close to an occluder. So if you cannot prevent your camera from going very close (or even slightly inside) an occluder, the first thing you may wish to try is to set backface threshold to something smaller than 100. This will help with dilation and may fix the issue.
If tweaking backface threshold does not help, or if your camera goes very deep inside an occluder, the only thing left to do is to simply remove the occluder flag from the object.Culling is too slow!
The reason for slow culling is typically very simple. Umbra traverses too many portals, and thus the visibility query takes a long time. The parameter that controls the portal resolution in the occlusion data is smallest occluder. A larger value will produce a lower-resolution portal graph, which is generally faster to traverse, up to a point. There are some situations, however, where this is not the case. Specifically, when having to simplify the occluder data conservatively, sometimes the increased conservativity of a lower-resolution graph may cause the view distances to increase, and the total amount of traversed portals to increase with it as well. But this is not the most typical of situations. In general, a large smallest occluder value will produce data that is faster to process in the runtime, at the cost of reduced accuracy of the occlusion.
Another, but obviously a bit more arduous way of making sure that the number of traversed portals doesn’t get out of hand is to modify the geometry of the scene so that the view distances don’t get too long in the problematic areas. Manually inserting occlusion into open areas will of course cause the traverse to terminate sooner, reducing the amount of processed portals and thus making occlusion culling faster.Baking is too slow!
The speed of baking largely depends on one thing: the number of voxels that need to be processed. In turn, the number of processed voxels is defined by two factors: the dimensions of the scene and the voxel size. Assuming you can’t do much about the former, the latter you can easily control with the smallest hole parameter. A larger value will of course speed up baking. So, it may make sense to start with a relatively large value and then tune it down if your objects are incorrectly disappearing because of too aggressive occluder generation. A microscopic smallest hole may cause baking to take forever and/or to consume ridiculous amounts of memory.Occlusion Data is too large!
If baking your scene produces too much occlusion data, there are a couple of things you can try. First, changing the value of backface limit to something smaller than 100, for instance 99, 50 or even 30 may be a good start. If you do this, make sure that culling works correctly in all areas your camera may be in. See the previous post for more information.
If changing backface limit is not an option, produces unpredictable results or doesn’t reduce the data size enough, you can try increasing the value of smallest occluder which determines the resolution of the occlusion data and thus has a very significant impact on the size. Note that increasing smallest occluder also increases the conservativity of the results.
Finally, it’s worth noting that huge scenes will naturally generate more occlusion data than small ones. The size of the occlusion data is displayed at the bottom of the Occlusion window.
In some rare cases, where the scene is vast in size and the smallest occluder parameter has been set to a super small value, Baking may fail with the error “Failure in split phase”. This occurs because the initial step of the bake tries to subdivide the scene into computation tiles. The subdivision is based on the smallest occluder parameter and when the scene is humongous in size (like, dozens of kilometers in each direction) too many computation tiles may be created, resulting in an out of memory error. This, in turn, manifests as “Failure in split phase” to the user. Increasing the value of smallest occluder and/or splitting up the scene into smaller chunks will get rid of this error.That’s it!
This concludes our three-post series of occlusion culling in Unity 4.3. For more information about Umbra, visit www.umbrasoftware.com.
Hello! My name is Sakari Pitkänen. I work as a developer on the Toolsmiths Team here at Unity. In this blogpost I will tell you about how we do automated performance monitoring of the Unity development branches.
With an ever increasing amount of test configurations (platforms, operating systems, versions) it gets increasingly difficult to keep track of everything that is going on. We need visibility, and to get this we need data. Our main reason for getting performance data from Unity is to prevent performance regressions.
Finding performance regressions
As we do day-to-day development we are not likely to notice if performance is degrading as time goes by, which is a big problem. We want to always try to make the next version of Unity perform better than the current one – and we definitely don’t want anything to be slower without us noticing it.
Most of our current performance tests measure time over some specific functionality of Unity. For example we can measure the frame time over some number of frames when utilizing a specific rendering functionality. The tests are designed to look for regressions, so the measurements are implemented in a way that whenever the measurement increases significantly we have a performance regression. Each time we run a test, we run it many times and use the median value of the samples, so that a single bad sample won’t show up as a regression. Besides time we can measure other things like memory usage.
Before we dig into the details of how we do this, let’s look at a concrete example: How we found a performance regression and used our data points to verify that it got fixed.
Performance regression and fix
In Unity version 4.3.0 we had a performance regression that affected a specific platform, Windows Standalone. Below is a table that has results for a limited set of tests run on four different platforms and two versions of Unity, 4.2 and 4.3. For all of these tests, the values are median values of measured frame times in milliseconds. The table is not showing performance per se; instead it lists the sample values. This means that increase in a measurement value can be considered a performance regression (red) and decrease can be considered an improvement (green).
The results show that the Windows Standalone platform has a significant performance regression that affects most of the selected tests. From the test names one can already assume the cause of the regression is probably graphics related. Unfortunately, we were still working on the test rig when 4.3.0 was released, so we didn’t get the data to catch this before shipping. That will surely not happen next time, and as we widen the coverage with more configurations we expect to significantly reduce the risk of shipping with performance regressions moving forward.
We did find the cause for this particular regression and promptly fixed it for Unity version 4.3.1. Then we ran the tests again, now comparing last three released versions of Unity, 4.2, 4.3.0 and 4.3.1.
We could verify that the fix was effective and that these tests show no significant changes in performance between Unity versions 4.2 and 4.3.1.
In Part II of this post I’ll tell you about the rig we have built for performance testing.
I was pursuing a UnitySteer issue last week using a sample project provided by Martin Haeusler, and it made me realize that since we have no unified tutorial anywhere, there are some UnitySteer assumptions that might escape new users. Since I haven’t yet had time to sit down and write a full tutorial – too crazy busy with work – I’ll start listing assumptions and gotchas that you may need to be aware of when working with it.
In no particular order…
- A vehicle’s forces are not updated every frame. Instead, a ticked priority queue is used to control how often this happens. The default is at 10Hz (every 100ms) and can be changed using the vehicle’s _tickLength property.
- The same applies to radars, only that they are updated by default only at 2Hz. Detection can be an expensive endeavor if vehicles and obstacles are tightly packed together, so I do not recommend decreasing a radar’s tick length blindly.
- UnitySteer uses the Radar component to “see” other objects, both vehicles and obstacles. Radar itself uses Physics.OverlapSphere to check for nearby objects. Only items where the vehicle’s radar radius overlaps with the other part’s collider will be detected. This means that you may want to extend an obstacle’s radius of the object itself, so that vehicles can see it coming instead of reacting at the last minute.
- Radar looks for DetectableObjects in the root, not across all the hierarchy. This means that you should not group obstacles or vehicles under a parent, or it may not find the detectable object that it expects, leading it to ignore some or all of your grouped items.
- If you are doing a space ship or car game, use AutonomousVehicle, not Biped – they apply the force in a different manner.
- When creating your vehicle prefabs, make sure that your AutonomousVehicle has a radius that is large enough to envelop the entire visuals of the object (you can check by enabling Draw Gizmos). This radius will be used for overlap calculations, so if it is not large enough chances are it’ll lead to cases where you get a visual collision when UnitySteer thinks there is no overlap.
- Make sure you assign enough force to your behaviors, particularly those that you need to have kick in for corrections (like the avoidance post-processing behaviors).
Found any other non-obvious assumptions yourself? Please add them to the comments!
The following blog post was written by Jasin Bushnaief of Umbra Software to explain the updates to occlusion culling in Unity Pro 4.3.
This is the second post in a three-post series. In the previous post, I discussed how the new occlusion culling system works in Unity 4.3. I went over the basic usage and parameters that you need to know in order to get the best out of occlusion culling. In case you missed it, check it out here.
This post gives you a list of general recommendations and tips that should help you get the best results out of occlusion culling.Good quality occlusion
It may seem obvious, but of course the first thing to make sure is that your scene actually contains meaningful occlusion. Moreover, the occlusion should preferably consist of good, large occluders if possible, as opposed to fine details that only accumulate as occluders when looking at from a certain angle. Umbra will generally not be able to perform occluder fusion, so even if your lush forest with lots of foliage will occlude something behind it, it will do so only once the individual occluders are “accumulated”. So in this sense, the trees and forests in general will be pretty bad occluders from Umbra’s point of view. On the other hand, a mountain is a good occluder and Umbra will certainly be able to capture it into the occlusion data as expected.
There are two main types of objects Umbra cares about: occluders and occludees. The former are just geometry and Umbra treats them basically as a single, solid model. The latter are the ones whose visibility Umbra actually tests using the occlusion data. Occluders consist of pretty much all geometry that have the “Occluder static” flag set, and unsurprisingly, occludees that have the “Occludee static” flag, respectively.
As a rule of thumb and by default, you can and should set most if not all your renderers as occludees in order for Umbra to cull them. Also by default, most of your static renderers can be occluders as well. Just make sure that if your renderer is non-opaque, it shouldn’t be an occluder either. (Unity will actually issue a warning if this is the case.) This naturally includes transparent objects and such.
But also, if your object contains very small holes (consider e.g. a cheese grater or a bushy plant) that you wish to see through, but reducing the value of smallest hole globally doesn’t make sense (see the previous post as to why), simply removing the occluder flag from the renderer is the correct thing to do.
Furthermore, because occluders are considered solid, correct culling can typically be guaranteed if the camera doesn’t intersect an occluder. This means that if e.g. the collision system cannot prevent the camera from flying inside an occluder, you should probably remove the occluder flag in order to get meaningful results.Object granularity
Given the fact that Umbra does object-level occlusion culling, it doesn’t make a whole lot of sense to have objects of several kilometers in size. Such massive objects are very hard to cull, as some part of the object is almost always visible, especially combined with Umbra’s conservative culling. So, splitting up e.g. the terrain into multiple patches is typically a good idea, unless you want the entire terrain to always be visible.
In terms of occlusion culling, typically the best object subdivision is a natural one, meaning that naturally distinct objects should probably kept separate in culling as well. So chunking objects too aggressively typically doesn’t help. One should group only objects that are similar in terms of visibility. On the other hand, too fine-grained subdivision may introduce some unnecessary per-object overhead. In reality, this becomes a problem only once there are tens of thousands of occludees in the scene.
Maybe it should be emphasized that only the object subdivision of occludees matters. Occluders are considered to be a single big bowl of polygon soup anyway.Watertight models
In the previous post, I briefly described how Umbra first voxelizes the occluding geometry, groups these voxels into cells and then connects the cells with portals. In the process, Umbra is always conservative, meaning that in various cases Umbra considers the occluders slightly smaller than what they are in reality, or conversely, the empty areas slightly larger.
This means that if there happens to be an unintentional hole in your occluding geometry, one which rather than getting patched up is retained by voxelization, there’s a good chance it’ll become somewhat magnified in the final output data. This may lead to surprising “leaks” in occlusion. The camera may be looking at a seemingly solid wall, but things behind the wall don’t get occluded because there’s some unnoticeably small crack somewhere.
So, while voxelization does patch a lot of unintentional cracks and gaps in the occluding geometry, it’s still highly important to try to model the geometry as water-tightly as possible. In the next post, I’ll describe the Visibility Lines visualization which may help you debug these kinds of issues.Finding the right parameter values
Admittedly the hardest part of using Umbra is finding the right parameter values. The default values in Unity do a good job as a starting point, assuming that one Unity unit maps into one meter in your game, and the game’s scale is “human-like” (e.g. not modeled on a molecular level, nor is your typical object a planet or a gigantic killer-mech-robot).
A good rule of thumb is to start with relatively large values and work your way down. In case of smallest hole, for instance, the larger value you can use, the swifter is the bake process. Thus, you should tune it down only if/when you start experiencing culling artifacts, i.e. false negatives. Similarly, starting with a relatively large value for smallest occluder typically makes sense.
Then you can start adjusting it downward and see how Umbra culls better. Stop when culling starts taking up too much time and/or the occlusion data becomes too large.
As for backface threshold, start with 100. If your occlusion data is too large, or if you happen to get weird results when the camera is very, very close or possibly even intersects an occluder, try using 90 or even a smaller value. More on this in the next post.To be continued…
About a year ago I uploaded the first release of SteerForSphericalObstacleRepulsion, a UnitySteer behavior that was meant to replace SteerForSphericalObstacleAvoidance and improve on it. See, avoidance only takes into account the closest obstacle, and my plan with Repulsion was to have it consider all surrounding obstacles and assign more weight to the ones we’d intersect with.
It was a somewhat hurried implementation, but it worked well enough on my test cases.
Turns out there was a stupid error that slipped by me. See this line.avoidanceMultiplier = Vehicle.Radar.Obstacles.Count;
As you see, if our path takes us into collision course with an obstacle, we do increase the avoidance force. However, for some silly reason – don’t ask me why – I didn’t do it by the distance, but by the number of obstacles detected. That line should be along these lines:avoidanceMultiplier = Mathf.Clamp(Vehicle.DesiredSpeed / nextDistance, 1, 3);
The avoidance multiplier should likely be clamped anyway, which will probably be a different parameter for the behavior. I’ll do some more tests and expect I’ll have a fix up on github soon.
Through the past 2 years Unity QA has expanded and built tools, frameworks, and test rigs for internal use, something we have previously blogged about. Through all this work we have done, we have created a lot of value internally in Unity and we want to give our users access to these awesome tools. Today we have released version one of the Unity Test Tools on the Asset Store. Get it here:
Thus we made the decision to make Unity Test Tools available for our users, which we hope will help you attain a high quality in your code while developing your games.Unit Test
The lowest and most efficient level to do automation is at the unit level. We have decided to use nUnit as the basis for our unit test framework, which is a well known framework for those already writing unit tests.
We have ensured that the integration to the editor is intuitive and simple, with the option of having automatic execution on every recompile, so you have immediate feedback while you write your code. Another important aspect of a test framework is the ability to make a build pipeline where unit tests are executed without a head and the Unity Test Tools give you this option as well.Integration Test
In order for you to test the integration between components, objects and assets, you need a higher level of tests and for this we have made the Integration Test Framework. Integration tests are also the easiest way of starting to write automation tests on a game in development, since they don’t have requirements on the architecture of the code.
The simplest use of these would be to make a scene, use assets and gameobjects, and set up the conditions you want to verify. The execution is as simple as with the unit tests, where the execution will cycle through the scenes containing tests and execute each test for you. This framework can also be integrated into a build pipeline seamlessly, so you can test all levels of your project from commandline.Assertion Component
The assertion component is able to monitor the state of gameobjects while in playmode and pause the game if the condition is met. The primary use is debugging, where you can make the game stop in the exact frame the condition is met, thus enabling you to see the entire picture of when it occurs.
To help manage the assertion components, we have added an explorer which is similar to a list of breakpoints in code, so you have an overview of the states and where the components are placed. The component can evaluate complex conditions runtime and thus is more powerful than a normal breakpoint.
In order to enable the build and release pipeline, we have made it possible for you to disable assertion components in release builds, thus making sure you don’t hit them in a released game.Examples
In the package you download you will find a full set of examples and a comprehensive documentation on how each of the tools work.The Future
Releasing the tools is just the beginning for us. We will be committed to release tutorials and patterns which will help you structure your projects such that they are testable. We will also continue improving on the tools and increase integration into Unity, all with the aim of making it easy for you to start testing your projects.
This featured guest article by Christian Thurau on game metrics, from essential to advanced, and their value in game development. Christian has a PhD in data mining and machine learning, and he is the CTO of GameAnalytics, one of the latest Unity official Asset Store Service Partners.
Facts about Christian:
Favorite games: Old-school shooters, mostly Quake 1, Team Fortress, and Bioshock
Favorite GA tool: Heatmaps
Favorite metric: Playtime
Analytics has become a much-discussed topic in game development in recent years, not only for F2P but across the entire industry. While it is still creativity, intuition, and experience that matter most in creating successful games, statistics on games and players have become an integral part of game development.
Unfortunately, analytics have become somewhat notorious as tools for monetization, used to influence player behavior to maximize revenue. What is less known is that metrics are just as useful in increasing fun and engagement. Hopefully, with this article, I can shed some light on the true nature of game analytics and break through the shadow of corporate greed that clouds its horizon.It’s all About Engagement
The first and most important question with any analytics tool is what exactly you should track. While you would need to see into the future to know every metric you’ll ever need, there are some clear must-have measures that will provide a very solid starting point. Generally, these basic metrics can be placed in three broad categories: Acquisition, Monetization, and Engagement.
You may have already figured it out from the introduction, but of these three categories, I believe engagement is the most important, mainly for two reasons:
Some engagement metrics sit at the base of player lifetime value calculations, a metric that is used exclusively to ensure user acquisition with a positive return on investment.
Engagement relates closely to the game’s “funness.” It is a measure of the degree to which your game fulfils, so to speak, its destiny. There’s no reason to attempt to improve on other metrics if engagement is low.
Of course, the priority of these metrics might change according to individual game needs.
If you have never tackled analytics before, this might be a lot to take in. The good news is that the free GameAnalytics’s SDK for Unity automatically takes care of all the basic tracking needs I mentioned. All you have to do is interpret the results.
For example, we estimate engagement by generating both retention and session metrics.
Day 1, day 3, day 7, day 14, day 28 retention: “Day X Retention” refers to the percentage of users who return to the game X days after installing it. This metric can be used to track how new builds or features of your game perform. The more often people return to the game, the more satisfied they are with the experience and the more likely they are to spend. On the other hand, low retention, depending on when it manifests itself, can indicate anything from a weak core loop to low production value or an endgame that offers insufficient content.
Daily active users (DAU), monthly active users (MAU), and the DAU/MAU ratio: The number of people who use your game on a given day and in a given month can give you key insights into its popularity, as well as allowing you to plan for growth and server loads.
Session length, average session length, and number of sessions per user: Sessions describe when and how long players engage with your game. Average session statistics can paint a clear picture of how your game fits in the player’s lifestyle—whether it’s played in short bursts on the bus or train, for example, or the player really likes to allocate a lot of time to get into the experience. Using session lengths, you can tweak the game experience to match the most popular (and natural) play styles. If you are using in-game advertisements, you can also look at sessions to estimate ad exposure and thus ad revenue.
Once you achieve solid engagement numbers, monetization and acquisition metrics come into play. GameAnalytics generates a number of fundamental metrics to allow you to estimate your income:
Revenue: the amount of currency, converted to USD, collected on a daily basis. You should always display revenues segmented by acquisition campaigns to identify your best-performing acquisition channels. Also, it is recommended that revenue from in-game purchases be distinguished from advertising revenue through custom event hierarchies.
Average revenue per daily active user (ARPDAU) and average revenue per paying user (ARPPU): how much users actually spend, both in absolute terms and compared to the total number of users. The bigger the ARPDAU, the better the chances your game will be self-sustaining.
Conversion rate: the percentage of users who make an in-game purchase for the first time on a given day. You can use conversion rates to assess the effectiveness of different special offers, for example.
Number of paying users and number of transactions: how many users and transactions occur in a given period. Flat numbers can sometimes be misleading; it’s helpful to look at the trend lines for the number of paying users versus the number of transactions in order to determine if you need to adjust the in-game product prices
Acquisition metrics have only recently been introduced into the GameAnalytics tool, so they are a bit on the basic side:
Paid vs organic users: comparison of the number of users who came to the game in a natural manner (word of mouth, friend invite, or stumbling upon it in the Game Store) to the number of users acquired via an advertising campaign. This basic metric can point to what is known as the K-factor, or the number of additional users that each user introduces to the game. This is very difficult to calculate without tracking all in-game invites, but the paid vs. organic users metric can at least give you a rough estimate. The K-factor is essential to determine whether your user acquisition campaigns pay off.
Number of installs by country, build, acquisition campaign, or other factor: the number of times your game is installed. The installs metric can show what iteration of your game is most popular and where the game performs best. The number of installs also gives you a rough overview of where your game is in its life cycle.
Remember, though, that you can segment all of the other predefined metrics by acquisition campaign parameters, as well. This ultimately allows you to make the best decisions when it comes to user acquisition.Advancing to the Next Level
The more you get the feel for using analytics as part of game development, the more you’ll find the need to track data that is unique to your game. This is easily done with GameAnalytics, which is centered around the concept of abstract events. An event is simply anything that could happen inside of your game; this can be a rocket that gets fired, an item that is picked up, or a banner that gets clicked. No matter what is important to your game, you can track it using GameAnalytics events.
Note, however, that while the old mantra of “there’s no data like more data” is certainly true, selecting which custom events you should track is crucial. You should always design the desired metrics in advance; otherwise, you’ll risk wasting a lot of time to get your head around what is basically clutter.
As nothing beats solid statistics when it comes to optimizing in-game economies, you could start with tracking specific item purchases. Then, you can easily gather statistics on your best-selling virtual goods, find out what works and what doesn’t, and ultimately get rid of useless items and optimize item pricing.
For even more advanced uses, combine user progression events with the funnel tool to open up a new world of tracking possibilities:
New user flow: Having a perfect first-time flow greatly improves the retention of your game. Data on where users drop out is essential for tweaking the learning curve.
First purchase: Identifying what item is being purchased first by players can help you understand what motivates players to spend money. This ultimately allows you to maximize conversion rates.
Missions and achievements completion: Since achievements and missions are optional in most games, their completion rate is a good indicator of what type of content is most popular among players.
Level progression: The players’ progression through the game is a good indicator of how much of the original game content has been consumed, allowing you to time your content updates perfectly.
There are a ton of other useful analyses you can do with the free GameAnalytics tool, and we are continuously working on new, exciting features to help you build better games.Our Gift to the Unity Community: 3D Heatmaps
Last but not least, I want to mention that there’s one GameAnalytics feature that is exclusively available inside Unity: 3D heatmaps. Heatmaps visually depict the frequency of events, directly over the game-level architecture. Let’s assume that you are already tracking players’ death events. With the 3D heatmap visualization, you can determine exactly where in the game players are dying. You may notice bottlenecks, such as narrow bridges where players tend to bump into each other or where they are simply more exposed. If you also track killer positions, you may also notice exploits, points where players camp out to ambush their enemies. Heatmaps are not only a great tool for fine tuning game balance; they also help in verifying that your game is played as intended.
I’ve barely scratched the surface of heatmaps with this short example, but you can read a lot more about them on the GameAnalytics blog. There are some nice visual examples there, as well. Maybe we’ll be able to delve into the details in a future article.
You can read our blog for many kinds of insights into data analysis in games, from in-depth introductions to the basics to real-world case studies. We invite leading industry experts and researchers to contribute to our blog on a regular basis.Get the GameAnalytics SDK—it’s all free, with no data limits!
The GameAnalytics SDK for Unity is right here in the Asset Store: https://www.assetstore.unity3d.com/#/content/6755
After embedding this code, all the necessary info like name, category, price and rating shows up, together with a picture and a link. When somebody’s done reading your post, they can hurry up and buy the asset. Needless to say, this is especially useful if you’re a publisher.
For example, paste this in (in the text view, if you’re using WordPress):
and you’ll get this:
To embed other assets, simply find their Content ID at the end of the URL of their asset store page. 11228 is the Content ID of our 2D platformer:
This is the general script:
It’s just a little convenience feature, but it can save a lot of time.
This should work with most content management systems for websites. The only condition is that you should be able to embed a <script> tag.
You can choose between different preview sizes and even embed multiple items. For a complete list of possibilities, head over to Keigo’s demo page.
Hello everybody! I’d like to tell you about Super BR Jam, an amazing and unique game jam in Brazil supported by Unity that was held November 22nd-24th.
A huge group of experienced Brazilian game developers including companies such as Critical (Dungeonland), Behold Studios (Knights of Pen and Paper, Chroma Squad) and Swordtales (Toren) pulled the brakes on their internal projects and came together for the entire weekend to make brand new games.
And here’s where things start to get interesting: All games developed during the Super BR Jam will be made available for purchase at a “pay what you want” model. However if you pay above 5 USD you’ll also get:
- Dungeonland All-Access Pass
- Magicka: Wizard Wars
- The Showdown Effect
- Knights of Pen and Paper +1 Edition
- Project Tilt
- Out There Somewhere
- Qasir Al-Wasat: A night in between
The best part: all proceedings from the bundle sales will go to charity! The chosen charity was Solar Meninos de Luz, a private philanthropic organization which supports formal and complementary education and basic health care to 400 children in impoverished communities in Rio de Janeiro, Brazil.
This is what Mark Venturelli, former Game Designer at Critical Studios said about the event: “Wow. I was completely blown away by how amazing the people were. We wanted to show the world how strong our indie scene is here in Brazil, and how we are working together and helping each other. This is a very different game jam: we are selling the games and 100% of the profits will go to a school for poor children in Rio. I think this noble cause really gave everyone an extra incentive to come together and create things that are absolutely bursting with love and care. We still have a few days to go, so we are all still working together – this time to boost the publicity on the event and try to raise as much money as we can.”
Saulo Camarotti, CEO and Founder of Behold Studios said, “At first we thought this would be just another game jam, where we would get some friends together to make a game in very few hours. We decided to make a game that would force us to stretch our skills, to go beyond what we did before, and when we were done (after a lot of sweating) we realised that all that hard work that we put into proving ourselves was directly connected to the help that we would be providing to the children and volunteers at Solar. It was an incredible experience! We are definitely interested in participating again and helping other charities.”
After the event I was really glad to find out that 18 out of the 24 games were made with Unity and the results are very impressive!
You cand find out more about Super BR Jam and purchase the bundle at the website www.superbrjam.com.
So wait no further! The bundle will be available only until December 4.
It’s a great way to buy some cool games at a ridiculously low price and have a nice, warm feeling in your heart at the same time!
Link to site: http://www.universa.la/en/More about the games made during the game jam:
Alien Kingdom by Behold Studios (PC + Mac)
A massively cooperative online strategy game in a beautiful alien world. This game cannot be described in words, just go play it!
Previous games from this developer: Knights of Pen and Paper, Chroma Squad
Kureizy Japanese TV Show by Swordtales
Like a Nintendo game if everyone at Nintendo had a stroke at the same time
Previous games from this developer: Toren
Legend of HUErule by BitCake Studio (PC + Mac)
A simple online RPG with adorable low-poly visuals where your party must work together to defeat increasingly difficulty waves of bouncy monsters
Previous games from this developer: Project Tilt
Cinerea by Otus (PC + Mac)
A lone woman wandering through alleyways is the theme of yet another mysterious and atmospheric game from Otus.
Previous games from this developer: Niveus
Dinomancer by Critical Studio
Previous games from this developer: Dungeonland
Thunderstruck by Pocket Trap
Defend your castle using the power of lightning! A challenging but rewarding game that looks absolutely adorable
Previous games from this developer: Ninjin, Hell Broker
Final da Zuera by Hoplon
As the crowd of a soccer game, choose wisely how you will cheer for your team
Previous games from this developer: Taikodom, Heavy Metal Machines
Gory Creatures from Across the Yard by Pigasus
You are a crafty pig defending your yard in this polished 3D tower defense game
Previous games from this developer: Adventurezator
Seal League Gravity Ball by Critical Studio
Space seals play football with no gravity. Do not worry, they all have magnetic suits to hover around. And lasers. Can be played with 2 or 4 players locally.
Previous games from this developer: Dungeonland
The following blog post was written by Jasin Bushnaief of Umbra Software to explain the updates to occlusion culling in Unity Pro 4.3.
Unity 4.3 includes a plethora of improvements. One of the completely re-implemented subsystems is occlusion culling. Not only has the interface been simplified and the culling runtime itself revamped, a number of new features have also been added.
In this series of three posts, I’m going to go over how the new occlusion culling system works in Unity 4.3. This first post goes through the basics of how occlusion culling is done and what the basic usage is like with the user interface. The second post focuses on best practices to get the most out of occlusion culling. The third and final post focuses on some common problem scenarios and how to resolve them.
But let’s start with some basics. Occlusion culling refers to eliminating all objects that are hidden behind other objects. This means that resources will not be wasted on hidden stuff, resulting in faster and better-looking games. In Unity, occlusion culling is performed by a middleware component called Umbra, developed by Umbra Software. The UI, from which Umbra is controlled in Unity, can be found in Window -> Occlusion Culling, under the Bake tab.How Umbra Works
Umbra’s occlusion culling process can be roughly divided into two distinct stages. In the editor, Umbra processes the game scene so that visibility queries can be performed in the game runtime, in the player. So first, Umbra needs to take the game scene as its input and bake it into a lightweight data structure. During the bake, Umbra first voxelizes the scene, then groups the voxels into cells and combines these cells with portals. This data, in addition to a few other important bits is referred to as occlusion data in Unity.
In the runtime, Umbra then performs software portal rasterization into a depth buffer, against which object visibility can be tested. In practice, Unity gives Umbra a camera position, and Umbra gives back a list of visible objects. The visibility queries are always conservative, which means that false negatives are never returned. On the other hand, some objects may be deemed visible by Umbra even though in reality they appear not to be.
It’s important to realize that, while this system appears similar to what was shipped with previous Unity versions, the entire system has been basically rewritten. A lot has changed for the better, both internally and externally!
How to Use Umbra
There are obviously a few considerations in getting the best out of occlusion culling. Ideally, you’d want the least conservative result as fast as possible. There are, however, tradeoffs involved. The more accurate (i.e. the least conservative) results you want, the higher-resolution data you need to generate. However, higher-resolution data is slower to traverse in the runtime, yielding slower occlusion culling. If occlusion culling requires more frame time than it saves by culling, it obviously doesn’t make a whole lot of sense. On the other hand, very quick culling isn’t of much help if only a few objects are culled. So it’s a balancing act.
The way Umbra lets you control this balance is by having you define a couple of bake parameters. The parameters determine what type of input the bake process should expect and what type of data is generated. In the runtime, using Umbra is as simple as it gets. If you’ve baked occlusion data and the camera has occlusion culling enabled in the Inspector, Unity will use Umbra automatically.
The input is controlled using the smallest hole parameter. When voxelizing the occluder geometry, smallest hole maps almost directly to the voxel size. This means that if your geometry contains intentional holes, gaps or cracks that you wish to see through, using a smallest hole smaller than these is a good idea. On the other hand, a lot of the time the geometry contains lots of unintentional cracks that you do not wish to see through. A reasonable voxel resolution will patch these up. It may help to think about smallest hole as the “input resolution” of the bake.
Note that setting smallest hole into a ridiculously small value means that baking will be unacceptably slow and/or take up a monumental amount of memory in the editor. In some rare cases, it may even cause the bake to fail due to insufficient memory. Then again, while using a larger value will be faster and more memory-friendly, it may cause Umbra to not see through things like grates or fences. So bigger isn’t always better either. In general, a smallest hole as large as possible without visible errors is desirable. In practice, we’ve found that values between 5 cm to 50 cm work fairly well for most games where the scale is “human-like”. The default value in Unity is 25 cm, and it’s a good starting point.
While smallest hole mostly deals with what type of input geometry you have, smallest occluder determines what kind of output data is produced. In essence, you can think about smallest occluder as the output resolution of the data. The larger the value, the faster it is to perform occlusion culling in the runtime, but at the cost of increased conservativity (false positives). The smaller the value, the more accurate results are generated, but at the cost of more CPU time. Obviously higher-resolution data will mean a larger occlusion data size as well.
So as the name implies, a small value means that very fine features are captured in the occlusion data. Under the hood, this directly maps to how large cells Umbra creates. Lots of small cells mean lots of small portals between them, and naturally it’s more expensive to rasterize a large amount of small portals than vice versa.
The effects of changing smallest occluder can be seen in the picture below. Note how the depth buffer, which is essentially what Umbra sees, loses detail as smallest occluder increases.
In most games, keeping smallest occluder slightly larger than the player, so around a few meters, is a good default. So anywhere between 2 and 6 meters may make sense if your game’s scale isn’t microscopic or galactic. The default value in Unity is 5 meters.
Perhaps the most difficult parameter to grasp is called backface threshold. While in many cases you don’t really need to change it, there are some situations in which it may come in handy to understand how it affects the generated data.
First, it’s important to note that the parameter exists only for a single purpose: occlusion data size optimization. This means that if your occlusion data size is OK, you should probably just disregard backface threshold altogether. Second, the value is interpreted as a percentage, so a value of 90 means 90% and so on.
OK so what does backface threshold actually do then? Well, imagine a typical scene that consists mostly of solid objects. Furthermore, there may be a terrain mesh whose normal points upwards. Given such a scene, where do you want your camera to be? Well, certainly not underneath the terrain, that’s for sure. Also, you probably don’t want your camera to be inside solid objects either. (Your collision detection normally takes care of that.) These invalid locations are also ones from which you tend to “see” mostly back-facing triangles (although they may of course get backface-culled). So in many cases it’s safe to assume that any location in the scene, from which the camera sees a lot of back-facing triangles, is an “invalid” one, meaning that the in-game camera will never end up in those locations.
The backface threshold parameter helps you take advantage of this fact. By defining a limit of how much back-facing geometry can be seen from any valid camera location, Umbra is able to strip away all locations from the data that exceed this threshold. How this works in practice is that Umbra will simply do random-sampling in all cells (see the previous post) by shooting out rays, then see how many of those rays hit back-facing triangles. If the threshold is exceeded, the cell can be dropped from the data. It’s important to note that only occluders contribute to the backface test, and the facing of occludees doesn’t bear any relevance to it. A value of 100 disables the backface test altogether.
So, if you define the backface threshold as 70, for instance, to Umbra this means that all locations in the scene, from which over 70% of the visible occluder geometry doesn’t face the camera, can be stripped away from the occlusion data, because the camera will never end up there in reality. There’s naturally no need to be able to perform occlusion culling correctly from underneath the terrain, for instance, as the camera won’t be there anyway. In some cases, this may yield pretty significant savings in data size.
It’s important to stress that stripping away these locations from the occlusion data means that occlusion culling is undefined in these locations. “Undefined”, in this context, means that the results may be correct, incorrect (pretty much random) or return an error. In the case of an error, all objects are simply frustum culled.
Of course in some cases, there just happens to be some amount of back-facing geometry in valid camera locations too. There may be a one-sided mesh that has been, possibly erroneously, tagged as an occluder. If it’s a large one, it may cause the backface test trigger in nearby areas, resulting in culling artifacts (=errors). This is why the default value of backface threshold in Unity is 100, meaning the feature is disabled by default.
Feel free to experiment with the parameter. Try reducing the value to 90, which should drop a lot of data underneath terrains for example. See how it has any noticeable effect on the occlusion data size. You can go even lower if you want. Just remember to do so at your own risk. If you start popping into rendering artifacts, increase the value back to 100 and see if it fixes the problems.
To be continued…
In the next post, I’ll go into some best practices and recommendations for how to get optimal results out of occlusion culling. Please visit www.umbrasoftware.com for more information about Umbra.
So you’ve decided to download and check out this newfangled “Unity” thing. You open it up, create a project, and are presented with the Unity editor. At this point you’re probably thinking to yourself, “OK, now what?” That’s where we come in. We are the Unity Learn team and our goal in this crazy, hectic world is to labor tirelessly to bring you the best content to learn from and utilize in your games. Of course, now you are probably thinking to yourself, “That’s an interesting claim, faceless-internet-article-writing-man, but certainly your content can’t help me. I make special games!” Nay, I say to you. Our content can help anyone learn to use Unity to make all kinds of games. Just look at what we have to offer (the Learn portion of Unity’s website can be found at Unity3d.com/Learn).
Why Can’t I Hold All These Tutorials?
You like tutorials. We get that. That’s why we’ve made a bazillion of them and we keep adding more. We get that you’re too busy to read all sorts of words. That’s why our tutorials are video tutorials in stunning high definition! You never have to guess where an option is or what a button looks like. Just grab a bowl of popcorn and your favorite caffeinated beverage, queue up our videos, and follow along. You’ll be a pro in no time (warning: watching videos actually takes some time due to them not existing inside a black hole)!
We’ll Do It Live!
Come check our newest initiative: Live Training! That’s correct, come hang out with a real person as we teach concepts and answer questions in real time! Right now you’re thinking, “Blargpfpfpf” (that’s the sound of your brain exploding). Not only do we deliver content live, we actually really enjoy it! Come check out what we have scheduled. You can also watch past sessions to see just what you’re getting yourself into. We look forward to seeing you there. I mean you specifically, John (your name is probably not John, but if it is, I just blew your mind).
Perhaps you’re the type of person that prefers to drudge through tons of documentation. Never fear, we have that too. Chances are most of your questions and curiosities can be resolved simply by flipping through the posted articles and examples. So feel free to dive right in. There’s nothing wrong with being a lone wolf. We won’t tell anyone.
That’s What Friends Are For
It can be very frustrating when you have a specific question that no resources seem to address directly. Luckily, Unity has a fantastic community. While not directly controlled by the Learn team, it’s still on our page and so it will be mentioned here (and there’s nothing you can do about it). Stop and in say “Hi” to fellow Unity users in the forums. Also, swing by the Unity Answers section to ask specific questions and get an amazing level of help! We’re all in this together!
TL;DR (Too Long; Didn’t Read)
The Unity Learn team is a giant collection of superhuman awesome-sauce. We are here to provide you with content and to lower the barrier to entry into game development. We recognise that you would rather make games than learn to make games, so let us expedite the process for you. Swing by our live sessions or check out our pre-recorded videos to start learning.
Got questions, comments, suggestions, snide remarks, or personal anecdotes? Hit us up on twitter:
@willgoldstone (He’s like a British Ryan Gosling)
@theantranch (a.k.a Little Angel)
@robotduck (He is an actual duck… no kidding)
@rodulator (Scottish and angry, best leave him alone)
Now go and make games. We expect to see progress!
Physically-based surface shaders in the Asset Store
So… physically-based surface shaders. Physically-based what? I hear you say. Even though we consider ourselves a company with its finger firmly on the gaming pulse, some of us had to have this explained (not the graphics team I hasten to add).
Physically-based surface shaders are a revolutionary (and really cool) way of doing shading that generate far more realistic and vibrant results. What distinguishes them from other shaders is that they actually mimic the behaviour of surfaces in the real world by following the law of energy conservation: the total amount of specular and diffuse reflection always equals the amount of incoming lighting energy.
The huge advantage of physically-based surface shaders isn’t just the exceptional visual fidelity they achieve, it’s that they look incredibly lifelike irrespective of the lighting conditions under which they’re used. Steel will look like steel whether it’s reflecting bright sunlight off a knight’s breastplate or used in a handrail on a dark and dreary New York backstreet. There’s no need to adjust your shader settings to get things looking ‘right’ when authoring different environments.
Physically based surfaces can have a wide range of reflective behaviour, making it easier to give shiny plastic, metal, wet stone and everything in between a unique visual identity. The Alloy Physical Shader Framework by RUST LTD, available now on the Unity Asset Store, brings this technology to Unity Pro.
Physically-based surface shaders are finding their way into many of the latest AAA productions like Killzone 3, Metal Gear Solid 5 and RYSE. RUST LTD’s Asset Store product is different in that it makes this technology accessible to small teams – or even individuals. By the way, RUST LTD are the team behind Museum of the Microstar – our DX11 contest winning entry. Museum of the Microstar used an earlier version of Alloy and showed its use scaled to intense heights.
The workaholic perfectionists at RUST LTD have spent 2 years working day and night to get the product to the point where they’re happy enough to release it, and somehow they’ve also found the time to come up with lots of learning material to help you guys get started. The Alloy package ships with a comprehensive sample scene, including over 30 complete materials with interactive controls so you can get up to speed fast on how to get the most out the set. Also, if you find yourself needing even more out of Alloy, the set is configured to make extending it with your own variants as easy as possible.
Alloy is built around a Normalized Blinn-Phong BRDF, which provides a great balance of speed and sophistication, and is fully compatible with DX9 and the light pre-pass renderer of Unity’s deferred mode. The Alloy set contains variants for all the common visual characteristics you might need including rim lighting, separated occlusion maps, detail maps, and several types of translucency.
Lastly, for those who don’t want the headache of managing cubemaps in complex environments, Alloy includes a complete set that uses Radially Symmetric Reflection Maps (RSRMs), a general-purpose blurred reflection texture that makes baked surfaces look shiny, without looking like perfect mirrors.
In an upcoming post we’ll dive deeper into both the technical details of Alloy, and the implications of developing content for a physically-based pipeline… stay tuned!
So, whether you’re a large team looking for that AAA graphics punch for your next-gen console game, or an indie trying to add wow factor, Alloy is ready for you. Check out these downloadable projects and demos made with Alloy, or simply purchase directly from the Asset Store.
We want to enable you to search our known bugs easily, so you can identify problems in your games and find workarounds faster. As a user, you can also comment on bugs, vote on bugs that are important to you, suggest workarounds you found or add information. We will use this information in our ongoing prioritization of bugs, so you can have a very real impact on the future of Unity.
When you have logged in with your Unity Developer Network account, you will have 10 votes you can place on active bugs. If a bug is resolved or in other ways removed, the vote is returned to you. You can always see a list of your own votes on the site.
Privacy has been a huge concern for us, so ANY bug which reveals ANY information about the submitter will NOT be made public. It is a manual task for QA to mark bugs public and the default is that a bug is NOT public. Likewise, we are not sharing projects, logs, attachments or communication on any bug even though it might help others. We simply can’t risk any privacy to be invaded.
The bugs you will see on the site are all confirmed internally in Unity. They are reproducible and acknowledged. If you have sent a report to us, it will not be visible before we have processed it, so it may not even become public, depending on how we decide to handle it. The fields we use are set specifically by us, so no part of what you write will be public without editing and your email address is nowhere on the site.
We are very excited about opening up this new way to communicate with you. We hope it will result in an even tighter and more relevant prioritization of bugs. Enjoy, and use your voting power!
Read this featured interview with Roadhouse Interactive’s CEO, James Hursthouse, on how they have taken advantage of the Unity Asset Store eco-system to discover and integrate GameAnalytics. The GameAnalytics tool brought Roadhouse Interactive closer to their players and enabled them to get deep insights into their game performance with a data driven approach to game development.
Tell us a bit about Roadhouse Interactive.
Roadhouse is an independent game developer based in Vancouver BC. Our focus is very much on free to play, but we’re always looking at other models that make sense.
Our talented team has years of development experience on multiple platforms, so we’re well equipped to create great games no matter the genre, from innovative social casino titles such as Trophy Bingo to established IP titles such as Warhammer 40,000: Carnage.
You may have also noticed that our Freeride: Ski Cross title was announced in Unity’s own Unity Games publishing initiative.
We did indeed notice Freeride: Ski Cross. You are one of the first studios to join the Unity Game publishing effort. Can you tell us more about this move?
Freeride: Ski Cross is the brainchild of Ian Verchere (SSX Tricky) and Jay Balmer (Skate) and is one of the first serious entries into the action sports category on mobile and tablet. The idea of blending free to play with high quality action sports has been germinating at Roadhouse since the early days of the company, and we’re happy that we’re partnering with Unity on this first iteration.
You mentioned that most of your games follow the freemium model. How do you usually monetize them?
We want to make sure that, first and foremost, our games are fun and rewarding for the gamer. That’s essential. While it’s possible to play through our games without needing to purchase anything, we also offer a number of options that enhance or accelerate the experience and give players who maybe don’t have huge amounts of time to devote to the game a little bit of help keeping up with their buddies.
This generally comes in the form of power-ups, equipment and other boosts that add to the depth of gameplay. We’re always keen to offer true value for the money, so if we do charge for extras, we make sure they’re of true value.
How did you discover GameAnalytics? Through the Unity Asset Store?
Actually, it was initially through a visit to Roadhouse by people on the GameAnalytics team. We’d also seen it available in the Unity store and finally decided to give it a try. We did an in-depth review of all of the tools available out there and found GameAnalytics to best complement the other tools we use on the game operations side of the business.
What do you usually track with GameAnalytics?
Everything from seeing where people are spending money to what they are doing in-game, where there are roadblocks, etc. Of course, we are also looking at retention and how we keep our customers coming back.
What’s your favorite GameAnalytics tool? Cohorts, Funnels, Heatmaps etc.?
Personally, we believe that cohort analysis is fascinating. It is providing us with some great insights that we’re continually taking on board and reacting to.
Also, the power of having access to our core game metrics out of the box with GameAnalytics is key.
For example, we usually look at session lengths and returning user data to get a general idea about user engagement.
Note: All funnels and charts have been anonymized.
With our custom events, we also rely a lot on funnels. They allow us to quickly test and isolate each tutorial flow, to see exactly where people drop out. This indicates exactly which tutorial steps are interesting enough to keep the players hooked and which steps are redundant. Looking at the lengths between the steps also offers great feedback on how smooth the learning curve is.
Did using GameAnalytics give you any pleasant surprises about your games?
We’ve learned a ton about our games – and have enhanced the experiences based on what we learned. As an example I can share that in one of our recently launched games, during beta testing, we were able to understand that ‘free’ items available in the store were not being collected; and they were essential to the gamers’ in-game success. Through GameAnalytics we identified this problem, and amended the game flow accordingly.
You probably saw this one coming, but how is development life after starting to use GameAnalytics?
I think as a group, we’re closer to our players than ever before – we know what features are resonating, where we’re making money directly, where it makes sense to look at alternative forms of revenue generation and so on.
Ultimately, data is at the core of the organization, and it informs our decision making for each game on an ongoing basis. A good analytics platform in the hands of talented game designers can be a very powerful tool to make the games more appealing. And today, more than 15 people from our company log in to GameAnalytics every day as one of the tools we use to check KPIs on everything.
How would you describe the process of integrating GameAnalytics into your Unity games?
Our team is pretty familiar with the tools that Unity provides, so it was fairly easy. For us, the deep integration between GameAnalytics and Unity meant that we were up and running with 45+ important metrics in a few hours of development time.
Do you have any advice for developers who are new to GameAnalytics?
I think one key piece of advice is to use analytics but not necessarily to react to every little change on a day by day basis. The tendency is to want to ‘fix’ immediately, but it sometimes makes sense to wait for a while to make sure what you are seeing is really a trend, and not a blip.
More about GameAnalytics
There you have it. So if you want to start understanding everything about your game, from player acquisition and engagement to in-game monetization, remember that the GameAnalytics SDK for Unity is right here in the Asset Store: https://www.assetstore.unity3d.com/#/content/6755
Be sure to stop by the GameAnalytics blog, they share a wealth of insights and key learnings on data analysis in games, in-depth introduction to the basics, further reads on optimization of virtual economies. Their blog is where leading industry experts and researchers contribute on a regular basis.
More about Roadhouse Interactive:
Roadhouse is an independent game developer and operator based in Vancouver BC. The company started in late 2009, initially focusing on high quality browser-based games. Recently, it has been accelerating its efforts on the mobile and tablet platforms and expects to publish four to five games on the iOS and Android market by early 2014.
Games in their Portfolio include:
Freeride: Ski Cross, Warhammer 40,000: Carnage, Trophy Bingo, Elemental Power, Mechwarrior Tactics, UFC: Undisputed Fight Nation, Family Guy Online
The following post was written by Jean-Christophe Cimetiere, Director, Partner Developer Marketing at Microsoft, to update the Unity community about the latest resources and news for Windows Phone and Windows Stores
This past June, we announced collaboration with Unity, through which Unity developers can publish their apps on Windows platforms. In porting your apps to Windows and Windows Phone, you can expand your market reach to millions of users with minimal effort.
Unity Developer Momentum on Windows
I’m thrilled to see the momentum you’ve demonstrated in launching your apps on Windows platforms. Since the release of Unity 4.2 we’ve seen exponential growth of apps made with Unity for Windows Store and Windows Phone; you have submitted more than one thousand Unity apps—and counting.
”This milestone is a powerful reminder of the incredible flexibility of the Unity technology and developer community,” said David Helgason, CEO, Unity Technologies. “The Windows Store apps add-on for Unity was just released in July and there are already over 1000 games powered by Unity available on Windows Phone 8 and Windows Store! That’s a mind-blowing achievement realized in less than four months.”
Thanks to your feedback, Unity continues to provide improved support for Windows platforms in the latest version, Unity 4.3, which you can download here.
The Porting labs: what happened…
As you know, Unity is designed to operate cross-platform and ensures a smooth recompiling experience; still, each platform is unique. So, in order to set you up for success in porting existing apps to Windows platforms, we gathered a crew of technical experts, and headed to the Unite 2013 event in Vancouver, British Columbia, where we worked hands-on with developers for three days in our Porting Lab.
Learning from experience: Porting Lab at Unite 2013
Our goal in hosting the Porting Lab was simple: to help you quickly port as many apps / games as possible and learn what we could do to make the process even easier.
And what we learned
For most, recompiling code was easy, but we also learned about the various aspects of porting that require extra care and attention. If you’ve gone beyond using standard Unity features, you may find extra work in incorporating your special plug-ins or APIs. For example, if you want to integrate In-App Purchase, use the Microsoft Ad Control, or manage the ‘back’ button, you’ll need to customize.
By the end of the lab, we had helped developers port about 80 apps from iOS, Android, and other platforms to Windows Phone or Windows Store.
New technical resources
To further empower you, we’ve developed technical guides that explain how to bring Unity apps to Windows Store and Windows Phone, including specifics of the platforms, what you need to get started, how to address platform-specific edge cases, and how to address roadblocks you may encounter as you build on Windows. The content will continue to evolve as we get more feedback from you.
Get Started: Join a porting lab or check the porting guides
Our porting lab was so successful that we are now working with Unity to host similar events in additional cities around the world.
“It’s been a pleasure to work with Microsoft in their efforts to help developers get their games up and running for Windows Store and Windows Phone 8,” said Carl Callewaert, Unity Evangelist, Unity Technologies. “The live porting labs have been a huge success, so much so that merely days after meeting Tic Toc Games at a recent porting event, I was playing their game on my Windows Phone 8. The new online porting guides will put these excellent resources in the hands of those unable to attend live events.”
Below is an initial schedule.
- London, UK: November 29, 2013
- San Francisco, US: December 13-14, 2013
- Finland (Kajaani, Helsinki, Tampere, Turku, Jyväskylä) January 13 to 17, 2014
Check for updates on the schedule here.
Watch this video to hear what developers had to say about their porting experiences:
Who won $100,000 in cash and prizes?
The response to the contest for the best Unity-authored content for Windows Store and Windows Phone was overwhelming. On November 14, we announced the 15 talented winners.
Take a look at the great projects.
Last but not the least; I am going to leave you with advice from developers we interviewed during the Unite 2013 event. Watch this to hear what they have to say about becoming a game developer.