Content Pipeline

One of the issues MonoGame Mac (and Linux) developers face is not being able to build shaders. Getting a good shader HLSL compiler to work on a non Windows platform is tricky. Work is under way to get this done but it will take some time to get right.

That leaves Mac (and Linux) developers a bit stuck. Well there is a solution, a custom Content Pipeline processor. This is where the Content Pipeline can show its pure awesomeness.

The Problem

Lets break it down, we have a shader we can to build. But we MUST build it on a Windows box. One way is to do it manually, but doing stuff manually is dull. Rather than opening a Virtual Machine and copying compiled .xnb files about, I wrote a pipeline extension. Its goal to take the shader code, send it to the Windows box. Compile it and send the result back to be packaged into an .xnb.

The Solution

MonoGame has a tool called 2MGFX. This is the underlying tool which takes HLSL .fx files and compiles them to HLSL or GLSL. So what I did was create a service which just shells out to the tool and gets the compiled code (or errors). We then return the results and use the existing packaging process to produce the .xnb file or throw an error. Then I went one further. Hosted the server in Azure, which saves me having to boot my VM each time I want to compile a shader.

The resulting processor code for this is quite simple. The new class is derived from EffectProcessor. You will see that if we are running the processor on Windows we just fall back to the default EffectProcessor code. Which means you can use the same Processor on Mac and Windows.

One restriction at this time is that the .fx file needs to be self contained. In other words you cannot use include’s or have code in external files. One this I could do is plug in the MonoGame Effect pre-processor to pull all of those includes into one file. But that is a job for the future (or a PR 🙂 )

If you want to take a look at all the code you can find it here.

The Code

		public override CompiledEffectContent Process (EffectContent input, ContentProcessorContext context)
		{
			if (Environment.OSVersion.Platform != PlatformID.Unix) {
				return base.Process (input, context);
			}
			var code = input.EffectCode;
			var platform = context.TargetPlatform;
			var client = new HttpClient ();
			client.BaseAddress = new Uri (string.Format ("{0}://{1}:{2}/", Protocol, RemoteAddress, RemotePort));
			var response = client.PostAsync ("api/Effect", new StringContent (JsonSerializer (new Data  () {
				Platform = platform.ToString(),
				Code = code
			}), Encoding.UTF8, "application/json")).Result;
			if (response.IsSuccessStatusCode) {
				string data = response.Content.ReadAsStringAsync ().Result;
				var result = JsonDeSerializer (data);
				if (!string.IsNullOrEmpty (result.Error)) {
					throw new Exception (result.Error);
				}
				if (result.Compiled == null || result.Compiled.Length == 0)
					throw new Exception ("There was an error compiling the effect");
				return new CompiledEffectContent (result.Compiled);
			} else {
				throw new Exception (response.StatusCode.ToString ());
			}
			return null;
		}

Pretty simple code isn’t it! At some point I’ll see if we can replace the .Result code with async/await. But I’m not entirely sure how the Pipeline will respond to that.

Using InfinitespaceStudios.Pipeline

Using this extension could not be easier.

If you want to use the default service

  1. Open your project and find the Packages Folder. Right click and select Add Packages. 
  2. This will open the Nuget search Dialog. Search for “InfinitespaceStudios.Pipeline” and add the Package.
  3. Once the package has been added. Open the Content.mgcb file in the Pipeline Editor.
  4. Select the “Content” node and then find the References property in the property grid. Double click the References property to bring up the Add References Dialog.
  5. Search for the “InfinitespaceStudios.Pipeline.dll” and Add it by clicking on the “Add” button. Note this should be located in the “packages\InfinitespaceStudios.Pipeline.X.X.X\Tools” folder. Once that is done, Save the Content.mgcb. Close it an re open it (there is a bug in the Pipeline Tool). The select the .fx file you want to change.
  6. Select the Processor property and in the drop down you should see “Remote Effect Processor – Infinitespace Studios”. Select this Item.
  7. If you are using the defaults just Save the Content.mcgb. Close the Pipeline tool and Build and Run you app. It should compile without any issues. If there is a problem with the .fx file the error will be reported in the build log.

If you are using a Custom Azure site or the Local Service on a Windows box you can use the RemoteAddress , RemotePort and Protocol properties to change the location of the server. Valid Protocol values are “http” and “https” if you have a secured service. The RemoteAddress can be a CNAME or IP address.

 

Conclusion

Hopefully this post has shown you what a cool thing the Pipeline system is. One of my future posts will be about creating a pipeline extension from scratch. So if you are interested watch out for it. In the mean time, if you are a mac user. Get compiling those shaders!

In the past it might seem that Windows users of MonoGame get all the cool stuff, and Mac / Linux users are left out in the cold. To be honest for a while that was true, but the great news is that more recently that has begun to change. On going community efforts have resulted in both MacOS and Linux installers which will download and install templates into the Xamarin Studio and MonoDevelop. They also install the Pipeline tool which is a GUI you can use to build your content for your game.

All that was great, but again Windows has something that Mac and Linux developers just didn’t have access to. Automatic Content Building, this is where you just include the .mgcb file in your project, set its Build Action to “MonoGameContentReference” and providing you created the project via one of the templates it would “just work”. Your .xnb files would appear as if by magic in your Output Directory without all that messy manual linking of .xnbs.

So how does it work.. Well to fully understand we need to dig into MSBuild a bit 🙂 I know recently I’ve been talking about MSBuild allot but thats because in my day job (@Xamarin) I’m dealing with it ALLOT! So its topical from my point of view 😉

So if you dig into your csproj which was created in Visual Studio via one of the MonoGame templates  you will see a number of things. The first is a <MonoGamePlatform> element. This element is used later to tell the MGCB (MonoGame Content Builder) which platform it needs to build for.  Next up is the <MonoGameContentReference> element which will contain a link to the .mgcb file.. This again is used later to tell MGCB which files to build. Note that you are not just limited to one of these. If you have multiple assets and different resolutions (e.g @2x stuff for iOS) you can have a separate .mgcb file for those and include that in your project. The system will collect ALL the outputs (just make sure they build into different intermediate directories).

The last piece of this system is the “MonoGame.Content.Builder.targets” file. This is the core of the system and you should be able to see the Import near the bottom of your .csproj. This .targets file is responsible for going through ALL the MonoGameContentReference Items in the csproj and calling MGCB.exe for each of them to build the content, it will also pass

/platform:$(MonoGamePlatform)

to the .exe to that it will build the assets for the correct platform. This is all done in the BeforeBuild msbuild event, so it will happen before the code is even built, just like the old XNA content references used to do. But this time you don’t need to do any fiddling to get this to work on a command line, it will just work. Now calling an .exe on a build in a .targets isn’t exactly magic, the magic bit is right here


<Target Name="BuildContent" DependsOnTargets="Prepare;RunContentBuilder"
Outputs="%(ExtraContent.RecursiveDir)%(ExtraContent.Filename)%(ExtraContent.Extension)">
<CreateItem Include="$(ParentOutputDir)\%(ExtraContent.RecursiveDir)%(ExtraContent.Filename)%(ExtraContent.Extension)"
AdditionalMetadata="Link=$(PlatformResourcePrefix)$(ContentRootDirectory)\%(ExtraContent.RecursiveDir)%(ExtraContent.Filename)%(ExtraContent.Extension);CopyToOutputDirectory=PreserveNewest"
Condition="'%(ExtraContent.Filename)' != ''">
<Output TaskParameter="Include" ItemName="Content" Condition="'$(MonoGamePlatform)' != 'Android' And '$(MonoGamePlatform)' != 'iOS' And '$(MonoGamePlatform)' != 'MacOSX'" />
<Output TaskParameter="Include" ItemName="BundleResource" Condition="'$(MonoGamePlatform)' == 'MacOSX' Or '$(MonoGamePlatform)' == 'iOS'" />
<Output TaskParameter="Include" ItemName="AndroidAsset" Condition="'$(MonoGamePlatform)' == 'Android'" />
</CreateItem>
</Target>

This part is responsible for adding the resulting .xnb files to the appropriate ItemGroup for the platform that we are targeting. So in the case of a Desktop build like Windows, Linix we use Content. For iOS and Mac we use BundleResource and for Android we use AndroidAsset. Because we are doing this just before the Build process, when those target platforms actually build the content later they will pick up the items we added in addition to any other items that the projects themselves included.

Now the really interesting bit is that code above was not how it originally looked.. The problem with the old code was it didn’t work with xbuild, which is what is used on Mac and Linux. So it just wouldn’t work. But now the entire .targets file will run quite happily on Mac and Linux and have intact been included in the latest unstable installers. So if you want to try it out go and download the latest development installers and give it a go.

If you have an existing project and you want to upgrade to use the new content pipeline system you will need to do the following

  1. Open your Application .csproj in an Editor.
  2. In the first <PropertyGroup> section add <MonoGamePlatform>$(Platform)</MonoGamePlatform>
    where $(platform) is the system you are targeting e.g Windows, iOS, Android.
  3. Add the following lines right underneath the <MonoGamePlatform /> element <MonoGameInstallDirectory Condition="'$(OS)' != 'Unix' ">$(MSBuildProgramFiles32)</MonoGameInstallDirectory>
    <MonoGameInstallDirectory Condition="'$(OS)' == 'Unix' ">$(MSBuildExtensionsPath)</MonoGameInstallDirectory>
  4. Find the <Import/> element for the CSharp (or FSharp) targets and underneath add <Import Project="$(MSBuildExtensionsPath)\MonoGame\v3.0\MonoGame.Content.Builder.targets" />

Now providing you have the latest development release this should all work. So if you have an old project go ahead and give it a try, its well worth it 🙂

The MonoGame team have been putting allot of effort into a cross platform content pipeline, but given that for the most part we support loading native assets like .png, .mp3, .wav why bother? Well it all boils down to a couple of words.. performance, efficiency. Lets look at an example, graphics are probably the biggest asset a game uses, they are also a major resource hog. Textures will probably take up most of the room in your deployment and will be taking up most of the memory on your device as well.

Textures

So lets say we have a 256×256 32 bit .png texture we are using in our game, we don’t want to bother with all this compiling to .xnb rubbish that people do, so we just use the texture as a raw .png file. On disk .png is very impressive in its size, that image probably only takes up 2-5 kb on disk, keeping your application package size down. Great!

Now lets go through what happens when we load this .png from storage on a device (like an iPhone). Firstly its loaded from storage into memory and decompressed/unpacked from its compressed png format into raw bytes. This is done because the GPU on your device doesn’t know how to use a png image directly, it can only use certain types of compression. So we unpacked the image into memory, this is 262,144 bytes , 256x256x4 , the x4 is because we have 1 byte per channel Red, Green, Blue and Alpha. Note that 262 KB  is quite a bit bigger than the compressed size. The next thing to do is create a texture for that data, because your device can’t compress on the fly (yet) it has to use that data as is. So in creating the texture we used 262kb  of graphics memory on the GPU. That doesn’t sound too bad, but if you are using larger textures say 1024×1024 then you are using 4 MB of GPU memory for that one texture. Multiply that over the number of textures in your game and you soon run out of texture memory on the GPU. Then the GPU has to swap that data out into system memory (if it supports that) or throw an error when you try to create textures that won’t fit into available memory. So to sum up using

.pngs = smaller package size & higher memory usage & less textures

Now let look at a texture pre-processed using the content pipeline, because we know we are targeting iOS we know the GPU’s on those devices support PVRTC texture compression directly. So lets take our sample .png and compress that using PVRTC, what we end up with is a 32kb file (size depends on the texture,alpha channel etc). Hmm that is allot bigger than the .png on disk but that is not the whole story. The difference is there is no need to unpack/decompress it which saves on load time, also we can create a texture from that data directly so we only use 32kb of texture memory on the GPU not 262kb. That is a massive saving.

compress textures = larger package size (maybe) & lower memory usage & more textures

Now we just looked at iOS, but the same applies to desktop environments. Most desktop GPU’s support DXT texture compression so the content pipeline will produce DXT compressed textures which can be loaded and used directly. The only platform which is a pain is android, because android does not have consistent support for compressed textures at the moment MonoGame has to decompress DXT on the device and use it directly. However even android will be getting compressed texture support. There is currently a piece of work happening where the pipeline tool will automatically pick a texture format to use, so for opaque textures it will use ETC1 (which is supported on all android devices but doesn’t support alpha channels) but for textures with a alpha channel it will use RGBA4444 (dithered) but also allow the user to pick from a wide variety of compression options manually such as PVRTC, ATITC, DXT/S3TC, ETC1 and RGBA4444. This will give the developer the choice of what to use/support.

Audio

Now lets look at audio. All the different platforms support different audio formats, if you are handling this yourself you will need to manually convert all your files and include the right ones for each platform. Would a better option be to keep one source file (be it .mp3, .wmv etc) and convert that to a supported format for the target platform at build time? Ok it makes for longer build times, but at least we know the music will work. MonoGame uses ffmpeg to do the heavy lifting when converting between formats as it can pretty much convert any type to any other type which is really cool.

Shaders

This is an area that causes real pain, custom shaders. There are a number of shading languages you can use depending on the platform you are targeting. For OpenGL based systems that is GLSL, for DirectX based systems its HLSL, there is also CG from nvidia. The Effect system in XNA/MonoGame was designed around the HLSL language. It is based around the .fx format which allows a developer to write both vertex and pixel shaders in one place. Historically both GLSL and HLSL have separate vertex and pixel shaders, HLSL until recently compiled and linked these at build time, GLSL does this at runtime. Now without a content pipeline or some form of tooling a developer would need to write two shaders, one for HLSL and one for GLSL. The good news is the MonoGame MGFX.exe tool can create a shader from an .fx format and have that work on GLSL. It does this by using an open source library called libmojoshader, which does some funky HLSL to GLSL instruction conversion to create OpenGL based shaders but rather than doing that at runtime we do it at build time so we don’t need to deploy mojoshader with the OpenGL based games. All this saves you the hassle of having to write and maintain two shaders.

Now the drawback of MGFX is that is only runs on a windows box at the time of writing. This is because it needs the DirectX shader tooling to compile the HLSL before passing it to libmojoshader for conversion (for OpenGL platform targets). There is a plan in place to create a version of the .fx file format which supports GLSL directly so people who want to do custom shaders on a Mac or Linux can do, but this is still undergoing development so for now you need to use a windows box.

Models

For the most part the model support in XNA/MonoGame is pretty good. XNA supports .x, .fbx files for 3D models, MonoGame, thanks to the excellent assimp project supports a much wider range of models including .3ds. However some of these formats might produce some weirdness at render time, only .fbx has been fully tested. Also note that assimp does not support the very old format .fbx files which ship with most of the XNA samples, so you’ll need to convert those to the new format manually. On nice trick I found was to open the old .fbx in Visual Studio 2012+ and then save it again under a new name. Oddly VS seems to know about .fbx files and will save the model in the new format :).

Now what happens when you use a 3D model is that it is converted by the pipeline into an optimised internal format which will contain the Vertices, Texture Coordinates and Normals. The pipeline will also pull out the textures used in the model and put those through the pipeline too, so you automatically get optimised textures without having to do all of that stuff yourself.

Summary

So hopefully you’ve got a good idea on why you should use the content pipeline in your games. Using the raw assets is ok when you are putting together a simple demo or proof of concept but sooner or later you will need to start optimising your content. My advice would be to use the Pipeline tooling from the outset so you get used to it.

Information on the Pipeline tool can be found here.

I will be covering in a future post how to produce custom content pipeline extensions for MonoGame which will allow you to optimise your own content or customise/extend existing content processors.

Until then Happy Coding.