Steamroll Art Pipeline: From sketch to Blender, from Blender to UE4

Being Steamroll the first real video game project I have worked on, I’ve faced a lot of issues and doubts I have resolved on the go. In this article I want to share the art pipeline we’ve developed during this months, not with the intention to be a tutorial or a ‘How to’ but a ‘How we did it’. So this post is a basic introduction and wants to show an overall process without getting so much into step by step detail.

References

The art of Steamroll is being developed mostly with Blender, Photoshop (and Gimp) and Unreal Engine 4, but all the ideas come from pencil sketches in a notebook.

Steamroll is set in a steampunk world full of moving mechanics, copper and rusty iron pieces and steam jets everywhere. To accomplish an attractive overall look, we gathered tons of reference images from quite different sources: Jules Verne, The Time Machine, Hugo, Final Fantasy VI, Confrontation miniatures, François Schuiten, The Castle in the Sky, Steamboy, old underground pictures, City of Ember, Georges Méliès, The Room, lots of steampunk artifacts and illustrations… even Starcraft 2 or Metal Slug! And each one of this was taken for different reasons (the look, the color palette, the atmosphere, the movement of machinery, the camera approach, the user interface…) in order to make the visual aspect of the game richer and more interesting.

1

Concept design

As I said, any element design starts with concept sketches which are lead by the needs of game design. The first sketches where done first with pencil and then taken to Photoshop to put color on them, but when the color palette and general tone was established, most of next concept art was only sketched in pencil and then directly put in 3D with Blender.

2

Steamballs, the Scarabeus (the rolling vehicle) and the old mine were the first elements to design, but one of my favourites has been the mechanical 3D user interface, and because of this I will be focusing the article on the making of the Pressure and Damage Meter main part (PDM from now on).

I’ve decided to choose an element from this UI because, despite the interface is still work in progress, it complements with the other article my programmer fellows have written about developing the 3D HUD in Unreal Engine 4, which you can read here.

The PDM design

As a good steampunk invention, the Scarabeus, your vehicle, has its own clockwork meter, and this one displays the steam pressure you have when you are a rolling ball and the shot intensity when you are set in a deployment spot and can shoot Steamballs. It also shows the level of damage the Scarabeus can resist. Given this, first sketches were drawn and painted, and position on the screen for the PDM was decided.

3

The design of the object requires a round dial with number reference, the damage indicator arc and a couple of moving hands. With this asset I started the modelling with no more detailed sketch, as this is an important part of the game and it was necessary to be developed and running the sooner the better. So I opened Blender and started up…

Modeling hi-poly mesh

In this stage of modeling I didn’t care about the number of polys, just about the visual design and its accordance to the function. I modeled most of the parts separately, specially stuff like small screws; by doing this the complexity of topology keeps small and details can be added easily.

4

Basic shapes and some nice subdivision surface modifiers made the job pretty simple. I left some minor details like the numbers and the radial steps for the texture and normal maps.

Retopo

The next step was to make things lighter by re-elaborating the mesh with less polygons. That’s it, retopology time! Retopology, or retopo for friends, is the process in which topology is reorganized in order to have a clean mesh for animating and, specially in video games, to reduce the amount of polys preserving the main shape. In this case, being this a hard surface asset and not having complicated animations in mind, the retopo was quite simple to do.

4retopo

In Blender I usually put the hi-poly meshes locked in a separate but visible layer and then, in the active layer, I add a basic shape such as a plane, a cylinder or a cube and start to trace the object by extruding with snap (to face) tool activated.

A couple of things I discovered and I do when retopologizing: as one of the goals of retopo is to save polys, I decided not to create polys that aren’t going to appear on screen, that is most of the back polys of the asset.

Also, I found easier to do the retopo by parts, so I separated the main parts of the hi-poly object and then created its retopo separately as well. This is also helpful for the unwrapping of the low poly object, which will be already split and easier to work with.

5

Unwrapping & UV Maps

When the low poly mesh was finished I unwrapped the mesh into a 2D UV map. By unwrapping, what we do is to place the 3D mesh into a 2D plane in order to project texture maps onto it. These texture maps can be color maps, normal maps or specularity maps, for example. In Blender, the Mark Seam tool helps to unwrap and organize the mesh parts in the UV map.

6seams

You can group objects and make some with one single UV map and some with two or more. By joining two meshes with two different UV maps you can keep the two material channels separately and then, in UE4, create two materials for one object and control them independently. So, each UV map I created is equivalent to a material available in the future.

7groups

In this PDM asset, for example, the round dial has been kept in a separate UV map/material but inside a bigger mesh ‘group’. Otherwise, the moving hands, despite being different single objects, share the same UV map, meaning that the same material works for all of them.

8uvmap

Baking maps

Once I have set all the maps it’s time for some bakery. Another usual process in 3D creation for video games is the bake of normal maps. In order to save all the detail and smooth level we created in the high poly object, we can project geometrical information to the low poly model and save it in the UV map as a normal map. So now, I picked the high and the low poly objects by pairs and started to Bake the normal and ambient occlusion maps.

* One of the things we’ve added to the pipeline is to triangulate the meshes just before Baking. Not doing this has ended sometimes in normal mapping issues in Unreal, because it triangulates the mesh when is imported. So it’s better if you manually do it before exporting.

In Blender you can find the Bake options on the bottom of the Render tab. There you can select which map you want and the different options depending on the meshes you built. Another free option for baking your normal and AO maps is xNormal software.

9NormalAO

The radial steps with the numbers maps were done in Photoshop transforming height maps to Normal and Ambient Occlusion maps. Check out the xNormal filter options for PS.

Texturing

I know most of triple A games have a real complicated process of texturing and creating materials but for the complexity of our project the color texturing pipeline is the following: I open the UV maps in Photoshop or Gimp and paint a first plain color base; then the AO map is set to multiply with low opacity to have a reference for the object shape and parts; having this, I paint some shape details, rust, worn edges, the screws and all the rest.

10textureprocess

In Steamroll we also use Roughness map to create different reflection tones. Keeping the color map in different layers it’s easy to desaturate and edit brightness and create the Roughness map.

11roughness

Exporting

Ok, now we have our low poly meshes with their texture maps (color, normal, roughness and ambient occlusion) and we want to bring them to Unreal Engine: time to export!

If this asset was animated this would be the time to create an armature and move the mesh around, but PDM’s hands will be animated by code. Anyway, I’ll show the two exporting settings for the static mesh (no armature) and skeletal meshes (armature) we use in Blender.

We export meshes in FBX format by selecting the mesh you want to export (and the armature if it’s the case) and going to File > Export > Autodesk FBX. (If you don’t have that option, check out the Blender Addons in User Preferences).

Here are the two export settings. The main difference is the Armature button and the animation checkboxes. It’s also really important to check the Tangent Space option in both cases, this is also another Blender to Unreal issue we got sometimes, it’s better to calculate the tangent space in the same software that made the normals.

12export

You can save this presets by clicking ‘+’ next to ‘Operator Presets’, on top of this.

Importing to Unreal Engine 4

Then in UE4 we have to import all assets we created. These are some FBX for the meshes and some PNG for the texture maps. With FBX it’s important to check ‘Import Normals and Tangents’ inside the Mesh tab and uncheck ‘Import Materials’ and ‘Import Textures’.

When Normal maps are imported we make sure we have ‘Flip Green Channel’ checkbox checked (because Blender and Unreal calculate normals in a different way). Also, Compression Settings must be set at ‘TC Normalmap’ option. All of this can be found inside the Details tab of each asset. Finally, we make all textures to ‘Never Stream’ due to our level composition and texture complexity.

13import

Creating Materials

This is the last step before we throw our little asset to the level or blueprint editor. Epic has really nice video documentation in its Youtube Channel and I highly recommend to watch the Material video series.

Material Editor of Unreal Engine 4 is one of its great features and you can do pretty much anything you can imagine. With Steamroll we use a really basic level of shaders complexity, but I encourage you to write your suggestions to improve Steamroll Materials!

Any static or skeletal mesh in Unreal has its material ‘channels’ in the Mesh Details tab. Here is where materials (one or more) can be linked to a mesh.

14materialchanels

By clicking in the ‘Add New…’ button we create new materials for each object and then edit them with imported texture maps. Inside the Material Editor, we throw the texture assets and link and edit according to the needs of each object.

15materialeditor

Once the materials are created and linked to the mesh, our PDM asset is ready to be used in the ancient art of coding. I remember you can check out our last post in which you can see the process we made to set this asset as a 3D user interface element.

Last words

That’s all folks! I hope this post serves to get a general idea of how we work and to encourage Blender users to create projects and assets for Unreal and for video games. I want to remember that this has been a ‘making of’ post of the development of our project and what is told here is just the way we’ve took and it’s possibly improved. It would be nice if you share your comments, criticisms and alternatives!

Thank you!

Steamroll’s 3D User Interface

Steamroll’s 3D User Interface

 

Overview

 

Games normally display information such as score and inventories and interact with the player using traditional 2D UIs, where all the information is displayed on top of the 3D world using traditional 2D widgets painted with tools such as Photoshop or the Gimp.

This document shows how Steamroll renders 3D objects on the player’s HUD to make an animated and interactive 3D User Interface (UI) using Unreal Engine 4.

Introduction

 

Steamroll is a 3D Puzzle and Strategy game set in an industrial steampunk universe where the player assembles and shoots steamballs to solve problems and overcome obstacles to get to the next level. A steamball can be fitted with up to 4 items including walls, ramps and bombs which will be deployed according to how the player programmed them to. A good example would be a steamball containing a wall deployed two seconds after launch, a ramp attached to the first thing the steamball bounced off and a bomb activated when the ball comes to a halt.

In addition of the usual health and ammo (balls) displays, the game needs to show the user a fair amount of information regarding the ball’s contents and programming, the inventory of available items and the remaining steam pressure.

Early on the development of Steamroll we realized that it would be a challenge to present all this information in the simplest, most intuitive possible way so we decided to take advantage of the game’s steampunk industrial atmosphere and use geared moving machines to show and hide everything of interest to the player.

This decision forced us to go 3D, since 2D animation would severely restrict us to simple panning, rotating and zooming of 2D widgets due to the huge amount of combinations in the assembly of a steamball. We’re a small team and do not have the resources to produce an enormous amount of animations whereas some skeletal meshes and a bit of coding can produce stunning results in relatively short time.

Method description

 

This task is not as easy as setting up some actors attached in front of the camera since the 3D environment around these actors would affect them and break the illusion and even its usefulness.

For instance, when the player gets too close to other objects, they might poke through the UI and occlude the player’s view. Lighting is also problematic because the level’s own can and will bleed onto the 3D UI which is something the artists will complain about since they cannot control it. The opposite is also true, the UI’s lighting can also affect the level’s often resulting in undesirable artifacts. Look at how the UI is partially covered and strangely lit in the following screenshot.

Unreal Engine 4’s UMG has an experimental widget called Viewport that can render a scene independently of the main level, but after testing it for several versions and seeing it didn’t really work well enough for us and didn’t reach full stability we decided to look for an alternative way.

We use the old-fashioned chroma keying technique, where the objects of interest are rendered in front of a green background and then the captured scene is composited with the final background by removing the original shade of green.

Isolation and Instancing

 

The 3D UI actors and complementary setup have all to be grouped and then instanced in each of the game’s levels. That’s why we create a new level called “UI_Sublevel” for it and load it as a sublevel in each of the game’s real gameplay levels.

Just remember to place everything really far away from the origin so that when they are included in a real level level they don’t end up appearing in the middle of the action!

It is also important to remember to change the sublevel’s streaming method to “Always loaded” in the level’s tab, otherwise you won’t see anything.

Scene Capture to Render Target

 

Here is where we are going to capture whatever is in the 3D UI with a special type of camera called SceneCapture2D and render everything onto a RenderTarget which is really a texture that we will later use. Create the SceneCapture2D in front of the chroma background.

Create a Texture Render Target in the content browser:

Edit the new Render Target and set its desired on-screen size and addressing:

Edit the scene capture 2D details and set the FOV, the texture target to the one just created and the capture source to Final Color. Make sure the mark the “Capture every frame” checkbox.

It is also important to uncheck the Sky Lighting checkbox to prevent the skylight from affecting the UI. Take a few minutes to uncheck every rendering feature you are sure that you do not need for a little boost in performance.

Post-process

 

Now that the Scene Capture and the Render target are linked, we need to ensure that the post-process the Scene Capture is going to have doesn’t break the chroma keying. Unreal’s post-process includes a tone-mapping stage that subtly changes the render’s final color, including the keying color thus breaking the system. Fortunately, we can replace the default post-process with a custom one that essentially does nothing!

It is possible to defeat the complete post-process and tone-mapping pipeline in the scene capture details by selecting “Scene Color” instead of “Final Color” in the “Capture Source” property. It is certainly easier than what we are doing but it has the disadvantage of not being able to have any post-process effect such as bloom which artists tend to like.

Create a new material and change its Material Domain property to “Post Process”:

It is also necessary to specify that this post-process material replaces the Tonemapper:

Then set up the material. We need “Input 0” which is the scene’s final color plus “Input 1” to add post-process effects such as bloom. Wire up the addition to the emissive color.

 

Chroma Material

 

This is the key step of the whole process. The green background has to be “erased” so that the level can be seen behind the 3D UI. Fortunately Unreal’s incredibly powerful and flexible materials make this task a piece of cake.

Just create a new material which we called OverlayMaterial and set the following properties to make the material translucent and unlit:

And arrange the following nodes:

The UTexture node on the left is linked to the Render Target. If you have already played the level with everything set up as explained in the previous sections you should already see the UI over the chroma in this node.

The RGB color is passed directly to the material’s final emissive color, but the chroma background will be cut out using the material’s opacity or “alpha” channel. The chroma keying color is simply subtracted from the Scene Capture and the dot product with itself is computed to obtain a mask or stencil to use as alpha channel. The multiply node at the end is used to tweak the border.

UMG (Unreal Motion Graphics UI Designer)

 

Up to this point we have set up the UI scene in a sublevel located in a remote part of the level. Then we’ve captured it with a Scene Capture actor to a Texture Render target. The missing part is how to draw this texture to the screen.

Although we could draw it directly to the HUD mapped over a quad, we are going to use UMG because it gives us the ability to specify a resolution-independent adaptive layout in a simple and visual way.

First of all let’s create a new UMG Widget in the content browser:

In the newly created Widget Blueprint, drag and drop an image widget from the palette tab to the hierarchy tab to create an image. Here we call it ImageOverlay:

This image widget has to be sized and positioned on the viewport. The sizes have to be exactly equal to the ones of the Render Target so that there is no scaling. It is also a good idea to disable DPI scaling in the project’s settings setting all resolutions to scale 1.  Notice the anchors on the bottom-right corner of the viewport:

Now it’s time to finally link it to the texture target. The image’s brush can be linked to the chroma Overlay Material we created earlier:

Final set-up

 

The last step remaining is actually adding the UMG widget to the viewport. It would be impractical if we had to manually add blueprints in every level to create and add the UI. That’s why it is done in the UI_Sublevel blueprint. Open it by clicking on the gamepad icon next to UI_Sublevel in the levels tab:

And add the following nodes to the blueprint:

Notice the widget we are creating here is the UMG Widget the made a few sections ago and you’re all set! Hit play and check everything looks as it should.

Shortcomings

 

In its current state Steamroll’s 3D UI has a few shortcomings that are not critical but would be nice to improve:

-The tone-mapper has to be disabled in the SceneCapture to make chroma keying work. This has the unintentional side-effect of disabling anti-aliasing too because it is completely integrated in the post-process and tone-mapping pipeline. It makes the UI look a bit jagged around the edges.

-DPI scaling has to be disabled so that the UMG Image widget is rendered at exactly the same resolution as the Render Target. It would be nice to have DPI scaling for resolution-independence, but whenever it resizes the UI it partially breaks the chroma keying around the edges making the UI look outlined in chroma green.

-Having a UMG widget all over the viewport intercepts mouse click events, and click events to the 3D UI have to be intercepted, converted from screen space to world space, converted to UI_Sublevel space and then recast with line tracing.

Any doubts? Suggestions? Improvements? Please leave a comment below!

Announcing Steamroll

We are very happy to announce that we are developing our first independent video game Steamroll!

new-level

This is a true indie gma in the sense that it follows our made-up indie-game-rules:

  • it has experimental gameplay elements
  • it is made by a team without influence from a publisher
  • the team fits in a sofa

We are running a kickstarter campaign, so don’t hesitate to visit it and back our development!

Big news coming soon!

Yes, we are still alive! Many things have happened during the last year and we have been very busy working in several projects. Some of them are great projects for our partners, others are small internal projects of which you may hear news soon.

We have shifted our company vision towards two different directions:

  • work close to a few of our partners to help them in their projects with our technology knowledge.
  • develop our own games as an independent studio.

We decided to freeze Mutable support for Unity to be able to focus on the new projects, but we will keep it in our fridge to come back with an improved version sometime in the future. Meanwhile we have been focusing in Unreal Engine 4.

We are still here, and we are very excited about the times to come!

“The Mandate” on Kickstarter

The Mandate is the kind of game we like at Anticto. It is a mixture of strategy, simulation and role-playing. The Escapist magazine calls it XCom + “Faster Than Light”. Who doesn’t want it? So we decided to make our humble contribution to the project backing it. Take a look at the Kickstarter page.

We actually hope that this is the kind of project that benefits from our work and technology. Who knows? maybe we can do some related announcement in the future!

Starting Kickstarter support: Candle

We are players at Anticto, and we love games. Especially those games that start humble, that take risks, that have a soul. We have decided to create a company account at Kickstarter and support projects regularly.

The first project we give our humble support is the beautiful Candle from Teku Studios. Take a look at its page here and support it if you like it like we do!

Candle from Teku Studios