Tool Design with Python

Image 1: A view of the inside of a Daursynka long house, in-game. There are some light artifacts that can be seen in the upper-left of the image. Hopefully, these can be overcome by tweaking the cascading shadow settings. Overall, however, we are very happy with the end result.

The Daursynka long house is done, and we are pretty happy with it (see the featured image for this post, and Image 1). There are many variations for each piece in the kit, to reduce the ’tiling’ effect that can be seen in some games. For example, there are twelve different variations of the main roof’s tile sections, to reduce the repetitive use of each piece. The human eye spots patterns fairly well, and repeating patterns are easiest to pick up on when there are many examples of the same pattern right next to one another. So, this approach was used for all of the pieces in the kit. However, this meant we ended up with over 460 pieces for this kit alone!

When looking at the prospects of exporting each piece, along with it’s collision geometry, it became clear very quickly that scripting the export process was going to be mandatory. When we say scripting in Blender, that means Python. I must admit, I have never been (nor will I ever be) a huge fan of Python. I don’t like the language as a whole, and so I didn’t have much experience with it. That meant first learning Python beyond a quick overview of the languages features. Even if the only scripts that you were ever going to write in Blender were small, relatively simple ‘helper’ scripts, the effort would absolutely be worth it. Especially while creating something as large as an entire modular kit.

After going beyond the introductory lessons on Python at W3Schools, I turned my attentions to writing a script to help with the export. It became obvious very quickly that this script was going to be much more than just a simple little script. I wanted to keep my collection hierarchy in Blender and use it as the folder structure on the hard drive when exporting the assets. This meant a recursive function. I want to be completely honest here. In all of the time that I have ever done programming of any kind, I have never had to write a recursive function. So this was a first for me, and it was an…experience.

Recursive functions are usually deceptively short, and therefore they are deceptively simple. But, at first, I found it very difficult to get my head wrapped around exactly how the function was going to work. I had to look at it in a completely new way (for me), which was so different to anything I had done before. The hardest part for me was thinking about how the function was going to ‘walk’ through the various collections, and the order that it was going to run in. I tried to visualize how the function would be called from the very top level collection, and how it would call itself when it discovered a collection within that top level one. I did figure it out, but I can’t say that I will ever be comfortable writing recursive functions.

Once I had a script that would export my objects, I realized that I had only solved half of the problem. I would still have to import those assets into the Unreal Engine, so I was still staring down an enormous amount of work. Thankfully, the Unreal Engine editor can also be scripted with Python, so I knew that I could write a script to handle the imports. I needed a way to have the editor import not only the 3D objects themselves, but also to create the materials for those objects first. It would do no good to import 460+ objects and then have to apply materials to each and every one. This led down a rabbit-hole that resulted in the project import and export managers pictured below.

Image 2: The import manager in the Unreal Engine editor, showing all of the data for the long house modular kit. This is a read-only UI and is only used to verify that all of the data is correct. There are mouse-over popups that show that asset’s JSON entry, allowing for a deeper dive into the data, if the user wishes.
Image 3: The export manager in Blender allows the user to set up the export data needed to successfully export the assets and generate the JSON file needed by the import manager. Even in a project with 460+ assets, setting up the export data only took around an hour and a half at most.

This tool (or tools if you want to count them separately) is why it took so long for a new post to be added to the website. These were a real trial to get to work, but I feel that the time spent will more than pay for itself when we are making other modular kits for ‘At the Crossroads’ as well as any other games that we create. All of the data entered into the export manager is saved in the .blend file that contains the modular kit assets. So, while it does take some time to enter that information, it only needs to be entered once.

When the data entry is complete, pressing the ‘OK’ button starts the export process. The tool uses the data entered for the materials and textures to generate part of a JSON file describing where these assets are on disk, as well as where they should be saved in the UE project when they are imported. It then ‘walks’ it’s way through the collections contained within the ‘Kit Root Collection’ discovering 3D assets as it goes. As it exports each 3D object, it finds that asset’s collision geometry and exports it along with the asset. So, when the import manager within the UE editor imports each 3D asset, it will have the correct collision. All of the 3D-asset-specific information is added to the JSON file as each asset is exported.

The import manager reads the JSON file and imports the textures, materials, master materials, and instanced materials. I say that it imports the materials, but it is more accurate to say that these are created using the data generated by the export manager and saved in the JSON file. The master materials that are defined in the export manager are created, and then all of the instanced materials are inherited from these. Once this is all complete, the 3D assets can be imported and have all of the correct materials and/or instanced materials applied. The whole import process took just under five hours to complete. Imagine how long this would take to do all of these imports individually. This will be a huge time saver in the long run, as there is very little work that needs to be done from that point.

After the import is complete, the only thing that is left to do is to define the actual node networks for the base materials and the master materials. This has to be done by hand due to the fact that, while it is possible to map some of the material nodes from Blender to the UE editor, it would be extremely difficult to do. Some nodes in Blender, like the math node, do have equivalents in UE, but there are others that do not map across the two applications at all. Also, when you get into the more complex materials in either program, trying to make these base node mappings work gets even more complicated. It was decided that it would be best to just define the node networks in the UE editor by hand. It doesn’t take that much time to do, and we get to use our preferred workflow in Unreal without having to worry about how it all has to be created in Blender to make the translation process successful.

After all of this, were these tools worth it? In the short-term, no. I could have exported all of the assets from Blender by hand, and then import them into the Unreal Engine in less time than was required to write these tools. In the long-term though, these tools will pay for themselves many times over. While the export manager does force a specific work-flow on the artist, it is not that much of a constraint. And, the time saved overall will make these tools valuable assets for us going forward.

Thank you for taking the time to read this, and I hope it has sparked some ideas that you may have for tools to improve your work-flow. Have a great day.

Procedural Modelling and More

Well, it’s been quite some time since I last updated the site, and I’ve been hard at work during that time. I have completed the majority of the work on the modding code to allow modders to create content of their own and package it all up into a mod that players could unzip into the “Mods” folder of the game. This is done via the UGC Plugin that was developed by Epic as part of their VR game, titled RoboRecall. However, there were some pretty glaring omissions in the functionality, due to the intentions of the plugin’s scope.

The UGC Plugin was meant to be very bare-bones in functionality, allowing the developer to decide what, exactly, they wanted to allow their modders to do. The plugin could enable mods as small as just remeshing/skinning some of the in-game weapons, to fully side-loading entire levels with custom game modes. The largest omission of functionality was the ability to easily get players to a new level provided by the mod, and get them back to the main world. This needs to be done entirely within the mod, without the ability to place anything in the main world, and with as few other limitations as possible on the mod developer. I came up with a solution that I hope modders will find a good compromise…spawnable POIs.

A spawnable POI (or Point Of Interest) is a self-contained POI that can include a special trigger volume that will transport players to a separate map that the mod developer has created. This system can handle POIs that have multiple entry/exit points, without the mod developer having to jump through too many hoops to get it all to work. This would allow, for example, a cave complex with numerous entrances to it. When the player(s) enter the cave, they will spawn in at the correct entrance and, when they leave the cave complex, they will spawn back into the main world in the correct location. If they enter through a jungle cave and exit right back out the way they came in, they should be in the jungle. However, if they enter the jungle cave entrance and exit via the tundra cave entrance, they should spawn into the world in the tundra. This seems simple, but keep in mind that the mod developers will not have any way to directly place anything in the main world. Everything will have to be done via the spawnable POIs. We wouldn’t have a problem including the main world, but we are using quite a lot of licensed assets, and we do not have the legal right to distribute those assets to mod developers. Copyright issues are a tricky subject, and are best avoided whenever possible.

The last bit of functionality for our version of the UGC Plugin will be UI related. This functionality isn’t worked out yet, but it really does need to be in place when the game ships. I realized that this was missing while watching some videos on YouTube. I noticed a content creator using a mod for a game, and this mod reorganized the UI for that game. This is something that I will need to add to our version of the plugin, but I don’t anticipate any major hurdles to this…famous last words, right?

Aside from all of the work on the UGC Plugin, I have been working on some procedurally generated models that are part of a modular kit that will be used in the game. This modular kit is for a long house used by some of the peoples that inhabit the tundra region of the crossroads. These are the Daursynka people, and they are loosely modelled after the Iroquois Confederacy of the north-eastern region of North America and the Viking peoples of Scandinavia. These houses were a challenge due to their size and detail. They need to be large enough to house an entire extended family, and be detailed enough to maintain the visual fidelity that the other game assets are already at. But, I also needed a fair amount of variety in the pieces because there will be multiple longhouses at each settlement. I want to avoid obvious repetition as much as possible, while maintaining a reasonable degree of performance. The latter part of the previous sentence is key here; performance must always be considered in any real-time application.

To create the kit pieces for the longhouses, I chose to procedurally create all of them from “building block” pieces that I could easily obtain from Quixel. For example, the roof tiles seen in the featured image for this article are all positioned via geometry nodes in Blender. This allows me to randomize the individual tiles and get a nice variation between the roof sections. Please note, however, that I was lazy in the creation of the image above and I just used an “Array” modifier in Blender to duplicate the roof sections (I am sufficiently ashamed by my laziness here). The modular kit features numerous variations of the roof section, not just a single section with a single tile pattern. This approach allows me to use a set of textures for the tiles, wall slats, beams, and other individual pieces and get a level of quality that would have required much more texture space in the RAM of the player’s video card if I had went with a more traditional approach. The traditional approach is to create all of the geometry in your software of choice (Maya, 3DS MAX, Houdini, Blender, etc.), and then import that geometry into Substance Designer or Quixel Mixer to “paint” the textures onto the geometry. With this more traditional approach, we would need to use a much larger texture to get the same visual quality. We are still using a not-insignificant amount of RAM, but nowhere near the amount that would be needed to get both this level of quality and this degree of variation in the kit.

Image 1: A side view of a simple render of a longhouse using this modular kit. The six roof sections are a single mesh duplicated with an array modifier in Blender (I am ashamed of myself for this). The roof tiles for the peak of the roof are duplicates of the same object as well…I really did get very lazy here. The ground plane is a very simple texture on a flat plane. The sky was added in Gimp using the very nice image provided by calibra of Pixabay, which can be found here.

In Image 1, you can see the picture used as the featured image of this post. Each element that makes up the modular kit pieces is an individual object that is placed via geometry nodes, with it’s rotation randomly tweaked ever-so-slightly to break up the uniformity of just laying everything out using the modifiers available in Blender. This could also be done in an application like Houdini, which I have used before. Blender doesn’t feature the same freedom in it’s procedural tools as Houdini, but the geometry nodes are quite powerful, and do allow for an amazing amount to be done with them. Doing something like this by hand would take many, many more hours than I spent learning the geometry nodes in Blender. The same basic approach was taken for the wall pieces, which are made up of slats with the gaps in between being filled with tar covered thatch. Geometry nodes can also be used to affect the vertex colors of geometry, and this was used to allow blending between a “tar” material and the material used for the wooden slats. You can see the effect of this in Image 2. The tarred thatch is represented by a simple plane that is textured to look like thatch that has been dipped into a vat of tar. At least, that is what I hope it looks like.

Image 2. The entrance to one of the more extravagant longhouses that can be built with this modular kit.

The image above shows a simple rendering of the front of the example longhouse. If you look at the wall that is set further back, you can see that the slats have “smears” of tar where they come close to the plane representing the tarred thatch. However, to work with the vertex colors of the generated mesh, the modifier for the geometry node network used to generate the wall piece needs to be applied. Only after that was I able to add the vertex color map and use the geometry node network that alters the vertex colors. If you look closely at the wall for the front of the foyer, you will notice that it lacks the darker smears of tar that the back wall features. This is because the foyer’s front wall hasn’t had the node network generating it applied yet. Without this, any vertex color map added to it will not be accessible to the node network designed to change the vertex colors. At least, I couldn’t get it to work, and I spent quite some time trying.

What you don’t see in the images above is the sheer volume of variety that can be obtained by creating the wall pieces (or any pieces for that matter) via the geometry node network approach. Each slat type used is a separate mesh, with it’s material applied to it. There are six different slats, all held in a single collection, that the geometry node network chooses from when placing each individual slat. All of the wall slats for any of the wall pieces can be randomized not only in the slat mesh chosen, but also the positioning and rotation as well. Once the vertex color network is applied to the wall piece, it is hard to believe that it is made up of nothing more than six different slat meshes randomly chosen and placed.

Another feature of these longhouses that is not visible in the images above is the thatch cards that are placed on the plane that represents the tarred thatch that is shoved in between each slat. It is highly unlikely that anyone stuffing thatch into these gaps would get it into the gap perfectly, which means that there would be a bit of thatch sticking out here and there to flutter in the breeze. That is where those little thatch cards come into play. Using a traditional modelling approach, placing each thatch card would be done by hand, probably by an intern who was questioning their life choices as they positioned each little thatch card. But, through the power of procedural modelling by virtue of geometry nodes, we can easily place thatch cards in between each wall slat. The best part is that no matter how each slat is rotated and positioned, the geometry node network for the thatched tar plane will recalculate where a thatch card can be placed without it ever being positioned where a slat is.

Image 3: A closer look at the thatch cards and their placement via Raycast in Blender’s geometry nodes. Notice that none of the thatch cards are protruding from within a slat. The thatch cards dynamically position themselves as the placement of the slats changes.

In Image 3, you can see a small portion of the node network used to place the thatch cards on the plane representing the tarred thatch. The entirety of the node network isn’t shown because the view would need to be zoomed so far out that you wouldn’t be able to read or see anything of note. The key idea to take away from Image 3 is the Raycast node in the network (you should be able to right-click on the image above and view it in a separate tab, allowing you to zoom in to read the node names). First, I used a “Distribute Points on Face” node to randomly place points on the tarred thatch plane. These points are where thatch cards could potentially be placed. With the Raycast node, we can do line traces and check if there is an intersection anywhere. In my case, I didn’t want to place a card anywhere that there was an intersection with the wall slats. Only where the raycast found no intersection should a thatch card be placed. The Raycast node is a bit strange to get used to, because it doesn’t work exactly the way that a line trace does in, say, the Unreal Engine. So if your interested in using this node, read the documentation and experiment a bit with it. It is worth your time to learn it.

Well, that was a lot to take in. I hope that I was clear in my descriptions, but the topics covered in this post are very complex. Without a large number of visual aids to help, it can be difficult to get my point across. Modding is a huge feature that I felt would be a great benefit to the game. Players will not be beholden solely to us for game content. If a mod developer wants to create a completely new dimension to the game (a new level of Hell perhaps), they will be able to do so. And, with the power of procedural modelling via Blender (or some other software like Houdini), they will be able to shorten the development time needed to make the custom assets they want. Thank you for taking the time to read this post, and I hope that you have a great day.

All the Modeling Tools

In game development today, there seems to be no shortage of tools vying for our attention. From programming to texturing to modeling, the selection of tools can be dizzying. When Epic announced the inclusion of the modeling tools plugin for the Unreal editor, I thought that this was nothing more than a replacement for the older BSPs already available. A nice addition to be sure, but not a serious tool to be used to create game content. Then Quixel released their videos on the creation of their medieval village demo, and I found a new appreciation for the tools that Epic has generously given us.

The obvious use is for blocking out a scene, and I have mixed feelings about their use for this purpose. Every time that you use a boolean operation on two objects, it creates a third object in the content folder. This can lead to a huge number of useless intermediary objects before you get to the final shape that you want for your block out. Worse, the created objects are all named something cryptic like ‘Boolean_a2b3x092xi3202’ or some such name. The editor appears to take the name of the operation and appends a UUID value to it. You can specify a name other than ‘Boolean’ in the tools, so you could use this to separate the final object you want from the intermediary objects you don’t care about. This leaves you with many unwanted objects with the name ‘Boolean-xxxx’ and one named with the value you provided in the UI. This is the approach that I used, and while it isn’t the most convenient, it does work. Still, this tool is far better that BSPs in my opinion, and is a welcome addition to the editor.

Where this tool really seems to shine is an application that I wouldn’t have thought it useful, but is shown to great effect in Quixel’s videos mentioned above. Using preexisting assets, along with the tools to reshape them, allows for the reuse of assets in a way that would have been much harder otherwise. What I really like about this toolkit, and even BSPs to some extent, is the fact that you are in the game level itself while using the tools. You can shape something to fit the exact position and placement that you need, with the look that you want. This could be done if you are creating all of your levels geometry in a separate DCC, but I have never liked this approach. I want to see what my level or asset looks like in the engine, not in the renderer that is shipped with the DCC. No matter what settings I tweak, I have never gotten MAX, Blender, or Houdini to render my assets the same as Unreal does. There is also the overhead of having to define each material twice; you define it in the DCC of your choice, and you define it again in your engine. We’ve all been there, and there is an element to this that cannot be escaped. It is a necessary evil. But, it is nice that this can be lessened to a degree.

The finished hut, placed in the first level. While this image doesn’t show the full detail of the hut, it gives a good impression of it’s look and feel. To better see the weathering written about below, see the featured image.

I have recently finished the bamboo hut where the player will go to initiate the start of the level in Capuchin Capers. This will allow the player to explore the island a bit and get familiar with the terrain…or just sightsee if they like. Once they are ready, they will enter the hut and the level will begin. Because of this, the hut will be included in every level and is the only structure in the game. It is likely to receive quite a bit of scrutiny from the player, so it has to match the visual fidelity of the rest of the level as well as having no strange issues with collision or scale. I decided to use the editors modeling tools to block this out. Previously, I would have either used BSPs (if I could talk myself into enduring that experience), or I would use an exported mannequin model as reference for scale. The latter would have been a big mistake.

I wanted a ramp leading up to the entrance to the hut, but the ramp needs to be long enough to clip through the terrain. I don’t know every location where this will be placed yet, so the model needs to account for that placement. But, I also do not want the hut as a whole to take up more space than is absolutely necessary. I was able to make the angle of the ramp steep enough to make it compact-ish, while still being able to actually walk up the ramp. This could have been done with BSPs, but that would have been a painful experience, to be sure. Aside from the ramp, I was able to easily get the overall shape of the hut the way that I wanted it. I had a specific look in mind and it was fairly easy to get to that look with the tools. I was still using the tools like a caveman, due to my experience with BSPs, so I could have refined the hut’s shape far more than I did in-editor. But my block-out was complete, with all the windows where I wanted them and at the correct heights. I exported this block-out to Blender to build the actual geometry for the hut.

I used geometry from preexisting assets in an attempt to maintain some continuity in the materials used. The tree trunks that make up the posts for the hut are palm trees that are actually in the levels. Similar assets were ‘repurposed’ in the same way, such as bamboo. I then used the same technique shown in Quixel’s video on the creation of their houses in their demo. Utilizing a separate UV channel to introduce mud, dirt, and grime to the hut really made all the difference. While most of the geometry used to build out the hut has shared UVs, or tiling textures, the approach Quixel demonstrates allowed me to break up the feeling that the materials are all shared. It gave each piece of geometry the feel of being a separate component in the hut, not just a bunch of copies of the same thing…which, of course, they are. I used Painter to bake out my AO, curvature, thickness and other maps, and then to create the masks needed to create this effect in Unreal.

I could have used Unreal’s modeling tools for much more than I did. They are not just a toy, or a replacement for BSPs, as I originally thought. They are a valuable tool in the toolkit, and one I plan to explore further. Thanks for the read.

The Wearing of Many Hats.

If you have done game development for any length of time, you know that you will have to play many different roles as a project moves along. From texturing to coding, you may be required to do a little bit of everything. That is where I have been for the last few weeks.

Completing the control rig in Unreal’s Control Rig plugin has been a challenge, but it has been an interesting experience. I definitely have a lot to learn when it comes to rigging. There are technical artists that spend the majority of their time creating rigs, and it is as much an artform as any other discipline. Getting Suzy’s bone structure to animate properly was difficult, and compromises were needed, but I am happy to say that she is done*.

I began to do her first animation, her walk cycle, when I discovered a serious problem with her tail. I had used a certain type of IK system in Control Rig (FABRIK in case you are curious), and while it is a great IK system, it just wasn’t even close to what I was looking for. So, back to the drawing board. I have been working hard to replicate the Rigify control rig available in Blender. While I haven’t achieved such a lofty goal, that shouldn’t be a surprise. But, I was able to capture the over-all feel of Rigify, if not the high-powered end result. The tail in particular was quite challenging and I really wanted Suzy’s tail to behave the same way as Rigify. What I thought would take an hour or two turned into a full day’s work. You just never know with these things. But I was able to get it to work, and get the results that I see in Blender. In some ways, this is unfortunate, because Suzy’s tail isn’t nearly as good as I had hoped it would be.

Weight painting is another part of rigging, and there are no unimportant parts to rigging. Getting the weight painting correct is a perfect example of this. No matter what I did, I couldn’t get her tail bones to deform the tail in a smooth way. No amount of weight blending would cure the problem. More bones in the tail seems like the only solution to this problem, but I am way too far along to change the bone structure now. I will just have to make it work, and remember this lesson in the future. You live and you learn.

The last hat that I have had to wear recently is that of a programmer. This isn’t something that concerned me. After all, I have more experience as a C++ programmer than any other skill that I have used on these game development projects. But this situation was new. The developers of the gFur plugin decided to drop support for the free version. This means that they are not updating the plugin for the current version of the UE4 engine (4.26.0 at the time that this is being written), or any of the future versions to follow. Luckily, they include the source code for the plugin which allowed me to recompile the plugin for the current version of the engine. There were a few little things that needed to be changed to satisfy some dependencies, but nothing difficult. A change to the plugin’s build file in VS2017 to point to the location where the FBX libraries and includes are located on my system were all that was necessary to get everything working nicely. Not a problem for an experienced programmer, but someone that doesn’t code would have had a hard time. It just goes to show how many hats you have to wear in game development.

* Is anything in game development really ever done? No, not really.

Rigging is Hard

I am in the process of creating Suzy, the star Capuchin monkey of the game we have in development. She needs to have accurate movement and the ability to make complex facial animations (Capuchin monkeys are very animated, no pun intended). The rig needed to animate her is very complex, and is the most challenging rig I have ever attempted. This isn’t saying much, because I am an inexperienced technical artist and a terrible animator. Encouraging words, I know.

This has been a long learning process overall. Learning to create the hair effects has been challenging, but fun. It took more time that I anticipated, but this is to be expected when you have to learn not one, but two separate hair systems (Blender and then Houdini). But when it comes to creating the rig, I have essentially jumped head first into the deep end of the pool.

The skeletal structure of the bones need to be defined so that a wide range of movement is possible, without severely distorting the mesh in unrealistic ways. To that end, twist bones are used to make sure that the mesh doesn’t collapse on itself when the arms or legs are moved. There isn’t nearly as much information on correct usage of twist bones for rigging as there is for modeling, so some trial-and-error was necessary.

Hopefully the bone structure that I have defined will be suitable for a wide variety of animations because we plan on selling this asset on the Unreal Engine Marketplace. If Suzy were just going to be used in this game alone, I wouldn’t worry so much about her rigging…I could rig her for our specific use-case. But once she is ‘in the wild’ there is no telling what a developer might do with her.