The Path Most Followed

Well, I have just spent the better part of the last week working on a custom EQS generator that creates a cone-shaped graph of points and then uses the A* algorithm to plot a path through the graph. Having virtually no experience in this field of programming I had no right to hope that I could produce anything of any value, but thanks to Red Blob Games excellent articles on this I was able to implement this in Unreal.

To say that this was challenging (for me, at least) is an understatement of massive proportions. While I am an experienced C++ programmer, this field of programming is very demanding. I am very happy with the results that I was able to achieve from studying the code examples and explanations on Red Blob Games site. If you have any interest in how pathfinding works in games or other applications, you owe it to yourself to read every word on their site.

My implementation uses a cone as the shape of the graph to generate the initial points for the graph. Next comes the line traces to find all of the blocking volumes that may have points generating within them. These need to be marked as ‘wall’ points for the A* algorithm to plot a path around them. There is also terrain costing that can be implemented, so that path generation can take the type of terrain into account while plotting through the graph. While this hasn’t been implemented in my generator yet, there is a cost function that is already being called. It just returns a value of 1 for every point in the graph, but later a data table can be added to allow for different terrain types to cost different amounts. This will require some rewriting of the code because at the moment the generator isn’t doing any line traces to the landscape underneath each graph point to find the material at that location. This approach may not be possible for a variety if reasons, especially considering Unreal’s landscape material architecture, but even if that is the case, there is always a way to do something if your really determined.

Having a generator that will create paths that avoid blocking volumes is crucial, because I had written another EQS generator that would create its points along a straight line from Suzy to the target. It is crude, but effective in some instances…except where it sends Suzy running through the ocean. Obviously, this isn’t ideal. In the image for this post, you can see the EQS generator has created a short path for Suzy to follow. This image doesn’t show off what the generator can really do, but does show it in action. Just as an aside, the image also shows off the new female mannequin that Epic has given the community. The hair groom was a quick job in Blender to test out export settings to finally get a groom out of Blender. The tutorial is by Marvel Master and does what it claims to.

It has taken a lot of work to create this EQS generator, and while it isn’t perfect, it works very well and allows Suzy to run from one side of the test Island all the way to the other. I don’t know how far it is in-game, but it has to be at least a kilometer and is likely more. This generator isn’t just a great asset to this project, but it is now a permanent tool in our toolkit, and I have no doubt that the time spent creating it will be paid back ten-fold in the future. Hard work pays off.

Coming to Our Senses

It has been quite a while since the last post here, and that is because work has been moving forward on Suzy’s AI as well as some ancillary code development for the AI Perception system as well as the Environment Query System (or EQS for short). Also, as can be seen in the image for this post, a test level was constructed to better represent the conditions the AI will need to operate it. This gave me a much better idea of how this AI will perform “in the wild“, as some like to say.

Suzy’s AI has come a long way, and it is now close to being implemented to the point that the AI outline describes. Whether or not it will be sufficient to make the game challenging enough is yet to be seen, but I am encouraged by the progress. With a much better understanding of all of the moving pieces in the Behavior Tree/Blackboard approach to AI design, I have been able to build up a reasonably intelligent AI that will wander looking for fruit to pick up. But, once the AI reaches a predefined level of frustration, it will seek out the player to follow them in the hopes of stealing a piece of fruit that the player may lead it to.

To help Suzy find fruit easier, and make the AI more challenging for the player, a new sense had to be created for UE4’s AI Perception system: Smell. With a sense of smell, the AI doesn’t have to actually see a piece of fruit to find it. This sense of smell respects not only the direction of the wind, but also its intensity. By taking the wind vector used in the newer atmosphere system’s material and converting that into a material parameter collection, the wind’s values can be piped into the perception system. In this way, the player will get a visual cue as to how, or why, the AI can sense them even when they remain unseen by the AI. It isn’t perfect by any means, but I feel that it is a great addition.

Finally, in this game Suzy is using a NavMesh Invoker to create a dynamic navigation mesh around her everywhere she goes. This is much better than trying to create a huge navmesh that encompasses the entire level. At best, that would be very time consuming during development due to the need to rebuild a huge navmesh whenever objects are moved in the level. At worst, the navmesh may be too big to generate at all, which would require an entirely new set of systems brought into the project (such as level streaming).

With a navmesh invoker, we can eliminate these issues. But, and you knew that ‘But’ was coming, navmesh invokers present their own sets of issues. The largest issue is that the AI can’t be given a target location to move to if that location is outside of it’s generated navmesh. For example, if the AI’s navmesh has a radius of 3000 units (the default) and you were to specify a location that is 4500 units away, that ‘move-to’ command would simply fail. The location is unreachable to the AI because it can’t build a path from where it is to where you are directing it to go. A solution that is still being developed is to use the EQS to generate a set of vectors that will be passed as an array from the AI’s current location to a target location. This will require multiple ‘move-to’ commands to go from the start to the end of the path, but hopefully, it will mitigate if not eliminate this problem.

There is an issue with the fact that the EQS is generating a straight line from point A to point B, and no tests can be used to score a better path. But, given the alternative, I feel that this is a good start of not a great solution.

Animation and AI

With all of the modelling, materials, and rigging done Suzy has finally started to move on her own. While the animation outline isn’t complete, there are always a few animations that you can be sure will be needed. An ‘Idle’, ‘Walk’, and ‘Run’ animation will always be needed for a character, so that is where I started. Once those were done, I moved on to the 1D blendspace that would allow these animations to transition from one to another based on Suzy’s movement speed. While this is pretty standard, it felt like a big moment when Suzy was running around the test arena. With the basic blendspace set up, the animation Blueprint script could be started. This too was about as basic as it gets, but it still felt good to see her moving under her own, albeit very simple, control.

Much like the animation outline, the AI outline isn’t complete, but it is closer to being done than the animation outline. With a clearly defined expectation for what the AI needs to accomplish, Suzy’s AI should be much easier to create than if I had created the Behavior Tree and attempted to ‘wing it’. Preproduction planning will always pay off in the end. And, the AI outline will allow me to finish the animation outline with the confidence that I will have a definitive list of the animations needed in this project.

Being relatively inexperienced in Behavior Trees and AI creation in general, I thought it wise to complete one of the courses on AI featured on Epics learning portal. While I started “Introduction to AI with Blueprints” having some knowledge of Behavior Trees and the EQS system, the course was a big help. It was definitely worth the time that it took to properly study the material and go through the course. I had worked out a little of Suzy’s Behavior Tree before taking the course, but I had a suspicion that I wasn’t doing things correctly…and that suspicion turned out to be correct.

With Suzy’s animation Blueprint started, her base animations, and movement blendspace finished I feel more prepared now to attempt to create Suzy’s AI. And, creating the AI for this project was the whole reason for this project to begin with.

The Wearing of Many Hats.

If you have done game development for any length of time, you know that you will have to play many different roles as a project moves along. From texturing to coding, you may be required to do a little bit of everything. That is where I have been for the last few weeks.

Completing the control rig in Unreal’s Control Rig plugin has been a challenge, but it has been an interesting experience. I definitely have a lot to learn when it comes to rigging. There are technical artists that spend the majority of their time creating rigs, and it is as much an artform as any other discipline. Getting Suzy’s bone structure to animate properly was difficult, and compromises were needed, but I am happy to say that she is done*.

I began to do her first animation, her walk cycle, when I discovered a serious problem with her tail. I had used a certain type of IK system in Control Rig (FABRIK in case you are curious), and while it is a great IK system, it just wasn’t even close to what I was looking for. So, back to the drawing board. I have been working hard to replicate the Rigify control rig available in Blender. While I haven’t achieved such a lofty goal, that shouldn’t be a surprise. But, I was able to capture the over-all feel of Rigify, if not the high-powered end result. The tail in particular was quite challenging and I really wanted Suzy’s tail to behave the same way as Rigify. What I thought would take an hour or two turned into a full day’s work. You just never know with these things. But I was able to get it to work, and get the results that I see in Blender. In some ways, this is unfortunate, because Suzy’s tail isn’t nearly as good as I had hoped it would be.

Weight painting is another part of rigging, and there are no unimportant parts to rigging. Getting the weight painting correct is a perfect example of this. No matter what I did, I couldn’t get her tail bones to deform the tail in a smooth way. No amount of weight blending would cure the problem. More bones in the tail seems like the only solution to this problem, but I am way too far along to change the bone structure now. I will just have to make it work, and remember this lesson in the future. You live and you learn.

The last hat that I have had to wear recently is that of a programmer. This isn’t something that concerned me. After all, I have more experience as a C++ programmer than any other skill that I have used on these game development projects. But this situation was new. The developers of the gFur plugin decided to drop support for the free version. This means that they are not updating the plugin for the current version of the UE4 engine (4.26.0 at the time that this is being written), or any of the future versions to follow. Luckily, they include the source code for the plugin which allowed me to recompile the plugin for the current version of the engine. There were a few little things that needed to be changed to satisfy some dependencies, but nothing difficult. A change to the plugin’s build file in VS2017 to point to the location where the FBX libraries and includes are located on my system were all that was necessary to get everything working nicely. Not a problem for an experienced programmer, but someone that doesn’t code would have had a hard time. It just goes to show how many hats you have to wear in game development.

* Is anything in game development really ever done? No, not really.

An Unreal Control Rig

After finally getting Suzy’s hair in a reasonable state, I have turned back towards her rigging. Rigging a character is a challenging process, and there are people in the industry that work mainly, or solely, rigging characters and creating animations with those rigs. I knew that this was going to be a difficult step in the ever evolving process. So I devised a strategy for setting up Suzy, and if all went well (spoiler: it didn’t), we would use this process for all of the characters in our games.

Originally, animation was going to be performed in a combination of programs. For Suzy’s facial animations, I had planned on using Blender to take advantage of Rigify via the UE2Rigify plugin. That fell through when I discovered that the facial animation keys were not being baked to the source rig. No baked keys means no facial animations in the exported animation sequences. Not ideal, clearly.

For the ‘macro’ animations I had planned on using Cascadeur. This software not only allows you to rig your characters and animate them, but it also performs physics simulations to ensure that your animations are reasonably accurate. I was really excited about learning Cascadeur and animating Suzy with it. That fell through when I discovered that there is a limit of 128 bones in your skeleton. Suzy’s skeleton is over 200. I thought that I may be able to work around that by removing many of the bones in Suzy’s skeleton and using the simplified skeleton in Cascadeur. Not ideal, but it would have worked.

Not being able to animate Suzy’s face was a real show-stopper, though, and forced me to look into other technologies. That led me to take a closer look at ‘Control Rig’, a plugin developed by Epic to allow game developers the ability to fully rig and animate their characters directly in the Unreal editor. I had taken a brief look at it in the past, but foolishly dismissed it. “I’m going to use Blender or Cascadeur“, I though to myself, “so, what would I need this for?” On top of that, Control Rig was experimental, so there was no guarantee that Epic would continue to include the plugin with the editor. Definitely not ideal.

But, in desperation I turned back to Control Rig, and I am glad I did. After looking further into it, I can see that it will give us substantial benefits. By creating the rig IN the editor, it will allow us to animate directly in the editor and save those NLA sequences as separate assets. Because we are using the editor as a DCC tool, we are assured that all of the scale and rotation issues that may crop up from external DCC software are a non-issue. But the biggest advantage of Control Rig is the fact that a developer that is creating character assets to sell on the marketplace will be able to include the rig along with the rest of the assets that make up the character. The customer will be able to create their own custom animations with the exact same rig used to generate the original animation sequences that are supplied with the asset pack. The deciding factor came when Epic’s developers stated in one of the live streams that Control Rig was not going to be dropped. It was being used in the creation of Fortnite, so it was safe to use in our workflow.

By using Control Rig, we will be sure that what we are seeing in our created animation sequences will be exactly what we will get. And, just as importantly, we will be able to supply our marketplace customers with a fully defined rig directly in the editor. That rig will be self-contained in the asset pack, which means we will not have to support rigging the character in multiple (external) DCC packages. The customer will get a full-featured rig and we will only have to support a single DCC package. Now, that is ideal.

Fur for Daze

Hair has dominated my thoughts over the last month or more. It has been a seemingly endless stream of information on different ways to represent hair, and all of the technological requirements to do so.

But finally the work on Suzy’s fur is done*. After doing more research, I discovered that the high triangle counts that I was worried about were due to the way that UE4 renders objects in passes. While I knew this to be the case already, what I didn’t think about are the consequences of that fact. The UV seams of a mesh contribute to the overall triangle/vertex count of an object. This is why I was seeing drastically different triangle numbers for Suzy. I was expecting to see the 60K triangle counts that are seen in any DCC package, as well as the information overlay in the UE4 editor while viewing the skeletal mesh. But, when rendering in an actual scene all of the multi-pass rendering overhead is accurately reported in the RHI stats.

With that explanation out of the way, we can move on to the results. Below are cropped images of both versions of Suzy. The first two are showing the results of the free gFur plug-in, with a brief description under each. The last two show the results of the new hair and fur system in Unreal.

Fig. 1) A front view of Suzy with the free gFur plug-in. The relatively low number of shells (16) for the close-up views leave some ‘stepping’ in the frill around her ears. But overall, the results are extremely good for the low-end option.
Fig. 2) A side view of Suzy shows off the free gFur plug-in and it’s ability to fairly accurately represent the flow of hair from the groom that was imported from Houdini. The ‘stepping’ around the ears, while tolerable, is the only area that shows a compromise in quality. We are very happy to have this result for our players to choose as the low-end option.
Fig. 3) Suzy sporting her fur in the new hair system in Unreal. The hairs that are part of what I have come to think of as her ear frill appear to be a bit wiry despite repeated efforts to make them slightly thinner. I have come to the conclusion that it is a combination of the hair color and the relatively low specular highlights that make it appear that way.
Fig. 4) Here we can see a side view of Suzy to better show off the highlighting and shadowing of the new hair system. What may be difficult so see is the addition of some random gray hairs throughout Suzy’s coat. This isn’t possible in the gFur system, but adds a nice, subtle effect.

While I am very happy with the results of the free gFur plug-in, there is a downside to using it in conjunction with the new Unreal hair system. UE4’s hair system allows the developer to provide a static mesh object to represent the hair cards to use with systems that can’t handle the hair system itself. Also, the new hair system appears to revert to the hair cards at certain LOD levels to save on system resources. But, the gFur system uses skeletal mesh objects as the fur emitter. This eliminates the possibility of passing the gFur asset to the hair system as an alternative. This is a shame too, because I believe that, at the proper distances, the transition between the new hair system and gFur (or its equivalent) would be nearly seamless. But without altering the engine code to accept and use the skeletal mesh, there isn’t much that can be done.

So in closing, these are the results that we will go with for this asset. Some of the limitations force a specific workflow onto us, but there isn’t much that can be done about that. The assets are, in my humble opinion, pretty good and should give us the visual fidelity that we are after. Now we are on to actually finishing the rigging and animating Suzy. Lets get her moving!

* If we have learned nothing else, it is that “You never know”.

The Many Choices for Fur

Hair is a critical aspect of developing a character. Whether it be the protagonist, a villain, some random person on the street, or a dog in an alley, hair is everywhere. The importance of getting this right is reflected in the number of solutions that exist for this problem. I knew from the beginning that Suzy’s fur was going to take quite a bit of time to develop, but I had no idea it was going to consume this much development time.

From hair cards, to shell and fin techniques, to attempts at using parallax occlusion mapping, I have experimented with each approach to get the highest visual fidelity while maintaining acceptable performance levels.

Unreal’s new hair and fur system, which is featured in the image for this post, is the direction that I knew we wanted to learn going forward. This system allows developers to create near photo-realistic hair in real-time…with an experienced artist at the wheel, of course. My results are not quite there (some of the fur still looks a bit wiry), but I still have room for some fine tuning to get better results than seen above. But there is a catch to Unreal’s hair system. It is pretty demanding on the hardware, so an alternative is required for lower spec’ed systems. That is where the rabbit hole started.

You must provide an alternative character mesh for Unreal to show the player when their computer is not capable of running the hair and fur system. In my quest to do this, I started to create hair cards in Houdini only to see the triangle counts ballooning. Not only that, but hair cards have static textures for the hair color, among other things. So it was going to be very hard to get Suzy’s hair color right without an enormous amount of work. Seeing this techniques limitations, I chose to look elsewhere first.

Parallax occlusion mapping (or POM for short) seemed like it may be a fairly lightweight way of providing a hair material for Suzy. But after creating the texture sets needed and bringing everything into Unreal I quickly realized that POM wasn’t the answer. It looked awful.

Next up was the shell and fin technique that has been used with a high degree of success by others. After hunting around I found gFur on the Unreal Engine Marketplace. There is a free version that lacks grooming tools and some other quality-of-life features. But what do I need groom tools for? I have Houdini with a groom already prepared. I will spare you the pain, but getting that groom out of Houdini in a way that gFur will use wasn’t a turn-key solution, that’s for sure. You can see the results below:

Fig. 1: Suzy with the gFur plugin generating the shells and fins for her fur.

While the results from gFur look fairly good, and would look even better once everything was dialed in, there is a secret that the above image is hiding. At this distance from Suzy, there are over 800,000 triangles from the shells that are extruded and textured to fake the hair. Over 800,000! That is without the hair cards around her face to blend everything together nicely, and those cards would easily be 70-100,000 triangles. Maybe I am doing something wrong, or perhaps this isn’t going to have the impact on performance that high triangle counts used to impose. But, this doesn’t feel like the solution I had hoped for, so back to the hair cards I went.

As stated above, one of the major problems with using hair cards for Suzy is her hair color. Or should I say, colors. With characters where their hair color changes from one location of the groom to another, it presents a challenge to create texture sets that will work correctly without having to generate many large textures to represent the subtle color variations. This is a lot of work and eats away at VRAM that is certainly needed elsewhere. My solution was to take the color values from the hair groom created to export to Unreal, and ‘push’ those values into the hair cards vertex color values. This was very challenging because I am still very new to Houdini and hadn’t worked with VEX before. VEX is a C-like language that can be used in Houdini to write snippets of code to automate a task or change some values programmatically. With that, I was able to get the groom’s colors and place those values onto the cards. So, if the hair color texture changes no extra work will need to be done for the hair cards or the Unreal hair groom. Just a simple re-export is all that will be necessary. You can see a cropped image of the hair cards below:

Fig. 2: Suzy’s hair cards complete with vertex colors that can be used in Unreal to tint the hair texture. The hair color texture is applied to the mesh, and then ‘pushed’ to the hair groom itself (not shown). Then, the hair grooms color values a ‘pushed’ onto the hair cards themselves. So the cards truly reflect the grooms colors.

The image above shows Suzy with her hair cards without any type of material applied. The color that is seen above is all driven by the vertex colors on the individual hair cards. With those vertex colors, I should be able to tint the base color values in Unreal’s material editor to get a nice transition of colors without having many different texture maps to achieve a similar effect. The down-side about this approach is the number of triangles. The above hair card groom is around 650,000 triangles. This can be controlled through level-of-detail (commonly known as LOD) meshes which Unreal does a great job of creating for us, but that is still a high number. I have no idea what effect this will have on performance.

The moral of the story is that hair is a challenging subject to take on. No matter what we do, there will be performance trade-offs that have to be made. And artistic trade-offs as well.

Capuchin Color

In a quick update on Suzy’s progress, you can see in the screenshot that her texturing is moving along. After working on the rigging for so long, a break was required. You can only stare at the bone structure of a character for so long before you just need a change of pace.

The updates to Trello haven’t been as disciplined as they should be. The whole point is to document the process of making this game, so flopping from one thing to another isn’t the best strategy.

As far as Suzy’s texturing is concerned, it is clear why Harrison Moore, one of Epic’s technical artists, states that you need to get work out of Substance Painter and into UE’s editor as quickly as possible. The rendering system of Painter is far different than that of UE4, so getting something up in UE4’s editor quickly is vital. For more information on what I am referring to, you can watch this YouTube video for more information. He can explain this much better than I could.

Using a color-calibrated lighting environment is very important. I was viewing Suzy in the default level for the third-person template, and her facial skin color was way off. Her skin color was much pinker than it appeared in a color calibrated level.

The specular highlights of her skin still needs some work. The skin on her body should not be nearly as shiny as it is, and there is a little bit of tweaking that needs to be done for her facial skin. But, I feel like this part of her development is going well (disregard her hair, that is a work in progress).

By using the approach that was used in Paragon, we can get a level of control that we wouldn’t be able to easily achieve otherwise. For example, the number of pores on her skin. I can very easily dial in more, or less, pores based on the look I need. On top of that, I can control how pronounced the pores are. Should they be deep pores, or fairly shallow? It is just a matter of changing a few numbers to see, in real-time, what the difference between the two are. It is clear why Epic chose this strategy when developing the Paragon characters.

Once the process for creating characters using the Paragon techniques and the new Hair Simulation system is clear, turn-around time for characters should be much, MUCH faster.

Rigging is Hard

I am in the process of creating Suzy, the star Capuchin monkey of the game we have in development. She needs to have accurate movement and the ability to make complex facial animations (Capuchin monkeys are very animated, no pun intended). The rig needed to animate her is very complex, and is the most challenging rig I have ever attempted. This isn’t saying much, because I am an inexperienced technical artist and a terrible animator. Encouraging words, I know.

This has been a long learning process overall. Learning to create the hair effects has been challenging, but fun. It took more time that I anticipated, but this is to be expected when you have to learn not one, but two separate hair systems (Blender and then Houdini). But when it comes to creating the rig, I have essentially jumped head first into the deep end of the pool.

The skeletal structure of the bones need to be defined so that a wide range of movement is possible, without severely distorting the mesh in unrealistic ways. To that end, twist bones are used to make sure that the mesh doesn’t collapse on itself when the arms or legs are moved. There isn’t nearly as much information on correct usage of twist bones for rigging as there is for modeling, so some trial-and-error was necessary.

Hopefully the bone structure that I have defined will be suitable for a wide variety of animations because we plan on selling this asset on the Unreal Engine Marketplace. If Suzy were just going to be used in this game alone, I wouldn’t worry so much about her rigging…I could rig her for our specific use-case. But once she is ‘in the wild’ there is no telling what a developer might do with her.

Trello Board

To try to organize this project more, we have created a Trello board. Being new to formal project management software, and its philosophies, I didn’t follow the standard practices of using a system like Trello. So, if the structure of the board changes dramatically from one viewing to another, it is because I finally got around to re-organizing it to follow proper project management practices.

But, this has helped immensely in organizing the steps required to get the work done. This forces you to think through all of the steps required for the project. Some aspects of game development may not be as familiar to you…I know that there is a lot that I have little experience with. This makes you really think about what is going to be required to complete the project. While this may sound boring at first, it was (and is) exciting to me. I really got a better sense of what it was going to take to make this happen.

On top of organization, it really helps in terms of motivation. The project doesn’t feel so overwhelming when it is broken down into smaller steps. These steps can be completed within a day or two, giving a real sense of accomplishment and progress.

If you haven’t tried Trello, or a similar system, give it a try. It’s free for personal use (which is what this board actually is). But, make sure to do a little bit of research on the various approaches used in project management. It can save you a bit of time. Trello seems to be very straight forward and has the tools to help organize any project. If not Trello, than any project management system would be helpful.

By the way, this post was NOT sponsored by Trello. I just think this is a really nice system that is easy to use.