The Path Most Followed

Well, I have just spent the better part of the last week working on a custom EQS generator that creates a cone-shaped graph of points and then uses the A* algorithm to plot a path through the graph. Having virtually no experience in this field of programming I had no right to hope that I could produce anything of any value, but thanks to Red Blob Games excellent articles on this I was able to implement this in Unreal.

To say that this was challenging (for me, at least) is an understatement of massive proportions. While I am an experienced C++ programmer, this field of programming is very demanding. I am very happy with the results that I was able to achieve from studying the code examples and explanations on Red Blob Games site. If you have any interest in how pathfinding works in games or other applications, you owe it to yourself to read every word on their site.

My implementation uses a cone as the shape of the graph to generate the initial points for the graph. Next comes the line traces to find all of the blocking volumes that may have points generating within them. These need to be marked as ‘wall’ points for the A* algorithm to plot a path around them. There is also terrain costing that can be implemented, so that path generation can take the type of terrain into account while plotting through the graph. While this hasn’t been implemented in my generator yet, there is a cost function that is already being called. It just returns a value of 1 for every point in the graph, but later a data table can be added to allow for different terrain types to cost different amounts. This will require some rewriting of the code because at the moment the generator isn’t doing any line traces to the landscape underneath each graph point to find the material at that location. This approach may not be possible for a variety if reasons, especially considering Unreal’s landscape material architecture, but even if that is the case, there is always a way to do something if your really determined.

Having a generator that will create paths that avoid blocking volumes is crucial, because I had written another EQS generator that would create its points along a straight line from Suzy to the target. It is crude, but effective in some instances…except where it sends Suzy running through the ocean. Obviously, this isn’t ideal. In the image for this post, you can see the EQS generator has created a short path for Suzy to follow. This image doesn’t show off what the generator can really do, but does show it in action. Just as an aside, the image also shows off the new female mannequin that Epic has given the community. The hair groom was a quick job in Blender to test out export settings to finally get a groom out of Blender. The tutorial is by Marvel Master and does what it claims to.

It has taken a lot of work to create this EQS generator, and while it isn’t perfect, it works very well and allows Suzy to run from one side of the test Island all the way to the other. I don’t know how far it is in-game, but it has to be at least a kilometer and is likely more. This generator isn’t just a great asset to this project, but it is now a permanent tool in our toolkit, and I have no doubt that the time spent creating it will be paid back ten-fold in the future. Hard work pays off.

Coming to Our Senses

It has been quite a while since the last post here, and that is because work has been moving forward on Suzy’s AI as well as some ancillary code development for the AI Perception system as well as the Environment Query System (or EQS for short). Also, as can be seen in the image for this post, a test level was constructed to better represent the conditions the AI will need to operate it. This gave me a much better idea of how this AI will perform “in the wild“, as some like to say.

Suzy’s AI has come a long way, and it is now close to being implemented to the point that the AI outline describes. Whether or not it will be sufficient to make the game challenging enough is yet to be seen, but I am encouraged by the progress. With a much better understanding of all of the moving pieces in the Behavior Tree/Blackboard approach to AI design, I have been able to build up a reasonably intelligent AI that will wander looking for fruit to pick up. But, once the AI reaches a predefined level of frustration, it will seek out the player to follow them in the hopes of stealing a piece of fruit that the player may lead it to.

To help Suzy find fruit easier, and make the AI more challenging for the player, a new sense had to be created for UE4’s AI Perception system: Smell. With a sense of smell, the AI doesn’t have to actually see a piece of fruit to find it. This sense of smell respects not only the direction of the wind, but also its intensity. By taking the wind vector used in the newer atmosphere system’s material and converting that into a material parameter collection, the wind’s values can be piped into the perception system. In this way, the player will get a visual cue as to how, or why, the AI can sense them even when they remain unseen by the AI. It isn’t perfect by any means, but I feel that it is a great addition.

Finally, in this game Suzy is using a NavMesh Invoker to create a dynamic navigation mesh around her everywhere she goes. This is much better than trying to create a huge navmesh that encompasses the entire level. At best, that would be very time consuming during development due to the need to rebuild a huge navmesh whenever objects are moved in the level. At worst, the navmesh may be too big to generate at all, which would require an entirely new set of systems brought into the project (such as level streaming).

With a navmesh invoker, we can eliminate these issues. But, and you knew that ‘But’ was coming, navmesh invokers present their own sets of issues. The largest issue is that the AI can’t be given a target location to move to if that location is outside of it’s generated navmesh. For example, if the AI’s navmesh has a radius of 3000 units (the default) and you were to specify a location that is 4500 units away, that ‘move-to’ command would simply fail. The location is unreachable to the AI because it can’t build a path from where it is to where you are directing it to go. A solution that is still being developed is to use the EQS to generate a set of vectors that will be passed as an array from the AI’s current location to a target location. This will require multiple ‘move-to’ commands to go from the start to the end of the path, but hopefully, it will mitigate if not eliminate this problem.

There is an issue with the fact that the EQS is generating a straight line from point A to point B, and no tests can be used to score a better path. But, given the alternative, I feel that this is a good start of not a great solution.