It has been quite a while since the last post here, and that is because work has been moving forward on Suzy’s AI as well as some ancillary code development for the AI Perception system as well as the Environment Query System (or EQS for short). Also, as can be seen in the image for this post, a test level was constructed to better represent the conditions the AI will need to operate it. This gave me a much better idea of how this AI will perform “in the wild“, as some like to say.
Suzy’s AI has come a long way, and it is now close to being implemented to the point that the AI outline describes. Whether or not it will be sufficient to make the game challenging enough is yet to be seen, but I am encouraged by the progress. With a much better understanding of all of the moving pieces in the Behavior Tree/Blackboard approach to AI design, I have been able to build up a reasonably intelligent AI that will wander looking for fruit to pick up. But, once the AI reaches a predefined level of frustration, it will seek out the player to follow them in the hopes of stealing a piece of fruit that the player may lead it to.
To help Suzy find fruit easier, and make the AI more challenging for the player, a new sense had to be created for UE4’s AI Perception system: Smell. With a sense of smell, the AI doesn’t have to actually see a piece of fruit to find it. This sense of smell respects not only the direction of the wind, but also its intensity. By taking the wind vector used in the newer atmosphere system’s material and converting that into a material parameter collection, the wind’s values can be piped into the perception system. In this way, the player will get a visual cue as to how, or why, the AI can sense them even when they remain unseen by the AI. It isn’t perfect by any means, but I feel that it is a great addition.
Finally, in this game Suzy is using a NavMesh Invoker to create a dynamic navigation mesh around her everywhere she goes. This is much better than trying to create a huge navmesh that encompasses the entire level. At best, that would be very time consuming during development due to the need to rebuild a huge navmesh whenever objects are moved in the level. At worst, the navmesh may be too big to generate at all, which would require an entirely new set of systems brought into the project (such as level streaming).
With a navmesh invoker, we can eliminate these issues. But, and you knew that ‘But’ was coming, navmesh invokers present their own sets of issues. The largest issue is that the AI can’t be given a target location to move to if that location is outside of it’s generated navmesh. For example, if the AI’s navmesh has a radius of 3000 units (the default) and you were to specify a location that is 4500 units away, that ‘move-to’ command would simply fail. The location is unreachable to the AI because it can’t build a path from where it is to where you are directing it to go. A solution that is still being developed is to use the EQS to generate a set of vectors that will be passed as an array from the AI’s current location to a target location. This will require multiple ‘move-to’ commands to go from the start to the end of the path, but hopefully, it will mitigate if not eliminate this problem.
There is an issue with the fact that the EQS is generating a straight line from point A to point B, and no tests can be used to score a better path. But, given the alternative, I feel that this is a good start of not a great solution.