AI systems using Unreal Engine 4: part 2

Jun 03, 2020 at 10:00 am by nemirc


Welcome to the second part of this series. Previously, I briefly explained the different components that conform UE4’s AI system. This time, I will explain a little bit more about the Blackboard and Behavior Tree.

First of all, maybe I should explain what kind of AI I am making. The design of your AI depends on the kind of game you are making, and an AI for an action game will be different than an AI for a stealth or horror game. “Just Let Me Go” is a horror game that is going to put a certain level of emphasis on stealth, so I want my AI to focus on that. I watched a good amount of AI videos and then I decided to design an AI more or less based on the explanations given on the first half of this video https://www.youtube.com/watch?v=Ay-5g36oFfc

As I mentioned before, my stalker is able to see and hear the player, but also hear objects. This last one is important in case I want to use noise to distract the AI. I have variables that tell whether or not the AI has seen and heard the player, and also variables that tell the AI where the player was seen/heard. You can see them (and others that I will explain later) on my blackboard below:

The idea is this (from the functions I explained on my previous article): when the AI gets a stimulus from the AIPerception component, I can get different information from that stimulus including whether or not the player was sensed, stimulus type, and stimulus location. I output two variables, one that returns true if the player was sensed, and another that tells the player location when it was sensed.

Before I continue, I need to say that the stalker needs a navigation mesh to move. You create those with a NavMeshBoundsVolume. Just drop the volume, resize, and you’re done. Now that I got that out of the way, let’s continue. Below you can see my Behavior Tree. It has four main branches that define the behavior of the character depending on different factors. The branches are (from left to right) idle, seen player, heard player and heard object.

Right now, I will tell you how I designed the second branch (I’m skipping the first one right now since I’m still working on it). But first, I need to explain what my main node (the one below the ROOT) is doing. That first node has a Service (made in Blueprints) I’m using to check different variables. It checks if the player is seen or heard (and the corresponding location of that stimulus, as explained above), or if an object is heard (also with its location). It’s also checking if the player is within an attack distance, what was the last stimulus type (I use this for the Idle branch I’m working on).

Branches on a Behavior Tree are executed based on the result of one or more Decorators (they are pretty much “if statements”). I am using the first node to get the state of the described variables, and then I add Decorators to the main node of every branch, so the game knows what branch to execute. For the second branch, I am using a Blackboard Decorator that checks if PlayerSeen is true. If true, the branch executes.

On that node you also see another Service. I am using that Service to check the current value of the AlertLevel variable. The idea is this: rather than just swapping from idle to full alert mode on sight, I want to give players a few seconds to react while the stalker increases its “alert level” from current value to 3 (this also means the stalker will go into full chase mode if it’s already on level 3). This node branches into two nodes, one that is executed if the alert level is less than 3, and another one that executes if it’s greater or equal to three.

If AlertLevel is less than 3, this branch executes. However, this one is using a “Sequence” node rather than a selector. This is what the branch does: first, it waits for 0.5 seconds; then, if the stalker is farther than the attack range, it starts increasing the alert level as it slowly moves to the player (at walking speed).

You can also see there’s a Service I’m using to increase the alert level. That Blueprint is used to gradually increase the alert level until it reaches “full alert” (level 3). When alert level is 3, the stalker will switch into chase mode. Now I will explain the Blueprint. The node “Event Receive Activation” is your equivalent to “On Begin Play.” I first cast the owner to Stalker_AIController and then I connect this to a “Do Once” node (I didn’t really need this one but I used it as an extra check to make sure flow works correctly) and then I connect the Do Once and the result of my Cast To into a custom function called Increase Alert (I created this function in my Stalker_AIController that I will explain in a bit, and that’s why I execute the cast at first). I am also getting the AlertLevel from the Cast To and then I plug that, and the output of the Increase Alert custom function into the “Set Blackboard Value as Int” node so the Blackboard is updated (causing an eventual switching to another branch of the Behavior Tree). Lastly, I have a delay that plugs into the start of the Blueprint. In other words, what this is doing is increase the alert level by one every second (giving players up to three seconds to react.

My IncreaseAlert function is pretty simple. It just adds “1” to the AlertLevel value as long as the alert is less than 3.

When alert is 3, it will switch to the other branch (which is also a sequence), seen below. As you see, there’s a Service called BTService_RunToPlayer. This service basically increases the pawn’s Max Walk Speed so the stalker starts running to the player.

The two branches of this main node are selected based on whether or not the stalker is within attack distance. If the stalker is far away, it will run to the player. If not, it will just attack.

BTTask_Attack is a custom Behavior Tree Task that is used to call the attack function in the AI pawn (not the AIController). Right now, I only have a function on y pawn Blueprint that switches the stalker color to represent the attack. One thing you can see is that I am performing a double Cast To. That really complicates things, so I will just move the attack function to the Stalker_AIController so it can be reused by any AI NPC.

Since I still need to define the actual attack sequence (I’m split between the stalker hitting the player, or grabbing the player, and these two require very different approaches), I’ve stopped there. I will revisit this branch when I define the actual attack. In the meantime, feel free to experiment with Behavior Trees to make your own AI driven NPCs. Next, I am going to explain how the first hearing branch works.

Get Unreal Engine: https://www.unrealengine.com/ 






Sign up for our newsletter

This website uses cookies to ensure you get the best experience possible More Info
Got it!