AI systems using Unreal Engine 4: part 3

Jun 03, 2020 at 02:00 pm by nemirc

AI systems using Unreal Engine 4

Last time I told you about the sight branch. This time I will explain the hearing branch. This branch is supposed to allow the stalker to hear the player and react to player-generated sounds. The stalker will move to the spot where sound comes from, and the AlertLevel is increased by one every time a sound is heard. One thing I also decided is that the AlertLevel would not go higher than 2 (the stalker would only reach level 3 when it actually sees the player).

This branch is using 2 Blackboard Decorators. The first one is to PlayerSeen as “Not Set” with an “Observer aborts” set to “self”; and the second one is set as PlayerHeard as “Set”. The idea is that this branch should run when the stalker hears the player, but only when the player is not seen. The reason is simple: if the enemy already saw the player and knows where the player is, it is not important if the enemy can hear the player since it already has visual contact. Since the Decorator “PlayerSeen” is set to “abort self” this branch is interrupted as soon as the stalker sees the player.

The first node of the branch also has a Service that increases the alert level. However, this is increased only once every time a sound stimulus happens. For example, say the enemy had an AlertLevel of 0, then hears the player for the first time, AlertLevel raises to 1, but it will remain at that value until it hears the player a second time, and, when that second time happens, the AlertLevel will go up to 2 (2 being the highest number it can go).

You see that I am changing AlertLevel from the PlayerSeen and PlayerHeard branches. This variable is mostly used to decide what the enemy will do during the Idle branch. Will the enemy be relaxed, or will it be patrolling and looking for the player? That is defined by this variable.

Let’s continue. As you can see, this main node is also a sequence. From this main node, we have 3 different tasks. The first task is a custom Blueprint called BTTask_GetPointAroundHeardPlayer. Basically, what this function does is pick a point around the heard location (a value gotten from the stalker AIPerception component), and then set that as a “Search Location” value on the Blackboard.

First, I perform a Cast To AIController to get the Player Heard Location, and then I use the Get Random Point In Navigable Radius to get a random point within a range of 500 units. However, then I perform a distance check between the Heard Location and the selected point to make sure they are at least 50 units away. The reason I do this is, because I want to keep a “void” around the Heard Location and pretty much get a search area that will resemble a donut around the Heard Location. If the distance between points is less than 50 units, another point is selected. If not, the task ends and the picked point is set as Search Location.

The next task makes the stalker move to the Search Location point, and then the stalker remains there for 5 seconds.

This task is only performed once, since the PlayerHeard will reset to false after a few seconds (the number of seconds depends on the “Max Age” setting on your AIPerception configuration for that sense).

The next branch is the ObjHeard branch. As I explained on the previous article, this will make the enemy move to a sound emitter, and I will use this to distract the stalkers with sounds. This will also be useful in cases the player interacts with something that makes a loud noise (for example, if the player drops a heavy object, causing a loud noise). Compared to the PlayerHeard branch, this one has 3 Decorators, one for PlayerSeen set to false, another for PlayerHeard set to false (both of them set as “Abort = Self”), and a last Decorator set to ObjHeard = true. The node also has a Service that increases the AlertLevel from 0 to 1 (if the AlertLevel is already 1, or higher, this Service has no effect). This Service is not set on stone, and I might change it to make it go up to 2 if enough stimuli happen. The rest of the branch is pretty much the same as the PlayerHeard (pick a point, move to point, wait, reset).

To make this work, I changed the AIPerception functions a little bit, though. If you go back to the On Target Perception Updated event of my AIPerception (first part of the series) you see there’s a macro that checks if the sensed actor is the player or not. From the False output of the macro, I output to a couple of nodes, one that sets the sensed actor as the object, and another that calls a custom event called HeardOtherObject.

The HeardOtherObject event is similar to the heard player: it sets the ObjectIsHeard function, the ObjectHeardLocation vector variable, and the LastStimulusType variable. If you remember the previous article, these variables are read from the Stalker_AIController and set to the Blackboard.

Lastly, to make an object make a noise, I use this simple Blueprint using the Report Noise Event node.

This ends this part of the article. Next, I will explain the Idle branch. In the meantime, feel free to experiment with the Behavior Tree to see what else you can do with sound-based gameplay.

Get Unreal Engine: https://www.unrealengine.com/






Sign up for our newsletter

This website uses cookies to ensure you get the best experience possible More Info
Got it!