Unity user explores Unreal Engine 4: part 8

May 04, 2020 at 01:30 pm by nemirc

In previous parts of this series, I've covered different aspects of UE4, including hair rendering, materials, landscape and foliage creation, and character animation. I've continued learning more about those things, but I also need to see other features to figure out how they would adjust to my needs, so I decided to take a look at the AI tools.

In the past, I've made my own AI-driven enemies in Unity, using PlayMaker in combination with NavMeshes. UE4's approach is very different. While in Unity you need to create everything from scratch (unless you use a third party asset that adds behavior tree functionality to your game), including tools to make the AI “see” or “hear” your player character, and a routine that will define what the AI does based on certain conditions.

In UE4, you have 3 different tools that work in together to create the AI: you have a Blackboard, a Behavior Tree, and programming (either C++ or Blueprints) to drive AI-specific functions for pawn “sensing.”

The Blackboard is basically a table used to keep tracks of variables. For example, you can use it to track where was the last position of the player character, if the player is visible or not, and what is the AI currently doing.

RELATED: Unity user explores Unreal Engine 4: Part 1
RELATED: Unity user explores Unreal Engine 4: Part 2
RELATED: Unity user explores Unreal Engine 4: Part 3
RELATED: Unity user explores Unreal Engine 4: Part 4
RELATED: Unity user explores Unreal Engine 4: Part 5
RELATED: Unity user explores Unreal Engine 4: Part 6
RELATED: Unity user explores Unreal Engine 4: Part 7

The Behavior Tree is literally a tree made of different states. Each state is “what the AI is doing” at that moment. For example, a state could be “patrolling” and another state could be “searching.” The Behavior Tree uses conditions to transition from one state to the next, making it easy to know what may be going wrong, and also allowing you to debug each state separately.

You put this all together with your own programming. As you know, I am using Blueprints at the moment (although I think I will learn some C++ in the near future), so everything I do is created there. In this case, you have different elements that put the entire programming system together: the AI controller, the pawn, and the AI components. The AI controller is what “drives” the character, and you can use it to attach the Blackboard and Behavior Tree to your pawn (among other things, obviously).

The nicest part is the “pawn sensing” system.

UE4 includes AI components that can be used to see, hear, and sense the player. Those components can also try to guess where the player is going to be next, but I haven't checked those yet. To configure how to see the player, you need to create a “sight sense” on your pawn, and configure the sight radius and maximum distance.

Hearing works pretty much the same, and the AI can know where your character is by the position of the sound. On the player's side, they are configured differently, though. Sight is automatic, but hearing only works if the player has a function to make noise.

The nice thing about hearing is that you can even make noise from objects, which is really cool and can be useful to create interesting gameplay scenarios. For example, I was testing generating a noise from a cube that could be used to distract the AI.

In the future, I can go into more detail on how to create a basic AI for an enemy. In the meantime, I will continue exploring the engine as I transition my horror game from Unity to UE4.

Get Unreal Engine: https://www.unrealengine.com/en-US/get-now

Sign up for our newsletter

This website uses cookies to ensure you get the best experience possible More Info
Got it!