Game Development
Game Development
Unity
Project Settings
At first when opening the Unity editor I was a bit overwhelmed by the many options
available, and it can be hard to get going without at least knowing how to
configure the most basic of settings for a Unity project. In the sections below,
I'll cover some simple settings that are worthwhile to consider when creating a new
project in Unity.
Playmode Tint
This option is not found in Project Settings, but I think it is something everyone
entering into Unity for the first time should consider. Navigate to the menu bar at
the top of your editor and select Edit->Preferences->Colors and adjust the Playmode
Tint to something very noticable. This will avoid forgetting you are in play mode
and making some changes, only to be forced to revert them all once exiting play
mode.
For the rest of these sections, we will be working in the Project Settings panel
opened with Edit->Project Settings... in the menu bar of the Unity editor.
Project Name
It is not to be assumed that Unity will distribute builds of your game with your
local Unity project name as you defined it when creating your project initially. In
fact, Unity requires us to specify these details within the Player section of
Project Settings. See the section below is adjusted to suit the needs of your
project.
It's important to change things like this from the default settings, otherwise even
a finished project can end up looking incomplete. Navigate to the Player section
and scroll down to adjust icon settings. It's important to be consistent across all
platforms, and this can easily be done by checking the Override for PC, Max & Linux
Standalone tick box at the top of the panel. This will apply your Icon settings on
all platforms.
Splash Screen
Within the Player panel we can find the below settings for modifying the splash
screen of a game or application created with Unity.
Quality Settings
You can rename quality levels, add new, and adjust platform-specific modes as well.
It's important to note that clicking the name of the quality setting in this table
(just left of the check-marks) will apply the settings within your editor for
testing. The Default drop-down arrors correspond with each platform at the top
level of the table.
Graphics Settings
This is where you'll define the preconfigured graphics settings available to the
player. Its important to adjust these to suit the platform the build will be
running on. As an example, this feature could be useful when trying to distribute a
test build of a Unity game with WebGL. We could reduce the settings to improve
performance within the browser to make the game much less demanding. This allows us
to build more efficiently to WebGL and not create an unecessarily demanding or slow
performing game given this basic platform of WebGL.
Curious what WebGL is or looks like in use? I host some archived examples on my
website, click here to check out some Unity WebGL games that I've already built and
hosted online for playing.
Input Manager
This section is very useful in configuring controls for your game that can then be
used for scripting. For example, the section below I have defined a button for
Fire, which is triggered when the player clicks the left mouse button
By using a custom script that defines global constants, we can reduce the task of
changing these values later on. Below, I'll cover an example of using the Input
Manager paired with a few C# scripts to define controls in global variables which
can be easily modified in a central location. This avoids a scenario where we have
built a complex game and want to change controls later in development, requiring us
to change static values across numerous scripts. This is not only tedious but also
makes the project more prone to errors.
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
// Constants used within the game to handle passing control settings to builtin
unity functions with string parameters
// UI controls
These constants can then be used in a relative PlayerInput class, which can handle
recieving input from the player at a higher level so we won't need to refactor all
of our scripts in the scenario that we want to modify our controls.
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
PlayerController playerController;
void Start()
playerController = GetComponent<PlayerController>();
Cursor.lockState = CursorLockMode.Locked;
Cursor.visible = false;
void Update()
// Always check if the player wants to unbind their cursor lock state
UpdateLockState();
Vector3 move;
switch(playerController.targetPov)
case Kamera.pov.Mounted:
break;
default:
break;
return move;
return GetLookAxis(Controls.c_LookMouseHorizontal,
Controls.c_LookGamePadHorizontal);
void UpdateLockState()
// Checks whether the look input is via mouse or gamepad and returns a float 0.0f-
1.0f of the strength
if (CanProcessInput())
// Check if there is any input from a gamepad controller on the given axis
// If we are using a gamepad use stickLook's strength, otherwise use mouse input
if (isGamePad)
// since mouse input is already deltaTime-dependant, only scale input with frame
time if it's coming from sticks
str *= Time.deltaTime;
else
str *= 0.01f;
#if UNITY_WEBGL
// str *= webglLookSensitivityMultiplier;
#endif
return str;
return Input.GetButtonDown(Controls.c_ModCrouch);
return Input.GetButtonUp(Controls.c_ModCrouch);
return Input.GetButton(Controls.c_ModSprint);
return Input.GetButtonDown(Controls.c_ModJump);
return Input.GetButtonDown(Controls.c_PrimaryFire) ||
Input.GetButtonDown(Controls.c_PrimaryGamepadFire);
return Input.GetButtonDown(Controls.c_PrimaryAim) ||
Input.GetButtonDown(Controls.c_PrimaryGamepadAim);
return Input.GetButtonDown(Controls.c_PrimaryHide);
}
if(Input.GetKeyDown(KeyCode.Alpha0)) return 9;
If you want to actually be able to apply damage, we need a Target script. See the
simple example below for a script which enables this to occur. Later, within
WeaponControl.cs, we will check if the object we hit has this script attached, and
if it does we can deal damage to the set HP amount given to the Target
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
[SerializeField]
[SerializeField]
[SerializeField]
[Tooltip("The GameObject to spawn when this object is broken")]
void Start()
void Update()
health -= amount;
if (health <= 0)
Destroy(gameObject);
We could then use this PlayerInput.cs script within a WeaponControl.cs script check
if the player presses this button in Update called once per frame. If they have
tried to shoot and have a weapon equipped, we can call the relative weapon's
ShootWeapon() function. Note that you will have to attach the playerInput and
playerWeapons variables to relative scripts within your editor.
[SerializeField]
PlayerInput playerInput;
[SerializeField]
PlayerWeapons playerWeapons;
[SerializeField]
[SerializeField]
[SerializeField]
Camera mainCamera;
[SerializeField]
ParticleSystem muzzleFlash;
[SerializeField]
GameObject hitEffect;
void Update()
void ShootWeapon()
fireSFX.Play();
projectileObj.GetComponent<Rigidbody>().AddForce(transform.forward * 100);
RaycastHit hit;
Debug.Log(hit.transform.name);
muzzleFlash.Play();
Destroy(hitObject, 1f);
Scripting
Scripting in Unity uses C# and is very well documented. In the sections below, I'll
Provide examples and edge cases where possible, and link to the relative
documentation for quick reference.
For a collection of classes and structs that are required for Unity to function,
which means they will always be available to you when scripting in Unity, head over
to UnityEngine.CoreModule Documentation
Transform
Local Space
Local space is the transform relative to the object's parent. An example of this
can be seen below where I have selected an object and the transform controls are
centralized on the exact transform of that object relative to its local position.
Take notice of three things in the above screenshot. First, we have selected Local
position in the top-left near our transform controls. Clicking this button again
will toggle between Local and World space. Second, take note of the World Space
axis shown at the top-right of the scene view. Third, in contrast to the World
space axis, notice the GameObject's axis shown in the scene view are different in
orientation. The transform axis shown on the GameObject are modifying and referring
to the GameObject's transform within Local space.
World Space
World space is the position of the GameObject rooted within the scene. An example
of this is seen in selecting the exact same object in the editor and toggling world
space transform view. This makes sure the transform controls are the same as the
World Space axis, instead of referring directly to the transform of a local object.
Again, take notice of three things in the above screenshot. First, we have selected
Global position in the top-left near our transform controls. Clicking this button
again will toggle between Local and World space. Second, take note of the World
Space axis shown at the top-right of the scene view. Third, in contrast to the
World space axis, notice the GameObject's axis shown in the scene view are
different in orientation. The transform axis shown on the GameObject are modifying
and referring to the GameObject's transform within World space.
Vector3
Axis
In Unity 3D you will use the X, Y, and Z axis frequently both in positioning within
the editor and scripting. It helps to have a clear understanding of the names these
axis can be referred to with, as it will greatly improve your ability to access and
modify these values without over complicating things.
The X axis can be accessed with the right keyword when accessing any class which
stores axis information
The Y axis can be accessed with the Up keyword when accessing any class which
stores axis information
The Z axis can be accessed with the forward keyword when accessing any class which
stores axis information
Similarly, when modifying a Vector, we can easily flip these axis by accessing the
opposite of these keywords -
The X negative axis can be accessed with the left keyword when accessing any class
which stores axis information
The Y negative axis can be accessed with the down keyword when accessing any class
which stores axis information
The Z negative axis can be accessed with the back keyword when accessing any class
which stores axis information
Quaternion
Shortcuts
Since Unity has many features and shortcuts available that will widen the gap
between an experienced developer and a beginner, I'll list some of my most
frequently used shortcuts and tricks here. Though these can all be viewed and
modified by opening the panel below in Edit->Shortcuts..., there is a huge amount
of shortcuts and this can be a lot to look at.
Transform Controls
At the top-left of your Unity editor, you'll notice the transform control buttons
where you can switch between Hand, Move, Rotate, Scale, Rect, and Universal
controls. Each of these can also be toggled by pressing Q, W, E, R, T, and Y,
respectively.
Snapping to Collision
There will be many cases where you want to place an object on a table or ground
within your scene. You should not need to manually fumble with axis to do this, but
instead given that both objects have collision of some kind you can simply hold
Shift+Ctrl while using the Move tool and dragging the grey box that appears in the
center of the object, NOT the axis themselves. This will immediately snap the
object to the collision nearest to your cursor as you drag it around the scene.
There may be minor adjustments needed, but overall this should do the trick for
most basic items.
You will frequently want to move an object to the position and rotation of your
current scene view in World space. You could manually drag the object across the
scene in unity, adjusting each axis as needed. Alternatively, you can fly to a
position near your desired location for the object, select the object, then press
Shift-Ctrl-F to move the object to your exact position and rotation. This is very
useful when setting up cameras, as you can just fly to the view you want the camera
to display, select the camera, and press Shift-Ctrl-F to set it to that exact
position with a lot less fumbling around.
Unclickable Objects
Tired of clicking in the scene view and selecting the terrain or some other
GameObject? Within the scene hierarchy you can toggle whether or not an object
should be clickable. Simply click the small hand next to the object's name in the
hierarchy.
You can also toggle hiding and showing objects by clicking the eye icon just to the
left of this setting
Prefabs
Since the Unity workflow is built around prefabs, I figured I'd document some
specific use cases for the many features introduced in the Unity LTS 2019 release
which added support for prefab variants and nested prefabs. On this page, I'll
cover some good practices and features these prefab features provide.
I'd highly recommend heading over to devassets.com to grab some of the assets you
see featured across the Unity pages on Knoats. They are entirely free and give you
a lot to work with when learning. If you can afford it, I would recommend donating
to the developers. Not only does this unlock more assets you can get with the
package you donated to, but it shows support to the developer that organized all of
these great assets in one place for you to learn with.
Positioning Prefabs
Being relatively new to Unity, I began by grabbing some assets off the Unity Store.
Like most free assets on the store, these did not come entirely assembled for me
and required me to work a bit to get things in a state that is usable for even the
most basic games. This has been a good learning experience, and required no
scripting, so if you are new to Unity and not quite ready to script, doing this
will give you experience creating prefabs, working with materials, shaders,
lighting, textures, and much more.
At first when creating a prefab of an object that exists within your scene, you may
see something like the below when opening the prefab to edit
This is clearly not the orientation that we expect this SciFi_Rover vehicle to have
when initially placed in our scene. To fix this, be sure you are editing the prefab
in the prefab editor and NOT directly within your scene. Then adjust the transform
to be in the orientation desired.
First, set all but the Scale of your object to 0. Shown in the screenshot below,
there will be many cases that this does not produce the desired results, so we
still need to modify the transform further
After making some adjustments, the object's final orientation within the prefab
editor is seen below
And the final transform properties of the root GameObject are now much cleaner -
Post Processing
Good graphics are good. That's why I was excited to find adding Post Processing to
my Unity 3D project was not only easy to do, but a huge improvement to the visuals
within my scene. This enables common modern graphics features like Motion Blur,
Ambient Occlusion, Depth of Field, and more.
Post Processing is added to each scene individually, and not a project as a whole.
To add this to a Unity 3D project, we first need to add the Post-process Layer
component to our scene's main camera. Next we'll add a Post Processing Volume that
globally effects our entire scene. Then we can add a PostProcessing_Profile for our
scene and add new visual effects accordingly.
Its important that the camera the player views the game from contains this
component. Otherwise, if the player can toggle between a camera which has the Post-
process Layer and one that does not, they effects gained by post processing will
only be rendered in one view and not the other.
Once we've added the above component to the scene's main camera, we need to adjust
the layer of both this component and our main camera to reflect this. Change to the
Postprocessing Layer in the Post-process Layer component.
Now change the layer of the camera itself to the Postprocessing Layer as well
At a glance, there is not much here. But once we add a Post Process Profile and
finish configuring our scene we will use this component to adjust some pretty neat
looking visuals.
Be sure to apply the Post Processing layer to the Camera and Volume GameObjects
within your scene before continuing or the effects will not be applied
Within the GameObject created for our Volume click New to the right of the Profile
field in the new Post-processing Volume component.
That's it! See the glow coming from the lights in the pictures below for an example
of how this can be used to add the Bloom effect to an emissive light source.
Post Processing on
Setup
Using the default configuration for a keyboard&mouse / Gamepad Input Actions asset
in unity, we can implement universal controls given various input devices.
This was a weird bug for me to figure out, so I thought it was worth a mention. For
me, I had to do this in order to get Unity to accept input from my Corsair gaming
mouse. I searched up a lot of information on this new input system thinking I was
using it wrong, and later found that my mouse was not passing input to unity and my
code was correct.
Use
Rigidbody playerRigidbody;
void Awake() {
playerRigidbody = GetComponent<Rigidbody>();
void OnEnable() {
controls.Player.Enable();
void OnDisable() {
controls.Player.Disable();
playerVelocity = context.ReadValue<Vector2>();
print("Bang");
void Update()
0,
Or we can let Unity call functions defined using the naming convention void On\
[ActionName\](InputValue value)
Rigidbody playerRigidbody;
Vector2 playerVelocity;
void Awake() {
playerRigidbody = GetComponent<Rigidbody>();
void OnEnable() {
controls.Player.Enable();
void OnDisable() {
controls.Player.Disable();
playerVelocity = value.Get<Vector2>();
playerVelocity = value.Get<Vector2>();
void Update()
Broadcast messages is the same as Send Messages, except broadcasting invokes the
same methods on all child objects who have a component with function definitions
that match this naming convention.Unreal Engine 5
Linux Setup
The setup process for UE4 on Linux is pretty straight forward, and the official
instructions are very well documented. Follow those steps, and return here once
you're done. Below, I just outline some of the issues I encountered using UE4 on
Linux post-installation and how I solved them.
Marketplace Assets
Epic Games doesn't seem to provide any support for the UE4 Marketplace for Linux.
As a result, Epic also does not support downloading and adding any assets to your
project. Bummer.
I would like to note a blog post on this same issue by alexandra-zaharia, I would
have very much preferred her solution but after attempting it I could not get it to
work. I could install Epic Games through Lutris, but symlinking my projects (or
copying them) into Wine Windows FS did not result in the Epic Games launcher
detecting the projects.
What I ended up doing was using the unofficial nmrugg/UE4Launcher on GitHub. It
works great, but I do miss some things like asset engine version information and
browsing the marketplace.
You can install assets with it though, as long as you own them and they're linked
to the Epic account you sign in with. That's all I need for now, I'm just playing
with the idea of learning some C++ for UE4.
UI Scaling
Related question on StackOverflow
Initially the UI for Unreal was very large and practically unusable. This could be
due to the fact that I'm running on an integrated graphics CPU, but I'm not sure.
To fix this, I just opened a project and navigated to Edit->Editor Preferences and
then unchecked the Enable High DPI Support option under the General/Appearance
settings menus. After restarting UE4, everything was good and UI was normal again.
Application Launchers
The shortcuts initialized by Unreal during the build process didn't work for me. I
couldn't figure out why, so I just made my own. I placed these files into
~/.local/share/applications/
You will need to change the Path value to point to the same directories on your
local system for the following configurations.
For the Unreal Engine Editor, the following UnrealEngine.desktop file will launch a
window to open or create a UE4 project
#!/usr/bin/env xdg-open
[Desktop Entry]
Version=1.0
Type=Application
Exec=UE4Editor
Path=/home/kapper/Code/Clones/UnrealEngine/Engine/Binaries/Linux
Name=Unreal Engine Editor
Icon=ubinary
Terminal=false
StartupWMClass=UE4Editor
MimeType=application/uproject
Comment=Open or create UE4 projects
[Desktop Entry]
Version=1.0
Type=Application
Exec=npm start
Path=/home/kapper/Code/Clones/UE4Launcher/
Name=Unreal Engine Launcher
Icon=ubinary
Terminal=false
StartupWMClass=UE4Editor
MimeType=application/uproject
Comment=UE4 Project and asset management
Debugging
See Build Configuration documentaiton for more information of the various build
targets that are made available when generating a C++ project solution with UE4.
Also see Compiling Projects for how to build and debug with Visual Studio.
I watched this video on YouTube to learn how to debug UE4 projects using vscode.
The instructions are for windows but it was very similar for linux. Just had to
install vscode on Linux first.
Once you have vscode installed, open the UE4 editor as you normally would to edit
your project. Then, go to Edit->Editor Preferences and navigate to the Source Code
settings menu from the left. Select Visual Studio Code from the drop down. You'll
get a prompt to restart the editor, which you should do before continuing.
Once the UE4 editor has been restarted, navigate to File->Generate Visual Studio
Code Project, there will be some loading screen that appears and once that finishes
you can go to File->Open Visual Studio Code
Once inside our UE4 vscode project, push CTRL+SHIFT+B and run the
<YOUR_PROJECT_NAME>Editor (Development) build task. This task was defined for us by
UE4 when we generated our vscode project in the previous step, and the entry within
my launch.json is seen below.
{
"name": "ThirdPersonEditor (Development)",
"request": "launch",
"preLaunchTask": "ThirdPersonEditor Linux Development Build",
"program":
"/home/kapper/Code/Clones/UnrealEngine/Engine/Binaries/Linux/UE4Editor",
"args": [
"/home/kapper/Code/GameDev/UnrealProjects/unrealgame/ThirdPerson.uproject"
],
"cwd": "/home/kapper/Code/Clones/UnrealEngine",
"type": "lldb"
},
Running this task for the first time could take a bit of time depending on the size
of your project. When this finishes building, it should automatically start a new
UE4 editor with your project open and in debug mode. If it doesn't, just select the
<YOUR_PROJECT_NAME>Editor (Development) debug configuration in the sidebar and
click the run button.
I also had to run the following command to install a missing dependency. The first
time I tried to build there was an error and after reading it I traced it to this
package and installed. The development build was successful after installing.
sudo apt install libopenvr-dev
Now you can run the same debug configuration and once the UE4 editor launches make
sure whatever script you're debugging is active within your scene or your code
breakpoint will not be hit.
Debugging on Windows under Visual Studios is very well documented, so while I do
also have a windows development setup for UE4 I wont cover that here.
No luck with CLion, I seen memory usage during the build jump from 6GB to 16GB,
plus an additional 4GB of swap space was also ate up.
Possibly a better solution: Running UE4 from Qt Creator
Since writing this, I've been using Rider for Unreal Engine, It's in EA right now
so you have to apply for access but I was accepted almost immediately so you
shouldn't have any issues.
Gameplay Ability System
Then within the editor go to Edit->Plugins... and enable the Gameplay Abilities
plugin. You will need to restart the editor for the changes to apply.
So, to complete this setup you will at least need to define a few new classes for
your project.
For my project I follow the same naming convention used with GASDocumentation. I'll
paste it below, but for the first few files required to set up the Gameplay Ability
System (GAS), we are defining the backend of the system so none of these
conventions apply. For these files, I added the GAS_ convention.
Prefix Asset Type
GA_ GameplayAbility
GC_ GameplayCue
GE_ GameplayEffect
GAS_ GameplayAbilitySystem (Core Configurations)
Ability Enumeration
First, we need to modify the contents of the header file for our unreal project. My
project is named unrealgame5 so the file is unrealgame5.h, and the contents are
below. If you already have information here, just make sure the EGASAbilityInputID
enumeration is added to the header file and save your changes. This enumeration is
used to correlate input actions to activate certain abilities in our game.
Note: Attack below must either be changed to match or made to match some keybind
within your Edit->Project Settings->Input options menu.
// Copyright Epic Games, Inc. All Rights Reserved.
#pragma once
#include "CoreMinimal.h"
UENUM(BlueprintType)
enum class EGASAbilityInputID : uint8
{
None,
Confirm,
Cancel,
Attack
};
Click next and name your class, I'll name this class GAS_AbilitySystemComponent.
The generated files are seen below. You don't need to put anything else in here for
now. Note that UE5 prefixed our original class name GAS_AbilitySystemComponent with
a U - it's name in the source code is UGAS_AbilitySystemComponent, this is normal
and to be expected.
// GAS_AbilitySystemComponent.h
// All content (c) Shaun Reed 2021, all rights reserved
#pragma once
#include "CoreMinimal.h"
#include "AbilitySystemComponent.h"
#include "GAS_AbilitySystemComponent.generated.h"
/**
*
*/
UCLASS()
class UNREALGAME5_API UGAS_AbilitySystemComponent : public UAbilitySystemComponent
{
GENERATED_BODY()
};
// GAS_AbilitySystemComponent.cpp
// All content (c) Shaun Reed 2021, all rights reserved
#include "GAS_AbilitySystemComponent.h"
Gameplay Abilities
Next we'll setup the base class that we will use for adding abilities to our game.
To do this we need to create another new C++ Source file like we did in the
previous step, only this time we will inherit from the GameplayAbility class
provided with the GameplayAbilities UE5 plugin.
I named this class GAS_GameplayAbility and the source code is seen below
// GAS_GameplayAbility.h
// All content (c) Shaun Reed 2021, all rights reserved
#pragma once
#include "../unrealgame5.h"
#include "CoreMinimal.h"
#include "Abilities/GameplayAbility.h"
#include "GAS_GameplayAbility.generated.h"
UCLASS()
class UNREALGAME5_API UGAS_GameplayAbility : public UGameplayAbility
{
GENERATED_BODY()
public:
UGAS_GameplayAbility();
#include "GAS_GameplayAbility.h"
UGAS_GameplayAbility::UGAS_GameplayAbility() { }
Attribute Sets
Next, we need to create an AttributeSet for our game. Repeat the process of
creating a new C++ sourcce file for your ue5 project, but this time inherit from
AttributeSet
I named this class GAS_AttributeSet and the files genereated are below
// GAS_AttributeSet.h
// All content (c) Shaun Reed 2021, all rights reserved
#pragma once
#include "AbilitySystemComponent.h"
#include "CoreMinimal.h"
#include "AttributeSet.h"
#include "GAS_AttributeSet.generated.h"
UCLASS()
class UNREALGAME5_API UGAS_AttributeSet : public UAttributeSet
{
GENERATED_BODY()
UGAS_AttributeSet();
public:
virtual void GetLifetimeReplicatedProps(TArray<FLifetimeProperty>&
OutLifetimeProps) const override;
/*
* Attribute Definitions
*/
// Health
UFUNCTION()
virtual void OnRep_Health(const FGameplayAttributeData& OldHealth);
// Stamina
UFUNCTION()
virtual void OnRep_Stamina(const FGameplayAttributeData& OldStamina);
// Attack Power
UFUNCTION()
virtual void OnRep_AttackPower(const FGameplayAttributeData&
OldAttackPower);
};
// GAS_AttributeSet.cpp
// All content (c) Shaun Reed 2021, all rights reserved
UGAS_AttributeSet::UGAS_AttributeSet()
{
}
void UGAS_AttributeSet::GetLifetimeReplicatedProps(TArray<FLifetimeProperty>&
OutLifetimeProps) const
{
Super::GetLifetimeReplicatedProps(OutLifetimeProps);
Character Setup
In the files below, my character is named ThirdPersonCharacter, so any appearances
of this string may need to be replaced with your character's name instead. To setup
your character that inherits from ACharacter base class, make the following changes
to your files.
In the ThirdPersonCharacter.h file, make sure you're inheriting from public
IABilitySystemInterface. The start of your class should look like this. Pay
attention to the includes.
// All content (c) Shaun Reed 2021, all rights reserved
#pragma once
// GAS includes
#include "AbilitySystemInterface.h"
#include <GameplatEffectTypes.h>
#include "CoreMinimal.h"
#include "GameFramework/Character.h"
#include "ThirdPersonCharacter.generated.h"
UCLASS()
class UNREALGAME5_API AThirdPersonCharacter : public ACharacter, public
IAbilitySystemInterface
{
GENERATED_BODY()
public:
// more code....
Character Components
Next, we add an instance of our GAS_AbilitySystem class using the
UGAS_AbilitySystemComponent typename, and we also add an instance of our
GAS_AttributeSet class using the UGAS_AttributeSet type.
UCLASS()
class UNREALGAME5_API AThirdPersonCharacter : public ACharacter, public
IAbilitySystemInterface
{
GENERATED_BODY()
public:
// GAS declarations
UPROPERTY(VisibleAnywhere, BlueprintReadOnly, Category = "GAS")
class UGAS_AbilitySystemComponent* AbilitySystemComponent;
Now we need to modify the character's constructor to add the new components we've
declared. I removed the code from my constructor that wasn't related to the GAS.
The additions are below.
// ThirdPersonCharacter.cpp
AThirdPersonCharacter::AThirdPersonCharacter()
{
// Initializing any components unrelated to GAS...
// ...
So we have the components we need, and the next step is to provide the required
definitions for virtual functions we've inheirted from IAbilitySystemInterface
Virtual Functions
To start, we declare the required virtual functions that we will need to define to
use the GAS.
UCLASS()
class UNREALGAME5_API AThirdPersonCharacter : public ACharacter, public
IAbilitySystemInterface
{
GENERATED_BODY()
public:
// GAS declarations
UPROPERTY(VisibleAnywhere, BlueprintReadOnly, Category = "GAS")
class UGAS_AbilitySystemComponent* AbilitySystemComponent;
// more code....
public:
// GAS declarations
UPROPERTY(VisibleAnywhere, BlueprintReadOnly, Category = "GAS")
class UGAS_AbilitySystemComponent* AbilitySystemComponent;
Define InitializeAttributes()
// ThirdPersonCharacter.cpp
void AThirdPersonCharacter::InitializeAttributes()
{
// If the ASC and DefaultAttributeEffect objects are valid
if (AbilitySystemComponent && DefaultAttributeEffect)
{
// Create context object for this gameplay effecct
FGameplayEffectContextHandle EffectContext = AbilitySystemComponent-
>MakeEffectContext();
EffectContext.AddSourceObject(this);
// Create an outgoing effect spec using the effect to apply and the
context
FGameplayEffectSpecHandle SpecHandle = AbilitySystemComponent-
>MakeOutgoingSpec(DefaultAttributeEffect, 1, EffectContext);
if (SpecHandle.IsValid())
{
// Apply the effect using the derived spec
// + Could be ApplyGameplayEffectToTarget() instead if we were
shooting a target
FActiveGameplayEffectHandle GEHandle = AbilitySystemComponent-
>ApplyGameplayEffectSpecToSelf(*SpecHandle.Data.Get());
}
}
}
Similar to how we defined default attributes, we define default abilities for our
character by overloading the GiveAbilities() function. We also add the
DefaultAbilities array to store the default abilities for the character.
Notice that we use the UPROPERTY macro to apply EditDefaultOnly to our components.
This will later allow us to modify these components in the UE5 editor for our
character's blueprint, so we can dynamically add and remove attributes and
abilities for our player without modifying the code each time.
// ThirdPersonCharacter.h
UCLASS()
class UNREALGAME5_API AThirdPersonCharacter : public ACharacter, public
IAbilitySystemInterface
{
GENERATED_BODY()
public:
// GAS declarations
public:
// GAS declarations
InitializeAttributes();
GiveAbilities();
}
void AThirdPersonCharacter::OnRep_PlayerState()
{
Super::OnRep_PlayerState();
AbilitySystemComponent->InitAbilityActorInfo(this, this);
InitializeAttributes();
And just to be sure we have all the headers we need, here are the final includes
for my character
// ThirdPersonCharacter.h
// GAS includes
#include "AbilitySystemInterface.h"
#include <GameplayEffectTypes.h>
#include "GAS_AbilitySystemComponent.h"
#include "GAS_GameplayAbility.h"
#include "GAS_AttributeSet.h"
#include "CoreMinimal.h"
#include "GameFramework/Character.h"
#include "ThirdPersonCharacter.generated.h"
// ThirdPersonCharacter.cpp
// All content (c) Shaun Reed 2021, all rights reserved
#include "ThirdPersonCharacter.h"
// Engine includes
#include "Kismet/GameplayStatics.h" // For spawning fireball static mesh
#include "Camera/CameraComponent.h"
#include "GameFramework/SpringArmComponent.h"
#include "GameFramework/CharacterMovementComponent.h"
Defining Abilities
At this point we have configured GAS for our project and our character, so we're
ready to start defining our abilities!
In my assets folder, I just created an Abilities subdirectory and continued with
the steps below, creating the assets within this directory.
Default Abilities
First I created a new Blueprint Class using the editor and derived from the
GameplayEffect class. Applying this effect will result in the player or character
being granted a set of default abilities.
I named this GE_CharacterDefaults, and opened it for editing. The screenshot below
contains all settings I modified under the Class Defaults panel. If it isn't in
this screenshot, I didn't change it.
Damage Effect
To prove the system is working, create a new blueprint that derives from
GameplayEffect and apply the settings below
Then, in the event chart for your BP_ThirdPersonCharacter add a BeginPlay node and
apply the damage when the game starts.
Hit play and then open a console and type showdebug abilitysystem to see that your
HP should now be 80 on the lefthand side. Remove the damage to your player when
youre done testing, but you can keep the GE_Damage asset around to use it later.
Attack Ability
First, open the animation blueprint for your character and add a montage Slot
'DefaultSlot' to your anim graph. For me, the screen looks like the below after the
changes have been made. Make sure to save and apply these changes.
Make a new blueprint deriving from the GAS_GameplayAbilitiy class that we defined
earlier.
Next, open GA_Attack for editing and add the following blueprint nodes. Add the
GetActorInfo node in the context menu in the screenshot, be sure to uncheck
'Context Sensitive' if it isn't appearing at first.
Now right click the GetActorInfo node and select Split Struct Pin to split the
actor into its components
And connect the skeletal mesh pins to finish the blueprint for GA_Attack
Under the Montage To Play pin on the Play Montage node, you may not have a montage
available for your skeleton.
If you also don't have an animation, check out Mixamo for a free anmation and see
the page on Retargeting Skeleton Animations
Then create a montage by watching this quick youtube video. If you're doing a
simple punch animation, you probably just need to create a montage and click and
drag the animation into the center of the screen and save. It's pretty simple, but
you can use Motages to do some pretty neat things. Maybe for the first montage try
making a one-off animation that doesn't loop like punching or a grab motion for
interactions.
Once you have the montage made, select it here in this node, and then play the
game. You'll now be able to see your character performing the attack!
Additional Abilities
At a higher level, the steps for adding a new ability are below
Debugging
To see useful information on the GAS, enter play mode and hit the tilde (~) key to
open a console. Then, type showdebug abilitysystem, and youll notice you can see
your character stats even if there's no UI elements to represent them yet.
Mixamo to UE5
Animation Retargeting UE5
These systems have changed with the official release of UE5. See the links above
for updated tutorials
If you're not an animator, you might also get use out of sites like Mixamo that
provide some free animations. To use these though, you need to retarget the
animation for your skeleton in UE5.
First, open the Skeleton asset for your character. For the UE4 Mannequin, it should
be called something like UE4_Mannequin_Skeleton. Once it's open for editing, click
Retarget Manager at the top of the screen.
It will reveal a new side panel like the below. Select the the Humanoid option from
the dropdown menu and it should automatically populate all the bones for your
skeleton if you are using the UE4 mannequin. If you are not, you will probably have
to manually make these assignments, as I did for the next skeleton in this section.
Next, click the Save button in the above Set up Rig dialog, and save the rig
someplace safe.
Open the other Skeleton asset that was used for the animation you want to use, and
again click the Retarget Manager button at the top of the screen. Then select the
Humanoid option from the dropdown, and assign the bones to their respective members
on the skeleton.
When you're finished, it should look like the screenshot below. Click the Show
Advanced button and do the same for as many bones as you can. Don't worry if you
can't figure some of them out, you can still produce a good result most of the
time. It might seem like overkill to assign each finger, but if you're doing a
punching animation like I was it will look more like a slap if you don't.
Click the Save button and storre this Rig asset somewhere. Then, make sure you also
save changes to both Skeleton assets, and right click the animation you want to
retarget. Select Duplicate Anim Assets and Retarget from the context menu as seen
in the screenshot below.
Select your skeleton on the left hand side and then set a name for the file and
choose a location to save the retargeted animation.
If the Source and Target skeletons dont have similar poses in the above dialog,
open the skeleton with the abnormal pose in the Retarget Manager and adjust it to
match the more normal pose. Then, in the Set up Rig dialog select Modify Pose->Use
Current Pose as seen in the screenshot below.
Once you're happy with the poses of the skeletons, you can retarget the animation
and see the results. If you aren't happy with the retargeted animation, try
defining more bones in the Rig assets for your skeletons, or maybe select a
different animation.