Senses Pack for Behavior Designer Pro

The Senses Pack for Behavior Designer Pro provides a comprehensive set of sensors and tasks for creating intelligent AI behaviors based on environmental perception. The Senses Pack has the following concepts:

Sense
A Sense is the fundamental unit of environmental perception. It represents an AI agent’s ability to detect and interpret specific types of environmental stimuli. Each Sense (implemented as a Sensor component) is responsible for gathering specific types of data from the environment, such as:

  • Visual information (Visibility, Luminance).
  • Auditory information (Sound).
  • Environmental conditions (Temperature, Surface type).
  • Spatial awareness (Distance, Tracing).
Senses continuously monitor their environment and provide real-time data that can be used to make decisions in the behavior tree.
Emitter
An Emitter is a component that generates environmental stimuli that can be detected by Senses. Emitters create the signals or conditions that Senses can detect, such as:

  • Sound sources that can be heard by the Sound sensor.
  • Temperature sources that can be detected by the Temperature sensor.
  • Light sources that affect the Luminance sensor.
  • Surface properties that can be detected by the Surface sensor.
Emitters work in conjunction with Senses to create a complete perception system where agents can both detect and be detected.
Task
A Task is a behavior tree node that uses the data provided by Senses to make decisions and perform actions. Tasks are the bridge between perception (Senses) and behavior. They can:

  • Check if specific conditions are met (WithinRange, CanDetectObject).
  • Query sensor data (GetSensorAmount).
  • Execute behaviors based on sensor input (FollowTraceTrail).
  • Combine multiple sensor inputs to make complex decisions
Tasks allow you to create sophisticated behaviors by using the environmental data gathered by Senses to make decisions in the behavior tree. The task can also specify a Detection Mode. The Detection Mode specifies how the Sensor should retrieve the Emitters. The following Detection Modes can be used:
  • Object: Allows sensors to detect a specific GameObject. This mode is ideal for tracking individual targets or objects of interest.
  • Object Array: Enables sensors to detect multiple GameObjects simultaneously. This mode is useful for scenarios where an agent needs to be aware of several objects at once, such as tracking multiple targets or monitoring a group of items.
  • Tag: Allows sensors to detect all GameObjects with a specific tag. This mode is particularly useful for detecting objects of a certain type or category, such as all enemies, collectibles, or interactive objects. Note: for performance reasons this detection mode is not recommended to be used in production.
  • LayerMask: Enables sensors to detect objects on specific layers. This mode is essential for filtering detection based on object categories defined by Unity’s layer system.

Different objects can be detected based on physics casts:

  • Raycast: Uses a single ray to detect objects in a straight line, ideal for precise line-of-sight checks.
  • Sphere Cast: Uses a spherical cast to detect objects in a volume, useful for detecting objects in a specific radius.
  • Circle Cast: Uses a circular cast for 2D detection, perfect for 2D games and side-scrollers.
  • Capsule Cast: Uses a capsule-shaped cast to detect objects, great for character-sized detection areas.
  • Box Cast: Uses a box-shaped cast to detect objects, useful for detecting objects in a rectangular area.

Included Sensors

Distance
The Distance sensor enables AI agents to measure and track distances to specific objects or points in their environment. It provides configurable parameters for maximum range, update frequency, and target filtering. This sensor is essential for creating behaviors that require spatial awareness, such as maintaining formation, keeping a safe distance, or measuring approach distances. The sensor can track multiple targets simultaneously and provides real-time distance measurements.
Luminance
The Luminance sensor allows AI agents to detect and respond to light levels in their environment. It provides configurable parameters for light sensitivity, detection range, and update frequency. This sensor is useful for creating behaviors that respond to lighting conditions, such as seeking well-lit areas or avoiding dark spaces. The sensor can detect both ambient and direct light sources, allowing for complex light-based behaviors.
Surface
The Surface sensor allows AI agents to detect and identify the type of surface they are on or near. It provides configurable parameters for surface type detection, detection radius, and update frequency. This sensor is particularly useful for creating behaviors that adapt to different terrains or materials, such as adjusting movement speed on different surfaces or triggering specific animations based on ground type.
Sound
The Sound sensor allows AI agents to detect and respond to audio sources in their environment. It features configurable parameters for hearing range, minimum volume threshold, and layer filtering. This sensor is ideal for creating behaviors where agents need to respond to sounds, such as investigating noises or being alerted by gunshots. The sensor can filter sounds based on their volume and distance, allowing for realistic hearing mechanics.
Temperature
The Temperature sensor enables AI agents to detect temperature sources and zones in their environment. It provides configurable parameters for temperature range, detection radius, and update frequency. This sensor is useful for creating behaviors where agents need to respond to temperature changes, such as seeking warmth in cold environments or avoiding heat sources. The sensor can detect temperature gradients and respond to both ambient and direct temperature sources.
Tracer
The Tracer sensor enables AI agents to follow trails or paths in their environment. It provides configurable parameters for trail detection radius, persistence time, and update frequency. This sensor is ideal for creating tracking behaviors, such as following scent trails or tracking footprints. The sensor can detect different types of trails and maintain awareness of their freshness or strength.
Visibility
The Visibility sensor enables AI agents to detect objects within their line of sight and field of view. It provides configurable parameters for the field of view angle (both horizontal and vertical), detection range, and layer filtering. If DebugDrawRay is enabled a line will be drawn within the scene view indicating the status of the sensor:
  • Red: The target is too far away from the agent.
  • Magenta: Target is outside the horizontal field of view.
  • Cyan: Target is outside the vertical field of view.
  • Yellow: Another object is blocking the target.
  • Green: The target can be seen and is within range.

Included Tasks

Can Detect Object
The Can Detect Object task enables AI agents to verify if they can detect a specific object using their sensors. This versatile task can work with multiple sensor types and provides configurable detection parameters. It’s essential for creating behaviors that require object recognition, such as identifying targets or finding specific items in the environment.
Can Detect Surface
The Can Detect Surface task enables AI agents to verify if they can detect a specific type of surface in their environment. Working in conjunction with the Surface sensor, this task allows for configurable surface type matching and detection radius. It’s essential for creating behaviors that need to adapt to different terrains, such as slowing down on slippery surfaces or triggering specific animations when walking on different materials.
Follow Trace Trail
The Follow Trace Trail task empowers AI agents to follow detected trails or paths in their environment. Designed to work with the Tracer sensor, this task provides configurable parameters for following speed and update frequency. It’s ideal for creating tracking behaviors, such as following scent trails or tracking footprints. The task can be configured with success and failure conditions to handle various tracking scenarios, such as losing the trail or reaching the end of the path. This task requires the Movement Pack.
Get Sensor Amount
The Get Sensor Amount task provides a way to retrieve and evaluate the current value from any sensor. This task is particularly useful for creating conditional checks and threshold-based behaviors. It can compare sensor values against specific thresholds and trigger different behaviors based on the comparison results. The task’s flexibility makes it valuable for creating complex behaviors that depend on precise sensor measurements, such as responding to specific light levels or temperature ranges.
Within Range
The Within Range task allows AI agents to check if a target object or position is within a specified range. This versatile task can work with any sensor type and provides configurable parameters for the range threshold and update frequency. It’s particularly useful scenarios such as detecting an object is within a specified distance. The task can be configured with success and failure conditions to trigger different behaviors based on the range check results.

Task Setup

Each task can be setup by following the specified workflow:
  1. Add the desired task. For this example we are going to use the Can Detect Object task.
  2. Select the Sensor type.
  3. Specify how the Sensor should detect the emitter. In the screenshot below we’ve selected the Visibility Sensor with the Object Detection Mode.

The Follow Trail task uses a Movement Pack workflow in order to detect the trace.

Emitter Setup

The following emitters have setup required within the scene:

Luminance

Each light source must have the Luminance Emitter component added to it in order for the light to be considered part of the calculation. The Luminance Manager is a singleton component that will be added to the scene automatically if it isn’t already added. This manager allows you to specify if ambient light should be used in the calculation.

Surface

New Surface Types can be created by right clicking within the project tab and selecting Create -> Opsive -> Behavior Designer Pro -> Senses Pack -> Surface Type. The singleton Surface Manager component can be used to specify a corresponding Surface Type per texture. Alternatively the Surface Identifier component can be added to a collider which allows for manual Surface Type specification.

Temperature

The Temperature Volume should be used to specify a temperature within the attached trigger. On this component the temperature can be Absolute or Relative. Relative temperatures are added to the base Scene Temperature. The Scene Temperature is a singleton component that allows you to specify fluctuating temperatures based on the time and day.

Trace

The Trace Manager is a singleton component that specifies the maximum world bounds. If this component is selected while the game is playing you’ll see a visualization of the traces that have been added along with the bounding octree node. Individual traces can be added with the Trace Emitter component, as is done for concepts such as blood trails. The Trace Emitter can also be used for scent with a specified Dissipation Time.