Sensor-Based Tracking Processing

Grid Studio also supports processing sensor-based tracking data, such as LiDAR or other spatial tracking systems.

Objects used for sensor processing are created and organized in the Project Tree.

Unlike object-based tracking systems where a specific object is driven directly, sensor-based tracking often works with dynamic entities detected by the sensor. Because the number and identity of tracked entities can change continuously, additional objects are used to evaluate and process the incoming tracking data.

Typical Sensor Tracking Flow

1

Sensor Object

The Sensor Object receives raw tracking data from the external sensor system. This object acts as the entry point for tracking data within the project.

2

Filtering and Clustering

Inside the Sensor Object, incoming sensor data can be processed using filters and clustering methods. These operations help identify and track individual entities detected by the sensor.

3

Volume Object

A Volume Object can receive the processed tracking data through an internal Map Input connection.

Volumes evaluate the spatial position of tracked entities and can detect when objects enter, leave, or move within a defined area.

This approach allows the system to work with dynamic tracking data, where entities may appear or disappear over time.

4

Volume Map Output

The Volume Object can generate signals through its Map Outputs, which can be used to trigger workflows, control parameters, or drive other objects within the project.

In many cases, the data generated by volumes is also used internally, for example to create position-based triggers or dynamic targets for other tracking or interactive systems without a map output.

Last updated