Landmark-based human action recognition
In landmark-based human action recognition we are given sequences of points (the landmarks) representing the positions of some of the major human body parts over time. The sequence shows a person performing an actions, as can be seen in the animation. The task is to train a classifier to label the sequences with their action class. Our path signature methodology for this task makes use of various path transformations and the path signature to create a robust feature set for which we are able to learn a linear classifier which is competitive with the state of the art. In this demo notebook we present a simplified version, which can be trained to a good performance within a few minutes on a laptop cpu for ease of presentation.
In this notebook we construct a classifier for identifying drones. Our assumption is that when we reflect a radio pulse off a drone, the reflected signal received back by the observer is a combination of the reflection caused by the drone's body and the reflection caused by the drone's propeller. We compute path signatures for several thousand simulated radio pulses reflected off drone objects with varying propeller locations, before averaging the path signatures. This approach of using expected path signatures aims at characterising the random behaviour in reflected signals. Taking estimates of expected path signatures as our feature vectors, we consider the task of distinguishing between drone and non-drone objects, in addition to predicting the number of rotations per minute (RPM) of the drone's propeller.