Our paper titled “Unsupervised Learning of Qualitative Motion Behaviours by a Mobile Robot” got accepted in the main track of Autonomous Agents and Multiagent Systems (AAMAS 2016), taking place in Singapore.
The success of mobile robots, in daily living environments, depends on their capabilities to understand human movements and interact in a safe manner. This paper presents a novel unsupervised qualitative-relational framework for learning human motion patterns using a single mobile robot platform. It is capable of learning human motion patterns in real-world environments, in order to predict future behaviours.
This previously untackled task is challenging because of the limited field of view provided by a single mobile robot. It is only able to observe one location at any time, resulting in incomplete and partial human detections and trajectories. Central to the success of the presented framework is mapping the detections into an abstract qualitative space, and then characterising motion invariant to exact metric position.
This framework was used by a physical robot autonomously patrolling an office environment during a six week deployment. Experimental results from this deployment demonstrate the effectiveness and applicability of the system.
P. Duckworth, Y. Gatsoulis, F. Jovan, N. Hawes, D. C. Hogg, D. and A. G. Cohn. Unsupervised Learning of Qualitative Motion Behaviours by a Mobile Robot. In Proc. of the Intl. Conf. on Autonomous Agents and Multiagent Systems (AAMAS). Singapore. May 2016.