Summary: | This paper addresses the problem of segmenting perception in physical robots into meaningful events along time. In structured environments this problem can be approached using domain-specific techniques, but in the general case, as when facing unknown environments, this becomes a non-trivial problem. We propose a dynamical systems approach to this problem, consisting of simultaneously learning a model of the robot's interaction with the environment (robot and world seen as a single, coupled dynamical system), and deriving predictions about its short-term evolution. Event boundaries are detected once synchronization is lost, according to a simple statistical test. An experimental proof of concept of the proposed framework is presented, simulating a simple active perception task of a robot following a ball. The results reported here corroborate the approach, in the sense that the event boundaries are correctly detected.
|