The first MobileMii platform, released two years ago, could detect the behavior of people in their homes in real time, using location and posture to identify specific actions. The platform recently got some advanced new features. It now uses state-of-the-art video analysis to identify typical household activities like cleaning, cooking, eating a meal, working, and more.
The new features leverage machine learning techniques entailing the statistical analysis of different activities in a database of 45-minute videos produced specifically for the project. Around 50 volunteers performed a range of predetermined activities so that the videos could capture variations in a given task from one person to another. The initial prototype—the only one of its kind in the world—is already performing very well, with a 75% successful-recognition rate based on the analysis of data from a single camera.
The research, which is ongoing, is now focusing on improvements to the algorithm, the integration of object recognition, and higher-level reasoning. The innovation could be used in in-home services, an application currently being investigated under the ITEA3 Emospaces project.