TY - JOUR AU - Albert Clapes AU - Alex Pardo AU - Oriol Pujol AU - Sergio Escalera PY - 2018// TI - Action detection fusing multiple Kinects and a WIMU: an application to in-home assistive technology for the elderly T2 - MVAP JO - Machine Vision and Applications SP - 765–788 VL - 29 IS - 5 KW - Multimodal activity detection KW - Computer vision KW - Inertial sensors KW - Dense trajectories KW - Dynamic time warping KW - Assistive technology N2 - We present a vision-inertial system which combines two RGB-Depth devices together with a wearable inertial movement unit in order to detect activities of the daily living. From multi-view videos, we extract dense trajectories enriched with a histogram of normals description computed from the depth cue and bag them into multi-view codebooks. During the later classification step a multi-class support vector machine with a RBF- 2 kernel combines the descriptions at kernel level. In order to perform action detection from the videos, a sliding window approach is utilized. On the other hand, we extract accelerations, rotation angles, and jerk features from the inertial data collected by the wearable placed on the user’s dominant wrist. During gesture spotting, a dynamic time warping is applied and the aligning costs to a set of pre-selected gesture sub-classes are thresholded to determine possible detections. The outputs of the two modules are combined in a late-fusion fashion. The system is validated in a real-case scenario with elderly from an elder home. Learning-based fusion results improve the ones from the single modalities, demonstrating the success of such multimodal approach. UR - https://link.springer.com/article/10.1007/s00138-018-0931-1 L1 - http://158.109.8.37/files/CPP2018.pdf N1 - HUPBA; no proj ID - Albert Clapes2018 ER -