Sort:
Open Access Research Article Issue
Imposing temporal consistency on deep monocular body shape and pose estimation
Computational Visual Media 2023, 9 (1): 123-139
Published: 18 October 2022
Downloads:43

Accurate and temporally consistent modeling of human bodies is essential for a wide range of applications, including character animation, understan-ding human social behavior, and AR/VR interfaces. Capturing human motion accurately from a monocular image sequence remains challenging; modeling quality is strongly influenced by temporal consistency of the captured body motion. Our work presents an elegant solution to integrating temporal constraints during fitting. This increases both temporal consistency and robustness during optimization. In detail, we derive parameters of a sequence of body models, representing shape and motion of a person. We optimize these parameters over the complete image sequence, fitting a single consistent body shape while imposing temporal consistency on the body motion, assuming body joint trajectories to be linear over short time. Our approach enables the derivation of realistic 3D body models from image sequences, including jaw pose, facial expression, and articulated hands. Our experiments show that our approach accurately estimates body shape and motion, even for challenging movements and poses. Further, we apply it to the particular application of sign language analysis, where accurate and temporally consistent motion modelling is essential, and show that the approach is well-suited to this kind of application.

Open Access Research Article Issue
Surface tracking assessment and interaction in texture space
Computational Visual Media 2018, 4 (1): 3-15
Published: 15 June 2017
Downloads:19

In this paper, we present a novel approach for assessing and interacting with surface tracking algorithms targeting video manipulation in post-production. As tracking inaccuracies are unavoidable, we enable the user to provide small hints to the algorithms instead of correcting erroneous results afterwards. Based on 2D mesh warp-based optical flow estimation, we visualize results and provide tools for user feedback in a consistent reference system, texture space. In this space, accurate tracking results are reflected by static appearance, and errors can easily be spotted as apparent change. A variety of established tools can be utilized to visualize and assess the change between frames. User interaction to improve tracking results becomes more intuitive in texture space, as it can focus on a small region rather than a moving object. We show how established tools can be implemented for interaction in texture space to provide a more intuitive interface allowing more effective and accurate user feedback.

total 2