کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
408944 | 679048 | 2016 | 7 صفحه PDF | دانلود رایگان |
Reaching is one of the most important behaviors in our daily life and has attracted plenty of researchers to work on it both in computer animation and robot research area. However, existing proposed methods either lack of flexibility or their results are not so convincing. In this paper, we present a novel controller-based framework for reaching motion synthesis. Our framework consists of four stationary controllers to generate concrete reaching motion and three transition controllers to stitch these stationary controllers automatically. For each stationary controller, it can either be applied alone or combined with other stationary controllers. Due to this design, our method can imitate the inherent tentative process for human reaching effectively. And our controller is able to generate continuous reaching motion based on virtual character׳s previous status with no need to start from one same initial pose. Moreover, we involve an important gaze simulation model into each controller, which can guarantee the consistency between the head and hand movement. The experiments show that our framework is very easy to be implemented and can generate natural-looking reaching motion in real-time.
Journal: Neurocomputing - Volume 177, 12 February 2016, Pages 26–32