FAIR and interactive data graphics from a scientific knowledge graph
Contents
An animation database is a database which stores fragments of animations or human movements and which can be accessed, analyzed and queried to develop and assemble new animations.[1][2] Given that the manual generation of a large amount of animation can be time consuming and expensive, an animation database can assist users in building animations by using existing components, and sharing animation fragments.[2]
Early examples of animation databases include the system MOVE which used an object oriented database.[1] Modern animation databases can be populated via the extraction of skeletal animations from motion capture data.[3]
Other examples include crowd simulation in which a number of people are simulated as a crowd. Given that in some applications the people need to be walking at different speeds, say on a sidewalk, the animation database can be used to retrieve and merge different animated figures.[4] The method is mainly known as "motion graphs".[5]
Animation databases can also be used for "interactive storytelling" in which fragments of animations are retrieved from the animation database and are recycled to combine into new stories. For instance, the animation database called Animebase is used within the system Words Anime to help generate animations using recycled components.[2] In this approach, the user may input words which form parts of a story and queries against the database help select suitable animation fragments. This type of system may indeed use two databases: an animation database, as well as a story knowledge database. The story knowledge database may use subjects, predicates and objects to refer to story fragments. The system then assists the user in matching between story fragments and animation fragments.[2]
Animation databases can also be used for the generation of visual scenes using humanoid models.[6] An example application has been the development of an animated humanoid-based sign language system to help the disabled.[6]
Another application of an animation database is in the synthesis of idle motion for human characters.[7] Human beings move all the time and in unique ways, and the presentation of a consistent and realistic set of idle motions for each character between different animation segments has been a challenge, e.g. each person has a unique way of standing and this needs to be represented in a realistic way throughout an animation. One of the problems is that idle motion affects all joints and simply showing statistical movements at each joint results in less than realistic portrayals. One approach to solving this problem is to use an animation database with a large set of pre-recorded human movements, and obtain the suitable patterns of motion from the database through statistical analysis.[7]
References
- ^ a b S. Kuroki, "Walkthrough using Animation database MOVE" in Database and expert systems applications, Volume 4 edited by Vladimír Marík, 1994 ISBN 3-540-57234-1 pages 760-763
- ^ a b c d Kaoru Sumi "Interactive Storytelling System Using Recycle-Based Story Knowledge" in Interactive Storytelling: Second Joint International Conference on Interactive Digital Storytelling, ICIDS 2009 by Ido A. Iurgel 2009 ISBN 3-642-10642-0 pages 74-85 [1]
- ^ G. Rogez, "Exploiting Spatio-temporal constraints for Robust 2D Pose Tracking" in Human Motion: Understanding, Modeling, Capture and Animation: HumanMotion 2007, Rio de Janeiro, Brazil, October 20, 2007 ISBN pages58-72 [2]
- ^ Crowd simulation by Daniel Thalmann, Soraia Raupp Musse 2007 ISBN 1-84628-824-X pages 59-64 [3]
- ^ Motion Graphs by Michael Gleicher, 2008, Published in Proceeding SIGGRAPH '08 ACM SIGGRAPH 2008 classes [4]
- ^ a b Takaya Yuizona et al. "Cognitive Development Environment of Sign Language Animation System Using Humanoid Model" in Knowledge-Based Intelligent Information Engineering Systems by Ernesto Damiani 2002 pages [5]
- ^ a b Arjan Egges et al "Personalized Real-time Idle Motion Synthesis" in the Proceedings of the 12th Pacific Graphics Conference, pages 121–130, October 2004