UC Merced UC Merced Electronic Theses and Dissertations Title Data-Based Motion Planning for Full-Body Virtual Human Interaction with the Environment Permalink https://escholarship.org/uc/item/2314d59w Author Juarez Perez, Alain Publication Date 2018 Peer reviewed|Thesis/dissertation eScholarship.org Powered by the California Digital Library University of California UNIVERSITY OF CALIFORNIA, MERCED Data-Based Motion Planning for Full-Body Virtual Human Interaction with the Environment A Dissertation submitted in partial satisfaction of the requirements for the Ph.D. degree in Electrical Engineering and Computer Science by Alain Juarez Perez Committee in charge: Professor Marcelo Kallmann, Chair Professor Stefano Carpin Professor David Noelle 2018 Copyright (cid:13)c Alain Juarez Perez, 2018 All rights reserved The dissertation of Alain Juarez Perez is approved, and it is acceptable in quality and form for publication on microfilm and electronically: Approved: Professor David Noelle Approved: Professor Stefano Carpin Approved: Professor Marcelo Kallmann, Committee Chair University of California, Merced 2018 -iii- To my family and Sandy -iv- Contents List of Figures vii List of Tables x Acknowledgments xi Vita xii Abstract xiii Chapter 1. Introduction 1 1.1. Definitions 1 1.2. Character Animation 2 1.3. Data acquisition 2 1.4. Overview 3 Chapter 2. Literature Review 6 2.1. Character Animation 6 2.2. Motion Graphs 7 2.3. Physics-based animation and control 8 2.4. Data-based approaches 9 2.5. Motion Planning 10 2.6. Environment Interaction 11 2.7. Conclusion 12 Chapter 3. Locomotion 13 3.1. Related Work 13 3.2. Data-Based Mobility Controllers 14 3.3. Coverage-Quality Maps 19 3.4. Results and Discussion 19 3.5. Conclusion 23 Chapter 4. Locomotion Planning 25 4.1. Introduction 25 4.2. Related Work 26 4.3. Outline 27 4.4. Locomotion Behaviors 27 4.5. Behavioral Path Planner 28 4.6. Behavioral Layered Path Planner 38 4.7. Discussion 47 4.8. Conclusion 48 -v- Chapter 5. Coordination 50 5.1. Introduction 50 5.2. Related Work 50 5.3. Overview 52 5.4. Encoding Coordination Features 54 5.5. Real-Time Motion Synthesis with the Coordination Model 58 5.6. Results and Discussion 62 5.7. Conclusion 67 Chapter 6. Conclusions 69 6.1. Limitations 69 6.2. Future Work 70 References 72 -vi- List of Figures 2.1Running animation by Hodgins et al. [1995]. (cid:13)c 1995 ACM 6 2.2Motion Graph generated animation by Kovar et al. [2002]. (cid:13)c 2002 ACM 7 2.3Different physical models used by Zimmermann et al. [2015]. (cid:13)c 2015 ACM 9 2.4Peng and van de Panne [2017]. (cid:13)c 2017 ACM 10 2.5Agrawal and van de Panne [2016]. (cid:13)c 2016 ACM 11 3.1Deformations over a cycle. The left-most diagram corresponds to the original clip cycle. The next ones represent the outcome of each deformation operation 16 3.2Blending operations according to cycle phases. 18 3.3Coverage-quality map for one forward step cycle motion. The heatmap coloring represents the overall computed validity loss over the original data. 20 3.4Final body orientation (after forward step) coverage-quality map. 21 3.5Coverage-quality map of one right lateral step. 22 3.6Volumetric coverage-quality map for a frontal step. The horizontal layers are planar maps equivalent to the one in Figure 3, with the vertical axis representing changes to the final body orientation deformation parameter. 23 3.7Example of real-time control of locomotion forming a circular trajectory. 24 4.1Parameterization coverage maps for the frontal regular walking (top) and lateral walking (bottom) locomotion behaviors. The maps are on polar coordinates. The angle coordinate represents motion deformation controlling the direction of the motion. The radial coordinate represents the orientation deformation that is applied to specify the final orientation of the character’s body, at the end of one step cycle. The black boundary delimits the radius representing no deformation on the final character orientation. 29 4.2The blue positions represent the projected joint position of the limbs that collided with obstacles. The green positions are the projected character root positions on the floor at the moment of each collision. The red positions are the point-obstacles that were inserted in the LCT navigation mesh as additional constraints to be considered in subsequent path queries. The final path (marked by the dark red corridor) takes into account the new point-obstacles that were inserted. 33 4.3Example solution path found by our method. The blue path sections can be executed with the preferred behavior B . The green section with B and the 3 2 red sections with B . The obstacles are colored cyan. 33 1 -vii- 4.4System overview 34 4.5Left: The obstacle was avoided by adding extra point obstacles to steer away the path. Right: The corridor is narrow so an arm avoidance behavior was needed. 34 4.6Behavior transition in a narrow corridor. From top to bottom: arm avoidance behavior employed in the first section. In the middle section regular walking could be used because no collisions were detected even though the path has low clearance. In the final section lateral side-stepping is required. 35 4.7Top: dense environment. Bottom: normal environment. Computed solution corridors are shown in both environments for a same goal point. 36 4.8Performance evaluation. The graph presents the standard deviation and average times required to compute paths of different lengths on dense and normal environments. 37 4.9Left: the frontal walking behavior requires the torso clearance larger than the bottom clearance. Center: with arms constrained frontal walking requires only slightly more torso clearance than legs clearance. Right: lateral walking requires the least clearance and the torso and legs clearances are the same because arm movement occurs only along the motion direction. 39 4.10Two layers are used in this method. The bottom layer (bottom images) represents all obstacles while the top layer (top images) only represents the tall obstacles which are mostly important to detect torso collisions. 40 4.11Example solution path found by our method. The blue path sections can be executed with B , the cyan sections with B , and the red sections with B . 41 3 2 1 4.12Apartment scenario. 44 4.13Narrow corridor behavior selection. From top to bottom: arm avoidance is employed in the first section, regular walking is correctly employed in the middle section given the high clearance at the torso layer, and lateral side-stepping is required in the last, narrowest, section. 45 4.14Dense (top) and Normal (bottom) environments. 46 4.15Average times, deviation and outliers when computing paths of different lengths on a dense environment. 47 5.1This sequence of opening and passing through a door synthesizes the upper-body motion in coordination with a generic walking controller. This result demonstrates complex anticipatory spine and arm movements which are key for successfully performing the action without collisions. 51 5.2Overall method overview. Green boxes correspond to locomotion operations and blue boxes correspond to the main modules of the coordination model. 53 5.3Left: Set of minimum vectors to the environment considering all body parts independently. Right: Set of minimum vectors with respect to only the body sections of interest. 55 -viii- 5.4GPLVM visualization of different CFV encodings. A: joint angles, B: full feature vector; C: proposed feature vector. The motion used in this validation example is a door opening motion (bottom image). The color bar on top of the motion is used to label each of the poses by vertically projecting on the bar the skeleton’s root position at each pose. These colors identify the poses corresponding to each curve point in images A, B and C. 57 5.5The image represents for every simulated frame the distance value v of each j element in the set of k-nearest neighbors. Every non-black pixel in a column of the image corresponds to one of the selected k-nearest neighbors for the corresponding simulated real-time frame. The image shows that the k-nearest neighbors are most often composed of adjacent frames in the training data, indicating high correlation between the training data and the action being synthesized. 61 5.6In left-right order: the example motions had approach angles of: 0◦, 90◦ and 180◦ degrees. The postures in this image were synthesized with our coordination method given that differences in the character size and door dimensions lead to collisions with the considered environment when displaying the original data. 63 5.7By allowing obstacles to be placed anywhere in the scene the approach direction to open a door can be arbitrary (left) instead of always front-facing (right). Our examples motions cover different approach directions and the most suitable coordination model is automatically selected by the system. 64 5.8Drawing sequence for a generic shape. This example was synthesized with a coordination model built from straight line drawing examples in attentive style. 65 5.9Left: Speed control turned off. Center: All modules turned on. Right: Spine deformations turned off. 66 -ix-
Description: