Edinburgh Research Archive

Interactive control of multi-agent motion in virtual environments

dc.contributor.advisor
Komura, Taku
en
dc.contributor.advisor
Herrmann, Michael
en
dc.contributor.author
Henry, Joseph William Roger
en
dc.contributor.sponsor
Engineering and Physical Sciences Research Council (EPSRC)
en
dc.date.accessioned
2017-02-08T16:09:18Z
dc.date.available
2017-02-08T16:09:18Z
dc.date.issued
2016-06-27
dc.description.abstract
With the increased use of crowd simulation in animation, specification of crowd motion can be very time consuming, requiring a lot of user input. To alleviate this cost, we wish to allow a user to interactively manipulate the many degrees of freedom in a crowd, whilst accounting for the limitation of low-dimensional signals from standard input devices. In this thesis we present two approaches for achieving this: 1) Combining shape deformation methods with a multitouch input device, allowing a user to control the motion of the crowd in dynamic environments, and 2) applying a data-driven approach to learn the mapping between a crowd’s motion and the corresponding user input to enable intuitive control of a crowd. In our first approach, we represent the crowd as a deformable mesh, allowing a user to manipulate it using a multitouch device. The user controls the shape and motion of the crowd by altering the mesh, and the mesh in turn deforms according to the environment. We handle congestion and perturbation by having agents dynamically reassign their goals in the formation using a mass transport solver. Our method allows control of a crowd in a single pass, improving on the time taken by previous, multistage, approaches. We validate our method with a user study, comparing our control algorithm against a common mouse-based controller. We develop a simplified version of motion data patches to model character-environment interactions that are largely ignored in previous crowd research. We design an environment-aware cost metric for the mass transport solver that considers how these interactions affect a character’s ability to track the user’s commands. Experimental results show that our system can produce realistic crowd scenes with minimal, high-level, input signals from the user. In our second approach, we propose that crowd simulation control algorithms inherently impose restrictions on how user input affects the motion of the crowd. To bypass this, we investigate a data-driven approach for creating a direct mapping between low-dimensional user input and the resulting high-dimensional crowd motion. Results show that the crowd motion can be inferred directly from variations in a user’s input signals, providing a user with greater freedom to define the animation.
en
dc.identifier.uri
http://hdl.handle.net/1842/19580
dc.language.iso
en
dc.publisher
The University of Edinburgh
en
dc.relation.hasversion
Henry, Joseph, Shum, Hubert P. H., & Komura, Taku. 2012. Environment-aware realtime crowd control. Pages 193–200 of: Proceedings of the 11th ACM SIGGRAPH / Eurographics conference on Computer Animation. EUROSCA’12. Aire-la-Ville, Switzerland, Switzerland: Eurographics Association.
en
dc.relation.hasversion
Henry, Joseph, Shum, Hubert P. H., & Komura, Taku. 2014. Interactive Formation Control in Complex Environments. Visualization and Computer Graphics, IEEE Transactions on, 20(2), 211–222.
en
dc.subject
crowd
en
dc.subject
animation
en
dc.subject
simulation
en
dc.subject
control
en
dc.title
Interactive control of multi-agent motion in virtual environments
en
dc.type
Thesis or Dissertation
en
dc.type.qualificationlevel
Doctoral
en
dc.type.qualificationname
PhD Doctor of Philosophy
en

Files

Original bundle

Now showing 1 - 3 of 3
Name:
Henry2016 images.zip
Size:
51.38 MB
Format:
Unknown data format
Name:
Henry2016 videos.zip
Size:
209.37 MB
Format:
Unknown data format
Name:
Henry2016.pdf
Size:
5.48 MB
Format:
Adobe Portable Document Format

This item appears in the following Collection(s)