2016-03-08 at 11:30
Conference room, UCL R+0, 13 rue Moreau, 75012 Paris
Visually-Controlled Locomotion: From Pedestrian to Crowd Dynamics To be communicated.
We ordinarily think of vision as something that occurs inside the head. But it can also be viewed as a coupling to the outside world, creating a larger agent-environment system. In this talk, I will develop the behavioral dynamics approach, which seeks to explain stable, adaptive behavior as emerging from the dynamics of this coupled system. At the level of an individual agent, consider guiding locomotion through a cluttered environment. Based on experiments in virtual reality, we have developed a pedestrian model that captures basic behaviors such as steering, obstacle avoidance, and pedestrian interactions. Locomotor trajectories emerge on-line from the agent-environment interaction, without appeal to a predictive world model or explicit path planning. At the collective level, crowd behavior is thought to emerge from local interactions between individual pedestrians. Multi-agent simulations of the pedestrian model reproduce patterns of crowd data from key scenarios such as Grand Central Station, swarm, and counterflow. Individual trajectories and crowd dynamics can be modeled with just a few basic behaviors. The results support the view that pedestrian and crowd dynamics emerge from local interactions, without internal models or plans, consistent with principles of self-organization.