Jehee Lee

Professor
School of Computer Science and Engineering
Seoul National University
1 Kwanakro Kwanakgu
Seoul 151-744, Republic of Korea
Office : 302-325
Phone : (02) 880-1845
Fax : (02) 871-4912
E-mail : jehee at cse dot snu dot ac dot kr
 
 
б ǻͰк
DZ Ƿ 1 (: 151-744)

302 (2Űа) 325ȣ

 

Short Biosketch

 

I am a professor in the School of Computer Science and Engineering at Seoul National University. My research is in the area of computer graphics. I am particularly interested in developing new ways of understanding, representing, and animating human movements. This involves full-body motion analysis and synthesis, motion capture, path planning, biped locomotion, controller design, animal locomotion, facial animation, and interactive techniques for animation. I am leading the Movement Research Lab.

 

( Publications  |  Research  |  Courses  |  Google Scholar Me )

 

Postdoctoral research positions available

 

Recent Publication and Images

 

Generating and Ranking Diverse Multi-Character Interactions, SIGGRAPH Asia 2014.

 

 

Locomotion Control for Many-Muscle Humanoids, SIGGRAPH Asia 2014.

 

 

Interactive Manipulation of Large-Scale Crowd Animation, SIGGRAPH 2014.

 

Recent Talk Slides

 

 

 

Data-Driven Control of Flapping Flight. Presented at SIAT workshop in Shenzhen in 2014. slide

Abstract: The animation and simulation of human/animal behavior is an important issue in the context of computer animation, games, robotics, and virtual environments. The study on human movements and animal locomotion has revealed various principles based on physics, biomechanics, physiology, and psychology. In this talk, we will discuss the design of a physically based controller that simulates the flapping behavior of a bird in flight. We recorded the motion of doves and parrots using marker-based optical motion capture and high-speed video cameras. The bird flight data thus acquired allow us to parameterize natural wingbeat cycles and provide the simulated bird with reference trajectories to track in physics simulation. Our controller simulates articulated rigid bodies of a birds skeleton and deformable feathers to reproduce the aerodynamics of bird flight. Motion capture from live birds is not as easy as human motion capture because of the lack of cooperation from subjects. Therefore, the flight data we could acquire were limited. We developed a new method to learn wingbeat controllers even from sparse, biased observations of real bird flight. Our simulated bird imitates life-like flapping of a flying bird while actively maintaining its balance. The bird flight is interactively controllable and resilient to external disturbances.

 

 

PrinciplesVsObservations

 

 

Principles vs Observation: How do people move? The slide set at the link is based on my research in 2010. slide

Abstract: The animation and simulation of human behavior is an important issue in the context of computer animation, games, robotics, and virtual environments. The study on human movements has revealed various principles based on physics, biomechanics, physiology, and psychology. Many of existing animation techniques rely on those principles, which may be described as mathematical equations, algorithms, or procedures. Another stream of research, called data-driven animation, made use of human motion data captured from live actors. The research on data-driven animation has developed a variety of techniques to edit, manipulate, segment and splice motion capture clips. The current trend of animation research is to combine these two approaches to complement each other. Over the past few years, we have explored several methods that addressed the problem of simulating human behaviors in virtual environments. Each solution relies on different principles of human movements and motion data captured at different scales. We found that principles and observed data can interact with each other in several ways. Sometimes, motion data drive physically-simulated bipeds that walk, turn, and spin. Sometimes, physics principles guide interactive motion editing to make a canned jump higher/wider and a spin longer. The group/crowd behavior can be captured from video, analyzed, interpolated, and re-synthesized to create a larger group/crowd of virtual humans for an extended period of time. Sometimes, simply adding more flexibility to motion data allows our animated characters to navigate highly-constrained, cluttered environments interactively

 

DDA_CourseNote

 

Introduction to Data-Driven Animation: Programming with Motion Capture, Siggraph Asia 2010 Course, course web

 

DataDrivenCrowd

 

Data-Driven Crowd and Group Behaviors, (Part of Siggraph Asia 2010 Course, Simulating Believable Crowd and Group Behaviors), course note

 

SIGGRAPH_decision_process

 

Siggraph Ǵ°, ѱǻͱ׷Ƚȸ 2008, slide

 

 

Media Coverage

 

KBS News (Dec 15, 2014) Many-Muscle Humanoids, AVI

 

Ϻ (Dec 5, 2014) Previsualization, PDF

 

ƻ̾ (Dec, 2014) 2 10ȸ, Link

 

е (Dec, 2014) 3D ͷ ϴ , PDF

 

е (Dec, 2014) ü Ѵ, PDF

 

MBC News (Oct 24, 2013) Bird flight simulation, AVI

 

YTN ̾ (Oct 24, 2013) Bird flight simulation, Link

 

Personal Stuff

 

My kids, Sungho and Eunje, in motion capture in 2005.

 

 

[Last update: Dec 2014]