Jehee Lee / 이제희
서울대학교 컴퓨터공학부 교수
I am a professor in the Department of Computer Science and Engineering at Seoul National University. My research interests are in the areas of computer graphics, animation, biomechanics, and robotics. I am particularly interested in developing new ways of understanding, representing, planning, and simulating human and animal movements. This involves full-body motion analysis and synthesis, biped control and simulation, clinical gait analysis, motion capture, motion planning, data-driven and physically based techniques, interactive avatar control, crowd simulation, and controller design. I co-chaired ACM/EG Symposium on Computer Animation in 2012 and served on numerous program committees, including ACM SIGGRAPH, ACM SIGGRAPH Asia, ACM/EG Symposium on Computer Animation, Pacific Graphics, CGI, and CASA. I am currently an associate editor of IEEE Transactions on Visualization and Computer Graphics. I am leading the SNU Movement Research Lab.
Postdotoral research positions are available. Please contact me if you are interested in.
(Publications | Research | Courses | Google Scholar Me )
Push-Recovery Stability of Biped Locomotion, SIGGRAPH Asia 2015. Link
Generate and Ranking Diverse Multi-Character Interactions, SIGGRAPH Asia 2014. Link
Locomotion Control for Many-Muscle Humanoids, SIGGRAPH Asia 2014. Link
Interactive Manipulation of Large-Scale Crowd Animation, SIGGRAPH 2014. Link
Data-driven Control of Flapping Flight, ACM Transactions on Graphcis. Link
Data-Driven Control of Flapping Flight. Presented at SIAT workshop in Shenzhen in 2014. Slides
Abstract: The animation and simulation of human/animal behavior is an important issue in the context of computer animation, games, robotics, and virtual environments. The study on human movements and animal locomotion has revealed various principles based on physics, biomechanics, physiology, and psychology. In this talk, we will discuss the design of a physically based controller that simulates the flapping behavior of a bird in flight. We recorded the motion of doves and parrots using marker-based optical motion capture and high-speed video cameras. The bird flight data thus acquired allow us to parameterize natural wingbeat cycles and provide the simulated bird with reference trajectories to track in physics simulation. Our controller simulates articulated rigid bodies of a bird’s skeleton and deformable feathers to reproduce the aerodynamics of bird flight. Motion capture from live birds is not as easy as human motion capture because of the lack of cooperation from subjects. Therefore, the flight data we could acquire were limited. We developed a new method to learn wingbeat controllers even from sparse, biased observations of real bird flight. Our simulated bird imitates life-like flapping of a flying bird while actively maintaining its balance. The bird flight is interactively controllable and resilient to external disturbances.
Principles vs Observation: How do people move? The slide set at the link is based on my research in 2010. Slides
Abstract: The animation and simulation of human behavior is an important issue in the context of computer animation, games, robotics, and virtual environments. The study on human movements has revealed various principles based on physics, biomechanics, physiology, and psychology. Many of existing animation techniques rely on those principles, which may be described as mathematical equations, algorithms, or procedures. Another stream of research, called data-driven animation, made use of human motion data captured from live actors. The research on data-driven animation has developed a variety of techniques to edit, manipulate, segment and splice motion capture clips. The current trend of animation research is to combine these two approaches to complement each other. Over the past few years, we have explored several methods that addressed the problem of simulating human behaviors in virtual environments. Each solution relies on different principles of human movements and motion data captured at different scales. We found that principles and observed data can interact with each other in several ways. Sometimes, motion data drive physically-simulated bipeds that walk, turn, and spin. Sometimes, physics principles guide interactive motion editing to make a canned jump higher/wider and a spin longer. The group/crowd behavior can be captured from video, analyzed, interpolated, and re-synthesized to create a larger group/crowd of virtual humans for an extended period of time. Sometimes, simply adding more flexibility to motion data allows our animated characters to navigate highly-constrained, cluttered environments interactively.
Introduction to Data-Driven Animation: Programming with Motion Capture, SIGGRAPH Asia 2010 Course. Course Web
Abstract: Data-driven animation using motion capture data has become a standard practice in character animation. A number of techniques have been developed to add flexibility on captured human motion data by editing joint trajectories, warping motion paths, blending a family of parameterized motions, splicing motion segments, and adapting motion to new characters and environments. Even with the abundance of motion capture data and the popularity of data-driven animation techniques, programming with motion capture data is still not easy. A single clip of motion data encompasses a lot of heterogeneous information including joint angles, the position and orientation of the skeletal root, their temporal trajectories, and a number of coordinate systems. Due to this complexity, even simple operations on motion data, such as linear interpolation, are rarely described as succinct mathematical equations in articles. This course provides not only a solid mathematical background but also a practical guide to programming with motion capture data. The course will begin with the brief review of affine geometry and coordinate-invariant (conventionally called coordinate-free) geometric programming, which will generalize incrementally to deal with three-dimensional rotations/orientations, the poses of an articulated figure, and full-body motion data. It will lead to identifying a collection of coordinate-invariant operations on full-body motion data and their object-oriented implementation. Finally, we will discuss the practical use of our programming framework in a variety of contexts ranging from data-driven manipulation/interpolation to state-of-the-art biped locomotion control.
|Data-Driven Crowd and Group Behaviors, Part of SIGGRAPH Asia 2010 Course. Course Web|
|SIGGRAPH 논문은 어떻게 결정되는가, 한국컴퓨터그래픽스학회 2008. Slides|
KBS News (Dec 15, 2014) Many-Muscle Humanoids, AVI
동아일보 (Dec 5, 2014) Previsualization, PDF
동아사이언스 (Dec, 2014) 공대가 좋아2 10회, Link
과학동아 (Dec, 2014) 3D 프린터로 하늘을 나는 새를 만든다, PDF
수학동아 (Oct, 2014) 생명체의 움직임을 재현한다, PDF
MBC News (Oct 24, 2013) Bird flight simulation, AVI
YTN 사이언스투데이 (Oct 24, 2013) Bird flight simulation, Link
My kids, Sungho and Eunje, in motion capture in 2005.
[Last update: 2015 Dec]