Skip to yearly menu bar Skip to main content


Spotlight Poster

H-GAP: Humanoid Control with a Generalist Planner

Zhengyao Jiang · Yingchen Xu · Nolan Wagener · Yicheng Luo · Michael Janner · Edward Grefenstette · Tim Rocktaeschel · Yuandong Tian

Halle B #73
[ ] [ Project Page ]
Tue 7 May 7:30 a.m. PDT — 9:30 a.m. PDT

Abstract:

Humanoid control is an important research challenge offering avenues for integration into human-centric infrastructures and enabling physics-driven humanoid animations.The daunting challenges in this field stem from the difficulty of optimizing in high-dimensional action spaces and the instability introduced by the bipedal morphology of humanoids. However, the extensive collection of human motion-captured data and the derived datasets of humanoid trajectories, such as MoCapAct, paves the way to tackle these challenges. In this context, we present Humanoid Generalist Autoencoding Planner (H-GAP), a state-action trajectory generative model trained on humanoid trajectories derived from human motion-captured data, capable of adeptly handling downstream control tasks with Model Predictive Control (MPC).For 56 degrees of freedom humanoid, we empirically demonstrate that H-GAP learns to represent and generate a wide range of motor behaviors. Further, without any learning from online interactions, it can also flexibly transfer these behaviours to solve novel downstream control tasks via planning. Notably, H-GAP excels established MPC baselines with access to the ground truth model, and is superior or comparable to offline RL methods trained for individual tasks.Finally, we do a series of empirical studies on the scaling properties of H-GAP, showing the potential for performance gains via additional data but not computing.

Live content is unavailable. Log in and register to view live content