MetaHuman Motion Matching
Animation & MotionBuild Pose Search databases for MetaHumans by turning retargeted third-party animation libraries into responsive motion systems.
/skill metahuman-motion-matching What this skill does
MetaHuman Motion Matching gives you expert guidance on building Pose Search (Motion Matching) databases specifically for MetaHuman characters using retargeted third-party animation. This skill bridges the gap between importing raw retargeted clips and producing seamless, responsive motion systems with natural transitions. You will learn how to construct Pose Search Schemas tuned for MetaHuman proportions, organize clips into multi-database strategies for different movement contexts, integrate with the MetaHuman Animation Blueprint, tune trajectory prediction, configure inertialization blending, and add post-process foot IK via Control Rig for ground contact accuracy.
Covers
- Pose Search Schema for MetaHuman proportions
- Database construction from categorized clips
- Multi-database strategy (locomotion, combat, conversation, idle)
- MetaHuman Animation Blueprint integration
- Trajectory prediction tuning
- Inertialization blending configuration
- Runtime database switching
- Post-process foot IK via Control Rig
- Motion Warping integration
- AgentUX automation workflows
Does not cover
- Animation import and retargeting → metahuman-motion-library
- Basic Motion Matching concepts → motion-matching
- Facial animation → face-capture
- MetaHuman character setup → metahuman-setup
How to use
Invoke directly in Claude Code:
/skill metahuman-motion-matching This skill is also auto-detected when your prompt mentions MetaHuman motion matching, pose search MetaHuman, motion flow, or seamless animation intent. AgentUX will automatically activate MetaHuman Motion Matching when it recognizes you are building responsive motion systems for MetaHuman characters.
Key Unreal Engine concepts
| Concept | Description |
|---|---|
| UPoseSearchDatabase | The indexed animation database searched every frame to find the best pose match for the character's current state and predicted trajectory. |
| UPoseSearchSchema | Defines which features (bone positions, velocities, trajectory points) are compared during the search, tuned here for MetaHuman skeleton proportions. |
| Inertialization Blending | A transition technique that decays the difference between old and new poses over time, providing seamless clip switches without cross-fade latency. |
| Motion Warping | Runtime root motion adjustment that aligns character movement with gameplay targets, ensuring animations land precisely where needed. |
| Multi-Database Strategy | Organizing animations into separate Pose Search Databases by movement context (locomotion, combat, conversation) and switching between them at runtime. |
| Trajectory Prediction | The system that projects future character movement from input and velocity, enabling Motion Matching to select animations that match intended direction and speed. |
| Control Rig Post-Process | A Control Rig applied after animation evaluation to add procedural corrections like foot IK for ground contact, preventing foot sliding on uneven terrain. |
Related skills
metahuman-motion-library
Acquire and retarget third-party animation
motion-matching
Motion Matching via Pose Search
metahuman-setup
Import and configure MetaHuman characters
control-rig
Control Rig node-based rigging
enhanced-input
Enhanced Input System
python-editor-scripting
Automate the editor with Python
What you'll learn
- How to build Pose Search databases tuned for MetaHuman skeleton proportions
- Organizing animations into multi-database strategies for different movement contexts
- Integrating Motion Matching into MetaHuman Animation Blueprints for responsive motion
- Adding post-process foot IK via Control Rig for accurate ground contact
- Switching databases at runtime for seamless transitions between locomotion, combat, and conversation