An Automated Pipeline for Generating Run-Time Rigs

On account of getting the flu at GDC, and then having to catch up on all of my work that wasn’t getting done, I’ve been a little behind, but I wanted to make sure I got up a write-up for my main conference presentation from GDC this year.
Continue reading An Automated Pipeline for Generating Run-Time Rigs

GDC and a New Year’s Resolution

Well, 2010 is now done, and I think I’ve selected some important goals for myself this year: learn to say “no” to interesting projects, and finish my own stuff! I have a TKO update forging ahead behind the scenes, and in the process I’ve built another tool I wanted to show, as it will be part of my Complete Maya-Unity tools release coming soon.

Alongside the TKO update, I’m promising a 0.1 release for the Maya-Unity tools by GDC! The board has finally approved a submission from me, so I’ll be doing a session on Automated Pipelines for Generating Run-Time Rigs. I want to have a version of the tools ready to go by then so people will be able to download them and have a look if interested. My GDC session will basically be a higher-level, expanded, and much more detailed look at some of the stuff I showed during my talk on advanced editor scripting at Unite 2010, for anyone who was there.

Unite is actually a good segue into the video I have to share today, too. At the conference, I had showed some examples I had of using editor scripts to generate sparse blend shape data offline. The point was to demonstrate the usefulness of editor scripts in reducing run-time computation, but I unfortunately didn’t have a good pipeline for it yet. So I put together a decent first pass on native blend shape support this week. It, too, will be part of my 0.1 release for my Maya-Unity workflow, so hopefully a couple of you out there will be looking forward to it. Have a great 2011!

Biped Editor released

For anyone at Unite, you already know about this. Unity announced their asset store at the keynote yesterday, which lets developers sell or share assets, editor extensions, and so on, right inside the Unity editor for users to download directly into their projects. As such, I wrapped up a first release of my Biped Editor and put it up with full source code. You can get it from the Unity asset store now, if you download Unity 3.1. I put together a quick tutorial video for anyone who downloads it to get an idea of what the feature set for this release looks like. Hope some of you out there find it useful!

Note: for best viewing, I recommend turning the resolution on the video up to 1080p and watching full-screen, since that’s my native resolution and it will be easier to read some of the buttons.

Biped Editor for Unity 3.0

I’ve been really head-down for awhile with work, the latest of which has involved some pretty substantial refactoring of Touch KO for a big update this summer (finally!). Since I’ve been working in Unity 3.0, I thought I’d share a short clip of a tool I developed during the refactoring process. It’s a biped component along with an editor for interactively adjusting collision shapes and sizes as well as joint limits. The component also has functionality to perform an automatic mass distribution based on human values, as well as interfaces for entering and exiting ragdoll. If I manage to get caught up on things after the update I’ll probably share the code on here eventually, but as anyone who follows the site may have guessed it’s been really busy lately O_o.

Note: for best viewing, I recommend turning the resolution on the video up to 1080p and watching full-screen, since that’s my native resolution and it will be easier to read some of the buttons.

Root Motion Computer for Unity

I recently finished up a new component for Unity as part of a contract job with Mixamo. For those of you who are unaware, Mixamo provides an online motion capture download and retargeting service powered by HumanIK. It lets you use sliders to creatively adjust a piece of motion capture data and then automatically retarget it onto your own hierarchy and download it for use in your game. It’s really slick and pretty affordable, so it’s definitely worth checking out, particularly for any indie developers out there.

At any rate, the component that I created is designed to let animation data, rather than procedural velocity values, drive a character’s motion in space. The component sits on top of Unity’s animation API to let you simply play, crossfade, and blend animations using any of the existing API methods, and the computer takes care of everything else after the fact. The way it works is by tracking the position and rotation of the pelvis in the space of the character’s root node for each active AnimationState and then backward applying this motion to the root node itself and snapping the pelvis back into its position hovering over the root. Since the source code is all available in the project, I won’t belabor the details too much here, but you can certainly ask me if you have any questions. (The one thing perhaps worth mentioning, as an addendum to the video, is that the pelvis forward axis is not strictly necessary for computing output: only for displaying debug information. For computation, the character’s rotation is determined using the pelvis right axis.)

You can download an example project from Mixamo that contains the component as well as a sample character with some animations. Because the tutorial video on the Mixamo website is compressed pretty substantially, I have also uploaded a copy to my Vimeo account in case you would like to watch it in full HD resolution.