On Tuesday I did my first test with iPi Soft - markerless motion capture. This is the main piece of equipment I'll be using for the practical side of my extended essay. I had previously tested it out during summer, however I wasn't able to export the data out due to it being a trial version. Because of that test though, I knew the basics of setting it all up and recording. When setting up the Kinect for recording you have to sort out the background first. Choosing depth mode when in the recorder allows you to see different colours, the main thing you want to get rid of is any areas of yellow so the Kinect can get the best scan as possible. Another thing to consider is what clothing you wear, you want slim clothing and if you have long hair, to tie it back. (I didn't tie my hair back so the head on the pure motion capture test was very wonky)
For this initial test I just used one Kinect. The more you use the better the scan, but it also takes longer to process. There wasn't much movement happening in this test, so one Kinect was more than enough for what I wanted to capture. Before recording you must take a snapshot of the background (once this has been done the camera cannot be moved), then you can go ahead and record. The actor must strike a T-pose before commencing any main action as this will help with the building and tracking of the character.
Once this was all recorded there were a few options within iPi Mocap Sutdio to edit the data and fit it onto a character. Using the tools such as
refit pose (the model will be fitted to the motion capture data) and
track forward (tracks the data onto the character) will get you what you need for you to export the bvh file and take it into another programme. It is also possible to import a pre-rigged character into Mocap Studio and retarget the data to it there. This eliminates the need for another piece of software such as MotionBuilder.
I tried taking the bvh file into MotionBuilder and editing it there, however I had some issues with it and wasn't able to pin the feet down too well. I thought by flattening out the F-Curves it would hold the pose, yet this wasn't the case. This test had a lot of issues with the feet, and this may have been due to the floor being shiny - it may not have coped very well with this. Hopefully the next test I do might work out better as I plan to do it on carpet.
I took the fbx file into Maya and retargeted this onto a character rig. I followed a tutorial with Digital Tutors which helped me greatly with this stage. Within MotionBuilder it is a much simpler process, however in Maya it takes a lot longer to do. Luckily a script was provided in order to speed up the whole process. Without it you would need to group each control and create other groups for retargeting. All in all it wouldn't have been as simple as MotionBuilder. To clean up the data I had to bake the animation and then simplify the curves. By simplifying it reduced the amount of keyframes there were and I was able to then go in and edit. I found this stage quite easy and had no issue with cleaning it up. Because of the restricted space that I worked in, the jump that I performed was very unenthusiastic with not much energy, so when cleaning up I thought I would make it more obvious that it was a jump and pushed it a bit further (without making it too exaggerated as this is what I want to keep for the pure keyframe animation version).
As you can see below, the first video is of the pure motion capture data that I retargeted to a character rig - provided by Digital Tutors. Underneath that is the cleaned up motion capture which I keyframed in order to correct the some of the movements. I had to do a lot more clean up and editing of keyframes than I had anticipated - this may have been due to the environment affecting the data capture. Maybe next time it might take less time but it all depends. Motion capture and keyframe could take up to the same amount of time.