Week 4 Update and Thoughts

Topic Shift

After thinking it over, I'm going to be moving my research into more of a live-action mocap direction. The most stunning example I can think of using a live action performance has to be in Hellblade: Senua's Sacrifice and its tech demo. But, the topic of mocap is pretty wide so I believe I want to study accurate facial expression capture, given that facial expression is the most effective way for people to understand emotion.


I think much of the resources used by Ninja Theory are far out of the hands of most indie developers. So I wanted to find more budget-conscious ways to capture facial animations, starting with using an iPhone X. Though the iPhone's pricey on its own, it's a a very mobile device that doesn't require multiple camera's or sensors and I already own one and I don't own a webcam so-- Might as well work with what I have for now!

Though I'm kind of shifting gears late in the game, so what I can do this week is more limited, I do have a more solid direction in where I want to look into.

FaceRig

One of the first things I tested out, just in its basic form was FaceRig for iOS. With it's biggest attractor of all, it's technically free with in-app purchases. FaceRig on the phone has the benefit of being about to record your own live action mocap performance and save it on your phone, to a google drive, email it, just a wide variety of options. But you are limited to the avatars that FaceRig provides, either the six or seven free ones and then having to purchase the others individually or in a pack.

[raccoon test] [cartoon lady test] [Live2D dog test]

My conclusion on FaceRig for the iPhone is that the facial capture itself doesn't seem to bad, though you can't edit the map points it places after it automatically maps your face to increase the mocaps accuracy. Which does obviously hurt the final result. The auto capture definitely misplaced the center of my eyes, and if I drastically shifted or turned my head, the eye would be permanently out of place until I stopped the recording.

Another downfall is that the models provided are very cartoony and limited on their facial expressions, mostly focused on mouth movements and eyebrows. This could also be to account for the Live2D rigs it is also compatible with, but results in not very expressive mocap, and heavy reliance on the actor's voice.

It also did have the benefit of having no limit on how long you could record, which is great considering all the other limitations on the app. That said, it does seem to be more of a party trick app that lends itself as advertisement for the desktop product. So I believe I'll have to look into the desktop version, which can be far more customizable, and testing out facial rigs that can express more emotions.

Pros:

  • it's free with its own rigs
  • it's kind of customizable
  • there's no limit on recording time

Cons:

  • its models are very limited
  • you can't export the data outside of the app
  • basically, it's a party trick, you'd need the studio version to do anything original

LIVE Face


LIVE Face is another free iOS app created by Reallusion that does nothing but capture your facial mocap. It's the same tech that Animoji uses to do performance capture. However, to use it outside of the app you must have a compatible program to send that information to. For now, I was having trouble getting it to connect to one of its sister programs iClone 7. I didn't realize you needed the absolute newest version of iClone to be able to connect to the app, so that was great.

So instead I tested out another simpler program, CrazyTalk Animator which only works in 2D. CrazyTalk is another program that helps you set up assets to use for animation, but Animator's compatible with LIVE Face so I figured I should test that out instead.



I tested using some of the base resources given in the Animator program itself, like this lovely cartoon lady. And I quickly rigged a picture of my younger brother to test in this program and. Well.

The Results

Mildly terrifying so far. That being said, I didn't clean up the picture at all since I was just testing the waters. Done properly, the eyes and mouth could look so much more convincing, but I also didn't want to test this program any farther. In just attempting to rig this up, I saw a lot of its limitations. CrazyTalk Animator is only intended to be a 2D animator so the results were always going to be a bit flawed, though it was fun to experiment. With the more cartoony example, I do think this program does make more flash/vector-like animations far easier to capture or even retro low poly games, but I wanted to look into more 3D real time rendering.

Honestly, getting LIVE Face to work with any program was a bit of a headache. I'm glad I mostly tried using assets they gave me because I tried to rig my brother's head in 3D really fast through CrazyTalk itself only to find out that it wasn't compatible with CrazyTalk animator, making it entirely pointless.

So I wanted to take LIVE Face's mocap outside of Reallusion's own programs only to find it takes a whole bunch of them to finally export the data out.


To be honest, I found this out really late in the weekend, so I couldn't work on this more. I would like to try this data on 3D models in the coming week though. I'll have to get the free trial of the newest iClone to test out 3D animation with LIVE Face because honestly it seems like it really captures subtle details with the app. I just couldn't quite figure out how to get the 2D programs themselves to work.



I did find there is a way to use LIVE App mocap without all these excess programs and bring it into Houdini, so I might look into that although all the notes on it are in Chinese. But that may be easier and cheaper for indie developers than trying to buy and learn a whole bunch of different programs from Reallusion.

Pros:

  • Can connect to other programs. Mocap information can be transferred to things like Unity, Blender, Maya, Unreal, etc.
  • Surprisingly great facial capture using the iPhone's TrueDepth camera
  • Does allow you to try out the Motion Live capture before you buy..! So..!

Cons:

  • Requires a lot of programs to technically get all the mocap data outside of the app, you'd be down at least $500 if you want just the iPhone capture data
  • Might as well go for the entire body mocap at that point
That being said, I really couldn't test this one out properly because of how many extra programs it takes to export this facial capture data into Maya or Blender. I'd like to look into that more this week because I need to update iClone, so I'll probably see if I can make full use of their trial version.

Conclusion

So my focus on iPhone mocap resulted in a whole lot of nothing to be perfectly honest. It's to be expected, but the iOS versions of these facial mocap apps are generally advertisements for the more powerful products. But it does show there is a fairly strong potential to use an iPhone to capture motion data. There just aren't too many apps available to consumer purchase that allow for this mocap to be sent outside of other programs made by the same developer.

There are more option to look into webcam and Kinect mocap as other cheap alternatives, but for now it seems like iPhone tech's still largely experimental.

Comments