See the Manual's "Lip syncing" chapter for details on AC's lip-syncing features.
With lip-syncing enabled, a Sprites Unity Complex character can be assigned a Float parameter that is mapped to the normalized phoneme frame.
If you're using a custom animation engine such as Spine, you can have a custom script read the character's GetLipSyncNormalised value in an Update function, and apply this to a "Phoneme" float parameter in your Animator manually, i.e.:
public Animator _animator;
public AC.Char character;
void Update ()
{
_animator.SetFloat ("Phoneme", character.GetLipSyncNormalised ());
}
This parameter can then be used to drive e.g. a Blend Tree that controls which phoneme is displayed.
Comments
https://github.com/DanielSWolf/rhubarb-lip-sync/blob/master/extras/EsotericSoftwareSpine/README.adoc
See the Manual's "Lip syncing" chapter for details on AC's lip-syncing features.
With lip-syncing enabled, a Sprites Unity Complex character can be assigned a Float parameter that is mapped to the normalized phoneme frame.
If you're using a custom animation engine such as Spine, you can have a custom script read the character's GetLipSyncNormalised value in an Update function, and apply this to a "Phoneme" float parameter in your Animator manually, i.e.:
This parameter can then be used to drive e.g. a Blend Tree that controls which phoneme is displayed.