About thirty years ago when I was involved with computer animation at the Imperial College, one of the projects we looked at was creating automated lipsync.
In principle it was fairly easy; a set of lip shapes were matched to a set of phonemes in a look-up table, and you typed in the syllables and gave them a number of frames.
In those days there was no instant playback so if you got the timing wrong it took some time to check. But it wasn't practical for other reasons such as needing different shapes for different faces, also size, gender, age, and accent affect the shape of the lips.
I am not an animator but a scriptwriter, and my interest was the idea that one day I would be able to type in a script and both see and hear it played back. I thought it might always be a pipe dream, but I have recently seen some examples that come close to doing just that. One of the programs is http://www.reallusion.com/go/crazytalk/default.htm which allows you to record your voice and have it spoken by an animated character.
I recently looked up "lipsync for animation" and got a hit rate of over 300,000, so developments have moved on.
I haven't tested them but here's a couple examples here that you may like to look at to see the current state of the art.
Also some other sites: Lip Sync