Summary: | 碩士 === 國立中山大學 === 資訊工程學系研究所 === 88 === 3D animation has developed rapidly in the multimedia nowadays, in computer games, virtual reality and films. Therefore, how to make a 3D model which is really true to life, especially in the facial expressions, and can have vivid actions, is a significant issue. At the present time, the methods to construct 3D facial model are divided into two categories: one is based on computer graphic technology, like geometric function, polygon, or simple geometric shapes, the other one is using hardware to measure a real face by laser scanning system, and three-dimensional digitizer. Moreover, the method to acquire the 3D facial expression primarily are applied as following: keyframing, motion capture, and simulation.
The research covers two areas:
1. Use two CCDs to digitalize the facial expressions of a real person simultaneously from both right and left side, and save the obtained standard image. Then, get the feature match points from the two standard images in the space domain, and by using the Stereo to attain the “depth information” which helps to build 3D facial model.
2. Use one CCD to continuously digitalize two facial expressions and get the feature match points’ coordinates in the time domain to calculate the motion vector.
By combining the “depth information” from space domain and the motion vector from the time domain, the 3D facial model’s motion sequence can be therefore obtained.
If sufficient digitalized facial expressions are processed by the 3D facial model’s motion sequence, a database could be built. By matching the feature points between the 2D test image and 2D standard image in the database, the standard image’s “depth information” and motion vector can be used and turn the test image into 3D model which can also imitate the facial expressions of the standard images sequences. The method to match the feature points between the test image and standard images in the database can be entirely processed by computers, and as a result eliminate unnecessary human resources.
|