当前位置:主页 > 文艺论文 > 广告艺术论文 >

真实感人脸表情合成的关键技术研究

发布时间:2018-09-08 17:38
【摘要】:人脸表情动画技术作为计算机图形学的一个重要分支,一直是广大研究人员竞相追逐的研究热点。当前,该领域已取得大量研究成果,且被广泛应用于影视、广告和游戏等产业。《金刚》、《指环王》、《阿凡达》等作品中使用了大量的计算机合成人脸表情,它们向观众展现了人脸表情动画的无穷魅力。人脸表情合成技术的发展已深入人心。随着技术的发展和时代的进步,人们对合成表情动画的真实感与合成速度的要求也在不断提高。广阔的应用前景与技术的可行性必将使这一领域的研究得到越来越多的投入和关注。 本文综述了人脸表情合成的发展现状,对现有方法进行分类并详细分析了各自的优缺点。在此基础上,我们对真实感人脸表情合成中的几个关键问题进行了深入探讨,提出了系统的解决方案,包括人脸运动数据的采集与人脸表情的提取、真实感人脸表情的合成以及人脸表情的编辑等。具体地说,本文的工作主要包括以下几个方面: 给出了一种高精度人脸表情的采集与提取方案。对于光学运动捕获系统采集的人脸运动数据,我们使用径向基函数(Radial Basis Functions, RBF)插值将其映射到中性人脸模型所在的坐标系下,以获得中性人脸模型空间的人脸运动数据。借助于数据采集时标定的与人脸表情变化无关的标记点(marker),我们从中提取表演者的脸部表情信息,同时得到相应头部刚体运动。 提出了一种基于拉普拉斯的表情合成技术,在人脸变形中保留人脸模型上已有细节特征,保证了合成表情的真实感。对于给定人脸模型,我们首先计算每个顶点的拉普拉斯坐标。在表情合成时,保持所有顶点的拉普拉斯坐标不变,由表情特征点的位移以及选取的固定点,计算人脸模型上其他所有顶点的新位置,从而合成新的人脸表情。结合提取的头部刚体运动,我们可以获得和表演者表情相似、头部姿态一致的目标人脸模型。 提出了一种基于测地距离与RBF插值的人脸表情合成新方法。由于人脸模型中嘴巴、眼睛等孔洞区域的存在,欧氏距离与沿着曲面表面的测地距离差异较大,直接使用传统的基于欧氏距离的RBF插值容易产生这些孔洞区域被拉伸的结果。本文引入了一种近似测地距离的计算规则,能够测量从人脸表情特征点到人脸模型上其他顶点的测地距离。使用测地距离衡量人脸模型上顶点间的相互影响,结合RBF插值,从而合成真实感人脸表情。 表情编辑是真实感人脸表情动画中的重要步骤,本文提出了一种基于时空的人脸表情动画编辑方法。我们使用拉普拉斯变形技术将用户对人脸表情特征点的编辑效果在空间域上传播到整个人脸模型。与此同时,用户对某一帧的编辑效果在时间域上以高斯函数的衰减模式在给定人脸动画上的邻近表情序列间传播。在编辑过程中,允许用户指定编辑在时间域上的传播范围,这为用户提供了人脸表情动画编辑范围的局部控制。 提出了一种基于二维形变的人脸表情编辑技术。在表情编辑中,我们保持人脸模型中每个三角面片的形状和比例,使得它们的变形总和达到最小。同时,根据对人脸表情变化的观察,我们约束在变形中人脸外围轮廓边长总和不变,以合成自然、真实的人脸表情。该技术还可以应用于服装设计的初期,自动地计算服装在新姿势下的形态,以获得服装在不同姿势下的效果。这可以避免设计师对类似的服装进行简单、重复绘制,为设计师的设计工作以及与他人进行交流思想提供了一种辅助手段,提高了工作效率。我们选用人体骨架作为服装变形的驱动元素,由初始状态下的骨架自动获取变形中的控制点、以新姿势下的骨架计算这些控制点的目标位置,从而驱动新姿势服装的变形。 我们分别使用了多个人脸模型对上述各种方法进行测试,均取得了不错的实验结果。最后,我们对本文的研究工作进行总结,分析了存在的问题,并指出了未来可能的研究方向。
[Abstract]:As an important branch of computer graphics, facial expression animation technology has been a hot research topic for many researchers. At present, a large number of research results have been achieved in this field, and it is widely used in film, television, advertising and games industries. Facial expression, they show the audience the infinite charm of facial expression animation. The development of facial expression synthesis technology has been deeply rooted in the hearts of people. More and more attention has been paid to the research in the field.
This paper reviews the development of facial expression synthesis, classifies the existing methods and analyzes their advantages and disadvantages in detail. On this basis, we discuss several key issues in realistic facial expression synthesis, and propose a systematic solution, including the collection of facial motion data and the extraction of facial expression. In particular, the work of this paper mainly includes the following aspects:
A high-precision facial expression acquisition and extraction scheme is presented. For the facial motion data collected by optical motion capture system, we use Radial Basis Functions (RBF) interpolation to map it to the coordinate system of the neutral face model to obtain the facial motion data in the space of the neutral face model. Markers, which have nothing to do with facial expression changes, are calibrated in data acquisition, from which we extract the facial expression information of performers and obtain the corresponding head movements.
This paper presents a Laplacian-based expression synthesis technique, which preserves the details of the face model and guarantees the authenticity of the expression. For a given face model, we first compute the Laplacian coordinates of each vertex. The displacement of feature points and the selected fixed points are used to calculate the new positions of all other vertices on the face model, and then the new facial expressions are synthesized.
A new method of facial expression synthesis based on geodesic distance and RBF interpolation is proposed in this paper.Owing to the existence of mouth and eye holes in the face model,the Euclidean distance is quite different from the geodesic distance along the surface of the curved surface.The traditional Euclidean distance-based RBF interpolation is easy to produce the result that these holes are stretched. In this paper, an approximate geodesic distance calculation rule is introduced, which can measure the geodesic distance from facial expression feature points to other vertices on the face model.
Emotion editing is an important step in realistic facial expression animation. In this paper, we propose a spatio-temporal facial expression animation editing method. We use Laplacian transformation technology to transmit the editing effect of user's facial expression feature points to the whole face model in the spatial domain. In the editing process, the user is allowed to specify the editing range in the time domain, which provides local control over the editing range of the facial expression animation.
A facial expression editing technique based on two-dimensional deformation is proposed.In facial expression editing, we preserve the shape and proportion of each triangle in the face model to minimize the total deformation.At the same time, according to the observation of facial expression changes, we constrain the sum of the peripheral contours of the face to be invariant in the deformation to synthesize. This technique can also be used in the early stage of fashion design, automatically calculating the shape of the garment in the new posture, so as to obtain the effect of the garment in different postures. The human skeleton is selected as the driving element of clothing deformation, and the control points in the deformation are automatically obtained from the skeleton in the initial state. The target positions of these control points are calculated with the skeleton in the new posture, thus driving the clothing deformation in the new posture.
Several face models are used to test the above methods and good experimental results are obtained. Finally, we summarize the research work, analyze the existing problems and point out the possible research directions in the future.
【学位授予单位】:浙江大学
【学位级别】:博士
【学位授予年份】:2012
【分类号】:TP391.41

【参考文献】

相关期刊论文 前3条

1 裴玉茹;查红彬;;真实感人脸的形状与表情空间[J];计算机辅助设计与图形学学报;2006年05期

2 姚俊峰;陈琪;;计算机人脸表情动画技术综述[J];计算机应用研究;2008年11期

3 吴宗敏;函数的径向基表示[J];数学进展;1998年03期



本文编号:2231219

资料下载
论文发表

本文链接:https://www.wllwen.com/wenyilunwen/guanggaoshejilunwen/2231219.html


Copyright(c)文论论文网All Rights Reserved | 网站地图 |

版权申明:资料由用户5fc32***提供,本站仅收录摘要或目录,作者需要删除请E-mail邮箱bigeng88@qq.com