基于光流法的3D视频稳定虚拟视点生成研究
发布时间:2018-04-14 16:01
本文选题:3D + 光流跟踪 ; 参考:《北京邮电大学》2016年硕士论文
【摘要】:3D技术的蓬勃发展使得3D视频深入人心。与戴眼镜式的3D技术相比,裸眼式3D给人们提供了更为轻松的观看体验。裸眼显示方案通常在已有的双目3D视频下生成多个中间视点,然后将生成的多视点合成最终的3D视频。然而,在生成多个中间视点的过程中,常常会因为立体匹配不稳定等因素,而出现不同程度的抖动现象。如何有效生成稳定中间视点并保证它们的连续性,这无疑是一个很有意义的研究课题。本论文在研究虚拟视点生成技术的基础上,分析了虚拟视点生成不稳定的原因,针对其主要原因,利用光流的跟踪来进行抖动抑制的改进,提出了两种稳定虚拟视点生成的方案,具体研究内容和成果如下:1.提出了一种基于稀疏光流跟踪的稳定虚拟视点生成方案。该方案主要从特征点提取、中间视点生成以及特征点纵向跟踪三个方面保证稳定性。采用SIFT算法能够得到稳定性较好的特征点集;将Delaunay三角剖分与Direct3D技术相结合,实现了中间视点的有效、快速生成;为了保证纵向生成视点间的稳定连续性,采用Lucas-Kanade稀疏光流法来跟踪特征点集,将图像金字塔引入到该光流法中,使其更适用于大而不连贯的运动;为保证跟踪过程的稳定可靠性,还进一步引入了跟踪周期,通过实验验证了这种虚拟视点生成方案不仅速度快,生成的中间视点质量较高,且有效改善了中间视点不稳定和不连续的现象。该方案回避了深度图的生成和空洞填补等难题,方便于实际应用。2.提出了一种基于稠密光流跟踪的稳定虚拟视点生成方案。该方案主要包括左右立体图像对间的光流跟踪、中间视点生成以及视频帧间跟踪和水平视差计算。采用TV-L1光流法进行跟踪,将图像金字塔与其结合从而充分优化了该光流法的稳定性。在左右图像对间运用该光流法,得到立体图像对的水平视差,采用视差偏移法生成了多个稳定中间视点。在视频帧间同样采用该光流法进行跟踪,根据前后帧间运动信息计算得出当前帧左右图像对间水平视差。为充分保证帧间跟踪的稳定可靠性,同样引入了跟踪周期。针对稠密跟踪时耗大的问题,对原图先采用降采样,生成中间视点后再插值恢复原图像的大小。实验显示该方案能够生成较良好的中间视点,有效弥补了稀疏法部分区域模糊的现象,并对视频帧间的稳定连续性也有一定保证。以上研究给出的这两种方案各有所长,都能有效生成稳定连续的中间视点,且在一定场景下相互补充,具有一定的现实意义。
[Abstract]:The vigorous development of 3D technology makes 3D video deeply popular.Compared with glasses-based 3D technology, naked-eye 3D provides a more relaxed viewing experience.The naked eye display scheme usually generates multiple intermediate views under the existing binocular 3D video, and then synthesizes the resulting multi-view points into the final 3D video.However, in the process of generating multiple intermediate viewpoints, some factors such as instability of stereo matching often appear different degrees of jitter.How to effectively generate stable intermediate views and ensure their continuity is undoubtedly a very meaningful research topic.Based on the research of virtual viewpoint generation technology, this paper analyzes the causes of instability of virtual view generation, and improves the jitter suppression by optical flow tracking.Two stable virtual viewpoint generation schemes are proposed. The specific research contents and results are as follows: 1.A stable virtual viewpoint generation scheme based on sparse optical flow tracking is proposed.This scheme mainly guarantees stability from three aspects: feature point extraction, intermediate viewpoint generation and feature point longitudinal tracking.Using SIFT algorithm can get the stable feature point set; combine Delaunay triangulation with Direct3D technology to realize the effective and fast generation of the intermediate viewpoint; in order to ensure the stability continuity of the longitudinal generation view,The Lucas-Kanade sparse optical flow method is used to track feature points, and the image pyramid is introduced to the optical flow method to make it more suitable for large discontinuous motion, and the tracking period is further introduced to ensure the stability and reliability of the tracking process.The experimental results show that the virtual viewpoint generation scheme is not only fast, but also can effectively improve the instability and discontinuity of the intermediate viewpoint.This scheme avoids the problems of depth map generation and void filling, and is convenient for practical application.A stable virtual viewpoint generation scheme based on dense optical flow tracking is proposed.The scheme mainly includes optical flow tracking between left and right stereo images, intermediate viewpoint generation, video frame tracking and horizontal parallax calculation.The TV-L1 optical flow method is used to track and combine the image pyramid with it to optimize the stability of the optical flow method.The horizontal parallax of stereo image pair is obtained by using the optical flow method between left and right image pairs, and several stable intermediate viewpoints are generated by parallax offset method.The optical flow method is also used to track the video frames, and the horizontal parallax between the left and right images of the current frame is calculated according to the motion information between the two frames.In order to ensure the stability and reliability of inter-frame tracking, tracking period is also introduced.In order to solve the problem of large amount of time in dense tracking, the original image is first de-sampled, the intermediate viewpoint is generated, and then the original image size is restored by interpolation.Experiments show that the proposed scheme can generate a good intermediate view point, which can effectively make up for the fuzzy phenomenon in some regions of the sparse method, and also guarantee the stable continuity between video frames to a certain extent.The above two schemes have their own strong points, which can effectively generate stable and continuous intermediate viewpoints, and complement each other in a certain scene, which has certain practical significance.
【学位授予单位】:北京邮电大学
【学位级别】:硕士
【学位授予年份】:2016
【分类号】:TP391.41
【相似文献】
相关期刊论文 前10条
1 孙正,郁道银,陈晓冬,徐智,黄家祥,谢洪波;基于光流法的冠状动脉造影图像序列中血管运动的估计[J];工程图学学报;2003年03期
2 韩雷;王洪庆;林隐静;;光流法在强对流天气临近预报中的应用[J];北京大学学报(自然科学版);2008年05期
3 龚大墉;汪治华;杨数强;刘岩;;基于目标的局部约束光流分析[J];重庆工学院学报(自然科学版);2009年07期
4 严强;黄增喜;曹丽萍;黄蓉刚;;光流法在机车安全行驶中的应用[J];计算机应用研究;2013年04期
5 马鹏飞;杨金孝;;基于光流法的粒子图像测速[J];科学技术与工程;2012年32期
6 张永亮;卢焕章;贺兴华;谢耀华;;基于光流预测的直线对应算法[J];信号处理;2010年05期
7 孙承志;熊田忠;吉顺平;张家海;;基于差分的光流法在目标检测跟踪中的应用[J];机床与液压;2010年14期
8 王,
本文编号:1750008
本文链接:https://www.wllwen.com/kejilunwen/ruanjiangongchenglunwen/1750008.html