当前位置:主页 > 科技论文 > 测绘论文 >

双模导航接收机精密授时系统的设计与实现

发布时间:2017-12-30 22:12

  本文关键词:双模导航接收机精密授时系统的设计与实现 出处:《北方工业大学》2014年硕士论文 论文类型:学位论文


  更多相关文章: 卫星导航接收机授时 钟差 锁相环 预报 秒脉冲


【摘要】:随着现代科技的飞速发展,各行业对时间和频率精度提出了更高的要求,并且随着各大全球卫星导航系统的逐步成熟和完善,卫星授时技术以授时范围广、精度高等优点在授时领域发挥着重要作用。授时接收机的工作方式就是接收以原子钟为时钟源的卫星信号,然后输出与卫星时钟源同步的时标信号(秒脉冲)供其它系统使用。目前市场实际应用中大多是单GPS授时接收机,为了提高授时的可靠性和防止对GPS的过度依赖,文中提出采用BDS和GPS双模系统进行授时。 论文主要以提高授时精度为目的,分析和研究了影响授时精度的因素,并提出减少这些因素的影响和提高授时精度需采取的相应措施。在授时接收机中关键是利用钟差进行秒脉冲的调整,所以钟差的准确度直接影响了授时的精度。首先,根据卫星授时原理和伪距方程分析了影响钟差解算的各个误差源,通过建立误差修正模型或采取相应措施减小误差源的影响。其次,为进一步提高钟差的准确度和秒脉冲的调节精度,采用一种基于锁相环的钟差校准方法。将解算钟差的方程作为鉴相器,然后将解算的钟差通过一个三阶环路滤波器,使用滤波过的钟差对秒脉冲产生进行调节,最后将得到的本地采样时刻进行反馈作为鉴相器的输入。最后,为实现授时接收机在卫星信号异常时能够在守时模式下保证其授时精度,利用晶振短时间内的稳定性,提出利用一组钟差序列建立相应的预报模型实现短时间内钟差的预报,由预报所得钟差进行秒脉冲调节,从而得到守时时标信号。经测试,相比于采用直接解算的钟差进行授时,通过授时闭合环路进行授时,其精度可提高将近一倍,可达到50ns以内。另外通过仿真验证使用AR(p)模型可实现短时间内精确的钟差预报,精度在5ns以内。
[Abstract]:With the rapid development of modern science and technology, various industries put forward higher requirements for the accuracy of time and frequency, and with the gradual maturity and improvement of the major global satellite navigation systems, satellite timing technology to serve a wide range. The advantages of high precision play an important role in the field of timing. The working mode of the timing receiver is to receive the satellite signal with the atomic clock as the clock source. Then the time scale signal (second pulse) synchronized with the satellite clock source is output for other systems. At present, most of the actual market applications are single GPS timing receiver. In order to improve the reliability of timing service and prevent excessive dependence on GPS, a dual mode system of BDS and GPS is proposed in this paper. In order to improve the precision of timing, the paper analyzes and studies the factors that affect the accuracy of timing. The measures to reduce the influence of these factors and to improve the accuracy of time service are put forward. The key of the timing receiver is to use the clock difference to adjust the second pulse. Therefore, the accuracy of clock error directly affects the accuracy of timing service. Firstly, according to the principle of satellite timing service and pseudo-range equation, the error sources that affect the solution of clock difference are analyzed. Through establishing error correction model or taking corresponding measures to reduce the influence of error source. Secondly, to further improve the accuracy of clock error and the adjustment accuracy of second pulse. A clock difference calibration method based on phase-locked loop is adopted. The equation of calculating clock error is used as phase discriminator, then the clock difference is calculated through a third-order loop filter, and the second pulse generation is adjusted by the filtered clock difference. Finally, the local sampling time feedback is used as the input of the phase discriminator. Finally, in order to realize the timing receiver can ensure its timing accuracy in the time-saving mode when the satellite signal is abnormal. Based on the stability of crystal oscillator in a short period of time, a corresponding prediction model is established by using a set of clock difference sequences to predict the clock error in a short time, and the clock difference obtained from the prediction is adjusted by second pulse. Therefore, the time-keeping signal is obtained. Compared with the clock difference of the direct solution, the precision of the time-keeping signal can be improved by nearly twice as much as that by the closed loop of time-sharing. In addition, the precision of clock error prediction in a short period of time can be realized by using the ARGRP) model, and the precision is less than 5 ns.
【学位授予单位】:北方工业大学
【学位级别】:硕士
【学位授予年份】:2014
【分类号】:P228.4

【参考文献】

相关期刊论文 前6条

1 单庆晓;杨俊;;北斗/GPS双模授时及其在CDMA系统的应用[J];测试技术学报;2011年03期

2 汪洋;赵宏波;刘春梅;陈U,

本文编号:1356742


资料下载
论文发表

本文链接:https://www.wllwen.com/kejilunwen/dizhicehuilunwen/1356742.html


Copyright(c)文论论文网All Rights Reserved | 网站地图 |

版权申明:资料由用户e203a***提供,本站仅收录摘要或目录,作者需要删除请E-mail邮箱bigeng88@qq.com