当前位置:主页 > 科技论文 > 地质论文 >

基于复杂结构双二次插值的有限元大地电磁二维正反演研究

发布时间:2019-01-13 08:09
【摘要】:大地电磁测深法(MT)是基于岩矿石电性差异,利用天然交变电磁场研究地球电性结构的有效方法。针对此方法开展的正反演数值模拟问题,国内外学者已经提出多种方法并取得较好效果。本文结合当前MT二维正反演问题存在的一些问题,并在正反演过程中同时考虑电阻率和磁导率异常影响,开展了基于二次场算法的MT数值模拟的算法探讨。本文从麦克斯韦方程组出发,导出了同时考虑电阻率与磁导率异常的大地电磁法二次场方程;为了提高计算精度与效率、简化计算节点生成,采用了格林定理处理二次场方程源项,并利用二次场算法进行了模型试算,采用了基于二叉树结构的三角单元剖分的有限单元法,推导出了双线性与双二次插值的单元刚度矩阵表达式;分别对电阻率异常及磁导率异常模型进行了试算。数值计算结果表明,二次场算法无需对场源区域剖分,可有效的减小计算区间,提高了计算效率;与总场算法相比二次场算法边界处理简单;二次场算法有利用提高计算精度,解决了低频计算不稳定问题;试算表明,较小的磁导率变化就可对可控源视电阻率产生明显影响,使电阻率增大。本文基于Tikhonov提出的正则化反演理论,针对目标函数的最优化求解及正则化因子的确定问题,分析了几种常用的的最优化方法,并选取拟牛顿法作为目标函数进行最优化求解;其次,采用修正的“L-Curve”方法求解正则化因子,改善了正则化因子的选取精度,提高了反演精度和速率,并在算例中得到了验证。
[Abstract]:Magnetotelluric sounding (MT) is an effective method to study the electrical structure of the earth by using natural alternating electromagnetic field based on the difference of electrical properties of rock and ore. For the forward inversion numerical simulation problem developed by this method, many methods have been proposed by domestic and foreign scholars and good results have been obtained. In this paper, considering some problems existing in MT 2-D forward and inverse modeling, and considering the influence of resistivity and permeability anomalies in the process of forward and inverse modeling, the algorithm of MT numerical simulation based on quadratic field algorithm is discussed in this paper. Based on Maxwell's equations, a magnetotelluric quadratic field equation considering both resistivity and permeability anomalies is derived. In order to improve the accuracy and efficiency of calculation and simplify the generation of calculation nodes, Green's theorem is used to deal with the source term of quadratic field equation, and the model trial calculation is carried out by using quadratic field algorithm. The element stiffness matrix expressions of bilinear and bi-quadratic interpolation are derived by using the finite element method of triangulation based on binary tree structure. The models of resistivity anomaly and permeability anomaly are calculated respectively. The numerical results show that the quadratic field algorithm can effectively reduce the calculation interval and improve the computational efficiency without dividing the field source region, and the boundary processing of the quadratic field algorithm is simpler than that of the total field algorithm. The quadratic field algorithm can improve the accuracy of calculation and solve the unstable problem of low frequency calculation. It is shown from the trial calculation that the apparent resistivity of controllable source can be obviously affected by the change of magnetic permeability and the resistivity increases. In this paper, based on the regularization inversion theory proposed by Tikhonov, aiming at the optimization of objective function and the determination of regularization factor, several common optimization methods are analyzed, and the quasi Newton method is selected as the objective function to solve the optimization problem. Secondly, the modified "L-Curve" method is used to solve the regularization factor, which improves the accuracy of the regularization factor selection, improves the inversion accuracy and rate, and is verified by an example.
【学位授予单位】:东华理工大学
【学位级别】:硕士
【学位授予年份】:2015
【分类号】:P631.325

【相似文献】

相关期刊论文 前10条

1 孙建国;利用复阻抗相位求大地电磁反射函数的设想[J];石油物探;1986年02期

2 张云琳,司玉兰,郭守年,安海静;大地电磁观测中一种典型的煤矿地下电气作业干扰[J];西北地震学报;1988年01期

3 魏胜,,王家映,罗志琼;大地电磁解释工作站[J];物探与化探;1994年01期

4 张大海,徐世浙;带相位信息的一维大地电磁曲线对比反演法[J];地震地质;2001年02期

5 周红,宋维琦,尹兵祥;二维大地电磁资料频域逆散射反演[J];地球物理学进展;2001年02期

6 王若,王妙月,底青云;二维大地电磁数据的整体反演[J];地球物理学进展;2001年04期

7 陈儒军,白宜诚,邓明;海底大地电磁探测仪数据采集软件[J];中南工业大学学报(自然科学版);2002年02期

8 柳建新,严家斌,张胜业,李冶,谭捍东,王家林,杨梅霞,李庆凯,张建华;多功能海底大地电磁数据处理软件包[J];物探化探计算技术;2002年03期

9 邓明,魏文博,谭捍东,金胜,董浩斌,邓靖武;海底大地电磁数据采集器[J];地球物理学报;2003年02期

10 Martyn UNSWORTH;用大地电磁勘探方法研究大陆动力学(英文)[J];地学前缘;2003年01期

相关会议论文 前10条

1 刘文R

本文编号:2408257


资料下载
论文发表

本文链接:https://www.wllwen.com/kejilunwen/diqiudizhi/2408257.html


Copyright(c)文论论文网All Rights Reserved | 网站地图 |

版权申明:资料由用户fbf6f***提供,本站仅收录摘要或目录,作者需要删除请E-mail邮箱bigeng88@qq.com