当前位置:主页 > 科技论文 > 计算机论文 >

基于GPU平台的ATMI模拟器并行算法的研究与实现

发布时间:2018-09-06 17:26
【摘要】:目前处理器持续增长的温度已经成为芯片设计的主要瓶颈之一。求解热方程最常用的方法是有限差分法(Finite Difference Method,简称FDM)和有限元法(Finite Element Method,简称FEM),这两种方法可以对物理系统进行详细的建模。但是使用有限差分法(FDM)和有限元法(FEM)对微处理器温度进行建模时需要大量的节点,这也意味着需要更长的计算时间。 ATMI模拟器对热方程的求解是采用最早的分析法(Analytical Method,简称AM),采用分析方法的优点是减少了计算所需要的节点,在一定程度上加快了计算速度。ATMI模拟器具有高效性,可靠性和易用性,这些优点使其在科研领域得到了广泛的应用。实验表明,当ATMI模拟器在模拟十几个核的处理器温度的时候,热响应阶段的计算需要一个周的时间。那么,随着科学技术的不断发展和处理器的更新换代,ATMI模拟器已经不适合未来处理器温度的模拟。 本文采用CUDA技术完成了ATMI模拟器GPU平台移植的前期工作,主要包括以下几个方面的工作:首先,在深入研究ATMI模拟器串行算法的基础上,重点研究了ATMI模拟器热响应阶段的算法,结合GPU体系结构和CUDA编程的特点,对原有的热响应阶段的串行算法进行了优化,优化的目的是使其更合适GPU编程。其次,实现了GPU平台的积分算法和贝塞尔算法。由于线程的并行度不够和复杂的程序结构,实现的这两个算法与原有串行算法相比实验结果不是很理想。在GPU平台上求解热响应方程的时候可以直接调用这两个已经在GPU实现的库函数。最后,我们提出了GPU平台热响应阶段的并行算法,完成了整个并行算法的框架搭建。本文从任务级进行了并行程序的设计,每一个线程负责一对热源之间的计算。 本文致力于GPU平台的模拟器的研究与探索,希望解决GPU多线程应用程序在开发过程中遇到的问题,也希望为以后多核处理温度相关问题的研究提供一些可以积累的经验。
[Abstract]:At present, the continuous increase of processor temperature has become one of the main bottlenecks of chip design. The most commonly used methods for solving thermal equations are finite difference method (Finite Difference Method,) and finite element method (Finite Element Method,) and finite element method (FEM), which can be used to model the physical system in detail. But using finite difference method (FDM) and finite element method (FEM) to model microprocessor temperature requires a lot of nodes. This also means longer computational time. The ATMI simulator solves the thermal equation by using the earliest analytical method, (Analytical Method, for short, and AM), which has the advantage of reducing the number of nodes needed for calculation. To some extent, ATMI simulator has high efficiency, reliability and ease of use, which makes it widely used in the field of scientific research. The experimental results show that when the ATMI simulator simulates the processor temperature of a dozen cores, the calculation of the thermal response phase takes a week. So, with the development of science and technology and the upgrading of processor, ATMI simulator is not suitable for the simulation of processor temperature in the future. In this paper, the previous work of transplanting GPU platform of ATMI simulator is accomplished by using CUDA technology. The main work includes the following aspects: firstly, on the basis of deeply studying the serial algorithm of ATMI simulator, The algorithm of the thermal response phase of ATMI simulator is studied emphatically. Combining with the characteristics of GPU architecture and CUDA programming, the serial algorithm of the original thermal response stage is optimized. The purpose of the optimization is to make it more suitable for GPU programming. Secondly, the integration algorithm and Bessel algorithm of GPU platform are implemented. Due to the insufficient parallelism of threads and the complex program structure, the experimental results of these two algorithms are not ideal compared with the original serial algorithms. These two library functions which have been implemented in GPU can be directly called when solving the thermal response equation on the GPU platform. Finally, we propose a parallel algorithm in the thermal response phase of GPU platform, and complete the framework of the whole parallel algorithm. In this paper, a parallel program is designed at the task level. Each thread is responsible for the calculation of a pair of heat sources. This paper is devoted to the research and exploration of the simulator of GPU platform, hoping to solve the problems encountered in the development of GPU multithreaded application program, and also hope to provide some accumulated experience for the research of multi-core processing temperature related problems in the future.
【学位授予单位】:内蒙古大学
【学位级别】:硕士
【学位授予年份】:2012
【分类号】:TP332

【参考文献】

相关期刊论文 前1条

1 查那日苏;何立强;魏凤歧;;基于热扩散模型的测试程序分类[J];计算机工程;2010年11期



本文编号:2227042

资料下载
论文发表

本文链接:https://www.wllwen.com/kejilunwen/jisuanjikexuelunwen/2227042.html


Copyright(c)文论论文网All Rights Reserved | 网站地图 |

版权申明:资料由用户22d16***提供,本站仅收录摘要或目录,作者需要删除请E-mail邮箱bigeng88@qq.com