基于大数据应用云计算技术评估详实房屋震害损失的研究
本文选题:房屋震害损失评估 + 云计算 ; 参考:《中国地震局工程力学研究所》2017年硕士论文
【摘要】:房屋震害损失评估是地震灾害损失评估非常重要的一环,对震后灾区救援和复建具有很大的指导意义和参考价值。传统的房屋震害损失评估采取抽样方法来采集数据,然后基于这些具有一定代表性的数据来进行评估,然而,用样本数据来代替总体数据是一种“无奈之举”,会在一定程度上削弱评估结果的准确性。随着信息时代的到来,各个行业领域的信息量快速增长,数据类型越来越复杂,大数据的概念就是在这样的背景下被提出的。大数据不仅仅是数据量巨大,同时还兼具异构性和价值性。但是要想发掘大数据所蕴藏的价值性,需要便捷快速、经济的运算工具。云计算是一种通过共享云端资源池来实现低成本高运算的计算模式,目前,公认是处理大数据的最佳利器。地震灾害评估方面的数据也在急剧增长中,面对这种增长,传统的计算和数据存储方法在处理速度上越发显得捉襟见肘。《中共中央国务院关于进一步加强城市规划建设管理工作的若干意见》指出要“推进城市智慧管理。加强城市管理和服务体系智能化建设,促进大数据、物联网、云计算等现代信息技术与城市管理服务融合,提升城市治理和服务水平。”地震灾害评估也需要推进智慧管理,将现代信息技术手段引入到地震灾害评估等相关领域,以此得到更好的发展。论文主要做了以下几项工作:1)综述大数据、云计算在地震数据处理领域运用的国内外发展现状。通过大量的资料查询以及分析对比工作,最终选定本文的云开发平台为Hadoop。同时结合本文数据存储以及计算速度的需求,明确了本文采用的云计算三项技术:MapReduce编程模型、HDFS分布式文件系统以及HBase非关系型数据库,并对这三项相关技术进行了必要的介绍。2)详细分析了目前传统房屋震害损评估方法存在的不足,结合房屋震害数据采集手段不断进步的发展趋势,设计了一种理想化的基于大数据应用云计算技术评估详实房屋震害损失的方法,重点是用全数据参与评估。并且完成以下几部分设计:a)数据存储方面,设计出合理的便于存储以及进行数据操作的HBase数据表结构。并且完成了数据批量录入HBase的程序设计。b)数据处理部分:将整个评估过程进行合理的拆分,设计出高效可靠的算法流程,并通过MapReduce程序来实现。3)将大量数据导入HBase数据库,在数据库相同的条件下,分别运用云计算集群和传统单机模式计算房屋震害损失值,比较本评估方法运用两种计算工具的计算速度。
[Abstract]:Building damage assessment is a very important part of earthquake disaster loss assessment, which has great guiding significance and reference value for disaster relief and reconstruction after earthquake.The traditional method of building damage assessment is to collect data by sampling method, and then evaluate it based on these representative data. However, it is "helpless" to replace the overall data with sample data.To some extent, the accuracy of the evaluation results will be weakened.With the arrival of the information age, the amount of information in various industries is increasing rapidly, and the data types are becoming more and more complex. Big data's concept is put forward under this background.Big data is not only a huge amount of data, but also heterogeneity and value.But in order to explore the value of big data, need convenient and rapid, economic computing tools.Cloud computing is a kind of computing mode which can achieve low cost and high operation by sharing cloud resource pool. At present, cloud computing is recognized as the best weapon to deal with big data.Data on earthquake disaster assessment are also growing dramatically, and in the face of this growth,The traditional methods of calculation and data storage are more and more overstretched in processing speed. "some opinions of the CPC Central Committee and the State Council on further strengthening the management of urban planning and construction" pointed out that "promoting urban intelligent management".Strengthen the intelligent construction of urban management and service system, promote the integration of modern information technology such as big data, Internet of things, cloud computing and urban management services, and improve the level of urban governance and services. "Earthquake disaster assessment also needs to promote intelligent management and introduce modern information technology into earthquake disaster assessment and other related fields in order to get a better development.This paper summarizes the development of big data, cloud computing in the field of seismic data processing at home and abroad.Finally, the cloud development platform of this paper is chosen as Hadoop through a lot of data query and analysis and comparison work.At the same time, according to the requirement of data storage and computing speed in this paper, three technologies of cloud computing, namely: MapReduce programming model, HDFS distributed file system and HBase non-relational database, are defined in this paper.At the same time, the necessary introduction of these three related technologies. 2) analyzes in detail the shortcomings of the traditional methods for evaluating the damage of buildings, and combines with the developing trend of the means of collecting data on the earthquake damage of buildings.An idealized method based on big data's application of cloud computing technology is designed to evaluate the damage loss of detailed buildings, with the emphasis on full data participation.At the same time, the following several parts are completed to design the HBase data table structure which is convenient to store and operate.And completed the program design of data batch input into HBase. B) data processing part: the whole evaluation process is divided reasonably, the efficient and reliable algorithm flow is designed, and a large amount of data is imported into HBase database through MapReduce program.Under the same database condition, the cloud computing cluster and the traditional single-machine model are used to calculate the damage value of buildings, and the calculation speed of the two computing tools is compared in this evaluation method.
【学位授予单位】:中国地震局工程力学研究所
【学位级别】:硕士
【学位授予年份】:2017
【分类号】:P315.9;TP311.13
【参考文献】
相关期刊论文 前10条
1 龚强;李萌;;基于大数据用云计算方法对地震房屋损失评估的研究[J];信息技术;2017年05期
2 龚强;李萌;;一种基于大数据云计算的地震房屋损失评估模型[J];防灾减灾学报;2017年01期
3 陈会忠;;地震大数据思维[J];城市与减灾;2016年02期
4 龚强;;云计算关键技术之编程模型认知研究[J];信息技术;2015年01期
5 李青云;余文;;关系型数据库到H Base的转换设计[J];信息网络安全;2015年01期
6 邢扬;张国江;;桥梁震害分析与抗震设计新方法研究[J];山西建筑;2015年01期
7 屈佳;郑蕊;王宁;;地震行业“大数据”应用探讨[J];城市与减灾;2014年04期
8 高源;;NoSQL非关系型数据库的发展和应用研究[J];计算机光盘软件与应用;2014年05期
9 崔忠伟;左羽;韦萍萍;熊伟程;;基于云计算的数字图书馆服务平台架构设计[J];物联网技术;2014年02期
10 何清;;大数据与云计算[J];科技促进发展;2014年01期
相关硕士学位论文 前5条
1 刘飞;基于云计算的分布式存储系统的研究和应用[D];西安工业大学;2012年
2 李崇欣;分布式数据库HBase快照的设计与实现[D];浙江大学;2011年
3 翟永东;Hadoop分布式文件系统(HDFS)可靠性的研究与优化[D];华中科技大学;2011年
4 李利;城市建筑物震害及震害经济损失预测方法研究[D];大连理工大学;2009年
5 陈洪富;城市房屋建筑装修震害损失评估方法研究[D];中国地震局工程力学研究所;2008年
,本文编号:1735925
本文链接:https://www.wllwen.com/kejilunwen/diqiudizhi/1735925.html