共查询到20条相似文献,搜索用时 171 毫秒
1.
提出了一种基于网格的计算燃烧学可视化共享模型MVSMg(Multilevel Visualization and Storage Model based on grid ),并给出了形式化定义。模型将共享分为静态共享和动态共享。网格环境下计算燃烧学的数据可视化分为多级,帮助领域专家进行不同深度的研究。对好的应用模块进行注册,并与处理所得数据一起分层存储,以利于其他用户共享。还提出了一个动态共享和注册管理算法。最终实现了数据和应用模块的共享,避免了重复开发和计算。 相似文献
2.
3.
4.
5.
对三维城市的建模方法进行了讨论,提出了一种网格环境下基于SOAP(Simple Object Access Protocol)消息包的三维空间数据表达方法,并通过动态网格服务的方式为客户端的三维可视化提供数据源,通过解析SOAP消息包便可得到三维可视化所需的矢量数据、属性数据以及纹理数据等,再通过Java和Java3D技术实现城市模型的三维显示和客户端的交互操作。最后,在Globus网格环境下给出了基于网格服务的三维城市模型可视化的实验例子。 相似文献
6.
网格计算主要关注大规模的资源共享,且这种共享是高度可控的。为解决网格环境下文件资源共享与管理的问题,提出了一个网格文件资源共享模型FsvGrid。该模型引入注册通知机制,并采用确定性算法与非确定性算法相结合的消息传递机制,使得网格中的各个节点之间能够高效协作;采用分层结构,屏蔽了文件资源的多样性;增加了共享的安全性,可以对共享进行控制;提出了一种依靠虚拟组织来对文件资源进行管理的方式,解决分布式资源难以管理的问题。 相似文献
7.
分析了空间信息共享技术的现状,针对如何消除"信息孤岛"的问题,提出了基于网格技术的空间信息共享模型.设计了元数据注册服务、元数据查询服务和空间数据服务,并通过实验进行了关键技术验证.实验结果表明,该模型在技术上是可行的,能够集成分布式异构数据资源. 相似文献
8.
基于虚拟组织的网格文件资源共享模型 总被引:1,自引:0,他引:1
网格计算主要关注大规模的资源共享,且这种共享是高度可控的。为解决网格环境下文件资源共享与管理的问题,提出了一个网格文件资源共享模型FsvGrid。该模型引入注册通知机制,并采用确定性算法与非确定性算法相结合的消息传递机制,使得网格中的各个节点之间能够高效协作;采用分层结构,屏蔽了文件资源的多样性;增加了共享的安全性,可以对共享进行控制;提出了一种依靠虚拟组织来对文件资源进行管理的方式,解决分布式资源难以管理的问题。 相似文献
9.
讨论了网格计算的有关概念及其最新的研究进展,并在此基础上提出了融合计算网格、数据网格、服务网格的地质点源信息系统网格的概念、特点及其体系结构。旨在为目前地矿点源信息的共享严重滞后于网络技术发展的速度而得不到有效利用和应用的提供新的解决方案,并为有效解决在点源地矿信息方面诸如动态异构、资源共享、信息孤岛、数据庞大、处理速度等相关问题提供了新的思路。 相似文献
10.
基于网格体系结构节点资源能够被共享并被协同使用的概念,设计在分布式节点上实现数据存储和传递的网格数据存储系统.该系统允许网格用户在本地将数据上传到网格,网格管理节点负责对参与共享存储资源的节点进行管理,并将预定大小的数据分配到相应节点存储,同时响应网格用户的请求,使用基于Hash表的路由信息,找到对应请求网格数据的最佳路径,并激活网格线程,实现网格数据在节点间的完整传递.基于Alchemi网格中间件和.NET框架对遥感数据而进行的开发和应用表明,桌面网格动态存储是一个可行的网格计算应用. 相似文献
11.
在对大规模数值模拟的计算结果进行可视化分析的过程中,用户需要管理大量的科学数据。目前国内外有很多优秀的可视化分析软件,但是他们仅支持对数据的后处理,不提供对中间输出及最终结果进行有效组织和管理的功能。为了提高科学数据的分析效率,设计并实现了一种面向科学数据的可视化分析与管理系统,实现对数值模拟的中间及最终结果的实时监控,并且能对其进行规范化的组织管理。 相似文献
12.
网格计算是近几年来发展迅速的一种网络资源共享模型,其目的是网络资源的完全共享。网格计算相对于传统的C/S及Web GIS来说,它在网络计算、数据处理、资源共享、任务协同等方面都有了进一步的发展。在适感影像处理与理解应用方面,由于涉及大量的网络计算及网络传输等耗时操作,应用网格计算思想进行适感图像处理就更有实际意义。结合网格计算的网络计算及资源共享的优势,设计并开发适合适感影像处理的网格计算环境及软件,是解决当前适感影像海量数据处理的有效途径之一,同时也是适感图像处理软件的一个发展趋势之一。本文结合实际研究成果,首先给出了基于网格计算思想的适感影像处理的设计流程,并对其具体实现技术进一步进行了讨论。在分析其在图像处理方面应用的基础上,对面向网络的智能化适感图像网格处理系统进行了分析与设计,并给出了其实现的一些关键技术。 相似文献
13.
Marian Petre 《Journal of Visual Languages and Computing》2010,21(3):171-183
This paper considers the relationship between mental imagery and software visualization in professional, high-performance software development. It presents overviews of four empirical studies of professional software developers in high-performing teams: (1) expert programmers’ mental imagery, (2) how experts externalize their mental imagery as part of teamwork, (3) experts’ use of commercially available visualization software, and (4) what tools experts build themselves, how they use the tools they build for themselves, and why they build tools for themselves. Through this series of studies, the paper provides insight into a relationship between how experts reason about and imagine solutions, and their use of and requirements for external representations and software visualization. In particular, it provides insight into how experts use visualization in reasoning about software design, and how their requirements for the support of design tasks differ from those for the support of other software development tasks. The paper draws on theory from other disciplines to explicate issues in this area, and it discusses implications for future work in this field. 相似文献
14.
彭彦洁 《数码设计:surface》2009,(3):21-23
随着信息时代的不断成熟,以互动性和图形化演示数据为定义的信息设计即将面临本质的转变。然而,信息设计理论仍然植根于早期的专业使用的概念,很大程度上忽略了新的,非专业的受众需求。因此,本论文试图将信息设计作为面向公众的一项实践,并探索能够让由专业人士设计并且面向专业人士的设计方式转变为面向大众的设计方式。 相似文献
15.
网络环境下高性能计算的可视化 总被引:8,自引:0,他引:8
本文介绍了在“先进计算基础设施”环境中实现科学计算可视化的方法 .该环境对可视化提出了较高的要求 :一方面 ,客户是通过 Internet观察这些结果的 ,没有事先安装任何软件 ;另一方面 ,高性能计算结果的可视化数据规模又非常大 ,必须用动态三维图像展现 .因此 ,传统的可视化手段难以同时满足这两个目标 .本文基于 Java的 Java 3D建立了三维可视化环境 ,实现了从 Internet自动安装用户端运行环境 ,并能通过传输控制三维模型的代码 ,在用户端快速生成可视化图像 ;同时 ,结合 Applet与 Servlet交互技术 ,在用户观察当前可视化结果的同时 ,分解传输有效数据来控制三维模型后续的运动 ,使跨网可视化的动态效果接近于本地运行的水平 . 相似文献
16.
乘潮水位计算是海洋环境信息处理的重要组成部分,具有计算量大、计算复杂度高、计算时间长等特性。采用传统集群计算模式实现乘潮水位计算业务,存在计算成本高、计算伸缩性和交互性差的问题。针对以上问题,提出一种基于Spark框架的乘潮水位计算和可视化平台。结合对Spark任务调度算法的研究,设计和实现了一种基于节点计算能力的任务调度算法,实现了长时间序列的多任务乘潮水位数据的检索、获取、数值计算、特征可视化的并行处理,达到了海量海洋环境数据计算和可视化处理的目的。实验结果表明,提出的基于Spark的乘潮水位计算和可视化平台可以有效地提高海量乘潮水位数据的分布式并行处理的效率,为更加快速和高效的乘潮水位计算提供了一种新的方法。 相似文献
17.
18.
We present a software framework for mining software repositories. Our extensible framework enables the integration of data
extraction from repositories with data analysis and interactive visualization. We demonstrate the applicability of the framework
by presenting several case studies performed on industry-size software repositories. In each study we use the framework to
give answers to one or several software engineering questions addressing a specific project. Next, we validate the answers
by comparing them with existing project documentation, by interviewing domain experts and by detailed analyses of the source
code. The results show that our framework can be used both for supporting case studies on mining software repository techniques
and for building end-user tools for software maintenance support.
Lucian Voinea received a Professional Doctorate in Engineering degree (PDEng) from the Eindhoven University of Technology (Netherlands) in 2003 and a PhD degree in computer science from the same university in 2007. Starting from 1999, he worked as a freelance contractor for companies in Romania, Netherlands and US. His research interests include methods, technologies and tools for the analysis of quality attributes of large software systems, and in particular the analysis of software evolution. He recently co-founded SolidSource, a start-up company specialized in tools and services for the maintenance of software systems (). Alexandru Telea received his PhD in 2000 from the Eindhoven University of Technology in the Netherlands. He worked at the same university as an assistant professor in data visualization until 2007, when he received an adjunct professor position in software visualization from the University of Groningen, the Netherlands. He has pioneered several innovative methods in visualizing complex information related to software systems, reverse engineering, and software evolution. He has been the lead developer and architect of several software systems for reverse engineering, data visualization, visual programming, and component-based development. He has published over 100 articles and one book in the above fields. 相似文献
Alexandru TeleaEmail: |
Lucian Voinea received a Professional Doctorate in Engineering degree (PDEng) from the Eindhoven University of Technology (Netherlands) in 2003 and a PhD degree in computer science from the same university in 2007. Starting from 1999, he worked as a freelance contractor for companies in Romania, Netherlands and US. His research interests include methods, technologies and tools for the analysis of quality attributes of large software systems, and in particular the analysis of software evolution. He recently co-founded SolidSource, a start-up company specialized in tools and services for the maintenance of software systems (). Alexandru Telea received his PhD in 2000 from the Eindhoven University of Technology in the Netherlands. He worked at the same university as an assistant professor in data visualization until 2007, when he received an adjunct professor position in software visualization from the University of Groningen, the Netherlands. He has pioneered several innovative methods in visualizing complex information related to software systems, reverse engineering, and software evolution. He has been the lead developer and architect of several software systems for reverse engineering, data visualization, visual programming, and component-based development. He has published over 100 articles and one book in the above fields. 相似文献
19.
Scientific computing at the petascale level enables us to answer many difficult scientific questions, but the resulting data are too large to store and study directly with conventional postprocessing visualization tools. This problem will only become more severe as we reach exascale computing. A plausible, attractive solution involves processing data in situ with the simulation to reduce the data that must be transferred over networks and stored and to prepare the data for more cost-effective postprocessing visualization. The data could be reduced with compression, feature extraction, and visualization methods. This article discusses critical issues in realizing in situ visualization and data reduction and suggests important research directions. 相似文献