首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 193 毫秒
1.
环面上van der Pol方程混沌解的可视化计算   总被引:2,自引:0,他引:2  
研究了环面上非线性van der Pol方程的图形建模及可视化计算的问题,在图形环境下,对van der Pol环面方程从建模,实验到结果分析的全过程进行了可视化建模和计算,并建立了一个对系统运动轨迹进行全面自动分析试验的可视化仿真框架,该方法不但可以不用传统程序代码对模型及算法编程,而且可方便地对系统进行多参数自动迭代试验及智能分析。  相似文献   

2.
混沌动力学系统的图形化建模及可视化仿真研究   总被引:3,自引:1,他引:2  
该文研究非线性混沌动力学系统的图形化建模及可视化仿真计算问题.通过建立统一的可视化仿真实验框架,实现了一个高度一体化的混沌系统图形建模和可视化仿真环境;与基于常规高级程序设计语言的混沌计算机模拟技术相比,该方法避免了传统意义上的混沌方程算法编程和调试过程,亦可对混沌参数进行自动摄动分析及最优设计。  相似文献   

3.
该文研究非线性混沌力学系统的图形化建模可视化仿真计算问题。通过建立统一的可视化仿真实验框架,实现了一具高度一体化的混沌系统图形建模和可视化仿真环境;与基于常规高级程序设计语言的混沌计算机模拟技术相比,该方法避免了传统意义上的混沌方程算法编程和调试过程,变可对混沌参数进行自动慑动分析及最优设计。  相似文献   

4.
基于RSView32组态技术的仿真可视化实现研究   总被引:1,自引:0,他引:1  
余斌  李仁发 《计算机仿真》2005,22(6):210-213
可视化仿真是系统仿真的研究方向之一,常用于工业控制的组态软件有丰富的图形功能,而专业仿真计算工具有强大的计算能力。在对仿真的结果进行可视化设计中,丰富的图形功能和强大的建模及计算能力是必不可少的。该文首先详细介绍了实现可视化仿真的方案,包括组态软件和仿真计算工具之间进行ActiveX通信的步骤,以及在组态软件自带的VBA编辑器中实现动画控制的过程,最后通过实例在组态软件中设计出良好的人机界面来实现仿真的可视化。该文的研究为仿真可视化的实现探索出了一种新方法。  相似文献   

5.
通过构建统一的图形自动解算环境,对强非线性系统方程从建模,实验到结果分析的全过程进行了可视化解算。强调数据的图形处理,具有对复杂系统进行智能化多参数自动迭代运算的能力。  相似文献   

6.
MC-CDMA系统的可视化仿真研究   总被引:1,自引:0,他引:1  
基于多载波并行传输的CDMA技术以其具有诸多的优点正受到人们的广泛关注,已成为移动通信系统研究中的一个热门课题,该文基于交互式图形建模技术,针对多载波传输机理,研究了MC-CDMA系统的图形建模及可视化仿真问题,并给出了系统仿真结果。  相似文献   

7.
计算机可视化技术的不断普及,使虚拟现实技术成为一种新的科学。三维场景建模技术被广泛的应用在科学计算、人工智能仿真以及三维图形的制作等个个方面。而OpenGL是一个图形硬件的软件接口,它是公认的高性能图形视景标准,运用基于OpenGL的图形建模技术并结合3DSmax软件可以快速的构建三维场景。那么如何快速使用这种技术,该文首先对可视化虚拟现实技术进行分析,再讨论基于OpenGL的三维场景构建技术。  相似文献   

8.
研究了复杂非线性系统参数优化环境的可视化建模技术及软件实现问题。应用先进的仿真技术,采用面向对象和结构化设计方法,通过通用软件实现了复杂系统参数自动寻优环境的高度可视化的一体化;与基于常规高级和谐设计语言的参数寻优方法相比,该方法不仅不用传统程序代码对算法编程,而且可方便地对系统进行多参数自动迭代寻优试验及进行智能化分析。  相似文献   

9.
研究了复杂非线性系统参数优化环境的可视化建模技术及软件实现问题.应用先进的仿真技术,采用面向对象和结构化设计方法,通过通用软件实现了复杂系统参数自动寻优环境的高度可视化的一体化;与基于常规高级和谐设计语言的参数寻优方法相比,该方法不仅不用传统程序代码对算法编程,而且可方便地对系统进行多参数自动迭代寻优试验及进行智能化分析.  相似文献   

10.
刘广志  刘军 《微计算机应用》2007,28(11):1225-1228
铁路仿真通用三维可视化系统是一个通用可视化平台,用户以三维漫游的方式查看铁路仿真数据。在分析仿真业务需要和铁路三维场景的数据特性的基础上,本文对系统建模进行研究,设计了系统框架和模块,给出了三维场景漫游控制流程。实验室初步试验证明,这个概要模型时可行的。  相似文献   

11.
Quantum Chromodynamics (QCD) is an application area that requires access to large supercomputing resources and generates large amounts of raw data. The UK's national lattice QCD collaboration UKQCD currently stores and requires access to around five Tbytes of data, a figure that is growing dramatically as the collaboration's purpose built supercomputing system, QCDOC [P.A. Boyle, D. Chen, N.H. Christ, M. Clark, S.D. Cohen, C. Cristian, Z. Dong, A. Gara, B. Joo, C. Jung, C. Kim, L. Levkova, X. Liao, G. Liu, R.D. Mawhinney, S. Ohta, K. Petrov, T. Wettig and A. Yamaguchi, “Hardware and software status of QCDOC, arXiv: hep-lat/0309096”, Nuclear Physics. B, Proceedings Supplement, Vol. 838, pp. 129–130, 2004. See: http://www.ph.ed.ac.uk/ukqcd/community/qcdoc/; P.A. Boyle, D. Chen, N.H. Christ, M.A. Clark, S.D. Cohen, C. Cristian, Z. Dong, A. Gara, B. Joo, C. Jung, C. Kim, L.A. Levkova, X. Liao, R.D. Mawhinney, S. Ohta, K. Petrov, T. Wettig and A. Yamaguchi, “Overview of the QCDSP and QCDOC computers”, IBM Journal of Research and Development, Vol. 49, No. 2/3, p. 351, 2005] came into full production service towards the end of 2004. This data is stored on QCDgrid, a data Grid currently composed of seven storage elements at five separate UK sites.  相似文献   

12.
Density-based clustering algorithms are attractive for the task of class identification in spatial database. However, in many cases, very different local-density clusters exist in different regions of data space, therefore, DBSCAN method [M. Ester, H.-P. Kriegel, J. Sander, X. Xu, A density-based algorithm for discovering clusters in large spatial databases with noise, in: E. Simoudis, J. Han, U.M. Fayyad (Eds.), Proceedings of the Second International Conference on Knowledge Discovery and Data Mining, Portland, OR, AAAI, Menlo Park, CA, 1996, pp. 226–231] using a global density parameter is not suitable. Although OPTICS [M. Ankerst, M.M. Breunig, H.-P. Kriegel, J. Sander, OPTICS: ordering points to identify the clustering structure, in: A. Delis, C. Faloutsos, S. Ghandeharizadeh (Eds.), Proceedings of ACM SIGMOD International Conference on Management of Data Philadelphia, PA, ACM, New York, 1999, pp. 49–60] provides an augmented ordering of the database to represent its density-based clustering structure, it only generates the clusters with local-density exceeds certain thresholds but not the cluster of similar local-density; in addition, it does not produce clusters of a data set explicitly. Furthermore, the parameters required by almost all the major clustering algorithms are hard to determine although they significantly impact on the clustering result. In this paper, a new clustering algorithm LDBSCAN relying on a local-density-based notion of clusters is proposed. In this technique, the selection of appropriate parameters is not difficult; it also takes the advantage of the LOF [M.M. Breunig, H.-P. Kriegel, R.T. Ng, J. Sander, LOF: identifying density-based local outliers, in: W. Chen, J.F. Naughton, P.A. Bernstein (Eds.), Proceedings of ACM SIGMOD International Conference on Management of Data, Dalles, TX, ACM, New York, 2000, pp. 93–104] to detect the noises comparing with other density-based clustering algorithms. The proposed algorithm has potential applications in business intelligence.  相似文献   

13.
就业工作是职业院校的一项重要工作,就业信息管理平台是学校就业指导、管理和服务的重要手段。就业信息管理平台应用效果的关键是招聘职位的数量和质量,其核心是能根据学生的就业期望,实现职位的有效推荐,同时,构建政府、企业、学校、学生之间快速沟通的渠道,提高就业指导和服务水平,提供就业调查反馈和就业统计分析功能,为专业人才培养的优化提供支撑。本文对就业信息管理平台的功能进行分析,利用软件工程的思想和统一建模工具,使用Web技术、爬虫技术、数据库技术、大数据处理技术和推荐算法对就业信息管理平台进行了设计与实现,实践表明,该平台能有效满足个性化就业需求,提高就业管理的质量和服务水平。  相似文献   

14.
Wang and Feng (IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 28, no. 5, p 846, May 2006) pointed out that the deduction in (Z. Lin and H. Y. Shum, IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 26, no. 1, pp. 83-97, Jan. 2004) overlooked the validity of the perturbation theorem used in (Z. Lin and H. Y. Shum, IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 26, no. 1, pp. 83-97, Jan. 2004). In this paper, we show that, when the perturbation theorem is invalid, the probability of successful superresolution is very low. Therefore, we only have to derive the limits under the condition that validates the perturbation theorem, as done in (Z. Lin and H. Y. Shum, IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 26, no. 1, pp. 83-97, Jan. 2004).  相似文献   

15.
A new method that exploits shape to localize the auroral oval in satellite imagery is introduced. The core of the method is driven by the linear least-squares (LLS) randomized Hough transform (RHT). The LLS-RHT is a new fast variant of the RHT suitable when not all necessary conditions of the RHT can be satisfied. The method is also compared with the three existing methods for aurora localization, namely the histogram-based k-means [C.C. Hung, G. Germany, K-means and iterative selection algorithms in image segmentation, IEEE Southeastcon 2003 (Session 1: Software Development)], adaptive thresholding [X. Li, R. Ramachandran, M. He, S. Movva, J.A. Rushing, S.J. Graves, W. Lyatsky, A. Tan, G.A. Germany, Comparing different thresholding algorithms for segmenting auroras, in: Proceedings of the International Conference on Information Technology: Coding and Computing, vol. 6, 2004, pp. 594-601], and pulse-coupled neural network-based [G.A. Germany, G.K. Parks, H. Ranganath, R. Elsen, P.G. Richards, W. Swift, J.F. Spann, M. Brittnacher, Analysis of auroral morphology: substorm precursor and onset on January 10, 1997, Geophys. Res. Lett. 25 (15) (1998) 3042-3046] methods. The methodologies and their performance on real image data are both considered in the comparison. These images include complications such as random noise, low contrast, and moderate levels of key obscuring phenomena.  相似文献   

16.
We present a method for efficiently providing algebraic correctness proofs for communication systems. It is described in the setting of μCRL [J.F. Groote, A. Ponse, The syntax and semantics of μCRL, in: A. Ponse, C. Verhoef, S.F.M. van Vlijmen (Eds.), Algebra of Communicating Processes, Workshops in Computing, Springer, Berlin, 1994, pp. 26–62] which is, roughly, ACP [J.C.M. Baeten, W.P. Weijland, Process Algebra, Cambridge Tracts in Theoretical Computer Science, vol. 18, Cambridge University Press, Cambridge 1990, J.A. Bergstra, J.W. Klop, The algebra of recursively defined processes and the algebra of regular processes, in: Proceedings of the 11th ICALP, Antwerp, Lecture Notes in Computer Science, vol. 172, Springer, Berlin, 1984, pp. 82–95] extended with a formal treatment of the interaction between data and processes. The method incorporates assertional methods, such as invariants and simulations, in an algebraic framework, and centers around the idea that the state spaces of distributed systems are structured as a number of cones with focus points. As a result, it reduces a large part of algebraic protocol verification to the checking of a number of elementary facts concerning data parameters occurring in implementation and specification. The resulting method has been applied to various non-trivial case studies of which a number have been verified mechanically with the theorem checker PVS. In this paper the strategy is illustrated by several small examples and one larger example, the Concurrent Alternating Bit Protocol (CABP).  相似文献   

17.
Book reviews     
CALCULUS AND THE COMPUTER (An Approach to problem solving) by T. V. Fossum and R. W. Gatterdam, 1980, pub. by Scott, Forrseman & Co., Glenview, Illinois, 217pp+Index. $6.95 (only U.S. Price available).

KNOWLEDGE BASED PROGRAM CONSTRUCTION, by David R. Barstow, The Computer Science Library, Programming Languages Series No. 6. North-Holland, 1979. $10.

NUMERICAL ANALYSIS OF SEMICONDUCTOR DEVICES. Proceedings of the NASECODE 1 Conference held at Trinity College, Dublin, from 27th-29th June 1979, edited by B.T. Browne and J. J. H. Miller, pub. by Boole Press Ltd., P.O. Box No. 5, 51 Sandycove Road, Dunlaoghaire, Co. Dublin, Ireland, August 1979, XII + 303 pages, Cloth £20 (U.S. $42) ISBN 0-906783-003.  相似文献   

18.
At iGrid 2005 we demonstrated the transparent operation of a biology experiment on a test-bed of globally distributed visualization, storage, computational, and network resources. These resources were bundled into a unified platform by utilizing dynamic lambda allocation, high bandwidth protocols for optical networks, a Distributed Virtual Computer (DVC) [N. Taesombut, A. Chien, Distributed Virtual Computer (DVC): Simplifying the development of high performance grid applications, in: Proceedings of the Workshop on Grids and Advanced Networks, GAN 04, Chicago, IL, April 2004 (held in conjunction with the IEEE Cluster Computing and the Grid (CCGrid2004) Conference)], and applications running over the Scalable Adaptive Graphics Environment (SAGE) [L. Renambot, A. Rao, R. Singh, B. Jeong, N. Krishnaprasad, V. Vishwanath, V. Chandrasekhar, N. Schwarz, A. Spale, C. Zhang, G. Goldman, J. Leigh, A. Johnson, SAGE: The Scalable Adaptive Graphics Environment, in: Proceedings of WACE 2004, 23–24 September 2004, Nice, France, 2004]. Using these layered technologies we ran a multi-scale correlated microscopy experiment [M.E. Maryann, T.J. Deerinck, N. Yamada, E. Bushong, H. Ellisman Mark, Correlated 3D light and electron microscopy: Use of high voltage electron microscopy and electron tomography for imaging large biological structures, Journal of Histotechnology 23 (3) (2000) 261–270], where biologists imaged samples with scales ranging from 20X to 5000X in progressively increasing magnification. This allows the scientists to zoom in from entire complex systems such as a rat cerebellum to individual spiny dendrites. The images used spanned multiple modalities of imaging and specimen preparation, thus providing context at every level and allowing the scientists to better understand the biological structures. This demonstration attempts to define an infrastructure based on OptIPuter components which would aid the development and design of collaborative scientific experiments, applications and test-beds and allow the biologists to effectively use the high resolution real estate of tiled displays.  相似文献   

19.
《Applied Soft Computing》2008,8(1):337-349
In many real-world applications of evolutionary algorithms, the fitness of an individual has to be derived using complex models and time-consuming computations. Especially in the case of multiple objective optimisation problems, the time needed to evaluate these individuals increases exponentially with the number of objectives due to the ‘curse of dimensionality’ [J. Chen, D.E. Goldberg, S. Ho, K. Sastry, Fitness inheritance in multi-objective optimization, in: W.B. Langdon et al. (Eds.), GECCO 2002: Proceedings of the Genetic and Evolutionary Computation Conference, July 9–13, Morgan Kaufmann Publishers, New York, 2002, pp. 319–326]. This in turn leads to a slower convergence of the evolutionary algorithms. It is not feasible to use time-consuming models with large population sizes unless the time to evaluate the objective functions is reduced. Fitness inheritance is an efficiency enhancement technique that was originally proposed by Smith et al. [R.E. Smith, B.A. Dike, S.A. Stegmann, Fitness inheritance in genetic algorithms, in: Proceedings of the 1995 ACM Symposium on Applied Computing, February 26–28, ACM, Nashville, TN, USA, 1995] to improve the performance of genetic algorithms. Sastry et al. [K. Sastry, D.E. Goldberg, M. Pelikan, Don’t evaluate, inherit, in: L. Spector et al. (Eds.), GECCO 2001: Proceedings of the Genetic and Evolutionary Computation Conference, Morgan Kaufmann Publishers, San Francisco, 2001, pp. 551–558] and Chen et al. [J. Chen, D.E. Goldberg, S. Ho, K. Sastry, Fitness inheritance in multi-objective optimization, in: W.B. Langdon et al. (Eds.), GECCO 2002: Proceedings of the Genetic and Evolutionary Computation Conference, July 9–13, Morgan Kaufmann Publishers, New York, 2002, pp. 319–326] have developed analytical models for fitness inheritance. In this paper, the usefulness of fitness inheritance for a set of popular and separable multiple objective test functions as well as a non-separable real-world problem is evaluated based on unary performance measures testing closeness to the Pareto-optimal front, uniform distribution along and extent of the obtained Pareto front. A statistical evaluation of the performance of an NSGA-II like algorithm on the basis of these unary performance measures suggests that especially for non-convex or non-continuous problems the use of fitness inheritance negatively affects the closeness to the Pareto-optimal front.  相似文献   

20.
Book Reviews     
Deborah Donovan 《EDPACS》2013,47(10):11-13
Abstract

EDP CONTROLS AND AUDITING, by W. Thomas Portex, Wadsworth Publishing Company (10 Davis Dr., Belmont, CA, 94002); 1974, 240 pages, $4.95. Reviewed by Donald L. Adams.

DATA PROCESSING SYSTEMS: THEIR PERFORMANCE, EVALUATION, MEASUREMENT, AND IMPROVEMENT, by Saul Stimler; (Motivational Learning Programs, Inc., Stimler Associates, 33 W. Second St., Moorestown, NJ, 08057); 1974, 183 pages, $15.00. Reviewed by Donald L. Adams.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号