首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In this paper we report on LCG Monte-Carlo Data Base (MCDB) and software which has been developed to operate MCDB. The main purpose of the LCG MCDB project is to provide a storage and documentation system for sophisticated event samples simulated for the LHC Collaborations by experts. In many cases, the modern Monte-Carlo simulation of physical processes requires expert knowledge in Monte-Carlo generators or significant amount of CPU time to produce the events. MCDB is a knowledgebase mainly dedicated to accumulate simulated events of this type. The main motivation behind LCG MCDB is to make the sophisticated MC event samples available for various physical groups. All the data from MCDB is accessible in several convenient ways. LCG MCDB is being developed within the CERN LCG Application Area Simulation project.

Program summary

Program title: LCG Monte-Carlo Data BaseCatalogue identifier: ADZX_v1_0Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADZX_v1_0.htmlProgram obtainable from: CPC Program Library, Queen's University, Belfast, N. IrelandLicensing provisions: GNU General Public LicenceNo. of lines in distributed program, including test data, etc.: 30 129No. of bytes in distributed program, including test data, etc.: 216 943Distribution format: tar.gzProgramming language: PerlComputer: CPU: Intel Pentium 4, RAM: 1 Gb, HDD: 100 GbOperating system: Scientific Linux CERN 3/4RAM: 1 073 741 824 bytes (1 Gb)Classification: 9External routines:
perl >= 5.8.5;
Perl modules
DBD-mysql >= 2.9004,
File::Basename,
GD::SecurityImage,
GD::SecurityImage::AC,
Linux::Statistics,
XML::LibXML > 1.6,
XML::SAX,
XML::NamespaceSupport;
Apache HTTP Server >= 2.0.59;
mod auth external >= 2.2.9;
edg-utils-system RPM package;
gd >= 2.0.28;
rpm package CASTOR-client >= 2.1.2-4;
arc-server (optional)
Nature of problem: Often, different groups of experimentalists prepare similar samples of particle collision events or turn to the same group of authors of Monte-Carlo (MC) generators to prepare the events. For example, the same MC samples of Standard Model (SM) processes can be employed for the investigations either in the SM analyses (as a signal) or in searches for new phenomena in Beyond Standard Model analyses (as a background). If the samples are made available publicly and equipped with corresponding and comprehensive documentation, it can speed up cross checks of the samples themselves and physical models applied. Some event samples require a lot of computing resources for preparation. So, a central storage of the samples prevents possible waste of researcher time and computing resources, which can be used to prepare the same events many times.Solution method: Creation of a special knowledgebase (MCDB) designed to keep event samples for the LHC experimental and phenomenological community. The knowledgebase is realized as a separate web-server (http://mcdb.cern.ch). All event samples are kept on types at CERN. Documentation describing the events is the main contents of MCDB. Users can browse the knowledgebase, read and comment articles (documentation), and download event samples. Authors can upload new event samples, create new articles, and edit own articles.Restrictions: The software is adopted to solve the problems, described in the article and there are no any additional restrictions.Unusual features: The software provides a framework to store and document large files with flexible authentication and authorization system. Different external storages with large capacity can be used to keep the files. The WEB Content Management System provides all of the necessary interfaces for the authors of the files, end-users and administrators.Running time: Real time operations.References:[1] The main LCG MCDB server, http://mcdb.cern.ch/.[2] P. Bartalini, L. Dudko, A. Kryukov, I.V. Selyuzhenkov, A. Sherstnev, A. Vologdin, LCG Monte-Carlo data base, hep-ph/0404241.[3] J.P. Baud, B. Couturier, C. Curran, J.D. Durand, E. Knezo, S. Occhetti, O. Barring, CASTOR: status and evolution, cs.oh/0305047.  相似文献   

2.
Consideration was given to control of a complex object whose motion obeys a multivariable nonlinear nonstationary mathematical model. Rigid constraints were imposed on the object’s dynamic precision. The paper considered computer-aided generation of the current equations of object motion with regard for the actuators which differ from subsystem to subsystem. The object is controlled adaptively with regard for the computer-based realization. The algorithms of control system operation that maintain the guaranteed precision of object motion were constructed. Conditions for problem solvability were formulated. The freeflying space robot was discussed by way of example.  相似文献   

3.
Barak and Lindell showed that there exist constant-round zero-knowledge arguments of knowledge with strict polynomial-time extractors.This leaves the open problem of whether it is possible to obtain an analogous result regarding constant-round zero-knowledge proofs of knowledge for NP.This paper focuses on this problem and gives a positive answer by presenting a construction of constant-round zero-knowledge proofs of knowledge with strict polynomial-time extractors for NP.  相似文献   

4.
In the field of experimental data acquisition and evaluation need rises for using some kind of “expert system” in order to provide support for sophisticated instruments and data evaluation applications. Different external expert system shells served as the basis for previous attempts to develop an expert system for such goals in the X-ray Photoelectron Spectroscopy (XPS). The paper presents a simple reasoning expert system engine, which can be built directly into data acquisition and evaluation software. Some problems arising due to the lack of human intelligence in the inferencing process are also discussed. The feasibility of the realized system is demonstrated through implementing a real-life rule set, an example (the carbon contamination rules) taken from the field of XPS. Apart from the field-specific rules, the package can be used on any field.  相似文献   

5.
6.
The Particle Flow Analysis (PFA) is currently under intense studies as the most promising way to achieve precision jet energy measurements required at the future linear e+e collider. In order to optimize detector configurations and to tune up the PFA it is crucial to identify factors that limit the PFA performance and clarify the fundamental limits on the jet energy resolution that remain even with the perfect PFA and an infinitely granular calorimeter. This necessitates a tool to connect each calorimeter hit in particle showers to its parent charged track, if any, and eventually all the way back to its corresponding primary particle, while identifying possible interactions and decays along the way. In order to realize this with a realistic memory space, we have developed a set of C++ classes that facilitates history keeping of particle tracks within the framework of Geant4. This software tool, hereafter called J4HistoryKeeper, comes in handy in particular when one needs to stop this history keeping for memory space economy at multiple geometrical boundaries beyond which a particle shower is expected to start. In this paper this software tool is described and applied to a generic detector model to demonstrate its functionality.  相似文献   

7.
An open source software system called GaussDal for management of results from quantum chemical computations is presented. Chemical data contained in output files from different quantum chemical programs are automatically extracted and incorporated into a relational database (PostgreSQL). The Structural Query Language (SQL) is used to extract combinations of chemical properties (e.g., molecules, orbitals, thermo-chemical properties, basis sets etc.) into data tables for further data analysis, processing and visualization. This type of data management is particularly suited for projects involving a large number of molecules. In the current version of GaussDal, parsers for Gaussian and Dalton output files are supported, however future versions may also include parsers for other quantum chemical programs.For visualization and analysis of generated data tables from GaussDal we have used the locally developed open source software SciCraft.

Program summary

Title of program: GaussDalCatalogue identifier: ADVTProgram summary URL:http://cpc.cs.qub.ac.uk/summaries/ADVTProgram obtainable from: CPC Program Library, Queen's University of Belfast, N. IrelandComputers: AnyOperating system under which the system has been tested: LinuxProgramming language used: PythonMemory required to execute with typical data: 256 MBNo. of bits in word: 32 or 64No. of processors used: 1Has the code been vectorized or parallelized?: NoNo. of lines in distributed program, including test data, etc: 543 531No. of bytes in distribution program, including test data, etc: 7 718 121Distribution format: tar.gzip fileNature of physical problem: Handling of large amounts of data from quantum chemistry computations.Method of solution: Use of SQL based database and quantum chemistry software specific parsers.Restriction on the complexity of the problem: Program is currently limited to Gaussian and Dalton output, but expandable to other formats. Generates subsets of multiple data tables from output files.  相似文献   

8.
Consideration was given to the design of the top-level system of the nuclear power-plant process control system with regard for the impact on its safety. This system is aimed at centralizing process instrumentation and control. The top-level system is a sophisticated hardware complex supported by the computer-aided design and adjustment system and intended for integrating all subsystems of the process control system. The top-level system executes the informational, control, service, and auxiliary functions of the nuclear power-plant process control system. Consideration was given to the structure of the top-level system, the structure of its software, and the principles of design providing high performance of the top-level system.  相似文献   

9.
10.
11.
多媒体智能数据库系统MIDS/BUAA的总体设计   总被引:1,自引:0,他引:1  
1.引言近二十年内,传统数据库(特别是RDB)技术获得了极大发展和广泛应用。但是,自八十年代以来,随着硬件造价的持续下降,人们对数据库提出了更多和更高的需(要)求,促  相似文献   

12.
A library for reading and writing data in the SUSY Les Houches Accord 2 format is presented. The implementation is in native Fortran 77. The data are contained in a single array conveniently indexed by preprocessor statements.

Program summary

Program title: SLHA2LibCatalogue identifier: AEDY_v1_0Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEDY_v1_0.htmlProgram obtainable from: CPC Program Library, Queen's University, Belfast, N. IrelandLicensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.htmlNo. of lines in distributed program, including test data, etc.: 7550No. of bytes in distributed program, including test data, etc.: 160 123Distribution format: tar.gzProgramming language: FortranComputer: For the build process, a Fortran 77 compiler in a Unixish environment (make, shell) are requiredOperating system: Linux, Mac OS, Windows (Cygwin), Tru64 UnixRAM: The SLHA Record is currently 88 944 bytes longClassification: 4.14, 11.6Nature of problem: Exchange of SUSY parameters and decay information in an ASCII file format.Solution method: The SLHA2Lib provides routines for reading and writing files in the SUSY Les Houches Accord 2 format, a common interchange format for SUSY parameters and decay data.Restrictions: The fixed-sized array that holds the SLHA2 data necessarily limits the amount of decay data that can be stored. This limit can be enlarged by editing and re-running the SLHA2.m program.Unusual features: Data are transported in a single “double complex” array in Fortran, indexed through preprocessor macros. This is about the simplest conceivable container and needs neither dynamic memory allocation nor Fortran extension like structures.Running time: Both reading and writing a SLHA file are typically in the range of a few milliseconds.  相似文献   

13.
14.
Performance of programming approaches and languages used for the development of software codes for numerical simulation of granular material dynamics by the discrete element method (DEM) is investigated. The granular material considered represents a space filled with discrete spherical visco-elastic particles, and the behaviour of material under imposed conditions is simulated using the DEM. The object-oriented programming approach (implemented via C++) was compared with the procedural approach (using FORTRAN 90 and OBJECT PASCAL) in order to test their efficiency. The identical neighbour-searching algorithm, contact forces model and time integration method were implemented in all versions of codes.Two identical representative examples of the dynamic behaviour of granular material on a personal computer (compatible with IBM PC) were solved. The results show that software based on procedural approach runs faster in compare with software based on OOP, and software developed by FORTRAN 90 runs faster in compare with software developed by OBJECT PASCAL.  相似文献   

15.
先进控制软件平台数据库的设计与实现   总被引:3,自引:2,他引:3  
介绍了先进控制软件平台中关系数据库和实时数据库的设计思路的实现方法,采用关系数据库来对变量,参数信息和历史数据进行管理和操作,并用MicrosoftSQL Server7.0来设计实现,同时开发了微机上的基于Windows环境的实时数据库,包含控制律参数,模型参数,观测器参数,系统命令及系统状态和过程变量信息等内容,二者均已成功地应用在基于Windows环境的先进控制软件平台中,为先进控制系统软件的开发和算法的仿真提供了有力的保证,同时保障了数据的统一性,独立性和可靠性。  相似文献   

16.
为了更好地利用 CERN 数据管理与信息共享系统技术平台为广大科研人员提供 CERN 生态学数据资源服务,CERN 需要不断完善平台性能,其中包括提高用户搜索 CERN 数据资源的效率和可靠性.本文分析了导航式搜索、主题式搜索、关键词搜索等三种不同检索方式的优缺点,着重讨论了在关键词搜索方式中,如何引入叙词表的技术来提高检索结果的查全率、查准率和响应速度.本文介绍了叙词表的概念与 CERN 生态学叙词表的构建方法,以及如何将开源的叙词表管理系统 TemaTres 进行汉化,包括关键词浏览功能、关键词扩展功能、关键词自动填完功能、利用扩展后的关键词去搜索 CERN 生态学数据资源元数据功能的汉化实现过程.通过建设并运行 TemaTres 汉化版叙词表管理信息系统,增强了 CERN 生态学元数据中关键词编撰的可控性和规范性,并且在 CERN 数据资源元数据检索中引入了关键词之间的某些简单的语义关系,比如等级关系、等同关系 (即同义词)、相关关系,从而改善了搜索效率,同时为下一步构建生态学本体打下良好基础.  相似文献   

17.
In the paper we present compact library for analysis of nuclear spectra. The library consists of sophisticated functions for background elimination, smoothing, peak searching, deconvolution, and peak fitting. The functions can process one- and two-dimensional spectra. The software described in the paper comprises a number of conventional as well as newly developed methods needed to analyze experimental data.

Program summary

Program title: SpecAnalysLib 1.1Catalogue identifier: AEDZ_v1_0Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEDZ_v1_0.htmlProgram obtainable from: CPC Program Library, Queen's University, Belfast, N. IrelandLicensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.htmlNo. of lines in distributed program, including test data, etc.: 42 154No. of bytes in distributed program, including test data, etc.: 2 379 437Distribution format: tar.gzProgramming language: C++Computer: Pentium 3 PC 2.4 GHz or higher, Borland C++ Builder v. 6. A precompiled Windows version is included in the distribution packageOperating system: Windows 32 bit versionsRAM: 10 MBWord size: 32 bitsClassification: 17.6Nature of problem: The demand for advanced highly effective experimental data analysis functions is enormous. The library package represents one approach to give the physicists the possibility to use the advanced routines simply by calling them from their own programs. SpecAnalysLib is a collection of functions for analysis of one- and two-parameter γ-ray spectra, but they can be used for other types of data as well. The library consists of sophisticated functions for background elimination, smoothing, peak searching, deconvolution, and peak fitting.Solution method: The algorithms of background estimation are based on Sensitive Non-linear Iterative Peak (SNIP) clipping algorithm. The smoothing algorithms are based on the convolution of the original data with several types of filters and algorithms based on discrete Markov chains. The peak searching algorithms use the smoothed second differences and they can search for peaks of general form. The deconvolution (decomposition - unfolding) functions use the Gold iterative algorithm, its improved high resolution version and Richardson-Lucy algorithm. In the algorithms of peak fitting we have implemented two approaches. The first one is based on the algorithm without matrix inversion - AWMI algorithm. It allows it to fit large blocks of data and large number of parameters. The other one is based on the calculation of the system of linear equations using Stiefel-Hestens method. It converges faster than the AWMI, however it is not suitable for fitting large number of parameters.Restrictions: Dimensionality of the analyzed data is limited to two.Unusual features: Dynamically loadable library (DLL) of processing functions users can call from their own programs.Running time: Most processing routines execute interactively or in a few seconds. Computationally intensive routines (deconvolution, fitting) execute longer, depending on the number of iterations specified and volume of the processed data.  相似文献   

18.
The detection, in a modern interferometric detector like Virgo, of a gravitational wave signal from a coalescing binary stellar system is an intensive computational task both for the on-line and off-line computer systems. A parallel computing scheme using the Message Passing Interface (MPI) is described. Performance results on a small scale cluster are reported.  相似文献   

19.
Visualization in the spherical geometry is ubiquitous in geophysical data processing. For the spherical visualization, the commonly used spherical polar coordinate system is not ideal due to its grid convergence nature near the poles. We propose to use a spherical overset grid system called Yin-Yang grid as the base grid system of the spherical visualization. The convergence-free nature of the Yin-Yang grid leads to a balanced data distribution and effective visualization processing in a sphere. The Yin-Yang grid is already used in various geophysical simulations including the geodynamo and mantle convection in the spherical geometry. Data produced by the Yin-Yang grid can be, and should be, visualized directly on the same Yin-Yang grid system without any data remapping. Since the component grid of the Yin-Yang grid is a part of (or low latitude region of) the standard spherical polar coordinate system, it is straightforward to convert an existing spherical visualization tool based on the spherical polar coordinates into a tool based on the Yin-Yang grid.  相似文献   

20.
This paper discusses the concept, application, and usefulness of software design patterns for scientific programming in Fortran 90/95. An example from the discipline of object-oriented design patterns, that of a game based on navigation through a maze, is used to describe how some important patterns can be implemented in Fortran 90/95 and how the progressive introduction of design patterns can usefully restructure Fortran software as it evolves. This example is complemented by a discussion of how design patterns have been used in a real-life simulation of Particle-in-Cell plasma physics. The following patterns are mentioned in this paper: Factory, Strategy, Template, Abstract Factory and Facade.

Program summary

Program title: mazev1, mazev2, mazev3Catalogue identifier: AEAI_v1_0Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEAI_v1_0.htmlProgram obtainable from: CPC Program Library, Queen's University, Belfast, N. IrelandLicensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.htmlNo. of lines in distributed program, including test data, etc.: 1958No. of bytes in distributed program, including test data, etc.: 17 100Distribution format: tar.gzProgramming language: Fortran 95Computer: PC/MacOperating system: Unix/Linux/Mac (FreeBSD)/Windows (Cygwin)RAM: These are interactive programs with small (KB) memory requirementsClassification: 6.5, 20Nature of problem: A sequence of programs which demonstrate the use of object oriented design patterns for the restructuring of Fortran 90/95 software. The programs implement a simple maze game similar to that described in [1].Solution method: Restructuring uses versions of the Template, Strategy and Factory design patterns.Running time: Interactive.References:
[1] 
E. Gamma, R. Helm, R. Johnson, J. Vlissides, Design Patterns: Elements of Reusable Object Oriented Software, Addison-Wesley, 1995, ISBN 0201633612.
  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号