首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
We present a study of Probability Collectives Multi-agent Systems (PCMAS) for combinational optimization problems in Biology. This framework for distributed optimization is deeply connected with both game theory and statistical physics. In contrast to traditional biologically-inspired algorithms, Probability-Collectives (PC) based methods do not update populations of solutions; instead, they update an explicitly parameterized probability distribution p over the space of solutions by a collective of agents. That updating of p arises as the optimization of a functional of p. The functional is chosen so that any p that optimizes it should be p peaked about good solutions. In this paper we demonstrate PCMAS as a promising combinational optimization method for biological network construction. This computational approach to response networks enables robust prediction of activated crucial sub-networks in biological systems under the presence of specific drugs, thereby facilitating the identification of important nodes for potential drug targets and furthering hypotheses about biological and medical problems on a systems level. The application of PCMAS in this context therefore sheds light on how this multi-agent learning methodology advances the current state of research in agent-based models for combinational optimization problems in Biology.  相似文献   

2.
Understanding Science Through the Computational Lens   总被引:1,自引:1,他引:0       下载免费PDF全文
This article explores the changing nature of the interaction between computer science and the natural and social sciences.After briefly tracing the history of scientific computation,the article presents the concept of computational lens,a metaphor for a new relationship that is emerging between the world of computation and the world of the sciences.Our main thesis is that,in many scientific fields,the processes being studied can be viewed as computational in nature,in the sense that the processes perform dynamic transformations on information represented as digital data.Viewing natural or engineered systems through the lens of their computational requirements or capabilities provides new insights and ways of thinking.A number of examples are discussed in support of this thesis.The examples are from various fields,including quantum computing,statistical physics,the World Wide Web and the Internet,mathematics,and computational molecular biology.  相似文献   

3.
The purpose of this article is to provide a starter kit for multicanonical simulations in statistical physics. Fortran code for the q-state Potts model in d=2,3,… dimensions can be downloaded from the Web and this paper describes simulation results, which are in all details reproducible by running prepared programs. To allow for comparison with exact results, the internal energy, the specific heat, the free energy and the entropy are calculated for the d=2 Ising (q=2) and the q=10 Potts model. Analysis programs, relying on an all-log jackknife technique, which is suitable for handling sums of very large numbers, are introduced to calculate our final estimators.  相似文献   

4.
Efficient qualitative simulators are crucial to continued progress in qualitative physics. Assumption-based truth maintenance systems (ATMS) were developed in part to simplify writing such programs. This paper identifies several abstractions for organizing ATMS-based problem-solvers which are especially useful for envisioning. In particular, we describe the many-worlds database, which avoids complex temporal reference schemes; how to organize problem-solving into justify/assume/interpret cycles which successively construct and extend partial solutions; and closed-world tables, which provide a mechanism for making closed-world assumptions. We sketch the design of the Qualitative Process Engine, QPE, an implementation of Qualitative Process theory, to illustrate the utility of these abstractions. On the basis of our experience in developing QPE and analysing its performance, we draw some general conclusions about the advantages and disadvantages of assumption-based truth maintenance systems.  相似文献   

5.
6.
7.
8.
Theories of qualitative physics are crucial in developing knowledge-based systems in engineering. Basic models for diagnostic reasoning are based on causal connections between the system parameters. In this paper a knowledge representation and problem solving technique is presented as a ground-laying step. The technique is based on the concepts introduced by Iwasaki and Simon9 and employs the methods of analysis from the field of control engineering. The causal analysis of systems with feedback is done by using knowledge about their block diagrams. A constraint representing the feed-forward path is modified in order to eliminate the feedback parameter. This approach supports the confluence heuristic in de Kleer and Brown's qualitative physics4. Causal dependencies that partially describe the behaviour of a system are used to generate a search space for faulty components. This approach is based on reasoning about counterfactuals using modal categorizations, as proposed by Nicolas Rescher and Herbert Simon12. The scope of application of this method in real-time monitoring and diagnosis of large industrial processes is discussed.  相似文献   

9.
Encryption in wireless communication systems is an extremely important factor to protect information and prevent fraud. In this paper, we propose a new encryption system for use in stream cipher applications. The design proposed is intended for hardware implementation and based on (n+1) feedback shift registers interconnected in such a way that one register controls the clocking of the other n registers. The aim of this construction is to allow the production of a large family of distinct keystreams when the initial states and feedback functions of the feedback shift registers are unchanged. The produced keystreams are shown to possess the basic security requirements for cryptographic sequences such as long period, high linear complexity and good statistical properties, provided that suitable parameters are chosen. Furthermore, the design is shown to resist various types of cryptanalytic attacks. These characteristics and properties enhance its use as a suitable encryption system for stream cipher applications.  相似文献   

10.
This paper describes the architecture, the development and the implementation of Janus II, a new generation application-driven number cruncher optimized for Monte Carlo simulations of spin systems (mainly spin glasses). This domain of computational physics is a recognized grand challenge of high-performance computing: the resources necessary to study in detail theoretical models that can make contact with experimental data are by far beyond those available using commodity computer systems. On the other hand, several specific features of the associated algorithms suggest that unconventional computer architectures–that can be implemented with available electronics technologies–may lead to order of magnitude increases in performance, reducing to acceptable values on human scales the time needed to carry out simulation campaigns that would take centuries on commercially available machines. Janus II is one such machine, recently developed and commissioned, that builds upon and improves on the successful JANUS machine, which has been used for physics since 2008 and is still in operation today. This paper describes in detail the motivations behind the project, the computational requirements, the architecture and the implementation of this new machine and compares its expected performances with those of currently available commercial systems.  相似文献   

11.
This paper investigates the H∞ filtering problem for a class of linear continuous-time systems with both time delay and saturation. Such systems have time delay in their state equations and saturation in their output equations, and their process and measurement noises have unknown statistical characteristics and bounded energies. Based on the Lyapunov-Krasovskii stability theorem and the linear matrix inequalities (LMIs) technique, a generalized dynamic filter architecture is proposed, and a filter design method is developed. The linear H∞ filter designed by the method can guarantee the H∞ performance. The parameters of the designed filter can be obtained by solving a kind of LMI. An illustrative example shows that the design method proposed in this paper is very effective.  相似文献   

12.
13.
Generally, phenomena of spontaneous pattern formation are random and repetitive, whereas elaborate devices are the deterministic product of human design. Yet, biological organisms and collective insect constructions are exceptional examples of complex systems (CS) that are both architectured and self-organized. Can we understand their precise self-formation capabilities and integrate them with technological planning? Can physical systems be endowed with information, or informational systems be embedded in physics, to create autonomous morphologies and functions? To answer these questions, we have launched in 2009, and developed through a series of workshops and a collective book, a new field of research called morphogenetic engineering. It is the first initiative of its kind to rally and promote models and implementations of complex self-architecturing systems. Particular emphasis is set on the programmability and computational abilities of self-organization, properties that are often underappreciated in CS science—while, conversely, the benefits of self-organization are often underappreciated in engineering methodologies. [This paper is an extended version of Doursat, Sayama and Michel (2012b) (Chapter 1, in Doursat R et al. (eds.) Morphogenetic engineering: toward programmable complex systems. Understanding complex systems. Springer, 2012a).]  相似文献   

14.
Fractional Brownian motion (fBm) emerged as a useful model for self-similar and long-range dependent aggregate Internet traffic. Asymptotic, respectively, approximate performance measures are known for single queueing systems with fBm through traffic. In this paper end-to-end performance bounds for a through flow in a network of tandem queues under open-loop fBm cross traffic are derived. To this end, a rigorous sample path envelope for fBm is proven that complements previous approximate results. The sample path envelope and the concept of leftover service curves are employed to model the remaining service after scheduling fBm cross traffic at a queuing system. Using composition results for tandem systems from the stochastic network calculus end-to-end statistical performance bounds for individual flows in networks under fBm cross traffic are derived. The discovery is that these bounds grow in O(n(logn)1/(2-2H)) for n systems in series where H is the Hurst parameter of the cross traffic. Explicit results on the impact of the variability and the burstiness of through and cross traffic on network performance are shown. Our analysis has direct implications on fundamental questions in network planning and service management.  相似文献   

15.
We present a methodology for learning a taxonomy from a set of text documents that each describes one concept. The taxonomy is obtained by clustering the concept definition documents with a hierarchical approach to the Self-Organizing Map. In this study, we compare three different feature extraction approaches with varying degree of language independence. The feature extraction schemes include fuzzy logic-based feature weighting and selection, statistical keyphrase extraction, and the traditional tf-idf weighting scheme. The experiments are conducted for English, Finnish, and Spanish. The results show that while the rule-based fuzzy logic systems have an advantage in automatic taxonomy learning, taxonomies can also be constructed with tolerable results using statistical methods without domain- or style-specific knowledge.  相似文献   

16.
Experimental multi-objective Quantum Control is an emerging topic within the broad physics and chemistry applications domain of controlling quantum phenomena. This realm offers cutting edge ultrafast laser laboratory applications, which pose multiple objectives, noise, and possibly constraints on the high-dimensional search. In this study we introduce the topic of multi-observable quantum control (MOQC), and consider specific systems to be Pareto optimized subject to uncertainty, either experimentally or by means of simulated systems. The latter include a family of mathematical test-functions with a practical link to MOQC experiments, which are introduced here for the first time. We investigate the behavior of the multi-objective version of the covariance aatrix adaptation evolution strategy (MO-CMA-ES) and assess its performance on computer simulations as well as on laboratory closed-loop experiments. Overall, we propose a comprehensive study on experimental evolutionary Pareto optimization in high-dimensional continuous domains, draw some practical conclusions concerning the impact of fitness disturbance on algorithmic behavior, and raise several theoretical issues in the broad evolutionary multi-objective context.  相似文献   

17.
We treat the problem of evaluating the statistical characteristics at the output signal of automatic control systems containing non-linear elements with zero memory excited by a random input with a normal distribution.

In this paper, in order to analyse the statistical characteristics we propose the statistical linearization technique determined by the condition that the variances at the output signal of a true nonlinear control system and of an equivalent linearized one are equal. The several examples are presented to illustrate the proposed method  相似文献   

18.
A new computer system with an entirely new processor design is described and demonstrated on a very small trial lattice. The new computer simulates systems of differential equations of the order of 104 times faster than present day computers and we describe how the machine can be applied to lattice models in theoretical physics. A brief discussion is also given of the various mathematical approaches for studying a lattice model. We used the computer on the X - Y model. In an actual QCD program an improved computer of such a kind is designed to be 102 times faster than ordinary machines.  相似文献   

19.
Modern particle physics experiments observing collisions of particle beams generate large amounts of data. Complex trigger and data acquisition systems are built to select on-line the most interesting events and write them to persistent storage. The final stage of this selection process nowadays often happens on large computer clusters. The stable and reliable operation of such event filter clusters is critical for the success of these experiments. Operating the event filter cluster must ensure dead time free processing of large amount of data, requiring 24-h continuous status monitoring of each processing node, and fast detection and problem solving. Ideally, problems should be recognized before they deteriorate the system performance. The process control of the event filter cluster is performed exclusively by a human operator, placing high demands difficult to accomplish. In this paper, a hybrid system based on expert systems technology and statistical tools and methods is proposed to address this issue. The system is built upon a scalable modular architecture and a design overview is given. The proposed hybrid system is designed and tested in a real environment, with an event filter cluster prototype based on the architecture of the Compact Muon Solenoid experiment at CERN. The system test results with an analysis are provided. Finally, the future possibilities are discussed.  相似文献   

20.
This paper studies the statistical behavior of errors involved in fundamental geometric computations. We first present a statistical model of noise in terms of the covariance matrix of the N-vector. Using this model, we compute the covariance matrices of N-vectors of lines and their intersections. Then, we determine the optimal weights for the least-squares optimization and compute the covariance matrix of the resulting optimal estimate. The result is then applied to line fitting to edges and computation of vanishing points and focuses of expansion. We also point out that statistical biases exist in such computations and present a scheme called renormalization, which iteratively removes the bias by automatically adjusting to noise without knowing noise characteristics. Random number simulations are conducted to confirm our analysis.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号