首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
How to efficiently handle uncertain information is still an open issue. In this paper, a new method to deal with uncertain information, named as two-dimensional belief function (TDBF), is presented. A TDBF has two components, T = (), both and are classical belief functions, while is a measure of reliable of . The definition of TDBF and the discounting algorithm are proposed. Compared with the classical discounting model, the proposed TDBF is more flexible and reasonable. Numerical examples are used to show the efficiency and application of the proposed method.  相似文献   

2.
In recent years, many researchers have been using CPU for quantum computing simulation. However, in reality, the simulation efficiency of the large-scale simulator is low on a single node. Therefore, striving to improve the simulator efficiency on a single node has become a serious challenge that many researchers need to solve. After many experiments, we found that much computational redundancy and frequent memory access are important factors that hinder the efficient operation of the CPU. This paper proposes a new powerful and simple quantum computing simulator: PAS (power and simple). Compared with existing simulators, PAS introduces four novel optimization methods: efficient hybrid vectorization, fast bitwise operation, memory access filtering, and quantum tracking. In the experiment, we tested the QFT (quantum Fourier transform) and RQC (random quantum circuits) of 21 to 30 qubits and selected the state-of-the-art simulator QuEST (quantum exact simulation toolkit) as the benchmark. After experiments, we have concluded that PAS compared with QuEST can achieve a mean speedup of (QFT), (RQC) (up to , ) on the Intel Xeon E5-2670 v3 CPU.  相似文献   

3.
4.
5.
The constrained shortest path tour problem (CSPTP) is an NP‐hard combinatorial optimization problem defined on a connected directed graph , where V is the set of nodes and A is the set of nonnegative weighted arcs. Given two distinct nodes , an integer value , and node disjoint subsets , , the CSPTP aims at finding the shortest trail from s to t while visiting at least one node in every subset , in this order. In this paper, we perform a comparative analysis between two integer programming (IP) models for the problem. We also propose valid inequalities and a Lagrangian‐based heuristic framework. Branch‐and‐bound algorithms from the literature, as well as a metaheuristic approach, are used for comparison. Extensive computational experiments carried out on benchmark data sets show the effective use of valid inequalities and the quality of bounds obtained by the Lagrangian framework. Because benchmark instances do not require a great computational effort of IP models in the sense that their optimality is reached at the root node of the CPLEX branch‐and‐cut search tree, we introduce new challenging CSPTP instances for which our solution approaches outperform existing ones for the problem.  相似文献   

6.
7.
8.
Vulnerability metrics play a key role in the understanding of cascading failures and target/random attacks to a network. The graph fragmentation problem (GFP) is the result of a worst‐case analysis of a random attack. We can choose a fixed number of individuals for protection, and a nonprotected target node immediately destroys all reachable nodes. The goal is to minimize the expected number of destroyed nodes in the network. In this paper, we address the GFP by several approaches: metaheuristics, approximation algorithms, polytime methods for specific instances, and exact methods for small instances. The computational complexity of the GFP is included in our analysis, where we formally prove that the corresponding decision version of the problem is ‐complete. Furthermore, a strong inapproximability result holds: there is no polynomial approximation algorithm with factor lower than 5/3, unless . This promotes the study of specific instances of the problem for tractability and/or exact methods in exponential time. As a synthesis, we propose new vulnerability/connectivity metrics and an interplay with game theory using a closely related combinatorial problem called component order connectivity.  相似文献   

9.
10.
Cover Image: Background image © Mopic - Fotolia.com. Cover design by SCHULZ Grafik-Design.

  相似文献   


11.
12.
Cognition, Technology & Work - A correction to this paper has been published: https://doi.org/10.1007/s10111-021-00676-x  相似文献   

13.
14.
The basic principles of meta-modelling are now well established for individual models. Activities such as the MOF QVT [QVT-Merge Group, “Revised submission for MOF 2.0 Query/Views/Transformations RFP (ad/2002-04-10)”, OMG Document ad/04-04-01, URL: http://www.omg.org/docs/ad/04-04-01.pdf] are now extending these principles to transformation between models. However, meta-model incompatibilities between transformations reduce opportunities for effective re-use, hindering wide scale adoption. We introduce a pattern, the Side Transformation Pattern, that arises naturally as transformations are made re-usable, and present a series of examples that show how its use can bring clarity and robustness to complex transformation problems.  相似文献   

15.
16.
Cover Image: Concept of proteomic evidence based personalized medicine: Individual disease progression can be timely monitored using clinical proteomic approaches. Network biology integrating computer learning andmodeling techniques will lead to intelligent patient stratifications. Such proteomic evidence based medical approach holds great promise in deriving sustainable therapies tailored for each individual patient. Cover image provided by Dr. Lei Mao, Dept. of Life Science Engineering, University of Applied Sciences, Berlin (Germany); cover design by SCHULZ Grafik-Design. Note: The protein network was generated using Cytoscape. 2DE gel image originated from the authors' own work (Mao et al., 2007 PlosOne, DOI: 10.1371/journal.pone.0001218 ).

  相似文献   


17.
18.
19.

Economists have been aware of the mapping between an Input-Output (I-O, hereinafter) table and the adjacency matrix of a weighted digraph for several decades (Solow, Econometrica 20(1):29–46, 1952). An I-O table may be interpreted as a network in which edges measure money flows to purchase inputs that go into production, whilst vertices represent economic industries. However, only recently the language and concepts of complex networks (Newman 2010) have been more intensively applied to the study of interindustry relations (McNerney et al. Physica A Stat Mech Appl, 392(24):6427–6441, 2013). The aim of this paper is to study sectoral vulnerabilities in I-O networks, by connecting the formal structure of a closed I-O model (Leontief, Rev Econ Stat, 19(3):109–132, 1937) to the constituent elements of an ergodic, regular Markov chain (Kemeny and Snell 1976) and its chance process specification as a random walk on a graph. We provide an economic interpretation to a local, sector-specific vulnerability index based on mean first passage times, computed by means of the Moore-Penrose inverse of the asymmetric graph Laplacian (Boley et al. Linear Algebra Appl, 435(2):224–242, 2011). Traversing from the most central to the most peripheral sector of the economy in 60 countries between 2005 and 2015, we uncover cross-country salient roles for certain industries, pervasive features of structural change and (dis)similarities between national economies, in terms of their sectoral vulnerabilities.

  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号