首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   67篇
  免费   0篇
化学工业   2篇
能源动力   1篇
无线电   1篇
一般工业技术   5篇
冶金工业   14篇
自动化技术   44篇
  2022年   1篇
  2017年   3篇
  2016年   4篇
  2013年   2篇
  2012年   6篇
  2011年   1篇
  2010年   2篇
  2009年   3篇
  2008年   1篇
  2007年   2篇
  2006年   2篇
  2005年   1篇
  2004年   1篇
  2003年   2篇
  2002年   3篇
  2001年   4篇
  1998年   5篇
  1997年   3篇
  1996年   2篇
  1995年   2篇
  1993年   3篇
  1992年   2篇
  1988年   1篇
  1986年   1篇
  1985年   1篇
  1984年   1篇
  1983年   2篇
  1982年   2篇
  1981年   1篇
  1980年   1篇
  1976年   2篇
排序方式: 共有67条查询结果,搜索用时 15 毫秒
51.
The incidence of malignant melanoma (MM) in orthodox and non-orthodox Jews in Israel during the years 1970-1982 was compared through an assessment of incidence rates in the orthodox and non-orthodox neighborhoods in the city of Jerusalem and by a comparison of the rates in 2 cities on the outskirts of Tel Aviv, one of which has a strongly orthodox population. Within the city of Jerusalem the incidence of MM for the European-American born in the orthodox neighborhoods was significantly lower than in the rest of the city in males, and of borderline significance in females. A similar pattern was seen upon comparison of the orthodox city of Bnei Brak to its neighbor city, Givatayim. The findings are consistent with the role of sun exposure in the etiology of malignant melanoma.  相似文献   
52.
53.
A New Heuristic for Bad Cycle Detection Using BDDs   总被引:1,自引:0,他引:1  
We describe a new heuristic for detecting bad cycles (reachable cycles that are not confined within one or another designated sets of model states), a fundamental operation for model-checking algorithms. It is a variation on a standard implementation of the Emerson-Lei algorithm, which our experimental data suggests can result in a significant speed-up for verification runs that pass. We conclude that this heuristic can be used to advantage on mature designs for which the anticipated result of the verification is pass.  相似文献   
54.
Symbolic Techniques in Satisfiability Solving   总被引:1,自引:0,他引:1  
Recent work has shown how to use binary decision diagrams for satisfiability solving. The idea of this approach, which we call symbolic quantifier elimination, is to view an instance of propositional satisfiability as an existentially quantified proposition formula. Satisfiability solving then amounts to quantifier elimination; once all quantifiers have been eliminated, we are left with either 1 or 0. Our goal in this work is to study the effectiveness of symbolic quantifier elimination as an approach to satisfiability solving. To that end, we conduct a direct comparison with the DPLL-based ZChaff, as well as evaluate a variety of optimization techniques for the symbolic approach. In comparing the symbolic approach to ZChaff, we evaluate scalability across a variety of classes of formulas. We find that no approach dominates across all classes. While ZChaff dominates for many classes of formulas, the symbolic approach is superior for other classes of formulas. Once we have demonstrated the viability of the symbolic approach, we focus on optimization techniques for this approach. We study techniques from constraint satisfaction for finding a good plan for performing the symbolic operations of conjunction and of existential quantification. We also study various variable-ordering heuristics, finding that while no heuristic seems to dominate across all classes of formulas, the maximum-cardinality search heuristic seems to offer the best overall performance. ★A preliminary version of the paper was presented in SAT'04. Supported in part by NSF grants CCR-9988322, CCR-0124077, CCR-0311326, IIS-9908435, IIS-9978135, EIA-0086264, ANI-0216467, and by BSF grant 9800096.  相似文献   
55.
Network-on-chip (NoC) is a promising paradigm for efficient communication between the processing elements inside multi-core system-on-chip (SoC) and general purpose chip-multi-processor. Choosing appropriate topology for NoCs with predefined application characteristics plays a pivotal role in improving power and area metrics. Until now, different irregular topologies with varying objective optimization parameters have been offered. In this paper, a novel heuristic topology synthesis method for creating application-specific NoCs consisting of some use cases which are described the applications characteristics has been proposed. This approach is composed of application clustering for assigning cores to specific routers, topology construction for finding a routing path for all flows, and also link insertion for producing final topology by interconnecting the routers. To confirm the proposed method, results of an industrial smartphone SoC and some generic benchmarks have been used as case studies. Experimental results demonstrate the benefits of the proposed method compared to state-of-the-art approaches.  相似文献   
56.
As a result of the extensive research in view-based query processing, three notions have been identified as fundamental, namely rewriting, answering, and losslessness. Answering amounts to computing the tuples satisfying the query in all databases consistent with the views. Rewriting consists in first reformulating the query in terms of the views and then evaluating the rewriting over the view extensions. Losslessness holds if we can answer the query by solely relying on the content of the views. While the mutual relationship between these three notions is easy to identify in the case of conjunctive queries, the terrain of notions gets considerably more complicated going beyond such a query class. In this paper, we revisit the notions of answering, rewriting, and losslessness and clarify their relationship in the setting of semistructured databases, and in particular for the basic query class in this setting, i.e., two-way regular path queries. Our first result is a clean explanation of the relationship between answering and rewriting, in which we characterize rewriting as a “linear approximation” of query answering. We show that applying this linear approximation to the constraint-satisfaction framework yields an elegant automata-theoretic approach to query rewriting. As for losslessness, we show that there are indeed two distinct interpretations for this notion, namely with respect to answering, and with respect to rewriting. We also show that the constraint-theoretic approach and the automata-theoretic approach can be combined to give algorithmic characterization of the various facets of losslessness. Finally, we deal with the problem of coping with loss, by considering mechanisms aimed at explaining lossiness to the user.  相似文献   
57.
Constrained sampling and counting are two fundamental problems arising in domains ranging from artificial intelligence and security, to hardware and software testing. Recent approaches to approximate solutions for these problems rely on employing SAT solvers and universal hash functions that are typically encoded as XOR constraints of length n/2 for an input formula with n variables. As the runtime performance of SAT solvers heavily depends on the length of XOR constraints, recent research effort has been focused on reduction of length of XOR constraints. Consequently, a notion of Independent Support was proposed, and it was shown that constructing XORs over independent support (if known) can lead to a significant reduction in the length of XOR constraints without losing the theoretical guarantees of sampling and counting algorithms. In this paper, we present the first algorithmic procedure (and a corresponding tool, called MIS) to determine minimal independent support for a given CNF formula by employing a reduction to group minimal unsatisfiable subsets (GMUS). By utilizing minimal independent supports computed by MIS, we provide new tighter bounds on the length of XOR constraints for constrained counting and sampling. Furthermore, the universal hash functions constructed from independent supports computed by MIS provide two to three orders of magnitude performance improvement in state-of-the-art constrained sampling and counting tools, while still retaining theoretical guarantees.  相似文献   
58.
59.
We propose HyDICE, Hybrid Discrete Continuous Exploration, a multi-layered approach for hybrid-system falsification that combines motion planning with discrete search and discovers safety violations by computing witness trajectories to unsafe states. The discrete search uses discrete transitions and a state-space decomposition to guide the motion planner during the search for witness trajectories. Experiments on a nonlinear hybrid robotic system with over one million modes and experiments with an aircraft conflict-resolution protocol with high-dimensional continuous state spaces demonstrate the effectiveness of HyDICE. Comparisons to related work show computational speedups of up to two orders of magnitude. Work supported in part by NSF CNS 0615328 (EP, LK, MV), NSF 0713623 (EP, LK), a Sloan Fellowship (LK), and NSF CCF 0613889 (MV). Equipment supported by NSF CNS 0454333 and NSF CNS 0421109 in partnership with Rice, AMD, and Cray. A preliminary version of this work was published in the Proceedings of the 19th International Conference on Computer Aided Verification (CAV 2007). Lecture Notes in Computer Science, eds. W. Damm and H. Hermanns, vol. 4590, pp. 468–481. This work contains substantial improvements to the overall computational method introduced in the preliminary work and new experiments that were not included in the preliminary work.  相似文献   
60.
In this work we study hybrid approaches to LTL symbolic model checking; that is, approaches that use explicit representations of the property automaton, whose state space is often quite manageable, and symbolic representations of the system, whose state space is typically exceedingly large. We compare the effects of using, respectively, (i) a purely symbolic representation of the property automaton, (ii) a symbolic representation, using logarithmic encoding, of explicitly compiled property automaton, and (iii) a partitioning of the symbolic state space according to an explicitly compiled property automaton. We apply this comparison to three model-checking algorithms: the doubly-nested fixpoint algorithm of Emerson and Lei, the reduction of emptiness to reachability of Biere et al., and the singly-nested fixpoint algorithm of Bloem et al. for weak automata. The emerging picture from our study is quite clear, hybrid approaches outperform pure symbolic model checking, while partitioning generally performs better than logarithmic encoding. The conclusion is that the hybrid approaches benefit from state-of-the-art techniques in semantic compilation of LTL properties. Partitioning gains further from the fact that the image computation is applied to smaller sets of states.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号