首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1844篇
  免费   24篇
  国内免费   3篇
电工技术   14篇
化学工业   210篇
金属工艺   21篇
机械仪表   25篇
建筑科学   42篇
矿业工程   1篇
能源动力   18篇
轻工业   97篇
水利工程   3篇
石油天然气   6篇
无线电   206篇
一般工业技术   176篇
冶金工业   730篇
原子能技术   7篇
自动化技术   315篇
  2022年   13篇
  2020年   8篇
  2019年   13篇
  2018年   10篇
  2017年   12篇
  2016年   19篇
  2015年   21篇
  2014年   28篇
  2013年   57篇
  2012年   39篇
  2011年   54篇
  2010年   47篇
  2009年   59篇
  2008年   48篇
  2007年   71篇
  2006年   62篇
  2005年   50篇
  2004年   52篇
  2003年   40篇
  2002年   43篇
  2001年   35篇
  2000年   37篇
  1999年   35篇
  1998年   185篇
  1997年   115篇
  1996年   90篇
  1995年   57篇
  1994年   46篇
  1993年   43篇
  1992年   25篇
  1991年   10篇
  1990年   32篇
  1989年   20篇
  1988年   29篇
  1987年   21篇
  1986年   24篇
  1985年   35篇
  1984年   22篇
  1983年   27篇
  1982年   23篇
  1981年   19篇
  1980年   11篇
  1979年   12篇
  1978年   20篇
  1977年   28篇
  1976年   44篇
  1975年   9篇
  1974年   8篇
  1973年   13篇
  1972年   8篇
排序方式: 共有1871条查询结果,搜索用时 31 毫秒
51.
Synthesis is the automated construction of a system from its specification. The system has to satisfy its specification in all possible environments. The environment often consists of agents that have objectives of their own. Thus, it makes sense to soften the universal quantification on the behavior of the environment and take the objectives of its underlying agents into an account. Fisman et al. introduced rational synthesis: the problem of synthesis in the context of rational agents. The input to the problem consists of temporal logic formulas specifying the objectives of the system and the agents that constitute the environment, and a solution concept (e.g., Nash equilibrium). The output is a profile of strategies, for the system and the agents, such that the objective of the system is satisfied in the computation that is the outcome of the strategies, and the profile is stable according to the solution concept; that is, the agents that constitute the environment have no incentive to deviate from the strategies suggested to them. In this paper we continue to study rational synthesis. First, we suggest an alternative definition to rational synthesis, in which the agents are rational but not cooperative. We call such problem strong rational synthesis. In the strong rational synthesis setting, one cannot assume that the agents that constitute the environment take into account the strategies suggested to them. Accordingly, the output is a strategy for the system only, and the objective of the system has to be satisfied in all the compositions that are the outcome of a stable profile in which the system follows this strategy. We show that strong rational synthesis is 2ExpTime-complete, thus it is not more complex than traditional synthesis or rational synthesis. Second, we study a richer specification formalism, where the objectives of the system and the agents are not Boolean but quantitative. In this setting, the objective of the system and the agents is to maximize their outcome. The quantitative setting significantly extends the scope of rational synthesis, making the game-theoretic approach much more relevant. Finally, we enrich the setting to one that allows coalitions of agents that constitute the system or the environment.  相似文献   
52.
In classical deterministic scheduling problems, it is assumed that all jobs have to be processed. However, in many practical cases, mostly in highly loaded make-to-order production systems, accepting all jobs may cause a delay in the completion of orders which in turn may lead to high inventory and tardiness costs. Thus, in such systems, the firm may wish to reject the processing of some jobs by either outsourcing them or rejecting them altogether. The field of scheduling with rejection provides schemes for coordinated sales and production decisions by grouping them into a single model. Since scheduling problems with rejection are very interesting both from a practical and a theoretical point of view, they have received a great deal of attention from researchers over the last decade. The purpose of this survey is to offer a unified framework for offline scheduling with rejection by presenting an up-to-date survey of the results in this field. Moreover, we highlight the close connection between scheduling with rejection and other fields of research such as scheduling with controllable processing times and scheduling with due date assignment, and include some new results which we obtained for open problems.  相似文献   
53.
Synthesis is the automated construction of a system from its specification. In the classical temporal synthesis algorithms, it is always assumed the system is “constructed from scratch” rather than “composed” from reusable components. This, of course, rarely happens in real life. In real life, almost every non-trivial commercial system, either in hardware or in software system, relies heavily on using libraries of reusable components. Furthermore, other contexts, such as web-service orchestration, can be modeled as synthesis of a system from a library of components. In this work, we define and study the problem of LTL synthesis from libraries of reusable components. We define two notions of composition: data-flow composition, for which we prove the problem is undecidable, and control-flow composition, for which we prove the problem is 2EXPTIME-complete. As a side benefit, we derive an explicit characterization of the information needed by the synthesizer on the underlying components. This characterization can be used as a specification formalism between component providers and integrators.  相似文献   
54.
55.
OBJECTIVES: This study assessed the incremental prognostic implications of normal and equivocal exercise technetium-99m (Tc-99m) sestamibi single-photon emission computed tomography (SPECT) and sought to determine its incremental prognostic value, impact on patient management and cost implications. BACKGROUND: The prognostic implications of Tc-99m sestamibi SPECT are not well defined, and risk stratification using this test has not been explored. METHODS: We studied 1,702 patients referred for exercise Tc-99m sestamibi SPECT who were followed up for a mean (+/- SD) of 20 +/- 5 months. Patients with previous percutaneous transluminal coronary angioplasty or coronary artery bypass surgery were excluded. The SPECT studies were assessed using semiquantitative visual analysis. Cardiac death and myocardial infarction were considered "hard" events, and coronary angioplasty and bypass surgery > 60 days after testing were considered "soft" events. RESULTS: Of the 1,702 patients studied, 1,131 had normal or equivocal scan results. A total of 10 events occurred in this group (1 cardiac death and 1 myocardial infarction [0.2% hard events]; 4 coronary angioplasty and 4 bypass surgery procedures [0.7% soft events]). The rates of hard events and referral to catheterization after SPECT were similarly low in patients with a low (< 0.15), intermediate (0.15 to 0.85) and high (> 0.85) post-exercise treadmill test (ETT) likelihood of coronary artery disease. With respect to scan type, patients with normal, probably normal or equivocal scan results had similarly low hard event rates. In the 571 patients with abnormal scan results, there were 43 hard events (7.5%) and 42 soft events (7.4%) (p < 0.001 vs. 1,131 patients with normal scan results for both). When the complete spectrum of scan responses was considered, SPECT provided incremental prognostic value in all patient subgroups analyzed. However, the nuclear scan was cost-effective only in patients with interpretable exercise ECG responses and an intermediate to high post-ETT likelihood of coronary artery disease and in those with uninterpretable exercise ECG responses and an intermediate to high pre-ETT likelihood of coronary artery disease. CONCLUSIONS: Normal or equivocal exercise Tc-99m sestamibi study results are associated with a benign prognosis, even in patients with a high likelihood of coronary artery disease. Although incremental prognostic value is added by nuclear testing in all patient subgroups, a testing strategy incorporating nuclear testing proved to be cost-effective only in the groups with an intermediate to high likelihood of coronary artery disease before scanning.  相似文献   
56.
Kainic acid (KA) induces status epilepticus and delayed neurodegeneration of CA3 hippocampal neurons. Downregulation of glutamate receptor 2 (GluR2) subunit mRNA [the alpha-amino-3-hydroxy-5-methyl-4-isoxazole-propionic acid (AMPA) subunit that limits Ca2+ permeability] is thought to a play role in this neurodegeneration, possibly by increased formation of Ca2+ permeable AMPA receptors. The present study examined early hippocampal decreases in GluR2 mRNA and protein following kainate-induced status epilepticus and correlated expression changes with the appearance of dead or dying cells by several histological procedures. At 12 h, in situ hybridization followed by emulsion dipping showed nonuniform decreases in GluR2 mRNA hybridization grains overlying morphologically healthy-appearing CA3 neurons. GluR1 and N-methyl-D-aspartate receptor mRNAs were unchanged. At 12-16 h, when little argyrophilia or cells with some features of apoptosis were detected by silver impregnation or electron microscopy, single immunohistochemistry with GluR2 and GluR2/3 subunit-specific antibodies demonstrated a pattern of decreased GluR2 receptor protein within CA3 neurons that appeared to predict a pattern of damage, similar to the mRNA observations. Double immunolabeling showed that GluR2 immunofluorescence was depleted and that GluR1 immunofluorescence was sustained in clusters of the same CA3 neurons. Quantitation of Western blots showed increased GluR1:GluR2 ratios in CA3 but not in CA1 or dentate gyrus subfields. Findings indicate that the GluR1:GluR2 protein ratio is increased in a population of CA3 neurons prior to significant cell loss. Data are consistent with the "GluR2 hypothesis" that reduced expression of GluR2 subunits will increase formation of AMPA receptors permeable to Ca2+ and predict vulnerability to a particular subset of pyramidal neurons following status epilepticus.  相似文献   
57.
Friedman Y  Schweitzer N 《Applied optics》1998,37(31):7229-7234
We have studied the stability of systems of plane mirrors by using a new way to describe ray transformations caused by such systems. All stable systems comprising as many as three mirrors are described and classified. Besides the well-known corner cube, infinitely many stable retroreflecting and direction-preserving three-mirror systems have been found.  相似文献   
58.
Friedman RP  Gordon JM 《Applied optics》1996,35(34):6684-6691
A new class of optical designs is developed for attaining ultrahigh flux in infrared and solar energy concentrators. These concentrators are required to satisfy simultaneously three criteria: (1) being monolithic, i.e., comprising a single piece of dielectric such that no mirrored surfaces or air spaces between concentrator elements are introduced; (2) attaining at least 90% of the thermodynamic limit to concentration; and (3) being relatively compact, e.g., aspect ratios of the order of unity or less. Our inventions are rooted in the recently developed formalism of tailored edge-ray concentrators.  相似文献   
59.
From object-process analysis to object-process design   总被引:1,自引:1,他引:0  
The object-process methodology incorporates the system static-structural and dynamic-procedural aspects into a single, unified model. This unification bridges the gap that separates the static, object model from the dynamic, behavior, state, or function-oriented models found in many current object oriented methodologies. In this work we concentrate on the transition from object-process analysis to design within the development of information systems. We use a detailed case study as a running example throughout the paper to demonstrate how the structure-behavior unification, which characterizes object-process analysis, is carried on to object-process design. The case study first applies object-process analysis to perform the analysis stage. The sequence of steps that constitutes the design is then discussed and demonstrated through the case study. The design is divided into two phases: the analysis refinement phase and the implementation-dependent phase. Analysis refinement is concerned with adding details to the analysis results which are beyond the interest of the analysis itself, yet they are not related with a particular implementation. The implementation-dependent phase is concerned with code-level design, which takes place after specific implementation details, such as programming language, data organization, and user interface, have been made during the strategic design.  相似文献   
60.
In this paper we demonstrate how genetic algorithms can be used to reverse engineer an evaluation function’s parameters for computer chess. Our results show that using an appropriate expert (or mentor), we can evolve a program that is on par with top tournament-playing chess programs, outperforming a two-time World Computer Chess Champion. This performance gain is achieved by evolving a program that mimics the behavior of a superior expert. The resulting evaluation function of the evolved program consists of a much smaller number of parameters than the expert’s. The extended experimental results provided in this paper include a report on our successful participation in the 2008 World Computer Chess Championship. In principle, our expert-driven approach could be used in a wide range of problems for which appropriate experts are available.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号