首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1538篇
  免费   34篇
电工技术   15篇
化学工业   285篇
金属工艺   25篇
机械仪表   30篇
建筑科学   79篇
矿业工程   17篇
能源动力   36篇
轻工业   133篇
水利工程   7篇
石油天然气   9篇
武器工业   1篇
无线电   109篇
一般工业技术   187篇
冶金工业   388篇
原子能技术   11篇
自动化技术   240篇
  2022年   9篇
  2021年   9篇
  2020年   9篇
  2019年   11篇
  2018年   17篇
  2017年   12篇
  2016年   19篇
  2015年   26篇
  2014年   29篇
  2013年   90篇
  2012年   46篇
  2011年   71篇
  2010年   50篇
  2009年   73篇
  2008年   56篇
  2007年   63篇
  2006年   72篇
  2005年   66篇
  2004年   64篇
  2003年   48篇
  2002年   54篇
  2001年   45篇
  2000年   33篇
  1999年   33篇
  1998年   43篇
  1997年   43篇
  1996年   42篇
  1995年   25篇
  1994年   30篇
  1993年   28篇
  1992年   31篇
  1991年   12篇
  1990年   29篇
  1989年   12篇
  1988年   10篇
  1987年   15篇
  1986年   18篇
  1985年   24篇
  1984年   25篇
  1983年   22篇
  1982年   15篇
  1981年   13篇
  1980年   12篇
  1979年   14篇
  1978年   11篇
  1977年   13篇
  1976年   21篇
  1975年   10篇
  1974年   9篇
  1973年   10篇
排序方式: 共有1572条查询结果,搜索用时 15 毫秒
31.
This paper presents a parameter sensitivity study of the Nelder-Mead Simplex Method for unconstrained optimization. Nelder-Mead Simplex Method is very easy to implement in practice, because it does not require gradient computation; however, it is very sensitive to the choice of initial points selected. Fan-Zahara conducted a sensitivity study using a select set of test cases and suggested the best values for the parameters based on the highest percentage rate of successful minimization. Begambre-Laier used a strategy to control the Particle Swarm Optimization parameters based on the Nelder Mead Simplex Method in identifying structural damage. The main purpose of the paper is to extend their parameter sensitivity study to better understand the parameter’s behavior. The comprehensive parameter sensitivity study was conducted on seven test functions: B2, Beale, Booth, Wood, Rastrigin, Rosenbrock and Sphere Functions to search for common patterns and relationships each parameter has in producing the optimum solution. The results show important relations of the Nelder-Mead Simplex parameters: reflection, expansion, contraction, and Simplex size and how they impact the optimum solutions. This study is crucial, because better understanding of the parameters behavior can motivate current and future research using Nelder-Mead Simplex in creating an intelligent algorithm, which can be more effective, efficient, and save computational time.  相似文献   
32.
This paper presents results on a new hybrid optimization method which combines the best features of four traditional optimization methods together with an intelligent adjustment algorithm to speed convergence on unconstrained and constrained optimization problems. It is believed that this is the first time that such a broad array of methods has been employed to facilitate synergistic enhancement of convergence. Particle swarm optimization is based on swarm intelligence inspired by the social behavior and movement dynamics of bird flocking, fish schooling, and swarming theory. This method has been applied for structural damage identification, neural network training, and reactive power optimization. It is also believed that this is the first time an intelligent parameter adjustment algorithm has been applied to maximize the effectiveness of individual component algorithms within the hybrid method. A comprehensive sensitivity analysis of the traditional optimization methods within the hybrid group is used to demonstrate how the relationship among the design variables in a given problem can be used to adjust algorithm parameters. The new method is benchmarked using 11 classical test functions and the results show that the new method outperforms eight of the most recently published search methodologies.  相似文献   
33.
Given a language L and a non-deterministic finite automaton M, we consider whether we can determine efficiently (in the size of M) if M accepts at least one word in L, or infinitely many words. Given that M accepts at least one word in L, we consider how long a shortest word can be. The languages L that we examine include the palindromes, the non-palindromes, the k-powers, the non-k-powers, the powers, the non-powers (also called primitive words), the words matching a general pattern, the bordered words, and the unbordered words.  相似文献   
34.
Security countermeasures help ensure the confidentiality, availability, and integrity of information systems by preventing or mitigating asset losses from Cybersecurity attacks. Due to uncertainty, the financial impact of threats attacking assets is often difficult to measure quantitatively, and thus it is difficult to prescribe which countermeasures to employ. In this research, we describe a decision support system for calculating the uncertain risk faced by an organization under cyber attack as a function of uncertain threat rates, countermeasure costs, and impacts on its assets. The system uses a genetic algorithm to search for the best combination of countermeasures, allowing the user to determine the preferred tradeoff between the cost of the portfolio and resulting risk. Data collected from manufacturing firms provide an example of results under realistic input conditions.  相似文献   
35.
In many clinical scenarios, medical data visualization and interaction are important to physicians for exploring inner anatomical structures and extracting meaningful diagnostic information. Real-time high-quality volume rendering, artifact-free clipping, and rapid scalar value classification are important techniques employed in this process. Unfortunately, in practice, it is still difficult to achieve an optimal balance. In this paper, we present some strategies to address this issue, which are based on the calculation of segment-based post color attenuation and dynamic ray–plane intersection (RPI) respectively. When implemented within our visualization system, the new classification algorithm can deliver real-time performance while avoiding the “color over-accumulation” artifacts suffered by the commonly used acceleration algorithms that employ pre-integrated classification. Our new strategy can achieve an optimized balance between image quality and classification speed. Next, the RPI algorithm is used with opacity adjustment technique to effectively remove the “striping” artifacts on the clipping plane caused by the nonuniform integration length. Furthermore, we present techniques for multiple transfer function (TF) based anatomical feature enhancement and “keyhole” based endoscopic inner structure view. Finally, the algorithms are evaluated subjectively by radiologists and quantitatively compared using image power spectrum analysis.  相似文献   
36.
This paper recasts the Friesz et al. (1993) measure theoretic model of dynamic network user equibrium as a controlled variational inequality problem involving Riemann integrals. This restatement is done to make the model and its foundations accessible to a wider audience by removing the need to have a background in functional analysis. Our exposition is dependent on previously unavailable necessary conditions for optimal control problems with state-dependent time lags. These necessary conditions, derived in an Appendix, are employed to show that a particular variational inequality control problem has solutions that are dynamic network user equilibria. Our analysis also shows that use of proper flow propagation constraints obviates the need to explicitly employ the arc exit time functions that have complicated numerical implementations of the Friesz et al. (1993) model heretofore. We close by describing the computational implications of numerically determining dynamic user equilibria from formulations based on state-dependent time lags.  相似文献   
37.
The genomics, proteomics, clinical, and drug discovery laboratories have a growing need to maintain valuable samples at ultra-low (−80°C) temperatures in a validated, secure environment. Automated sample processing systems have until now required manual (off-line) storage of samples at −80°C, reducing system reliability and speed. Both of these important needs are addressed by the Sample Process Management System being introduced by BIOPHILE Inc. Conventional sample management processes, such as storage, retrieval, and cataloging, are increasingly strained by the growing sample populations. There are variable sample types, access requirements and storage requirements. Security and inventory procedures are implemented manually. The evolving technologies present in the laboratory cannot interface with conventional manual storage techniques. Addressing these limitations, the primary benefits of BIOPHILE's solutions are:
• Fully validated sample management process that coordinates the life-cycles of samples and their related data.
• Robotic technology to securely store and retrieve samples, improving their accessibility and stability. Thermal shock is reduced, improving sample longevity and quality. The robotic technology allows integration with larger automation systems.
• A process program to develop a Sample Management Strategy. This strategy is developed by analyzing long-term research goals, current baseline processes, and identification of current sample life cycles. A full validation documentation package can be generated, providing a high level of quality assurance.
• Improved sample visibility and quality assurance - automated sample population cataloging; controlled sample management access and security.
  相似文献   
38.
Linear models of synaptic plasticity provide a useful starting-point for examining the dynamics of neuronal development and learning, but their inherent problems are well known. Models of synaptic plasticity that embrace the demands of biological realism are therefore typically nonlinear. Viewed from a more abstract perspective, nonlinear models of synaptic plasticity are a subset of nonlinear dynamical systems. As such, they may therefore exhibit bifurcations under the variation of control parameters, including noise and errors in synaptic updates. One source of noise or error is the cross-talk that occurs during otherwise Hebbian plasticity. Under cross-talk, stimulation of a set of synapses can induce or modify plasticity in adjacent, unstimulated synapses. Here, we analyze two nonlinear models of developmental synaptic plasticity and a model of independent component analysis in the presence of a simple model of cross-talk. We show that cross-talk does indeed induce bifurcations in these models, entirely destroying their ability to acquire either developmentally or learning-related patterns of fixed points. Importantly, the critical level of cross-talk required to induce bifurcations in these models is very sensitive to the statistics of the afferents' activities and the number of afferents synapsing on a postsynaptic cell. In particular, the critical level can be made arbitrarily small. Because bifurcations are inevitable in nonlinear models, our results likely apply to many nonlinear models of synaptic plasticity, although the precise details vary by model. Hence, many nonlinear models of synaptic plasticity are potentially fatally compromised by the toxic influence of cross-talk and other sources of noise and errors more generally. We conclude by arguing that biologically realistic models of synaptic plasticity must be robust against noise-induced bifurcations and that biological systems may have evolved strategies to circumvent their possible dangers.  相似文献   
39.
明天的航海电气系统将同今天的系统有极大的不同。电力电子给予船舶上包括推进、电力分配、备用电源、声纳和雷达等在内的各种系统的进展以重要的影响。刚刚出现的新材料、新器件和新的系统概念(诸如宽带半导体材料、碳化硅基的电力半导体器件、电力电子模组(PEBB),以及集成功率系统)正在使、并将持续地使未来的航海系统有别于今天的系统,如同内燃船舶有别于蒸汽船舶。但是,这些正在实现的技术和有关概念还未被大家所周知,而且还有难于理解的地方。本文就将介绍这些新概念和新技术,指出潜在的影响力,并揭示新的设计方法,以推动航海电气系统的发展。  相似文献   
40.
Construction accidents are broadly categorized into five basic groups, namely falls (from elevation), shock (electrical), caught in/between, struck-by, and other. “Struck-by” accidents accounted for 22% of all construction-related fatalities recorded by the Occupational Safety and Health Administration between 1985 and 1989. Recent (1997 to 2000) data show that the percentage of struck-by accidents constituted 24.6% of the fatalities and serious construction worker injuries. Struck-by accidents primarily involve workers struck by equipment, private vehicles, falling materials, vertically hoisted materials, horizontally transported materials, and trench cave ins. Determining possible causation factors of these accident types is often difficult, due to the broad categories utilized in the accident coding system. This study resulted in gaining insights about the root causes of the struck-by injuries. By finding the root causes, effective methods for accident prevention can be developed.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号