首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   123616篇
  免费   3045篇
  国内免费   440篇
电工技术   1070篇
综合类   2341篇
化学工业   19088篇
金属工艺   5443篇
机械仪表   3823篇
建筑科学   3441篇
矿业工程   625篇
能源动力   1931篇
轻工业   8970篇
水利工程   1608篇
石油天然气   513篇
武器工业   4篇
无线电   11479篇
一般工业技术   23126篇
冶金工业   13072篇
原子能技术   458篇
自动化技术   30109篇
  2023年   382篇
  2022年   497篇
  2021年   1059篇
  2020年   819篇
  2019年   754篇
  2018年   15554篇
  2017年   14488篇
  2016年   11227篇
  2015年   1759篇
  2014年   1708篇
  2013年   3072篇
  2012年   5509篇
  2011年   11373篇
  2010年   9825篇
  2009年   6989篇
  2008年   8387篇
  2007年   9128篇
  2006年   1368篇
  2005年   2219篇
  2004年   2076篇
  2003年   1990篇
  2002年   1293篇
  2001年   582篇
  2000年   656篇
  1999年   684篇
  1998年   3422篇
  1997年   2046篇
  1996年   1370篇
  1995年   775篇
  1994年   685篇
  1993年   649篇
  1992年   228篇
  1991年   249篇
  1990年   238篇
  1989年   221篇
  1988年   205篇
  1987年   172篇
  1986年   183篇
  1985年   214篇
  1984年   192篇
  1983年   138篇
  1982年   170篇
  1981年   177篇
  1980年   164篇
  1979年   128篇
  1978年   119篇
  1977年   200篇
  1976年   418篇
  1975年   90篇
  1973年   82篇
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
241.
242.
243.
This work processes linear prediction (LP) residual in the time domain at three different levels, extracts speaker information, and demonstrates their significance and also different nature for text-independent speaker recognition. The subsegmental analysis considers LP residual in blocks of 5 msec with shift of 2.5 msec to extract speaker information. The segmental analysis extracts speaker information by processing in blocks of 20 msec with shift of 2.5 msec. The suprasegmental speaker information is extracted by viewing in blocks of 250 msec with shift of 6.25 msec. The speaker identification and verification studies performed using NIST-99 and NIST-03 databases demonstrate that the segmental analysis provides best performance followed by subsegmental analysis. The suprasegmental analysis gives the least performance. However, the evidences from all the three levels of processing seem to be different and combine well to provide improved performance, demonstrating different speaker information captured at each level of processing. Finally, the combined evidence from all the three levels of processing together with vocal tract information further improves the speaker recognition performance.  相似文献   
244.
In this paper we consider an adaptive control problem of finite DoF worm-like locomotion systems (WLLS) which contact the ground with Coulomb dry friction. Using a rough mathematical friction law the system is shown to belong to a system class that allows adaptive control. Gaits from the kinematic theory can be tracked by means of adaptive controllers. For this we introduce two different adaptive controllers for λ-tracking and focus on that one which is not based on the derivative of the output. We pay attention to the analysis of such systems and present some theoretical control investigations including proofs. Numerical simulations of tracking different reference signals under arbitrary choice of the system parameters demonstrate and illustrate that the introduced simple adaptive controllers work successfully and effectively. Current experiments are aimed at the justification of theoretical results.  相似文献   
245.
The rapidly increasing complexity of multi-body system models in applications like vehicle dynamics, robotics and bio-mechanics requires qualitative new solution methods to slash computing times for the dynamical simulation.  相似文献   
246.
Recently, it was shown how the convergence of a class of multigrid methods for computing the stationary distribution of sparse, irreducible Markov chains can be accelerated by the addition of an outer iteration based on iterant recombination. The acceleration was performed by selecting a linear combination of previous fine-level iterates with probability constraints to minimize the two-norm of the residual using a quadratic programming method. In this paper we investigate the alternative of minimizing the one-norm of the residual. This gives rise to a nonlinear convex program which must be solved at each acceleration step. To solve this minimization problem we propose to use a deep-cuts ellipsoid method for nonlinear convex programs. The main purpose of this paper is to investigate whether an iterant recombination approach can be obtained in this way that is competitive in terms of execution time and robustness. We derive formulas for subgradients of the one-norm objective function and the constraint functions, and show how an initial ellipsoid can be constructed that is guaranteed to contain the exact solution and give conditions for its existence. We also investigate using the ellipsoid method to minimize the two-norm. Numerical tests show that the one-norm and two-norm acceleration procedures yield a similar reduction in the number of multigrid cycles. The tests also indicate that one-norm ellipsoid acceleration is competitive with two-norm quadratic programming acceleration in terms of running time with improved robustness.  相似文献   
247.
In order to improve decision-making efficiency about emergency event, this paper proposes a novel concept, i.e., Agile-Delphi Method, which is an integration of agile decision and Delphi Method implicating that the decision-makers instantly deliver, respond, treat, and utilize information via Delphi process while conducting group decision-making about emergency event. The paper details the mechanism of group decision-making about emergency event based on network technology and Agile-Delphi Method. Finally, the paper conducts an empiric analysis taking the “111 event”, i.e., the liquid ammonia spill event happened on November 1, 2006 in a phosphorus chemical company in China, as an example.  相似文献   
248.
In the IT industry, de facto standards emerge from standards competition as firms offer incompatible technologies, and user choices determine the outcome of the competition. The standards literature suggests that strong network effects create a bias toward a standard with a large installed base, leading to a winner-take-all outcome. More recently, several researchers have revealed that the dynamics of standardization are much more complex than the explanation offered by the economic theory of networks. Markets do not always exhibit tipping behavior so there is not always a single winner in de facto standardization; and the size of an overall installed base does not always exert a strong influence on adoption decisions. In contrast, network effects drawn from local social influence may be more salient to user adoption decisions. We ask: (1) Do we always observe a winner-take-all outcome in de facto standards competition? (2) What are the different technology adoption patterns observed in de facto standards competition? (3) What are the implications of network effects, switching costs, pricing, and functionality enhancement strategies on the outcome of de facto standards competition in different user network structures? Drawing on the economic theory of networks, the complex network theory, and previous work in the standards literature, we examine the influence of network effects, switching costs, price, and technology functionality on user adoption decisions using agent-based simulation. We incorporate underlying user network structures frequently observed in the real world as an important determining factor of user adoption decisions. Our results suggest that de facto standardization process does not always follow a three-phased S-shaped pattern. Winner-take-all is not a necessary outcome of standards competition. User network structures have a significant impact on the dynamics and outcomes of standards competition.  相似文献   
249.
250.
According to the human factors paradigm for patient safety, health care work systems and innovations such as electronic medical records do not have direct effects on patient safety. Instead, their effects are contingent on how the clinical work system, whether computerized or not, shapes health care providers’ performance of cognitive work processes. An application of the human factors paradigm to interview data from two hospitals in the Midwest United States yielded numerous examples of the performance-altering effects of electronic medical records, electronic clinical documentation, and computerized provider order entry. Findings describe both improvements and decrements in the ease and quality of cognitive performance, both for interviewed clinicians and for their colleagues and patients. Changes in cognitive performance appear to have desirable and undesirable implications for patient safety as well as for quality of care and other important outcomes. Cognitive performance can also be traced to interactions between work system elements, including new technology, allowing for the discovery of problems with “fit” to be addressed through design interventions.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号