首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   18155篇
  免费   490篇
  国内免费   18篇
电工技术   240篇
综合类   9篇
化学工业   3591篇
金属工艺   280篇
机械仪表   313篇
建筑科学   1102篇
矿业工程   109篇
能源动力   505篇
轻工业   1383篇
水利工程   159篇
石油天然气   194篇
武器工业   1篇
无线电   1074篇
一般工业技术   2913篇
冶金工业   4179篇
原子能技术   162篇
自动化技术   2449篇
  2022年   140篇
  2021年   215篇
  2020年   174篇
  2019年   225篇
  2018年   273篇
  2017年   268篇
  2016年   325篇
  2015年   246篇
  2014年   392篇
  2013年   1105篇
  2012年   666篇
  2011年   872篇
  2010年   676篇
  2009年   662篇
  2008年   860篇
  2007年   874篇
  2006年   687篇
  2005年   719篇
  2004年   560篇
  2003年   570篇
  2002年   510篇
  2001年   331篇
  2000年   301篇
  1999年   304篇
  1998年   317篇
  1997年   305篇
  1996年   299篇
  1995年   334篇
  1994年   268篇
  1993年   303篇
  1992年   272篇
  1991年   164篇
  1990年   237篇
  1989年   283篇
  1988年   204篇
  1987年   215篇
  1986年   211篇
  1985年   267篇
  1984年   258篇
  1983年   227篇
  1982年   217篇
  1981年   213篇
  1980年   172篇
  1979年   187篇
  1978年   176篇
  1977年   177篇
  1976年   147篇
  1975年   178篇
  1974年   165篇
  1973年   133篇
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
991.
The advance of technology proceeds through an evolutionary process, with many different new departures in competition with each other and with prevailing practice, and with ex-post selection determining the winners and losers. In modern times what gives power to the process is the strong base of scientific and technological understanding and technique that guides the efforts of those seeking to advance the technology. Most of that base is part of a commons open to all who have expertise in a field. The proprietary aspects of technology traditionally have comprised a small topping on the commons. But recently parts of the commons have become privatized. While the justification for the policies and actions that have spurred privatization of the commons is that this will spur technological progress, the argument here is that the result can be just the opposite.  相似文献   
992.
Modern humans possess an enormous amount of 'know-how' that enables them to do things that early humans could not dream of doing. This paper explores some promising connections between two bodies of empirical research and theorizing that bear on technological know-how and its advance. Cognitive science is concerned with the nature and mechanisms of human knowing. The focus traditionally has not been on the knowledge involved in complex technologies, nor on the processes through which human know-how has advanced over time. However, certain recent developments in this field provide a nice linkage with the implicit cognitive theorizing used by scholars who study the advance of technology. And a number of the debates in the two fields are similar.  相似文献   
993.
W. Grey Walter built robotic systems to improve understanding of biological systems. In that tradition, this paper reports ongoing work on a robot model of cricket sound localization. The main advances are the inclusion of a much larger range of neuroethological detail, and the investigation of multimodal influences on the behaviour. The former allows exploration of the functionality of identified neurons in the insect, including the possible roles of multiple sensory fibres, mutually inhibitory connections, and brain neurons with pattern-filtering properties. The latter focuses on the inclusion of an optomotor stabilization response, and how this might improve tracking, particularly under conditions of random disturbance.  相似文献   
994.
With today's highly competitive global manufacturing marketplace, the pressure for right-first-time manufacture has never been so high. New emerging data standards combined with machine data collection methods, such as in-process verification lead the way to a complete paradigm shift from the traditional manufacturing and inspection to intelligent networked process control. Low-level G and M codes offer very limited information on machine capabilities or work piece characteristics which consequently, results in no information being available on manufacturing processes, inspection plans and work piece attributes in terms of tolerances, etc. and design features to computer numerically controlled (CNC) machines. One solution to the aforementioned problems is using STEP-NC (ISO 14649) suite of standards, which aim to provide higher-level information for process control. In this paper, the authors provide a definition for process control in CNC manufacturing and identify the challenges in achieving process control in current CNC manufacturing scenario. The paper then introduces a STEP-compliant framework that makes use of self-learning algorithms that enable the manufacturing system to learn from previous data and results in eliminating the errors and consistently producing quality products. The framework relies on knowledge discovery methods such as data mining encapsulated in a process analyser to derive rules for corrective measures to control the manufacturing process. The design for the knowledge-based process analyser and the various process control mechanisms conclude the paper.  相似文献   
995.
In organisations where information security has historically been a part of management and for which the risk assessment methodologies have been designed there are established methods for communicating risk. This is the case for example in the banking and military sectors. However in organisations where information security is not embedded into management thinking and where the relationship between information security and the business is less clear-cut, communicating the risks to the business is less straightforward. In such circumstances it has been observed during field research that information security risk assessments frequently output findings to which the business cannot relate and the process is consequently often viewed as a “tick box” exercise, as opposed to one that provides real value to the business. In such a situation the information security risk assessment is divorced from the business process and not embedded into the organisation’s processes or thinking. The research for this paper was undertaken in order to identify what needs to be done in order to ensure that businesses of this type find the risk assessment process valuable in practice. Lizzie Coles-Kemp is a postgraduate research student in Computer Science and Richard E. Overill is a Senior Lecturer in Computer Science.  相似文献   
996.
Prior knowledge of the input–output problems often leads to supervised learning restrictions that can hamper the multi-layered perceptron’s (MLP) capacity to find an optimal solution. Restrictions such as fixing weights and modifying input variables may influence the potential convergence of the back-propagation algorithm. This paper will show mathematically how to handle such constraints in order to obtain a modified version of the traditional MLP capable of solving targeted problems. More specifically, it will be shown that fixing particular weights according to prior information as well as transforming incoming inputs can enable the user to limit the MLP search to a desired type of solution. The ensuing modifications pertaining to the learning algorithm will be established. Moreover, four supervised improvements will offer insight on how to control the convergence of the weights towards an optimal solution. Finally, applications involving packing and covering problems will be used to illustrate the potential and performance of this modified MLP.  相似文献   
997.
Silence resides in the gaps between the known islands of explicit knowledge. Rather than expecting to build systems with complete information, we take a human-centred approach. Individual citizens need to be active, engage in dialogue and be aware of the importance of tacit knowledge. As societies, we recognise the incompleteness and inconsistency of our discourse.  相似文献   
998.
Smooth relevance vector machine: a smoothness prior extension of the RVM   总被引:2,自引:0,他引:2  
Enforcing sparsity constraints has been shown to be an effective and efficient way to obtain state-of-the-art results in regression and classification tasks. Unlike the support vector machine (SVM) the relevance vector machine (RVM) explicitly encodes the criterion of model sparsity as a prior over the model weights. However the lack of an explicit prior structure over the weight variances means that the degree of sparsity is to a large extent controlled by the choice of kernel (and kernel parameters). This can lead to severe overfitting or oversmoothing—possibly even both at the same time (e.g. for the multiscale Doppler data). We detail an efficient scheme to control sparsity in Bayesian regression by incorporating a flexible noise-dependent smoothness prior into the RVM. We present an empirical evaluation of the effects of choice of prior structure on a selection of popular data sets and elucidate the link between Bayesian wavelet shrinkage and RVM regression. Our model encompasses the original RVM as a special case, but our empirical results show that we can surpass RVM performance in terms of goodness of fit and achieved sparsity as well as computational performance in many cases. The code is freely available. Action Editor: Dale Schuurmans.  相似文献   
999.
We present a methodology for managing outsourcing projects from the vendor's perspective, designed to maximize the value to both the vendor and its clients. The methodology is applicable across the outsourcing lifecycle, providing the capability to select and target new clients, manage the existing client portfolio and quantify the realized benefits to the client resulting from the outsourcing agreement. Specifically, we develop a statistical analysis framework to model client behavior at each stage of the outsourcing lifecycle, including: (1) a predictive model and tool for white space client targeting and selection—opportunity identification (2) a model and tool for client risk assessment and project portfolio management—client tracking, and (3) a systematic analysis of outsourcing results, impact analysis, to gain insights into potential benefits of IT outsourcing as a part of a successful management strategy. Our analysis is formulated in a logistic regression framework, modified to allow for non-linear input–output relationships, auxiliary variables, and small sample sizes. We provide examples to illustrate how the methodology has been successfully implemented for targeting, tracking, and assessing outsourcing clients within IBM global services division.Scope and purposeThe predominant literature on IT outsourcing often examines various aspects of vendor–client relationship, strategies for successful outsourcing from the client perspective, and key sources of risk to the client, generally ignoring the risk to the vendor. However, in the rapidly changing market, a significant share of risks and responsibilities falls on vendor, as outsourcing contracts are often renegotiated, providers replaced, or services brought back in house. With the transformation of outsourcing engagements, the risk on the vendor's side has increased substantially, driving the vendor's financial and business performance and eventually impacting the value delivery to the client. As a result, only well-ran vendor firms with robust processes and tools that allow identification and active management of risk at all stages of the outsourcing lifecycle are able to deliver value to the client. This paper presents a framework and methodology for managing a portfolio of outsourcing projects from the vendor's perspective, throughout the entire outsourcing lifecycle. We address three key stages of the outsourcing process: (1) opportunity identification and qualification (i.e. selection of the most likely new clients), (2) client portfolio risk management during engagement and delivery, and (3) quantification of benefits to the client throughout the life of the deal.  相似文献   
1000.
This paper presents a new class of estimators for speech enhancement in the discrete Fourier transform (DFT) domain, where we consider a multidimensional normal inverse Gaussian (MNIG) distribution for the speech DFT coefficients. The MNIG distribution can model a wide range of processes, from heavy-tailed to less heavy-tailed processes. Under the MNIG distribution complex DFT and amplitude estimators are derived. In contrast to other estimators, the suppression characteristics of the MNIG-based estimators can be adapted online to the underlying distribution of the speech DFT coefficients. Compared to noise suppression algorithms based on preselected super-Gaussian distributions, the MNIG-based complex DFT and amplitude estimators lead to a performance improvement in terms of segmental signal-to-noise ratio (SNR) in the order of 0.3 to 0.6 dB and 0.2 to 0.6 dB, respectively  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号