首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1484篇
  免费   63篇
  国内免费   2篇
电工技术   4篇
化学工业   304篇
金属工艺   13篇
机械仪表   20篇
建筑科学   106篇
矿业工程   5篇
能源动力   18篇
轻工业   195篇
水利工程   14篇
石油天然气   8篇
无线电   86篇
一般工业技术   244篇
冶金工业   314篇
原子能技术   1篇
自动化技术   217篇
  2023年   18篇
  2022年   39篇
  2021年   59篇
  2020年   30篇
  2019年   51篇
  2018年   29篇
  2017年   45篇
  2016年   54篇
  2015年   47篇
  2014年   58篇
  2013年   113篇
  2012年   70篇
  2011年   113篇
  2010年   94篇
  2009年   79篇
  2008年   101篇
  2007年   66篇
  2006年   58篇
  2005年   50篇
  2004年   42篇
  2003年   51篇
  2002年   41篇
  2001年   18篇
  2000年   16篇
  1999年   15篇
  1998年   18篇
  1997年   17篇
  1996年   15篇
  1995年   12篇
  1994年   13篇
  1993年   10篇
  1992年   8篇
  1991年   5篇
  1990年   7篇
  1989年   4篇
  1988年   6篇
  1987年   5篇
  1986年   6篇
  1985年   4篇
  1984年   9篇
  1983年   7篇
  1982年   4篇
  1981年   4篇
  1980年   4篇
  1979年   4篇
  1978年   6篇
  1977年   5篇
  1976年   5篇
  1966年   2篇
  1963年   2篇
排序方式: 共有1549条查询结果,搜索用时 12 毫秒
101.
Although many recent systems have been built to support Information Capture and Retrieval (ICR), these have not generally been successful. This paper presents studies that evaluate two different hypotheses for this failure, firstly that systems fail to address user needs and secondly that they provide only rudimentary support for ICR. Having first presented a taxonomy of different systems built to support ICR, we then describe a study that attempts to identify user needs for ICR. On the basis of that study we carried out two user-oriented evaluations. In the first, we carried out a task-based evaluation of a state-of-the-art ICR system, finding that it failed to provide users with abstract ways to view meetings data, and did not present users with information categories that they considered to be important. In a second study, we introduce a new method for comparative evaluation of different techniques for accessing meetings data. The second study showed that simple interface techniques that extracted key information from meetings were effective in allowing users to extract gist from meetings data. We conclude with a discussion of outstanding issues and future directions for ICR research.  相似文献   
102.
103.
Many times both physicians and patients take for granted that they are qualified to treat certain ages and certain conditions. Neurological conditions are diagnosed and treated by neurologists, pediatric neurologists, and neurosurgeons. What is the difference and why does it matter? It is a balancing act of risk and benefit and the physician’s obligation to know her/his limitations. Telemedicine may play a key role in enabling a greater access to pediatric neurologists. If telemedicine becomes more available, then access even in more remote areas without child neurologists may be possible. As this article discusses, there may be certain limitations to using telemedicine, which is distinguishable from telehealth, in relation to pediatric neurology. This article examines the type of neurological physician that is the most appropriate; the availability of and access to pediatric neurologists; and the applicability of telemedicine to the diagnosis and treatment of pediatric neurological conditions.  相似文献   
104.
105.
A linear program was developed to help seismically active communities decide: (1) how much to spend on pre-earthquake mitigation that aims to reduce future losses versus waiting until after an event and paying for reconstruction, and (2) which of the many possible mitigation activities to fund so as to minimize overall risk. The mitigation alternatives considered are structural upgrading policies for groups of buildings. Benefits of mitigation are losses avoided in future earthquakes, including structural, non-structural, contents, and time-related losses, and casualties. The model is intended to be used as a tool to support the public regional mitigation planning process. In realistic applications, the model includes millions of variables, thus requiring a special solution method. This paper focuses on two efficient solution algorithms to solve the model—a Dantzig–Wolfe decomposition algorithm and a greedy heuristic algorithm. A comprehensive numerical study compares the two algorithms in terms of solution quality and solution time. The study shows that, compared to the Dantzig–Wolfe algorithm, the heuristic algorithm is much faster as expected, and provides comparable solution quality.  相似文献   
106.
107.
Electron tomography (ET) combines electron microscopy and the principles of tomographic imaging in order to reconstruct the three-dimensional structure of complex biological specimens at molecular resolution. Weighted back-projection (WBP) has long been the method of choice since the reconstructions are very fast. It is well known that iterative methods produce better images, but at a very costly time penalty. In this work, it is shown that efficient parallel implementations of iterative methods, based primarily on data decomposition, can speed up such methods to an extent that they become viable alternatives to WBP. Precomputation of the coefficient matrix has also turned out to be important to substantially improve the performance regardless of the number of processors used. Matrix precomputation has made it possible to speed up the block-iterative component averaging (BICAV) algorithm, which has been studied before in the context of computerized tomography (CT) and ET, by a factor of more than 3.7. Component-averaged row projections (CARP) is a recently introduced block-parallel algorithm, which was shown to be a robust method for solving sparse systems arising from partial differential equations. It is shown that this algorithm is also suitable for single-axis ET, and is advantageous over BICAV both in terms of runtime and image quality. The experiments were carried out on several datasets of ET of various sizes, using the blob model for representing the reconstructed object.  相似文献   
108.
This research explores reliance behaviours of decision-makers using a decision aid. Objective and subjective task characteristics in the form of task complexity and task difficulty, respectively, are examined, along with the effect of the individual characteristic of expertise. A total of 130 subjects (65 novices and 65 experienced practitioners) completed a lab experiment using a decision aid (Insolve-DG) to help them make decisions for two insolvency tasks with differing levels of complexity. The research finds that the objective task characteristic (task complexity) and individual characteristic (expertise) both affect reliance behaviours; however, their effects are fully mediated by the subjective task characteristic (task difficulty). Expertise and task complexity are both associated with the degree of task difficulty experienced by an individual user: increasing task complexity increases task difficulty, and increasing expertise reduces task difficulty. Task difficulty and task complexity are established as different constructs; and importantly it is task difficulty, not task complexity, that ultimately affects reliance.  相似文献   
109.
This study is an empirical investigation of problematic instant messaging (IM) use among university students in Singapore. It adapts Caplan's (2005) theoretical framework of problematic Internet use (PIU) to the context of problematic IM use by linking pre-existing human dispositions to cognitive-behavioral symptoms and negative outcomes of improper IM use. Four new factors—oral communication apprehension, polychronicity, perceived inconvenience of using offline communication means, and trait procrastination—were tested as predictors of problematic IM use. The results provided strong support for Caplan's theoretical framework of PIU and indicated that oral communication apprehension and perceived inconvenience of using offline means were significant predictors of problematic IM use, whereas polychronicity and trait procrastination were not. The implications of these findings are discussed.  相似文献   
110.
Peers in a peer-to-peer data management system often have heterogeneous schemas and no mediated global schema. To translate queries across peers, we assume each peer provides correspondences between its schema and a small number of other peer schemas. We focus on query reformulation in the presence of heterogeneous XML schemas, including data–metadata conflicts. We develop an algorithm for inferring precise mapping rules from informal schema correspondences. We define the semantics of query answering in this setting and develop query translation algorithm. Our translation handles an expressive fragment of XQuery and works both along and against the direction of mapping rules. We describe the HePToX heterogeneous P2P XML data management system which incorporates our results. We report the results of extensive experiments on HePToX on both synthetic and real datasets. We demonstrate our system utility and scalability on different P2P distributions.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号