首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   2843篇
  免费   107篇
  国内免费   1篇
电工技术   20篇
综合类   4篇
化学工业   600篇
金属工艺   43篇
机械仪表   66篇
建筑科学   225篇
矿业工程   1篇
能源动力   129篇
轻工业   278篇
水利工程   40篇
石油天然气   5篇
无线电   174篇
一般工业技术   541篇
冶金工业   273篇
原子能技术   26篇
自动化技术   526篇
  2023年   26篇
  2022年   50篇
  2021年   58篇
  2020年   43篇
  2019年   54篇
  2018年   64篇
  2017年   56篇
  2016年   74篇
  2015年   64篇
  2014年   93篇
  2013年   181篇
  2012年   153篇
  2011年   172篇
  2010年   153篇
  2009年   130篇
  2008年   152篇
  2007年   137篇
  2006年   128篇
  2005年   122篇
  2004年   106篇
  2003年   90篇
  2002年   92篇
  2001年   48篇
  2000年   44篇
  1999年   47篇
  1998年   82篇
  1997年   52篇
  1996年   44篇
  1995年   33篇
  1994年   36篇
  1993年   35篇
  1992年   21篇
  1991年   18篇
  1990年   16篇
  1989年   20篇
  1988年   13篇
  1987年   11篇
  1986年   15篇
  1985年   13篇
  1984年   12篇
  1983年   14篇
  1982年   10篇
  1980年   9篇
  1978年   17篇
  1977年   10篇
  1976年   17篇
  1943年   16篇
  1942年   12篇
  1941年   8篇
  1940年   8篇
排序方式: 共有2951条查询结果,搜索用时 15 毫秒
21.
We propose and study a new type of location optimization problem, the min-dist location selection problem: given a set of clients and a set of existing facilities, we select a location from a given set of potential locations for establishing a new facility, so that the average distance between a client and her nearest facility is minimized. The problem has a wide range of applications in urban development simulation, massively multiplayer online games, and decision support systems. We also investigate a variant of the problem, where we consider replacing (instead of adding) a facility while achieving the same optimization goal. We call this variant the min-dist facility replacement problem. We explore two common approaches to location optimization problems and present methods based on those approaches for solving the min-dist location selection problem. However, those methods either need to maintain an extra index or fall short in efficiency. To address their drawbacks, we propose a novel method (named MND), which has very close performance to the fastest method but does not need an extra index. We then utilize the key idea behind MND to approach the min-dist facility replacement problem, which results in two algorithms names MSND and RID. We provide a detailed comparative cost analysis and conduct extensive experiments on the various algorithms. The results show that MND and RID outperform their competitors by orders of magnitude.  相似文献   
22.
Some volatile acids were found to dramatically reduce the effectiveness of additives based on 2,2,6,6-tetramethylpiperidine as photostabilizers for polypropylene films. Strong acids such as HCl, HBr, and HNO3 had the largest effect with sulfurous acid somewhat less detrimental. Weak organic acids did not impair the effectiveness of the hindered amine light stabilizers. The role of acid concentration and contact time were explored for the HCl–piperidyl additive system. Secondary and tertiary amines were included in the study as well as oligomeric additives and an N-oxyl derivative. The latter is less basic than the free amines, and it was correspondingly less effected by acid exposures. The possibilities for acid exposure during the compounding, fabrication, and use of stabilized polyolefin articles is discussed as well as the effects of acids in terms of proposed stabilization mechanisms for the hindered amines.  相似文献   
23.
The formulation and control of margarine oils and margarines is based on an understanding of the relation between various physical measurements and the composition of the oils and margarines. Solid-to-liquid-fat ratios are determined by dilatometry or by nuclear magnetic resonance spectroscopy. Oils are chosen for their crystal habit under conditions of processing and finishing. Some margarine test methods involve appearance, oral melting characteristics, oil-off, slump or collapse, get-away, penetration, and spreadability. Many measurements are effective only when they describe conditions over a range of temperatures. These include dilatometry and consistency determinations which require multipoint measurements. Presented at the American Oil Chemists' Society Short Course on “Processing and Quality Control of Fats and Oils,” Aug. 29–Sept. 1, 1966, Michigan State University, East Lansing, Mich.  相似文献   
24.
Proteomic analysis of cerebrospinal fluid (CSF) holds great promise in understanding the progression of neurodegenerative diseases, including Alzheimer's disease (AD). As one of the primary reservoirs of neuronal biomolecules, CSF provides a window into the biochemical and cellular aspects of the neurological environment. CSF can be drawn from living participants allowing the potential alignment of clinical changes with these biochemical markers. Using cutting-edge mass spectrometry technologies, we perform a streamlined proteomic analysis of CSF. We quantify greater than 700 proteins across 10 pairs of age- and sex-matched participants in approximately one hour of analysis time each. Using the paired participant study structure, we identify a small group of biologically relevant proteins that show substantial changes in abundance between cognitive normal and AD participants, which were then analyzed at the peptide level using parallel reaction monitoring experiments. Our findings suggest the utility of fractionating a single sample and using matching to increase proteomic depth in cerebrospinal fluid, as well as the potential power of an expanded study.  相似文献   
25.
26.
Color quantization is a common image processing technique where full color images are to be displayed using a limited palette of colors. The choice of a good palette is crucial as it directly determines the quality of the resulting image. Standard quantization approaches aim to minimize the mean squared error (MSE) between the original and the quantized image, which does not correspond well to how humans perceive the image differences. In this article, we introduce a color quantization algorithm that hybridizes an optimization scheme based with an image quality metric that mimics the human visual system. Rather than minimizing the MSE, its objective is to maximize the image fidelity as evaluated by S‐CIELAB, an image quality metric that has been shown to work well for various image processing tasks. In particular, we employ a variant of simulated annealing with the objective function describing the S‐CIELAB image quality of the quantized image compared with its original. Experimental results based on a set of standard images demonstrate the superiority of our approach in terms of achieved image quality.  相似文献   
27.
The purpose of the paper is to illuminate the costs and benefits of crossing firm boundaries in inbound open innovation (OI) by determining the relationships among partner types, knowledge content and performance. The empirical part of the study is based on a survey of OI collaborations answered by R&D managers in 415 Italian, Finnish and Swedish firms. The results show that the depth of collaboration with different partners (academic/consultants, value chain partners, competitors and firms in other industries) is positively related to innovation performance, whereas the number of different partners and size have negative effects. The main result is that the knowledge content of the collaboration moderates the performance outcomes and the negative impact of having too many different kinds of partners. This illustrates how successful firms use selective collaboration strategies characterized by linking explorative and exploitative knowledge content to specific partners, to leverage the benefits and limit the costs of knowledge boundary crossing processes.  相似文献   
28.
Getting enough quality sleep is a key part of a healthy lifestyle. Many people are tracking their sleep through mobile and wearable technology, together with contextual information that may influence sleep quality, like exercise, diet, and stress. However, there is limited support to help people make sense of this wealth of data, i.e., to explore the relationship between sleep data and contextual data. We strive to bridge this gap between sleep-tracking and sense-making through the design of SleepExplorer, a web-based tool that helps individuals understand sleep quality through multi-dimensional sleep structure and explore correlations between sleep data and contextual information. Based on a two-week field study with 12 participants, this paper offers a rich understanding on how technology can support sense-making on personal sleep data: SleepExplorer organizes a flux of sleep data into sleep structure, guides sleep-tracking activities, highlights connections between sleep and contributing factors, and supports individuals in taking actions. We discuss challenges and opportunities to inform the work of researchers and designers creating data-driven health and well-being applications.  相似文献   
29.
In this article, two field experiments, conducted in an automotive assembly plant, evaluate how computer‐based training of operational sequences and related quality information can support the assembly performance of the operators. The experiments were performed during the launch of a new vehicle. A comparison was made of learning progress and quality performance between a reference group of operators that only had regular training and a test group for which some of the regular training was replaced with individual computer‐based training. Both quantitative measures of the quality output and questionnaires and observations were used to evaluate the effects of computer‐based training. The results show a clear positive difference in learning progress and improvements in quality output for the test group compared with the reference group. This combined with positive attitudes expressed by the operators and their team leaders shows that this type of training is an effective way to train operators during launches of new vehicles in automotive production.  相似文献   
30.
Matrix models are ubiquitous for constraint problems. Many such problems have a matrix of variables $\mathcal{M}$ , with the same constraint C defined by a finite-state automaton $\mathcal{A}$ on each row of $\mathcal{M}$ and a global cardinality constraint $\mathit{gcc}$ on each column of $\mathcal{M}$ . We give two methods for deriving, by double counting, necessary conditions on the cardinality variables of the $\mathit{gcc}$ constraints from the automaton $\mathcal{A}$ . The first method yields linear necessary conditions and simple arithmetic constraints. The second method introduces the cardinality automaton, which abstracts the overall behaviour of all the row automata and can be encoded by a set of linear constraints. We also provide a domain consistency filtering algorithm for the conjunction of lexicographic ordering constraints between adjacent rows of $\mathcal{M}$ and (possibly different) automaton constraints on the rows. We evaluate the impact of our methods in terms of runtime and search effort on a large set of nurse rostering problem instances.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号