首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   216篇
  免费   1篇
电工技术   2篇
化学工业   35篇
金属工艺   1篇
机械仪表   11篇
建筑科学   5篇
能源动力   9篇
轻工业   17篇
无线电   38篇
一般工业技术   33篇
冶金工业   12篇
原子能技术   8篇
自动化技术   46篇
  2022年   3篇
  2021年   9篇
  2020年   5篇
  2019年   4篇
  2018年   5篇
  2017年   2篇
  2016年   4篇
  2015年   2篇
  2014年   12篇
  2013年   9篇
  2012年   17篇
  2011年   18篇
  2010年   6篇
  2009年   16篇
  2008年   7篇
  2007年   8篇
  2006年   7篇
  2005年   4篇
  2004年   4篇
  2003年   2篇
  2002年   3篇
  2001年   1篇
  2000年   3篇
  1999年   3篇
  1998年   8篇
  1997年   6篇
  1996年   3篇
  1995年   1篇
  1994年   1篇
  1993年   2篇
  1992年   2篇
  1990年   3篇
  1989年   2篇
  1988年   3篇
  1987年   5篇
  1986年   2篇
  1985年   2篇
  1984年   4篇
  1982年   1篇
  1981年   1篇
  1980年   1篇
  1979年   3篇
  1977年   2篇
  1976年   2篇
  1975年   2篇
  1973年   1篇
  1971年   1篇
  1968年   1篇
  1967年   1篇
  1966年   1篇
排序方式: 共有217条查询结果,搜索用时 15 毫秒
211.
Today’s IP networks must be able to serve many applications with different needs and goals and the traditional best effort approach is not sufficient to provide the various degrees of Quality of Service (QoS) required by these heterogeneous applications. In previous work, we have proposed a user-centric QoS management scheme within a game theoretic framework. In this paper, our goal is to model user behavior and to study the relationship between network performance, user satisfaction and choice of service class. Our results show that our user behavior model is realistic in the context of a user-controlled QoS: users can obtain a satisfactory service by choosing their service classes themselves. Their satisfaction is higher when they can choose their service classes dynamically rather than with a static allocation. The model also points out the importance of user actions on network stability.  相似文献   
212.
Scale adaptation, where authors alter the wording of an already published scale, is a deeply rooted social practice in IS research. This paper argues that the time is ripe to question this activity as well as the beliefs that have progressively formed around it. We identify and challenge five fallacious scale adaptation beliefs that hinder the development of more robust measure development norms. Contributing to this area of research, this paper offers a conceptual definition of the cognitive validity concept, defined as the extent to which a scale is free of problematic item characteristics (PICs) that bias the survey response process and subsequent empirical results. Building on this conceptualization effort, a new methodological process for assessing the cognitive validity of adapted IS measures is introduced. Through a series of three programmatic studies, we find converging evidence that the method can benefit the IS field by making the scale adaptation process more robust, transparent, and consistent. Along with the method, we introduce a new index that IS scholars can use to benchmark the cognitive quality of their scales against venerable IS measures. We discuss the implications of our work for IS research (including detailed implementation guidelines) and provide directions for future research on measurement in IS.  相似文献   
213.
The disambiguation of named entities is a challenge in many fields such as scientometrics, social networks, record linkage, citation analysis, semantic web…etc. The names ambiguities can arise from misspelling, typographical or OCR mistakes, abbreviations, omissions… Therefore, the search of names of persons or of organizations is difficult as soon as a single name might appear in many different forms. This paper proposes two approaches to disambiguate on the affiliations of authors of scientific papers in bibliographic databases: the first way considers that a training dataset is available, and uses a Naive Bayes model. The second way assumes that there is no learning resource, and uses a semi-supervised approach, mixing soft-clustering and Bayesian learning. The results are encouraging and the approach is already partially applied in a scientific survey department. However, our experiments also highlight that our approach has some limitations: it cannot process efficiently highly unbalanced data. Alternatives solutions are possible for future developments, particularly with the use of a recent clustering algorithm relying on feature maximization.  相似文献   
214.
We report here the development of a straightforward, sensitive, and quantitative NMR-based method for high-throughput characterization of carbohydrate structure and screening of carbohydrate active enzyme (CAZyme) specificity. Automated assays starting from gene library expression to carbohydrate structure determination directly from crude reaction media have been established and successfully used to screen a library of 4032 CAZymes obtained by combinatorial engineering, at a rate of 480 enzyme variants per day. This allowed one to accurately discriminate 303 enzyme variants with altered specificity. The results demonstrate the potential of high-throughput NMR technology in glycomics, to mine artificial and natural enzyme diversity for novel biocatalysts.  相似文献   
215.
216.
Hexavalent chromium (Cr(VI)) occurrence in soils is generally determined using an extraction step to transfer it to the liquid phase where it is more easily detected and quantified. In this work, the performance of the most common extraction procedure (EPA Method 3060A) using NaOH-Na(2)CO(3) solutions is evaluated using X-ray absorption near edge structure spectroscopy (XANES), which enables the quantification of Cr(VI) directly in the solid state. Results obtained with both methods were compared for three solid samples with different matrices: a soil containing chromite ore processing residue (COPR), a loamy soil, and a paint sludge. Results showed that Cr(VI) contents determined by the two methods differ significantly, and that the EPA Method 3060A procedure underestimated the Cr(VI) content in all studied samples. The underestimation is particularly pronounced for COPR. Low extraction yield for EPA Method 3060A was found to be the main reason. The Cr(VI) present in COPR was found to be more concentrated in magnetic phases. This work provides new XANES analyses of SRM 2701 and its extraction residues for the purpose of benchmarking EPA 3060A performance.  相似文献   
217.
Most graphics cards in standard personal computers are now equipped with several pixel pipelines running shader programs. Taking advantage of this technology by transferring parallel computations from the CPU side to the GPU side increases the overall computational power even in non-graphical applications by freeing the main processor from an heavy work. A generic library is presented to show how anyone can benefit from modern hardware by combining various techniques with little hardware specific programming skills. Its shader implementation is applied to retinal and cortical simulation. The purpose of this sample application is not to provide a correct approximation of real center surround ganglion or middle temporal cells, but to illustrate how easily intertwined spatiotemporal filters can be applied on raw input pictures in real-time. Requirements and interconnection complexity really depend on the vision framework adopted, therefore various hypothesis that may benefit from such a library are introduced.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号