首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   358篇
  免费   19篇
  国内免费   2篇
电工技术   5篇
化学工业   95篇
金属工艺   5篇
机械仪表   6篇
建筑科学   16篇
矿业工程   1篇
能源动力   27篇
轻工业   60篇
水利工程   3篇
石油天然气   1篇
无线电   21篇
一般工业技术   53篇
冶金工业   7篇
原子能技术   5篇
自动化技术   74篇
  2022年   9篇
  2021年   17篇
  2020年   9篇
  2019年   9篇
  2018年   9篇
  2017年   17篇
  2016年   11篇
  2015年   14篇
  2014年   11篇
  2013年   31篇
  2012年   19篇
  2011年   31篇
  2010年   21篇
  2009年   23篇
  2008年   21篇
  2007年   13篇
  2006年   18篇
  2005年   9篇
  2004年   8篇
  2003年   11篇
  2002年   7篇
  2001年   5篇
  2000年   2篇
  1999年   5篇
  1998年   4篇
  1997年   2篇
  1996年   2篇
  1995年   5篇
  1994年   2篇
  1993年   1篇
  1992年   2篇
  1989年   2篇
  1988年   2篇
  1986年   2篇
  1985年   2篇
  1984年   3篇
  1983年   4篇
  1981年   5篇
  1980年   1篇
  1979年   2篇
  1978年   4篇
  1977年   1篇
  1976年   1篇
  1975年   2篇
排序方式: 共有379条查询结果,搜索用时 62 毫秒
1.
The authenticity and traceability of meat products are issues of primary importance to ensure food safety. Unfortunately, food adulteration (e.g. the addition of inexpensive cuts to minced meat products) and mislabelling (e.g. the inclusion of meat from species other than those declared) happens frequently worldwide. The aim of this study was to apply a droplet digital PCR assay for the detection and quantification (copies μL−1) of the beef, pork, horse, sheep, chicken and turkey in meat products. The analysis conducted on commercial meat showed the presence of traces of DNA from other animal species than those declared. We show that the method is highly sensitive, specific and accurate (accuracy = 100%). This method could be adopted by competent food safety authorities to verify compliance with the labelling of meat products and to ensure quality and safety throughout the meat supply chain, from primary production to consumption.  相似文献   
2.
3.
We consider the problem of generating a large state-space in a distributed fashion. Unlike previously proposed solutions that partition the set of reachable states according to a hashing function provided by the user, we explore heuristic methods that completely automate the process. The first step is an initial random walk through the state space to initialize a search tree, duplicated in each processor. Then, the reachability graph is built in a distributed way, using the search tree to assign each newly found state to classes assigned to the available processors. Furthermore, we explore two remapping criteria that attempt to balance memory usage or future workload, respectively. We show how the cost of computing the global snapshot required for remapping will scale up for system sizes in the foreseeable future. An extensive set of results is presented to support our conclusions that remapping is extremely beneficial.  相似文献   
4.
Criteria for evaluating the classification reliability of a neural classifier and for accordingly making a reject option are proposed. Such an option, implemented by means of two rules which can be applied independently of topology, size, and training algorithms of the neural classifier, allows one to improve the classification reliability. It is assumed that a performance function P is defined which, taking into account the requirements of the particular application, evaluates the quality of the classification in terms of recognition, misclassification, and reject rates. Under this assumption the optimal reject threshold value, determining the best trade-off between reject rate and misclassification rate, is the one for which the function P reaches its absolute maximum. No constraints are imposed on the form of P, but the ones necessary in order that P actually measures the quality of the classification process. The reject threshold is evaluated on the basis of some statistical distributions characterizing the behavior of the classifier when operating without reject option; these distributions are computed once the training phase of the net has been completed. The method has been tested with a neural classifier devised for handprinted and multifont printed characters, by using a database of about 300000 samples. Experimental results are discussed.  相似文献   
5.
6.
The incorporation of decatungstate in polymeric membranes provides new heterogeneous photocatalysts for the oxidation of organic substrates under oxygen atmosphere at 25 °C. Photocatalytic membranes have been prepared yielding polymeric films with a high thermal, chemical and mechanical stability (PVDF, PDMS, Hyflon). Surface spectroscopy techniques including transmittance and reflectance UV-Vis and FT-IR have been used to assess the photocatalyst integrity within the polymeric support. Catalyst screening has been performed under both homogeneous and heterogeneous photooxygenation conditions. The photocatalyst activity has been evaluated in terms of the substrate conversion, turnover numbers, and recycling experiments. A membrane induced selectivity behavior has been evidenced by comparison with homogeneous oxidations.  相似文献   
7.
This paper describes the results of site investigations, monitoring, stability analyses, and soil-pipe interaction modeling of a built-up slope located near Pineto (Abruzzo Province, Central Italy), where a gas pipeline exploded on March 6th, 2015, due to heavy rains inducing slope movements. The slope is formed by OC clay, covered with an upper 10- to 14-m-thick clayey-sandy silt colluvial layer. The explosion in the upper portion of the slope caused extensive damage to existing buildings and threatened human lives. Soon after the event, a site investigation and monitoring program was carried out. A detailed topographic survey and hydrological data were analyzed in order to characterize possible critical rainfall events. The stability of the slope was analyzed both in pre- and in post-explosion conditions. The profiles of the DMT horizontal stress index K D helped to identify multiple slip surfaces. Then, the results of the site investigation and stability analyses were used to implement a simplified finite element model aimed to describe the soil-pipeline interaction, taking into account the role of the observed wrinkle in the pipeline. The numerical simulations reveal the crucial role played by the slope movements, and by the wrinkle as well, in inducing the collapse of the pipe.  相似文献   
8.
Despite the burgeoning number of studies of public sector information systems, very few scholars have focussed on the relationship between e-Government policies and information systems choice and design. Drawing on Fountain’s (2001) technology enactment framework, this paper endeavours to conduct an in-depth investigation of the intricacies characterising the choice and design of new technologies in the context of e-Government reforms. By claiming that technologies are carriers of e-Government reform aims, this study investigates the logics embedded in the design of new technology and extant political interests and values inscribed in e-Government policies. The e-Government enactment framework is proposed as a theoretical and analytical approach to understand and study the complexity of these relationships which shape e-Government policies.  相似文献   
9.
In 1950 Markowitz first formalized the portfolio optimization problem in terms of mean return and variance. Since then, the mean-variance model has played a crucial role in single-period portfolio optimization theory and practice. In this paper we study the optimal portfolio selection problem in a multi-period framework, by considering fixed and proportional transaction costs and evaluating how much they affect a re-investment strategy. Specifically, we modify the single-period portfolio optimization model, based on the Conditional Value at Risk (CVaR) as measure of risk, to introduce portfolio rebalancing. The aim is to provide investors and financial institutions with an effective tool to better exploit new information made available by the market. We then suggest a procedure to use the proposed optimization model in a multi-period framework. Extensive computational results based on different historical data sets from German Stock Exchange Market (XETRA) are presented.  相似文献   
10.
The determinization of a nondeterministic finite automaton (FA) is the process of generating a deterministic FA (DFA) equivalent to (sharing the same regular language of) . The minimization of is the process of generating the minimal DFA equivalent to . Classical algorithms for determinization and minimization are available in the literature for several decades. However, they operate monolithically, assuming that the FA to be either determinized or minimized is given once and for all. By contrast, we consider determinization and minimization in a dynamic context, where augments over time: after each augmentation, determinization and minimization of into is required. Using classical monolithic algorithms to solve this problem is bound to poor performance. An algorithm for incremental determinization and minimization of acyclic finite automata, called IDMA, is proposed. Despite being conceived within the narrow domain of model‐based diagnosis and monitoring of active systems, the algorithm is general‐purpose in nature. Experimental evidence indicates that IDMA is far more efficient than classical algorithms in solving incremental determinization and minimization problems. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号