首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   829篇
  免费   40篇
  国内免费   2篇
电工技术   9篇
化学工业   170篇
金属工艺   5篇
机械仪表   20篇
建筑科学   30篇
矿业工程   3篇
能源动力   45篇
轻工业   81篇
水利工程   16篇
石油天然气   5篇
无线电   122篇
一般工业技术   102篇
冶金工业   15篇
原子能技术   2篇
自动化技术   246篇
  2023年   9篇
  2022年   35篇
  2021年   47篇
  2020年   25篇
  2019年   24篇
  2018年   30篇
  2017年   35篇
  2016年   40篇
  2015年   28篇
  2014年   55篇
  2013年   75篇
  2012年   67篇
  2011年   66篇
  2010年   56篇
  2009年   62篇
  2008年   41篇
  2007年   41篇
  2006年   32篇
  2005年   24篇
  2004年   25篇
  2003年   14篇
  2002年   10篇
  2001年   5篇
  2000年   2篇
  1999年   4篇
  1998年   4篇
  1997年   3篇
  1996年   3篇
  1995年   2篇
  1993年   1篇
  1992年   1篇
  1991年   1篇
  1990年   3篇
  1983年   1篇
排序方式: 共有871条查询结果,搜索用时 15 毫秒
61.
The analysis of air quality and the continuous monitoring of air pollution levels are important subjects of the environmental science and research. This problem actually has real impact in the human health and quality of life. The determination of the conditions which favor high concentration of pollutants and most of all the timely forecast of such cases is really crucial, as it facilitates the imposition of specific protection and prevention actions by civil protection. This research paper discusses an innovative threefold intelligent hybrid system of combined machine learning algorithms HISYCOL (henceforth). First, it deals with the correlation of the conditions under which high pollutants concentrations emerge. On the other hand, it proposes and presents an ensemble system using combination of machine learning algorithms capable of forecasting the values of air pollutants. What is really important and gives this modeling effort a hybrid nature is the fact that it uses clustered datasets. Moreover, this approach improves the accuracy of existing forecasting models by using unsupervised machine learning to cluster the data vectors and trace hidden knowledge. Finally, it employs a Mamdani fuzzy inference system for each air pollutant in order to forecast even more effectively its concentrations.  相似文献   
62.
The existence of good probabilistic models for the job arrival process and the delay components introduced at different stages of job processing in a Grid environment is important for the improved understanding of the Grid computing concept. In this study, we present a thorough analysis of the job arrival process in the EGEE infrastructure and of the time durations a job spends at different states in the EGEE environment. We define four delay components of the total job delay and model each component separately. We observe that the job inter-arrival times at the Grid level can be adequately modelled by a rounded exponential distribution, while the total job delay (from the time it is generated until the time it completes execution) is dominated by the computing element’s register and queuing times and the worker node’s execution times. Further, we evaluate the efficiency of the EGEE environment by comparing the job total delay performance with that of a hypothetical ideal super-cluster and conclude that we would obtain similar performance if we submitted the same workload to a super-cluster of size equal to 34% of the total average number of CPUs participating in the EGEE infrastructure. We also analyze the job inter-arrival times, the CE’s queuing times, the WN’s execution times, and the data sizes exchanged at the kallisto.hellasgrid.gr cluster, which is node in the EGEE infrastructure. In contrast to the Grid level, we find that at the cluster level the job arrival process exhibits self-similarity/long-range dependence. Finally, we propose simple and intuitive models for the job arrival process and the execution times at the cluster level.  相似文献   
63.
Content distribution networks (CDNs) improve scalability and reliability, by replicating content to the “edge” of the Internet. Apart from the pure networking issues of the CDNs relevant to the establishment of the infrastructure, some very crucial data management issues must be resolved to exploit the full potential of CDNs to reduce the “last mile” latencies. A very important issue is the selection of the content to be prefetched to the CDN servers. All the approaches developed so far, assume the existence of adequate content popularity statistics to drive the prefetch decisions. Such information though, is not always available, or it is extremely volatile, turning such methods problematic. To address this issue, we develop self-adaptive techniques to select the outsourced content in a CDN infrastructure, which requires no apriori knowledge of request statistics. We identify clusters of “correlated” Web pages in a site, called Web site communities, and make these communities the basic outsourcing unit. Through a detailed simulation environment, using both real and synthetic data, we show that the proposed techniques are very robust and effective in reducing the user-perceived latency, performing very close to an unfeasible, off-line policy, which has full knowledge of the content popularity.  相似文献   
64.
Smart card technology has evolved over the last few years following notable improvements in the underlying hardware and software platforms. Advanced smart card microprocessors, along with robust smart card operating systems and platforms, contribute towards a broader acceptance of the technology. These improvements have eliminated some of the traditional smart card security concerns. However, researchers and hackers are constantly looking for new issues and vulnerabilities. In this article we provide a brief overview of the main smart card attack categories and their corresponding countermeasures. We also provide examples of well-documented attacks on systems that use smart card technology (e.g. satellite TV, EMV, proximity identification) in an attempt to highlight the importance of the security of the overall system rather than just the smart card.  相似文献   
65.
We propose a probabilistic variant of the pi-calculus as a framework to specify randomized security protocols and their intended properties. In order to express and verify the correctness of the protocols, we develop a probabilistic version of the testing semantics. We then illustrate these concepts on an extended example: the Partial Secret Exchange, a protocol which uses a randomized primitive, the Oblivious Transfer, to achieve fairness of information exchange between two parties.  相似文献   
66.
67.
68.
69.
There has been little research to investigate the impact of software to support the care for older people with dementia care. This article reports the evaluation of software adapted to support one key person-centered task for the care of older residents with dementia – recording and sharing daily care notes. The evaluation on the dementia wing of 1 residential home for over 6 months revealed that use of the software on mobile devices carried by the carers increased the number and volume of daily care notes recorded, but only for the types of content that were already being recorded by carers. Carers reported more advantages that resulted from daily care notes once in digital form than from the documenting task, as well as barriers to the use of mobile digital software to record daily care notes.  相似文献   
70.
Universal Access in the Information Society - Pervasive technologies such as Artificial Intelligence, Virtual Reality and the Internet of Things, despite their great potential for improved...  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号