全文获取类型
收费全文 | 145篇 |
免费 | 3篇 |
专业分类
电工技术 | 2篇 |
化学工业 | 22篇 |
金属工艺 | 1篇 |
机械仪表 | 1篇 |
建筑科学 | 3篇 |
能源动力 | 6篇 |
轻工业 | 4篇 |
水利工程 | 3篇 |
无线电 | 32篇 |
一般工业技术 | 6篇 |
冶金工业 | 2篇 |
自动化技术 | 66篇 |
出版年
2023年 | 1篇 |
2022年 | 8篇 |
2021年 | 6篇 |
2020年 | 1篇 |
2019年 | 2篇 |
2018年 | 7篇 |
2017年 | 3篇 |
2016年 | 3篇 |
2015年 | 4篇 |
2013年 | 12篇 |
2012年 | 11篇 |
2011年 | 9篇 |
2010年 | 9篇 |
2009年 | 13篇 |
2008年 | 5篇 |
2007年 | 19篇 |
2006年 | 11篇 |
2005年 | 7篇 |
2004年 | 6篇 |
2003年 | 1篇 |
2002年 | 3篇 |
2001年 | 2篇 |
1999年 | 1篇 |
1998年 | 1篇 |
1993年 | 1篇 |
1989年 | 1篇 |
1987年 | 1篇 |
排序方式: 共有148条查询结果,搜索用时 15 毫秒
1.
2.
Michalis Koutinas Ludmila G Peeva Andrew G Livingston 《Journal of chemical technology and biotechnology (Oxford, Oxfordshire : 1986)》2005,80(11):1252-1260
This study presents a comparison of the efficiency of a bioscrubber and a biotrickling filter (BTF) for the removal of ethyl acetate (EA) vapour from a waste gas stream, under the same operating conditions. The maximum EA elimination capacity achieved in the bioscrubber was 550 g m?3 h?1 with removal efficiency higher than 96%. For higher EA loadings the bioscrubber was oxygen limited, which caused incomplete EA biodegradation. When pure oxygen was fed to the bioscrubber at a rate of 0.02 L min?1, the bioscrubber recovered and could treat higher EA loadings without any oxygen limitation. The BTF achieved EA elimination capacity of 600 g m?3 h?1 with removal efficiency higher than 97% and the dissolved oxygen concentration remained substantially higher than in the bioscrubber. However, severe channelling and blockage of the spray nozzle occurred due to the excessive biomass growth. Overall, the bioscrubber system was easier to operate and control than the BTF, while an enhancement of the oxygen mass transfer in the bioscrubber could potentially increase its performance by up to three times. Copyright © 2005 Society of Chemical Industry 相似文献
3.
Georgios Kontaxis Michalis Polychronakis Evangelos P. Markatos 《International Journal of Information Security》2012,11(5):321-332
Over the past few years, a large and ever increasing number of Web sites have incorporated one or more social login platforms and have encouraged users to log in with their Facebook, Twitter, Google, or other social networking identities. Research results suggest that more than two million Web sites have already adopted Facebook’s social login platform, and the number is increasing sharply. Although one might theoretically refrain from such social login features and cross-site interactions, usage statistics show that more than 250 million people might not fully realize the privacy implications of opting-in. To make matters worse, certain Web sites do not offer even the minimum of their functionality unless users meet their demands for information and social interaction. At the same time, in a large number of cases, it is unclear why these sites require all that personal information for their purposes. In this paper, we mitigate this problem by designing and developing a framework for minimum information disclosure in social login interactions with third-party sites. Our example case is Facebook, which combines a very popular single sign-on platform with information-rich social networking profiles. Whenever users want to browse to a Web site that requires authentication or social interaction using a Facebook identity, our system employs, by default, a Facebook session that reveals the minimum amount of information necessary. Users have the option to explicitly elevate that Facebook session in a manner that reveals more or all of the information tied to their social identity. This enables users to disclose the minimum possible amount of personal information during their browsing experience on third-party Web sites. 相似文献
4.
Nikolaos Petsas Giorgos Kouzilos Giorgos Papapanos Michalis Vardavoulias Angeliki Moutsatsou 《Journal of Thermal Spray Technology》2007,16(2):214-219
The purpose of the present work was the investigation and characterization of the quality of air in a thermal spray industry,
in Greece. The activities that take place in the specific facility, as well as in most other similar industries, include thermal
spraying and several mechanical and metallurgical tasks that generate airborne particles, such as grit-blasting, cutting and
grinding of metallic components. Since the main focus of this work was the workers exposure to airborne particles and heavy
metals, portable air samplers with quartz fiber filters, were used daily for 8 h. Three samplers, carried from different employees,
were used for a period of 1 month. Results showed that both particles and heavy metals concentrations were low, even in the
production site, which was the most susceptible area. The only exceptions were observed in the case of cleaning and maintenance
activities in the thermal spray booth and in the case of spraying outside the booth. The main reason for the low concentrations
is the fact that most of the activities that could produce high-particle concentrations are conducted in closed, well-ventilated
systems. Statistical elaboration of results showed that particles are correlated with Ni, Cu, Co. The same conclusion is extracted
for Fe, Mn. These correlations indicate possible common sources. 相似文献
5.
Traditional ant colony optimization (ACO) algorithms have difficulty in addressing dynamic optimization problems (DOPs). This is because once the algorithm converges to a solution and a dynamic change occurs, it is difficult for the population to adapt to a new environment since high levels of pheromone will be generated to a single trail and force the ants to follow it even after a dynamic change. A good solution to address this problem is to increase the diversity via transferring knowledge from previous environments to the pheromone trails using immigrants schemes. In this paper, an ACO framework for dynamic environments is proposed where different immigrants schemes, including random immigrants, elitism-based immigrants, and memory-based immigrants, are integrated into ACO algorithms for solving DOPs. From this framework, three ACO algorithms, where immigrant ants are generated using the aforementioned immigrants schemes and replace existing ants in the current population, are proposed and investigated. Moreover, two novel types of dynamic travelling salesman problems (DTSPs) with traffic factors, i.e., under random and cyclic dynamic environments, are proposed for the experimental study. The experimental results based on different DTSP test cases show that each proposed algorithm performs well on different environmental cases and that the proposed algorithms outperform several other peer ACO algorithms. 相似文献
6.
A memetic ant colony optimization algorithm for the dynamic travelling salesman problem 总被引:1,自引:1,他引:0
Michalis Mavrovouniotis Shengxiang Yang 《Soft Computing - A Fusion of Foundations, Methodologies and Applications》2011,15(7):1405-1425
Ant colony optimization (ACO) has been successfully applied for combinatorial optimization problems, e.g., the travelling salesman problem (TSP), under stationary environments. In this paper, we consider the dynamic TSP (DTSP), where cities are replaced by new ones during the execution of the algorithm. Under such environments, traditional ACO algorithms face a serious challenge: once they converge, they cannot adapt efficiently to environmental changes. To improve the performance of ACO on the DTSP, we investigate a hybridized ACO with local search (LS), called Memetic ACO (M-ACO) algorithm, which is based on the population-based ACO (P-ACO) framework and an adaptive inver-over operator, to solve the DTSP. Moreover, to address premature convergence, we introduce random immigrants to the population of M-ACO when identical ants are stored. The simulation experiments on a series of dynamic environments generated from a set of benchmark TSP instances show that LS is beneficial for ACO algorithms when applied on the DTSP, since it achieves better performance than other traditional ACO and P-ACO algorithms. 相似文献
7.
Michalis Savelonas Dimitris Maroulis Manolis Sangriotis 《Computer methods and programs in biomedicine》2009,96(1):25-32
In this paper, a novel computer-based approach is proposed for malignancy risk assessment of thyroid nodules in ultrasound images. The proposed approach is based on boundary features and is motivated by the correlation which has been addressed in medical literature between nodule boundary irregularity and malignancy risk. In addition, local echogenicity variance is utilized so as to incorporate information associated with local echogenicity distribution within nodule boundary neighborhood. Such information is valuable for the discrimination of high-risk nodules with blurred boundaries from medium-risk nodules with regular boundaries. Analysis of variance is performed, indicating that each boundary feature under study provides statistically significant information for the discrimination of thyroid nodules in ultrasound images, in terms of malignancy risk. k-nearest neighbor and support vector machine classifiers are employed for the classification tasks, utilizing feature vectors derived from all combinations of features under study. The classification results are evaluated with the use of the receiver operating characteristic. It is derived that the proposed approach is capable of discriminating between medium-risk and high-risk nodules, obtaining an area under curve, which reaches 0.95. 相似文献
8.
Christos Doulkeridis Akrivi Vlachou Kjetil Nørvåg Yannis Kotidis Michalis Vazirgiannis 《Peer-to-Peer Networking and Applications》2010,3(1):67-79
The advent of the World Wide Web has made an enormous amount of information available to everyone and the widespread use of
digital equipment enables end-users (peers) to produce their own digital content. This vast amount of information requires scalable data management systems. Peer-to-peer
(P2P) systems have so far been well established in several application areas, with file-sharing being the most prominent.
The next challenge that needs to be addressed is (more complex) data sharing, management and query processing, thus facilitating
the delivery of a wide spectrum of novel data-centric applications to the end-user, while providing high Quality-of-Service.
In this paper, we propose a self-organizing P2P system that is capable to identify peers with similar content and intentionally
assign them to the same super-peer. During content retrieval, fewer super-peers need to be contacted and therefore efficient
similarity search is supported, in terms of reduced network traffic and contacted peers. Our approach increases the responsiveness
and reliability of a P2P system and we demonstrate the advantages of our approach using large-scale simulations. 相似文献
9.
Bayesian feature and model selection for Gaussian mixture models 总被引:1,自引:0,他引:1
Constantinopoulos C Titsias MK Likas A 《IEEE transactions on pattern analysis and machine intelligence》2006,28(6):1013-1018
We present a Bayesian method for mixture model training that simultaneously treats the feature selection and the model selection problem. The method is based on the integration of a mixture model formulation that takes into account the saliency of the features and a Bayesian approach to mixture learning that can be used to estimate the number of mixture components. The proposed learning algorithm follows the variational framework and can simultaneously optimize over the number of components, the saliency of the features, and the parameters of the mixture model. Experimental results using high-dimensional artificial and real data illustrate the effectiveness of the method. 相似文献
10.
Antonis Papadogiannakis Giorgos Vasiliadis Demetres Antoniades Michalis Polychronakis Evangelos P. Markatos 《Computer Communications》2012,35(1):129-140
Passive network monitoring is the basis for a multitude of systems that support the robust, efficient, and secure operation of modern computer networks. Emerging network monitoring applications are more demanding in terms of memory and CPU resources due to the increasingly complex analysis operations that are performed on the inspected traffic. At the same time, as the traffic throughput in modern network links increases, the CPU time that can be devoted for processing each network packet decreases. This leads to a growing demand for more efficient passive network monitoring systems in which runtime performance becomes a critical issue.In this paper we present locality buffering, a novel approach for improving the runtime performance of a large class of CPU and memory intensive passive monitoring applications, such as intrusion detection systems, traffic characterization applications, and NetFlow export probes. Using locality buffering, captured packets are being reordered by clustering packets with the same port number before they are delivered to the monitoring application. This results in improved code and data locality, and consequently, in an overall increase in the packet processing throughput and decrease in the packet loss rate. We have implemented locality buffering within the widely used libpcap packet capturing library, which allows existing monitoring applications to transparently benefit from the reordered packet stream without modifications. Our experimental evaluation shows that locality buffering improves significantly the performance of popular applications, such as the Snort IDS, which exhibits a 21% increase in the packet processing throughput and is able to handle 67% higher traffic rates without dropping any packets. 相似文献