首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   673篇
  免费   63篇
  国内免费   3篇
电工技术   12篇
综合类   4篇
化学工业   131篇
金属工艺   7篇
机械仪表   18篇
建筑科学   11篇
矿业工程   2篇
能源动力   39篇
轻工业   102篇
水利工程   8篇
石油天然气   6篇
无线电   74篇
一般工业技术   166篇
冶金工业   10篇
原子能技术   7篇
自动化技术   142篇
  2024年   4篇
  2023年   30篇
  2022年   83篇
  2021年   103篇
  2020年   61篇
  2019年   62篇
  2018年   60篇
  2017年   63篇
  2016年   30篇
  2015年   20篇
  2014年   42篇
  2013年   41篇
  2012年   31篇
  2011年   28篇
  2010年   17篇
  2009年   11篇
  2008年   5篇
  2007年   6篇
  2006年   4篇
  2005年   6篇
  2004年   2篇
  2003年   5篇
  2001年   2篇
  2000年   2篇
  1999年   3篇
  1998年   3篇
  1997年   5篇
  1996年   4篇
  1995年   1篇
  1994年   1篇
  1992年   1篇
  1989年   1篇
  1987年   1篇
  1977年   1篇
排序方式: 共有739条查询结果,搜索用时 0 毫秒
81.
Header Detection to Improve Multimedia Quality Over Wireless Networks   总被引:1,自引:0,他引:1  
Wireless multimedia studies have revealed that forward error correction (FEC) on corrupted packets yields better bandwidth utilization and lower delay than retransmissions. To facilitate FEC-based recovery, corrupted packets should not be dropped so that maximum number of packets is relayed to a wireless receiver's FEC decoder. Previous studies proposed to mitigate wireless packet drops by a partial checksum that ignored payload errors. Such schemes require modifications to both transmitters and receivers, and incur packet-losses due to header errors. In this paper, we introduce a receiver-based scheme which uses the history of active multimedia sessions to detect transmitted values of corrupted packet headers, thereby improving wireless multimedia throughput. Header detection is posed as the decision-theoretic problem of multihypothesis detection of known parameters in noise. Performance of the proposed scheme is evaluated using trace-driven video simulations on an 802.11b local area network. We show that header detection with application layer FEC provides significant throughput and video quality improvements over the conventional UDP/IP/802.11 protocol stack  相似文献   
82.
In this paper, we propose a novel Route Maintenance scheme for IEEE 802.11 wireless mesh networks. Despite lack of mobility and energy constraints, reactive routing protocols such as AODV and DSR suffer from frequent route breakages in 802.11 based infrastructure wireless mesh networks. In these networks, if any intermediate node fails to successfully transmit a packet to the next hop node after a certain number of retransmissions, the link layer reports a transmission problem to the network layer. Reactive routing protocols systematically consider this as a link breakage (and therefore a route breakage). Transmission failures can be caused by a number of factors e.g. interference or noise and can be transient in nature. Frequent route breakages result in significant performance degradation. The proposed mechanism considers multiple factors to differentiate between links with transient transmission problems from those links which have permanent transmission problems and takes a coherent decision on link breakage. The proposed mechanism is implemented in AODV for single-radio single-channel mesh network and an extension is incorporated in multi-radio multi-channel scenarios. Simulation results show substantial performance improvement compared to classical AODV and local route repair schemes.  相似文献   
83.

Nature-inspired algorithms take inspiration from living things and imitate their behaviours to accomplish robust systems in engineering and computer science discipline. Symbiotic organisms search (SOS) algorithm is a recent metaheuristic algorithm inspired by symbiotic interaction between organisms in an ecosystem. Organisms develop symbiotic relationships such as mutualism, commensalism, and parasitism for their survival in ecosystem. SOS was introduced to solve continuous benchmark and engineering problems. The SOS has been shown to be robust and has faster convergence speed when compared with genetic algorithm, particle swarm optimization, differential evolution, and artificial bee colony which are the traditional metaheuristic algorithms. The interests of researchers in using SOS for handling optimization problems are increasing day by day, due to its successful application in solving optimization problems in science and engineering fields. Therefore, this paper presents a comprehensive survey of SOS advances and its applications, and this will be of benefit to the researchers engaged in the study of SOS algorithm.

  相似文献   
84.
A rule-based expert system for earthquake prediction   总被引:2,自引:0,他引:2  
Earthquake is a natural disaster which causes extensive damage as well as the death of thousands of people. Earthquake professionals for many decades have recognized the benefits to society from reliable earthquake predictions. Techniques like: mathematical modelling, hydrology analysis, ionosphere analysis and even animal responses have been used to forecast a quake. Most of these techniques rely on certain precursors like, stress or seismic activity. Data mining techniques can also be used for prediction of this natural hazard. Data mining consists of evolving set of techniques such as association rule mining that can be used to extract valuable information and knowledge from massive volumes of data. The aim of this study is to predict a subsequent earthquake from the data of the previous earthquake. This is achieved by applying association rule mining on earthquake data from 1979 to 2012. These associations are polished using predicate-logic techniques to draw stimulating production-rules to be used with a rule-based expert system. Prediction process is done by an expert system, which takes only current earthquake attributes to predict a subsequent earthquake. The rules generated for predicting the earthquake are mathematically validated as well as tested on real life earthquake data. Results from our study show that the proposed rule-based expert system is able to detect 100 % of earthquakes which actually occurred within 15 hours at-most within a defined range, depth and location. This work solely relies on previous earthquake data for predicting the next.  相似文献   
85.
Multimedia Tools and Applications - Analysis of facial images decoding familial features has been attracting the attention of researchers to develop a computerized system interested in determining...  相似文献   
86.
Scalability is one of the most important quality attribute of software-intensive systems, because it maintains an effective performance parallel to the large fluctuating and sometimes unpredictable workload. In order to achieve scalability, thread pool system (TPS) (which is also known as executor service) has been used extensively as a middleware service in software-intensive systems. TPS optimization is a challenging problem that determines the optimal size of thread pool dynamically on runtime. In case of distributed-TPS (DTPS), another issue is the load balancing b/w available set of TPSs running at backend servers. Existing DTPSs are overloaded either due to an inappropriate TPS optimization strategy at backend servers or improper load balancing scheme that cannot quickly recover an overload. Consequently, the performance of software-intensive system is suffered. Thus, in this paper, we propose a new DTPS that follows the collaborative round robin load balancing that has the effect of a double-edge sword. On the one hand, it effectively performs the load balancing (in case of overload situation) among available TPSs by a fast overload recovery procedure that decelerates the load on the overloaded TPSs up to their capacities and shifts the remaining load towards other gracefully running TPSs. And on the other hand, its robust load deceleration technique which is applied to an overloaded TPS sets an appropriate upper bound of thread pool size, because the pool size in each TPS is kept equal to the request rate on it, hence dynamically optimizes TPS. We evaluated the results of the proposed system against state of the art DTPSs by a client-server based simulator and found that our system outperformed by sustaining smaller response times.  相似文献   
87.
A computer-aided diagnostic (CAD) system for effective and accurate pulmonary nodule detection is required to detect the nodules at early stage. This paper proposed a novel technique to detect and classify pulmonary nodules based on statistical features for intensity values using support vector machine (SVM). The significance of the proposed technique is, it uses the nodules features in 2D & 3D and also SVM for the classification that is good to classify the nodules extracted from the image. The lung volume is extracted from Lung CT using thresholding, background removal, hole-filling and contour correction of lung lobe. The candidate nodules are extracted and pruned using the rules based on ground truth of nodules. The statistical features for intensity values are extracted from candidate nodules. The nodule data are up-samples to reduce the biasness. The classifier SVM is trained using data samples. The efficiency of proposed CAD system is tested and evaluated using Lung Image Consortium Database (LIDC) that is standard data-set used in CAD Systems for Lungs Nodule classification. The results obtained from proposed CAD system are good as compare to previous CAD systems. The sensitivity of 96.31% is achieved in the proposed CAD system.  相似文献   
88.
There is significant interest in the network management and industrial security community about the need to identify the “best” and most relevant features for network traffic in order to properly characterize user behaviour and predict future traffic. The ability to eliminate redundant features is an important Machine Learning (ML) task because it helps to identify the best features in order to improve the classification accuracy as well as to reduce the computational complexity related to the construction of the classifier. In practice, feature selection (FS) techniques can be used as a preprocessing step to eliminate irrelevant features and as a knowledge discovery tool to reveal the “best” features in many soft computing applications. In this paper, we investigate the advantages and disadvantages of such FS techniques with new proposed metrics (namely goodness, stability and similarity). We continue our efforts toward developing an integrated FS technique that is built on the key strengths of existing FS techniques. A novel way is proposed to identify efficiently and accurately the “best” features by first combining the results of some well-known FS techniques to find consistent features, and then use the proposed concept of support to select a smallest set of features and cover data optimality. The empirical study over ten high-dimensional network traffic data sets demonstrates significant gain in accuracy and improved run-time performance of a classifier compared to individual results produced by some well-known FS techniques.  相似文献   
89.
A mathematical model is developed to permit study of the behavior of fuel-oil droplets in a combustion chamber, and results are presented from a computer calculation performed for the 300-MW model TGMP-314P boiler of a power plant.Translated from Inzhenerno-Fizicheskii Zhurnal, Vol. 53, No. 3, pp. 450–458, September, 1987.  相似文献   
90.
Water Resources Management - Climate change will modify the spatio-temporal variation of hydrological variables worldwide, potentially leading to more extreme events and hydraulic infrastructure...  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号