首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   621篇
  免费   22篇
  国内免费   3篇
电工技术   3篇
化学工业   121篇
金属工艺   21篇
机械仪表   25篇
建筑科学   18篇
能源动力   18篇
轻工业   34篇
水利工程   1篇
无线电   57篇
一般工业技术   153篇
冶金工业   89篇
原子能技术   2篇
自动化技术   104篇
  2023年   6篇
  2022年   13篇
  2021年   20篇
  2020年   22篇
  2019年   21篇
  2018年   19篇
  2017年   28篇
  2016年   26篇
  2015年   22篇
  2014年   12篇
  2013年   47篇
  2012年   31篇
  2011年   48篇
  2010年   20篇
  2009年   26篇
  2008年   23篇
  2007年   16篇
  2006年   19篇
  2005年   22篇
  2004年   19篇
  2003年   18篇
  2002年   10篇
  2001年   9篇
  2000年   6篇
  1999年   6篇
  1998年   13篇
  1997年   3篇
  1996年   14篇
  1995年   8篇
  1994年   9篇
  1993年   7篇
  1992年   15篇
  1991年   7篇
  1990年   11篇
  1989年   3篇
  1988年   6篇
  1987年   10篇
  1986年   5篇
  1985年   8篇
  1984年   2篇
  1983年   1篇
  1982年   1篇
  1981年   2篇
  1980年   2篇
  1979年   1篇
  1978年   3篇
  1977年   1篇
  1976年   2篇
  1969年   1篇
  1966年   1篇
排序方式: 共有646条查询结果,搜索用时 0 毫秒
1.
The synthesis of BaMgAl10O17∶ Eu2 (BAM) phosphors using the sol-gel method and their luminescence properties were reported. The blue-light emitting BAM was synthesized using citric acid and ethylene glycol as chelating materials. Emission of blue-light was obtained from these phosphors. The luminescent intensity increases as the temperature of heat treatment is increased. This study investigated the effects of the molar ratio of ethylene glycol to citric acid (Φ value), with respect to the phase formation and luminescence properties of BAM. The variation of the Φ value resulted in the change of the sol-gel reaction mechanism and the microstructures of the resultant powders. An increase in Φ value leads to an increase in the rate of BAM phase formation. The photoluminescent intensity of the prepared phosphors increases with heating temperatures because of enhanced crystallization.  相似文献   
2.
3.
When a circuit is tested using random or pseudorandom patterns, it is essential to determine the amount of time (test length) required to test it adequately. We present a methodology for predicting different statistics of random pattern test length. While earlier methods allowed estimation only of upper bounds of test length and only for exhaustive fault coverage, the technique presented here is capable of providing estimates of all statistics of interest (including expected value and variance) for all coverage specifications.Our methodology is based on sampling models developed for fault coverage estimation [1]. Test length is viewed as awaiting time on fault coverage. Based on this relation we derive the distribution of test length as a function of fault coverage. Methods of approximating expected value and variance of test length are presented. Accuracy of these approximations can be controlled by the user. A practical technique for predicting expected test length is developed. This technique is based on clustering faults into equal detectability subsets. A simple and effective algorithm for fault clustering is also presented. The sampling model is applied to each cluster independently and the results are then aggregated to yield test lengths for the whole circuit. Results of experiments with several circuits (both ISCAS '85 benchmarks and other practical circuits) are also provided.This work was done while the author was with the Department of Electrical Engineering, Southern Illinois University, Carbondale, IL 62901.  相似文献   
4.
The initial rate of colloid deposition onto semi‐permeable membranes is largely controlled by the coupled influence of permeation drag and particle‐membrane colloidal interactions. Recent studies show that the particle‐membrane interactions are subject to immense local variations due to the inherent morphological heterogeneity (roughness) of reverse osmosis (RO) and nanofiltration (NF) membranes. This experimental investigation reports the effect of membrane roughness on the initial deposition of polystyrene latex particles on a rough NF membrane during cross flow membrane filtration under different operating pressures and solution chemistries. Atomic force microscopy was used to characterize the roughness of the membrane and observe the structure of particle deposits. At the initial stages of fouling, the AFM images show that more particles preferentially accumulate near the “peaks” than in the “valleys” of the rough NF membrane surface.  相似文献   
5.
Multiversion databases store both current and historical data. Rows are typically annotated with timestamps representing the period when the row is/was valid. We develop novel techniques to reduce index maintenance in multiversion databases, so that indexes can be used effectively for analytical queries over current data without being a heavy burden on transaction throughput. To achieve this end, we re-design persistent index data structures in the storage hierarchy to employ an extra level of indirection. The indirection level is stored on solid-state disks that can support very fast random I/Os, so that traversing the extra level of indirection incurs a relatively small overhead. The extra level of indirection dramatically reduces the number of magnetic disk I/Os that are needed for index updates and localizes maintenance to indexes on updated attributes. Additionally, we batch insertions within the indirection layer in order to reduce physical disk I/Os for indexing new records. In this work, we further exploit SSDs by introducing novel DeltaBlock techniques for storing the recent changes to data on SSDs. Using our DeltaBlock, we propose an efficient method to periodically flush the recently changed data from SSDs to HDDs such that, on the one hand, we keep track of every change (or delta) for every record, and, on the other hand, we avoid redundantly storing the unchanged portion of updated records. By reducing the index maintenance overhead on transactions, we enable operational data stores to create more indexes to support queries. We have developed a prototype of our indirection proposal by extending the widely used generalized search tree open-source project, which is also employed in PostgreSQL. Our working implementation demonstrates that we can significantly reduce index maintenance and/or query processing cost by a factor of 3. For the insertion of new records, our novel batching technique can save up to 90 % of the insertion time. For updates, our prototype demonstrates that we can significantly reduce the database size by up to 80 % even with a modest space allocated for DeltaBlocks on SSDs.  相似文献   
6.
In cognitive radio networks, cognitive nodes operate on a common pool of spectrum where they opportunistically access and use parts of the spectrum not being used by others. Though cooperation among nodes is desirable for efficient network operations and performance, there might be some malicious nodes whose objective could be to hinder communications and disrupt network operations. The absence of a central authority or any policy enforcement mechanism makes these kinds of open-access network more vulnerable and susceptible to attacks.In this paper, we analyze a common form of denial-of-service attack, i.e., collaborative jamming. We consider a network in which a group of jammers tries to jam the channels being used by legitimate users who in turn try to evade the jammed channels. First, we compute the distribution of the jamming signal that a node experiences by considering a random deployment of jammers. Then, we propose different jamming and defending schemes that are employed by the jammers and legitimate users, respectively. In particular, we model and analyze the channel availability when the legitimate users randomly choose available channels and the jammers jam different channels randomly. We propose a multi-tier proxy-based cooperative defense strategy to exploit the temporal and spatial diversity for the legitimate secondary users in an infrastructure-based centralized cognitive radio network. Illustrative results on spectrum availability rates show how to improve resiliency in cognitive radio networks in the presence of jammers.  相似文献   
7.

With the exponential growth of end users and web data, the internet is undergoing the change of paradigm from a user-centric model to a content-centric one, popularly known as information-centric networks (ICN). Current ICN research evolves around three key-issues namely (i) content request searching, (ii) content routing, and (iii) in-network caching scheme to deliver the requested content to the end user. This would improve the user experience to obtain requested content because it lowers the download delay and provides higher throughput. Existing researches have mainly focused on on-path congestion or expected delivery time of a content to determine the optimized path towards custodian. However, it ignores the cumulative effect of the link-state parameters and the state of the cache, and consequently it leads to degrade the delay performance. In order to overcome this shortfall, we consider both the congestion of a link and the state of on-path caches to determine the best possible routes. We introduce a generic term entropy to quantify the effects of link congestion and state of on-path caches. Thereafter, we develop a novel entropy dependent algorithm namely ENROUTE for searching of content request triggered by any user, routing of this content, and caching for the delivery this requested content to the user. The entropy value of an intra-domain node indicates how many popular contents are already cached in the node, which, in turn, signifies the degree of enrichment of that node with the popular contents. On the other hand, the entropy for a link indicates how much the link is congested with the traversal of contents. In order to have reduced delay, we enhance the entropy of caches in nodes, and also use path with low entropy for downloading contents. We evaluate the performance of our proposed ENROUTE algorithm against state-of-the-art schemes for various network parameters and observe an improvement of 29–52% in delay, 12–39% in hit rate, and 4–39% in throughput.

  相似文献   
8.
9.
We propose a new link metric called normalized advance (NADV) for geographic routing in multihop wireless networks. NADV selects neighbors with the optimal trade-off between proximity and link cost. Coupled with the local next hop decision in geographic routing, NADV enables an adaptive and efficient cost-aware routing strategy. Depending on the objective or message priority, applications can use the NADV framework to minimize various types of link cost.We present efficient methods for link cost estimation and perform detailed experiments in simulated environments. Our results show that NADV outperforms current schemes in many aspects: for example, in high noise environments with frequent packet losses, the use of NADV leads to 81% higher delivery ratio. When compared to centralized routing under certain settings, geographic routing using NADV finds paths whose cost is close to the optimum. We also conducted experiments in Emulab testbed and the results demonstrate that our proposed approach performs well in practice.  相似文献   
10.
In this paper, we present a new variant of Particle Swarm Optimization (PSO) for image segmentation using optimal multi-level thresholding. Some objective functions which are very efficient for bi-level thresholding purpose are not suitable for multi-level thresholding due to the exponential growth of computational complexity. The present paper also proposes an iterative scheme that is practically more suitable for obtaining initial values of candidate multilevel thresholds. This self iterative scheme is proposed to find the suitable number of thresholds that should be used to segment an image. This iterative scheme is based on the well known Otsu’s method, which shows a linear growth of computational complexity. The thresholds resulting from the iterative scheme are taken as initial thresholds and the particles are created randomly around these thresholds, for the proposed PSO variant. The proposed PSO algorithm makes a new contribution in adapting ‘social’ and ‘momentum’ components of the velocity equation for particle move updates. The proposed segmentation method is employed for four benchmark images and the performances obtained outperform results obtained with well known methods, like Gaussian-smoothing method (Lim, Y. K., & Lee, S. U. (1990). On the color image segmentation algorithm based on the thresholding and the fuzzy c-means techniques. Pattern Recognition, 23, 935–952; Tsai, D. M. (1995). A fast thresholding selection procedure for multimodal and unimodal histograms. Pattern Recognition Letters, 16, 653–666), Symmetry-duality method (Yin, P. Y., & Chen, L. H. (1993). New method for multilevel thresholding using the symmetry and duality of the histogram. Journal of Electronics and Imaging, 2, 337–344), GA-based algorithm (Yin, P. -Y. (1999). A fast scheme for optimal thresholding using genetic algorithms. Signal Processing, 72, 85–95) and the basic PSO variant employing linearly decreasing inertia weight factor.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号