全文获取类型
收费全文 | 169549篇 |
免费 | 17523篇 |
国内免费 | 10666篇 |
专业分类
电工技术 | 14256篇 |
技术理论 | 4篇 |
综合类 | 14236篇 |
化学工业 | 23563篇 |
金属工艺 | 10407篇 |
机械仪表 | 11490篇 |
建筑科学 | 14283篇 |
矿业工程 | 5663篇 |
能源动力 | 5116篇 |
轻工业 | 12992篇 |
水利工程 | 4238篇 |
石油天然气 | 8345篇 |
武器工业 | 2009篇 |
无线电 | 20636篇 |
一般工业技术 | 16970篇 |
冶金工业 | 6725篇 |
原子能技术 | 1931篇 |
自动化技术 | 24874篇 |
出版年
2024年 | 855篇 |
2023年 | 2768篇 |
2022年 | 5572篇 |
2021年 | 7647篇 |
2020年 | 5743篇 |
2019年 | 4442篇 |
2018年 | 4946篇 |
2017年 | 5907篇 |
2016年 | 5094篇 |
2015年 | 7646篇 |
2014年 | 9621篇 |
2013年 | 11374篇 |
2012年 | 13280篇 |
2011年 | 13794篇 |
2010年 | 12812篇 |
2009年 | 12051篇 |
2008年 | 11899篇 |
2007年 | 11252篇 |
2006年 | 10190篇 |
2005年 | 7962篇 |
2004年 | 5551篇 |
2003年 | 4701篇 |
2002年 | 4517篇 |
2001年 | 3981篇 |
2000年 | 3175篇 |
1999年 | 2521篇 |
1998年 | 1600篇 |
1997年 | 1308篇 |
1996年 | 1215篇 |
1995年 | 1043篇 |
1994年 | 849篇 |
1993年 | 525篇 |
1992年 | 425篇 |
1991年 | 308篇 |
1990年 | 247篇 |
1989年 | 211篇 |
1988年 | 146篇 |
1987年 | 98篇 |
1986年 | 90篇 |
1985年 | 39篇 |
1984年 | 37篇 |
1983年 | 24篇 |
1982年 | 31篇 |
1981年 | 41篇 |
1980年 | 41篇 |
1979年 | 25篇 |
1977年 | 9篇 |
1976年 | 10篇 |
1959年 | 10篇 |
1951年 | 15篇 |
排序方式: 共有10000条查询结果,搜索用时 22 毫秒
981.
Based on the famous Schnorr signature scheme, we propose a new chameleon hash scheme which enjoys all advantages of the previous schemes: collision-resistant, message-hiding, semantic security, and key-exposure-freeness. 相似文献
982.
Text categorization based on combination of modified back propagation neural network and latent semantic analysis 总被引:1,自引:1,他引:0
This paper proposed a new text categorization model based on the combination of modified back propagation neural network (MBPNN)
and latent semantic analysis (LSA). The traditional back propagation neural network (BPNN) has slow training speed and is
easy to trap into a local minimum, and it will lead to a poor performance and efficiency. In this paper, we propose the MBPNN
to accelerate the training speed of BPNN and improve the categorization accuracy. LSA can overcome the problems caused by
using statistically derived conceptual indices instead of individual words. It constructs a conceptual vector space in which
each term or document is represented as a vector in the space. It not only greatly reduces the dimension but also discovers
the important associative relationship between terms. We test our categorization model on 20-newsgroup corpus and reuter-21578
corpus, experimental results show that the MBPNN is much faster than the traditional BPNN. It also enhances the performance
of the traditional BPNN. And the application of LSA for our system can lead to dramatic dimensionality reduction while achieving
good classification results. 相似文献
983.
Towards capacity and profit optimization of video-on-demand services in a peer-assisted IPTV platform 总被引:1,自引:0,他引:1
Yih-Farn Chen Yennun Huang Rittwik Jana Hongbo Jiang Michael Rabinovich Jeremy Rahe Bin Wei Zhen Xiao 《Multimedia Systems》2009,15(1):19-32
This paper studies the conditions under which peer-to-peer (P2P) technology may be beneficial in providing IPTV services over
typical network architectures. It has three major contributions. First, we contrast two network models used to study the performance
of such a system: a commonly used logical “Internet as a cloud” model and a “physical” model that reflects the characteristics
of the underlying network. Specifically, we show that the cloud model overlooks important architectural aspects of the network
and may drastically overstate the benefits of P2P technology. Second, we propose an algorithm called Zebra that pre-stripes
content across multiple peers during idle hours to speed up P2P content delivery in an IPTV environment with limited upload
bandwidth. We also perform simulations to measure Zebra’s effectiveness at reducing load on the content server during peak
hours. Third, we provide a cost-benefit analysis of P2P video content delivery, focusing on the profit trade-offs for different
pricing/incentive models rather than purely on capacity maximization. In particular, we find that under high volume of video
demand, a P2P built-in incentive model performs better than any other model, while the conventional no-P2P model generates
more profits when the request rate is low. The flat-reward model generally falls in between the usage-based model and the
built-in model in terms of profitability except for low request rates. We also find that built-in and flat-reward models are
more profitable than the usage-based model for a wide range of subscriber community sizes.
Funding for J. Rahe’s research has been provided by AT&T Labs, the State of California under the MICR Oprogram, and by the
Toshiba Corporation.
Zhen Xiao is partially supported by China MOST project (2006BAH02A10). 相似文献
984.
Model checking is a formal technique used to verify communication protocols against given properties. In this paper, we propose a new model checking algorithm aims at verifying systems designed as a set of autonomous interacting agents. These software agents are equipped with knowledge and beliefs and interact with each other according to protocols governed by a set of logical rules. We present a tableauased version of this algorithm and provide the soundness, completeness, termination and complexity results. A case study about an agent-based negotiation protocol and its implementation are also described. 相似文献
985.
Agent trust researches become more and more important because they will ensure good interactions among the software agents in large-scale open systems. Moreover, individual agents often interact with long-term coalitions such as some E-commerce web sites. So the agents should choose a coalition based on utility and trust. Unfortunately, few studies have been done on agent coalition credit and there is a need to do it in detail. To this end, a long-term coalition credit model (LCCM) is presented. Furthermore, the relationship between coalition credit and coalition payoff is also attended. LCCM consists of internal trust based on agent direct interactions and external reputation based on agent direct observation. Generalization of LCCM can be demonstrated through experiments applied in both cooperative and competitive domain environment. Experimental results show that LCCM is capable of coalition credit computation efficiently and can properly reflect various factors effect on coalition credit. Another important advantage that is a useful and basic property of credit is that LCCM can effectively filter inaccurate or lying information among interactions. 相似文献
986.
Conventional time series models have been applied to handle many forecasting problems, such as financial, economic and weather forecasting. In stock markets, correct stock predictions will bring a huge profit for stock investors. However, conventional time series models produce forecasts based on some strict statistical assumptions about data distributions, and, therefore, they are not very proper to forecast financial datasets. This paper proposes a new forecasting model using adaptive learning techniques to predict TAIEX (Taiwan Stock Exchange Capitalization Weighted Stock index) with multi-stock indexes (NASDAQ stock index and Dow Jones stock index). In verification, this paper employs seven year period of TAIEX stock index, from 1997 to 2003, as experimental datasets, and the root mean square error (RMSE) as evaluation criterion. The performance comparison results show that the proposed model outperforms the listing methods in forecasting Taiwan stock market. Besides, from statistical test results, it is showed that the volatility of Dow Jones and the NASDAQ affect TAIEX significantly. 相似文献
987.
Grid computing now becomes a practical computing paradigm and solution for distributed systems and applications. Currently increasing resources are involved in Grid environments and a large number of applications are running on computational Grids. Unfortunately Grid computing technologies are still far away from reach of inexperienced application users, e.g., computational scientists and engineers. A software layer is required to provide an easy interface of Grids to end users.To meet this requirement HEAVEN (Hosting European Application Virtual ENvironment) upperware is proposed to build on top of Grid middleware. This paper presents HEAVEN philosophy of virtual computing for Grids – a combinational idea of simulation and emulation approaches. The concept of Virtual Private Computing Environment (VPCE) is thereafter proposed and defined. The design and current implementation of HEAVEN upperware are discussed in detail. Use case of Ag2D application justifies the philosophy of HEAVEN virtual computing methodology and the design/implementation of HEAVEN upperware. 相似文献
988.
煤矿巷道监控图像中脸部分割的难度比较大.在典型监控角度方向采集了巷道中矿工的监控图像;在HSV颜色模型的各分量中分析了矿工脸部肤色的统计特征;在色度分量空间,采用上限和下限阈值的方法分割出肤色区域像素和相似色度的背景像素;对脸部区域的二值图像进行平滑处理和分割,大幅度降低背景像素的影响,缩小脸部区域的范围.实验结果表明,此方法计算简单、速度快,能成功地分割巷道监控图像中的脸部区域. 相似文献
989.
提出了一种新的用于灵活图像认证的多重水印嵌入算法.不同于传统的块独立水印算法中每个图像块只嵌入一个水印信息,算法对每个图像块嵌入多重水印信息.提出了两个通用的图像块等级模型,形成图像块内部的等级结构,对每个图像块以及图像块内部的各等级子块进行独立的水印生成和嵌入.将图像特征值映射为混沌系统的初值,并将图像块的编号映射为混沌系统的迭代次数,经过混沌迭代生成图像块水印,再将水印信号替代图像块中选定像素点的最低有效位,完成水印的嵌入.实验结果表明,该算法可对图像进行多重认证,对篡改区域进行精确的检测与定位,并能选择不同的定位精度. 相似文献
990.
重压缩检测是多级隐密分析中关键的预处理部分,高准确率的重压缩检测是隐密分析获得更高性能的重要前提条件.深入研究了重压缩对于JPEG图像各种特征的影响,基于此提出了一种融合直方图分布特征、Benford特征、DFT特征的重压缩检测算法.仿真实验表明,该算法具有更高的检测率,能够适用于JPEG多级隐密分析中的重压缩检测. 相似文献