首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   9544篇
  免费   611篇
  国内免费   490篇
电工技术   438篇
综合类   785篇
化学工业   141篇
金属工艺   105篇
机械仪表   499篇
建筑科学   160篇
矿业工程   56篇
能源动力   147篇
轻工业   49篇
水利工程   68篇
石油天然气   40篇
武器工业   47篇
无线电   820篇
一般工业技术   697篇
冶金工业   40篇
原子能技术   26篇
自动化技术   6527篇
  2024年   16篇
  2023年   55篇
  2022年   89篇
  2021年   88篇
  2020年   153篇
  2019年   140篇
  2018年   105篇
  2017年   207篇
  2016年   218篇
  2015年   299篇
  2014年   417篇
  2013年   629篇
  2012年   529篇
  2011年   653篇
  2010年   428篇
  2009年   655篇
  2008年   707篇
  2007年   739篇
  2006年   666篇
  2005年   535篇
  2004年   496篇
  2003年   456篇
  2002年   386篇
  2001年   301篇
  2000年   254篇
  1999年   193篇
  1998年   191篇
  1997年   179篇
  1996年   112篇
  1995年   99篇
  1994年   93篇
  1993年   70篇
  1992年   70篇
  1991年   42篇
  1990年   33篇
  1989年   39篇
  1988年   44篇
  1987年   25篇
  1986年   22篇
  1985年   32篇
  1984年   47篇
  1983年   27篇
  1982年   21篇
  1981年   20篇
  1980年   18篇
  1979年   12篇
  1978年   7篇
  1977年   11篇
  1976年   9篇
  1975年   4篇
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
1.
Efficient electricity price forecasting plays a significant role in our society. In this paper, a novel influencer-defaulter mutation (IDM) mutation operator has been proposed. The IDM operator has been combined with six well-known optimization algorithms to create mutated optimization algorithms whose performance has been tested on twenty-four standard benchmark functions. Further, the artificial neural network is integrated with mutated optimization algorithms to solve the electricity price prediction problem. The policymakers can identify appropriate variables based on the predicted prices to help future market planning. The statistical results prove the efficacy of the IDM operator on the recent optimization algorithms.  相似文献   
2.
准确的横波测井速度是影响地震叠前属性分析及反演质量的重要参数,在缺乏偶极横波测井情况下需要用Krief、Pride等数学模型预测横波,近年来临界孔隙度模型在计算骨架的弹性模量中应用效果较好。在前人研究的临界孔隙度模型的基础上,发展出了可变临界孔隙度的Nur模型,并将遗传算法应用于可变临界孔隙度的计算,最终估算得到了页岩气层的横波速度。实例应用表明,遗传算法可以计算得到沿井眼不同深度的临界孔隙度,而且预测得到的横波与偶极横波测井(DSI)一致性好,证明该方法应用于页岩气的横波估算中是可行的。  相似文献   
3.
Portfolio selection is a key issue in the business world and financial fields. This article presents a new decision making method of portfolio optimization (PO) issues in different risk measures by using new evolutionary computing method and cardinality constrains which is mentioned as hybrid meta-heuristic algorithms. Based on mean–variance (MV) Method by Markowitz we collected three risk levels; mean absolute deviation (MAD), semi variance (SV) and variance with skewness (VWS). The developed algorithms are Electromagnetism-like algorithm (EM), particle swarm optimization (PSO), genetic algorithm (GA), genetic network programming (GNP) and simulated annealing (SA). Also a diversification mechanism strategy is implemented and hybridized with the developed algorithms to increase the diversity and overcome local optimality. The sustainability of this proposed model is verified by 50 factories on the Iranian stock exchange. Finally, experimental results of proposed algorithms with cardinality constraint are compared with each other by four effective metrics in which the algorithms performance for achieving the optimal solution discussed. In addition, we have done the analysis of variance technique to confirm the validity and accurately analyze of the results which the success of this method was proved.  相似文献   
4.
To implement on-line, real-time monitoring for the surface morphology of Plasma-Facing Materials (PFMs) in tokamak, we developed a Laser Speckle Interferometry measurement approach. A laser ablation method was used to simulate the erosion process during Plasma-Wall Interactions in a tokamak. In the present investigation, we evaluated the results of laser ablation morphology changes on the surface of Mo material reconstructed by four different approaches (Flood-fill, Quality-guided, Discrete Cosine Transform (DCT) and Weighted-DCT). The morphology results measured by the weighted-DCT approach are very close to the measurement results from confocal microscopy with an average error rate within 7%. It is verified that the weighted-DCT algorithm has high accuracy and can efficiently reduce the influence of noise pollution coming from laser ablation, which is used as a proxy for erosion from plasma wall interaction. Additionally, the CPU computer time has been shortened. This is of great significance for the real-time monitoring of PFMs’ morphology in the Experimental Advanced Superconducting Tokamak (EAST) in the future.  相似文献   
5.
作为精准农业的重要组成部分,农用拖拉机自动转向系统的研究也越来越多。通过对相关文献进行梳理和分析,分别从自动转向系统的两个组成部分(自动转向执行机构和自动转向控制系统)阐述国内农用拖拉机自动转向系统的研究成果,为设计合理的自动转向执行机构和稳定准确的自动转向控制系统提供建议和参考。  相似文献   
6.
An evaluation of XML queries such as XQuery or XPath expressions represents a challenging task due to its complexity. Many algorithms have been introduced to cope with this problem. Some of them, called binary joins, evaluate separated parts of a query and subsequently merge intermediate results, while the others, called holistic twig joins, evaluate a query as a whole. Moreover, these algorithms also differ in what index data structure they use to handle XML data. There exist cost-based approaches utilizing binary joins and various index data structures; however, they share a limitation. The limitation is that they cannot perform a join between query nodes not having a direct XPath relationship. Such a join can be advantageous especially if their joint selectivity is high. Since holistic joins work with all query nodes they overcome this limitation. In this article, we introduce such a holistic twig join called CostTwigJoin. To the best of our knowledge, CostTwigJoin is the first holistic join capable of combining various index data structures during an evaluation of an XML query. Usage of the holistic join has yet another advantage for cost-based approaches: an optimizer does not have to resolve the order of binary joins; therefore, the search space is reduced. In this article, we perform thorough experiments on hundreds of queries to evaluate our approach and demonstrate its advantages.  相似文献   
7.
Combinatorial auction is a useful trade manner for transportation service procurements in e-marketplaces. To enhance the competition of combinatorial auction, a novel auction mechanism of two-round bidding with bundling optimization is proposed. As the recommended the auction mechanism, the shipper/auctioneer integrates the objects into several bundles based on the bidding results of first round auction. Then, carriers/bidders bid for the object bundles in second round. The bundling optimization is described as a multi-objective model with two criteria on price complementation and combination consistency. A Quantum Evolutionary Algorithm (QEA) with β-based rotation gate and the encoding scheme based on non-zero elements in complementary coefficient matrix is developed for the model solution. Comparing with a Contrast Genetic Algorithm, QEA can achieve better computational performances for small and middle size problems.  相似文献   
8.
Accelerated life testing (ALT) of a field programmable gate array (FPGA) requires it to be configured with a circuit that satisfies multiple criteria. Hand-crafting such a circuit is a herculean task as many components of the criteria are orthogonal to each other demanding a complex multivariate optimization. This paper presents an evolutionary algorithm aided by particle swarm optimization methodology to generate synthetic benchmark circuits (SBC) that can be used for ALT of FPGAs. The proposed algorithm was used to generate a SBC for ALT of a commercial FPGA. The generated SBC when compared with a hand-crafted one, demonstrated to be more suitable for ALT, measured in terms of meeting the multiple criteria. The SBC generated by the proposed technique utilizes 8.37% more resources; operates at a maximum frequency which is 40% higher; and has 7.75% higher switching activity than the hand-crafted one reported in the literature. The hand-crafted circuit is very specific to the particular device of that family of FPGAs, whereas the proposed algorithm is device-independent. In addition, it took several man months to hand-craft the SBC, whereas the proposed algorithm took less than half-a-day.  相似文献   
9.
随着个性化推荐技术的发展,推荐系统面临着越来越多的挑战。传统的推荐算法通常存在数据稀疏性和推荐精度低等问题。针对以上问题,提出了一种融合时间隐语义填充和子群划分的推荐算法[K]-TLFM(Time Based Latent Factor Model Integrated with [k]-means)。该算法利用融合时间因素的隐语义模型对原始用户物品评分矩阵缺失项进行填充,避免了用全局平均值或者用户/物品平均值补全矩阵带来的误差,有效缓解了数据稀疏性问题,同时融合时间因素有效地刻画了用户偏好随时间的变化;完成评分矩阵缺失项填充后,基于二分[k]-means聚类算法将偏好、兴趣特征相似的对象划分到同一个子群中,在目标用户所属的子群中基于选定的协同过滤算法为用户产生推荐列表,提高了推荐效率和准确性。在MovieLens和Netflix数据集上对该算法的推荐性能进行了对比实验,结果表明该算法具有更高的推荐精度。  相似文献   
10.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号