首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   174963篇
  免费   16292篇
  国内免费   9386篇
电工技术   13836篇
技术理论   8篇
综合类   17870篇
化学工业   17315篇
金属工艺   7160篇
机械仪表   15771篇
建筑科学   20295篇
矿业工程   6728篇
能源动力   7796篇
轻工业   13447篇
水利工程   7490篇
石油天然气   8279篇
武器工业   2002篇
无线电   9997篇
一般工业技术   17043篇
冶金工业   7019篇
原子能技术   2515篇
自动化技术   26070篇
  2024年   721篇
  2023年   2141篇
  2022年   4361篇
  2021年   5109篇
  2020年   5377篇
  2019年   4489篇
  2018年   4426篇
  2017年   5414篇
  2016年   6550篇
  2015年   6887篇
  2014年   11232篇
  2013年   11272篇
  2012年   12997篇
  2011年   14333篇
  2010年   10276篇
  2009年   10476篇
  2008年   9826篇
  2007年   11790篇
  2006年   10229篇
  2005年   8597篇
  2004年   7269篇
  2003年   6199篇
  2002年   5008篇
  2001年   4115篇
  2000年   3520篇
  1999年   2935篇
  1998年   2524篇
  1997年   2124篇
  1996年   1745篇
  1995年   1438篇
  1994年   1293篇
  1993年   965篇
  1992年   891篇
  1991年   654篇
  1990年   550篇
  1989年   468篇
  1988年   395篇
  1987年   268篇
  1986年   233篇
  1985年   214篇
  1984年   252篇
  1983年   239篇
  1982年   196篇
  1981年   95篇
  1980年   94篇
  1979年   65篇
  1978年   53篇
  1977年   46篇
  1976年   43篇
  1959年   30篇
排序方式: 共有10000条查询结果,搜索用时 0 毫秒
951.
基于高校公文处理及文件管理系统的分析与设计   总被引:1,自引:0,他引:1  
本文在分析高校公文处理的业务流程和数据流程、收发文间的数据关联的基础上,提出高校公文处理与文件管理系统的功能需求,为开发经济实用的适合高校需要的文档一体化管理软件提供思路。  相似文献   
952.
本文介绍了EAST低杂波高压电源系统的诊断,在低杂波系统中高压电源系统的正常运行意义非凡。我们通过对一般的抽象的故障分析专家系统具体化建立一个高压电源故障分析专家系统并把它做成一个可被计算机识别的数学模型存储在计算机当中。在低杂波高压电源诊断系统中,根据采集到的高压电源的参数通过存储在计算机中的高压电源故障分析专家系统来推断高压电源系统中哪个部分出现了问题,这个问题会带来什么影响以便去寻找针对具体故障的解决办法。  相似文献   
953.
As part of the efforts to understand the intricacies of the k-colorability problem, different distributions over k-colorable graphs have been analyzed. While the problem is notoriously hard (not even reasonably approximable) in the worst case, the average case (with respect to such distributions) often turns out to be “easy”. Semi-random models mediate between these two extremes and are more suitable to imitate “real-life” instances than purely random models. In this work we consider semi-random variants of the planted k-colorability distribution. This continues a line of research pursued by Coja-Oghlan, and by Krivelevich and Vilenchik. Our aim is to study a more general semi-random framework than those suggested so far. On the one hand we show that previous algorithmic techniques extend to our more general semi-random setting; on the other hand we give a hardness result, proving that a closely related semi-random model is intractable. Thus we provide some indication about which properties of the input distribution make the k-colorability problem hard.  相似文献   
954.
Share price trends can be recognized by using data clustering methods. However, the accuracy of these methods may be rather low. This paper presents a novel supervised classification scheme for the recognition and prediction of share price trends. We first produce a smooth time series using zero-phase filtering and singular spectrum analysis from the original share price data. We train pattern classifiers using the classification results of both original and filtered time series and then use these classifiers to predict the future share price trends. Experiment results obtained from both synthetic data and real share prices show that the proposed method is effective and outperforms the well-known K-means clustering algorithm.  相似文献   
955.
The aim of this paper is to develop an optimal technique for dealing with the fuzziness aspect of demand uncertainties. Triangular fuzzy numbers are used to model external demand, and decision models in both non-coordination and coordination situations are constructed. It is shown that in the decision models there exists a unique solution that can be expressed analytically. Based on the closed form solutions for both models, the behaviors and relationships of both the manufacturer and the retailer are quantitatively analyzed, and a cooperative policy for the optimization of the whole supply chain is put forward.  相似文献   
956.
Empirical studies of human systems often involve recording multidimensional signals because the system components may require physical measurements (e.g., temperature, pressure, body movements and/or movements in the environment) and physiological measurements (e.g., electromyography or electrocardiography). Analysis of such data becomes complex if both the multifactor aspect and the multivariate aspect are retained. Three examples are used to illustrate the role of fuzzy space windowing and the large number of data analysis paths. The first example is a classic simulated data set found in the literature, which we use to compare several data analysis paths generated with principal component analysis and multiple correspondence analysis with crisp and fuzzy windowing. The second example involves eye-tracking data based on advertising, with a focus on the case of one category variable, but with the possibility of several space windowing models and time entities. The third example concerns car and head movement data from a driving vigilance study, with a focus on the case involving several quantitative variables. The notions of analysis path multiplicity and information are discussed both from a general perspective and in terms of our two real examples.  相似文献   
957.
This paper studies a special game with incomplete information, in which the payoffs of the players are both random and fuzzy. Such a game is considered in the context of a Bayesian game with the uncertain types characterized as fuzzy variables. A static fuzzy Bayesian game is then introduced and the decision rules for players are given based on credibility theory. We further prove the existence of the equilibrium of the game. Finally, a Cournot competition model with fuzzy efficiency under asymmetric information is investigated as an application and some results are presented.  相似文献   
958.
Many problems in paleontology reduce to finding those features that best discriminate among a set of classes. A clear example is the classification of new specimens. However, these classifications are generally challenging because the number of discriminant features and the number of samples are limited. This has been the fate of LB1, a new specimen found in the Liang Bua Cave of Flores. Several authors have attributed LB1 to a new species of Homo, H. floresiensis. According to this hypothesis, LB1 is either a member of the early Homo group or a descendent of an ancestor of the Asian H. erectus. Detractors have put forward an alternate hypothesis, which stipulates that LB1 is in fact a microcephalic modern human. In this paper, we show how we can employ a new Bayes optimal discriminant feature extraction technique to help resolve this type of issues. In this process, we present three types of experiments. First, we use this Bayes optimal discriminant technique to develop a model of morphological (shape) evolution from Australopiths to H. sapiens. LB1 fits perfectly in this model as a member of the early Homo group. Second, we build a classifier based on the available cranial and mandibular data appropriately normalized for size and volume. Again, LB1 is most similar to early Homo. Third, we build a brain endocast classifier to show that LB1 is not within the normal range of variation in H. sapiens. These results combined support the hypothesis of a very early shared ancestor for LB1 and H. erectus, and illustrate how discriminant analysis approaches can be successfully used to help classify newly discovered specimens.  相似文献   
959.
In this paper, a novel one-dimensional correlation filter based class-dependence feature analysis (1D-CFA) method is presented for robust face recognition. Compared with original CFA that works in the two dimensional (2D) image space, 1D-CFA encodes the image data as vectors. In 1D-CFA, a new correlation filter called optimal extra-class origin output tradeoff filter (OEOTF), which is designed in the low-dimensional principal component analysis (PCA) subspace, is proposed for effective feature extraction. Experimental results on benchmark face databases, such as FERET, AR, and FRGC, show that OEOTF based 1D-CFA consistently outperforms other state-of-the-art face recognition methods. This demonstrates the effectiveness and robustness of the novel method.  相似文献   
960.
Problems from plastic analysis are based on the convex, linear or linearised yield/strength condition and the linear equilibrium equation for the stress (state) vector. In practice one has to take into account stochastic variations of several model parameters. Hence, in order to get robust maximum load factors, the structural analysis problem with random parameters must be replaced by an appropriate deterministic substitute problem. A direct approach is proposed based on the primary costs for missing carrying capacity and the recourse costs (e.g. costs for repair, compensation for weakness within the structure, damage, failure, etc.). Based on the mechanical survival conditions of plasticity theory, a quadratic error/loss criterion is developed. The minimum recourse costs can be determined then by solving an optimisation problem having a quadratic objective function and linear constraints. For each vector a(·) of model parameters and each design vector x, one obtains then an explicit representation of the “best” internal load distribution F. Moreover, also the expected recourse costs can be determined explicitly. Consequently, an explicit stochastic nonlinear program results for finding a robust maximal load factor μ. The analytical properties and possible solution procedures are discussed.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号