首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   43917篇
  免费   950篇
  国内免费   178篇
电工技术   572篇
综合类   630篇
化学工业   4719篇
金属工艺   612篇
机械仪表   900篇
建筑科学   837篇
矿业工程   390篇
能源动力   419篇
轻工业   2140篇
水利工程   608篇
石油天然气   48篇
武器工业   4篇
无线电   2042篇
一般工业技术   3618篇
冶金工业   21702篇
原子能技术   199篇
自动化技术   5605篇
  2024年   55篇
  2023年   196篇
  2022年   126篇
  2021年   129篇
  2019年   51篇
  2018年   467篇
  2017年   686篇
  2016年   1078篇
  2015年   793篇
  2014年   451篇
  2013年   451篇
  2012年   2160篇
  2011年   2473篇
  2010年   694篇
  2009年   797篇
  2008年   654篇
  2007年   657篇
  2006年   579篇
  2005年   3374篇
  2004年   2580篇
  2003年   2072篇
  2002年   883篇
  2001年   755篇
  2000年   293篇
  1999年   641篇
  1998年   6268篇
  1997年   3882篇
  1996年   2548篇
  1995年   1480篇
  1994年   1106篇
  1993年   1125篇
  1992年   257篇
  1991年   321篇
  1990年   322篇
  1989年   287篇
  1988年   306篇
  1987年   232篇
  1986年   212篇
  1985年   182篇
  1984年   78篇
  1983年   85篇
  1982年   134篇
  1981年   184篇
  1980年   201篇
  1979年   73篇
  1978年   102篇
  1977年   624篇
  1976年   1332篇
  1975年   101篇
  1973年   50篇
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
991.
Attribute selection with fuzzy decision reducts   总被引:2,自引:0,他引:2  
Rough set theory provides a methodology for data analysis based on the approximation of concepts in information systems. It revolves around the notion of discernibility: the ability to distinguish between objects, based on their attribute values. It allows to infer data dependencies that are useful in the fields of feature selection and decision model construction. In many cases, however, it is more natural, and more effective, to consider a gradual notion of discernibility. Therefore, within the context of fuzzy rough set theory, we present a generalization of the classical rough set framework for data-based attribute selection and reduction using fuzzy tolerance relations. The paper unifies existing work in this direction, and introduces the concept of fuzzy decision reducts, dependent on an increasing attribute subset measure. Experimental results demonstrate the potential of fuzzy decision reducts to discover shorter attribute subsets, leading to decision models with a better coverage and with comparable, or even higher accuracy.  相似文献   
992.
We consider the system of intuitionistic fuzzy sets (IF-sets) in a universe X and study the cuts of an IF-set. Suppose a left continuous triangular norm is given. The t-norm based cut (level set) of an IF-set is defined in a way that binds the membership and nonmembership functions via the triangular norm. This is an extension of usual cuts of IF-sets. We show that the system of these cuts fulfils analogical properties as usual systems of cuts. However, it is not possible to reconstruct an IF-set from the system of t-norm based cuts.  相似文献   
993.
This paper presents the use of place/transition petri nets (PNs) for the recognition and evaluation of complex multi-agent activities. The PNs were built automatically from the activity templates that are routinely used by experts to encode domain-specific knowledge. The PNs were built in such a way that they encoded the complex temporal relations between the individual activity actions. We extended the original PN formalism to handle the propagation of evidence using net tokens. The evaluation of the spatial and temporal properties of the actions was carried out using trajectory-based action detectors and probabilistic models of the action durations. The presented approach was evaluated using several examples of real basketball activities. The obtained experimental results suggest that this approach can be used to determine the type of activity that a team has performed as well as the stage at which the activity ended.  相似文献   
994.
In this paper we propose a new circularity measure which defines the degree to which a shape differs from a perfect circle. The new measure is easy to compute and, being area based, is robust—e.g., with respect to noise or narrow intrusions. Also, it satisfies the following desirable properties:
it ranges over (0,1] and gives the measured circularity equal to 1 if and only if the measured shape is a circle;
it is invariant with respect to translations, rotations and scaling.
Compared with the most standard circularity measure, which considers the relation between the shape area and the shape perimeter, the new measure performs better in the case of shapes with boundary defects (which lead to a large increase in perimeter) and in the case of compound shapes. In contrast to the standard circularity measure, the new measure depends on the mutual position of the components inside a compound shape.Also, the new measure performs consistently in the case of shapes with very small (i.e., close to zero) measured circularity. It turns out that such a property enables the new measure to measure the linearity of shapes.In addition, we propose a generalisation of the new measure so that shape circularity can be computed while controlling the impact of the relative position of points inside the shape. An additional advantage of the generalised measure is that it can be used for detecting small irregularities in nearly circular shapes damaged by noise or during an extraction process in a particular image processing task.  相似文献   
995.
In order to address the rapidly increasing load of air traffic operations, innovative algorithms and software systems must be developed for the next generation air traffic control. Extensive verification of such novel algorithms is key for their adoption by industry. Separation assurance algorithms aim at predicting if two aircraft will get closer to each other than a minimum safe distance; if loss of separation is predicted, they also propose a change of course for the aircraft to resolve this potential conflict. In this paper, we report on our work towards developing an advanced testing framework for separation assurance. Our framework supports automated test case generation and testing, and defines test oracles that capture algorithm requirements. We discuss three different approaches to test-case generation, their application to a separation assurance prototype, and their respective strengths and weaknesses. We also present an approach for statistical analysis of the large numbers of test results obtained from our framework.  相似文献   
996.
This paper proposes a novel signal transformation and interpolation approach based on the modification of DCT (Discrete Cosine Transform). The proposed algorithm can be applied to any periodic or quasi periodic waveform for time scale and/or pitch modification purposes in addition to signal reconstruction, compression, coding and packet lost concealment. The proposed algorithm has two advantages:
  • (i) 
    Since DCT does not have the explicit phase information, one does not need the cubic spline interpolation of the phase component of the sinusoidal model.
  • (ii) 
    The parameters to be interpolated can be reduced because of the energy packing efficiency of the DCT. This is particularly important if signal synthesis is carried out on a remote location from the transmitted parameters.
The results are presented on periodic waveforms and on speech signal in order to appreciate the fidelity of the proposed algorithm. In addition, the proposed method is compared with TD-PSOLA, sinusoidal model and phase vocoder algorithms. The results are presented in objective PESQ scores for time scale modification and output files are provided as supplementary material,1 for subjective evaluation, for packet lost concealment. Results prove that the proposed modification of the DCT synthesis provides a favorable algorithm for specialists working in the signal processing area.  相似文献   
997.
This paper proposes a novel computer vision approach that processes video sequences of people walking and then recognises those people by their gait. Human motion carries different information that can be analysed in various ways. The skeleton carries motion information about human joints, and the silhouette carries information about boundary motion of the human body. Moreover, binary and gray-level images contain different information about human movements. This work proposes to recover these different kinds of information to interpret the global motion of the human body based on four different segmented image models, using a fusion model to improve classification. Our proposed method considers the set of the segmented frames of each individual as a distinct class and each frame as an object of this class. The methodology applies background extraction using the Gaussian Mixture Model (GMM), a scale reduction based on the Wavelet Transform (WT) and feature extraction by Principal Component Analysis (PCA). We propose four new schemas for motion information capture: the Silhouette-Gray-Wavelet model (SGW) captures motion based on grey level variations; the Silhouette-Binary-Wavelet model (SBW) captures motion based on binary information; the Silhouette–Edge-Binary model (SEW) captures motion based on edge information and the Silhouette Skeleton Wavelet model (SSW) captures motion based on skeleton movement. The classification rates obtained separately from these four different models are then merged using a new proposed fusion technique. The results suggest excellent performance in terms of recognising people by their gait.  相似文献   
998.
One of the main goals of an applied research field such as software engineering is the transfer and widespread use of research results in industry. To impact industry, researchers developing technologies in academia need to provide tangible evidence of the advantages of using them. This can be done trough step-wise validation, enabling researchers to gradually test and evaluate technologies to finally try them in real settings with real users and applications. The evidence obtained, together with detailed information on how the validation was conducted, offers rich decision support material for industry practitioners seeking to adopt new technologies and researchers looking for an empirical basis on which to build new or refined technologies. This paper presents model for evaluating the rigor and industrial relevance of technology evaluations in software engineering. The model is applied and validated in a comprehensive systematic literature review of evaluations of requirements engineering technologies published in software engineering journals. The aim is to show the applicability of the model and to characterize how evaluations are carried out and reported to evaluate the state-of-research. The review shows that the model can be applied to characterize evaluations in requirements engineering. The findings from applying the model also show that the majority of technology evaluations in requirements engineering lack both industrial relevance and rigor. In addition, the research field does not show any improvements in terms of industrial relevance over time.  相似文献   
999.
Although the deterministic flow shop model is one of the most widely studied problems in scheduling theory, its stochastic analog has remained a challenge. No computationally efficient optimization procedure exists even for the general two-machine version. In this paper, we describe three heuristic procedures for the stochastic, two-machine flow shop problem and report on computational experiments that compare their effectiveness. We focus on heuristic procedures that can be adapted for dispatching without the need for computer simulation or computer-based search. We find that all three procedures are capable of quickly generating solutions close to the best known sequences, which were obtained by extensive search.  相似文献   
1000.
A method of reducing the system matrices of a planar flexible beam described by an absolute nodal coordinate formulation (ANCF) is presented. In this method, we focus that the bending stiffness matrix expressed by adopting a continuum mechanics approach to the ANCF beam element is constant when the axial strain is not very large. This feature allows to apply the Craig–Bampton method to the equation of motion that is composed of the independent coordinates when the constraint forces are eliminated. Four numerical examples that compare the proposed method and the conventional ANCF are demonstrated to verify the performance and accuracy of the proposed method. From these examples, it is verified that the proposed method can describe the large deformation effects such as dynamic stiffening due to the centrifugal force, as well as the conventional ANCF does. The use of this method also reduces the computing time, while maintaining an acceptable degree of accuracy for the expression characteristics of the conventional ANCF when the modal truncation number is adequately employed. This reduction in CPU time particularly pronounced in the case of a large element number and small modal truncation number; the reduction can be verified not only in the case of small deformation but also in the case of a fair bit large deformation.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号