首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   4361篇
  免费   333篇
  国内免费   7篇
电工技术   31篇
综合类   2篇
化学工业   1189篇
金属工艺   44篇
机械仪表   79篇
建筑科学   167篇
矿业工程   8篇
能源动力   165篇
轻工业   1003篇
水利工程   45篇
石油天然气   11篇
无线电   217篇
一般工业技术   648篇
冶金工业   99篇
原子能技术   18篇
自动化技术   975篇
  2024年   5篇
  2023年   51篇
  2022年   61篇
  2021年   255篇
  2020年   151篇
  2019年   192篇
  2018年   196篇
  2017年   178篇
  2016年   199篇
  2015年   154篇
  2014年   231篇
  2013年   379篇
  2012年   327篇
  2011年   379篇
  2010年   263篇
  2009年   240篇
  2008年   221篇
  2007年   218篇
  2006年   140篇
  2005年   130篇
  2004年   124篇
  2003年   97篇
  2002年   82篇
  2001年   46篇
  2000年   45篇
  1999年   42篇
  1998年   54篇
  1997年   41篇
  1996年   25篇
  1995年   24篇
  1994年   18篇
  1993年   15篇
  1992年   17篇
  1991年   20篇
  1990年   11篇
  1989年   13篇
  1988年   5篇
  1987年   3篇
  1985年   7篇
  1984年   5篇
  1983年   6篇
  1982年   5篇
  1981年   2篇
  1980年   2篇
  1979年   2篇
  1978年   4篇
  1977年   7篇
  1976年   2篇
  1975年   2篇
  1973年   2篇
排序方式: 共有4701条查询结果,搜索用时 140 毫秒
111.
Ontological reasoning for improving the treatment of emotions in text   总被引:2,自引:2,他引:0  
With the advent of affective computing, the task of adequately identifying, representing and processing the emotional connotations of text has acquired importance. Two problems facing this task are addressed in this paper: the composition of sentence emotion from word emotion, and a representation of emotion that allows easy conversion between existing computational representations. The emotion of a sentence of text should be derived by composition of the emotions of the words in the sentence, but no method has been proposed so far to model this compositionality. Of the various existing approaches for representing emotions, some are better suited for some problems and some for others, but there is no easy way of converting from one to another. This paper presents a system that addresses these two problems by reasoning with two ontologies implemented with Semantic Web technologies: one designed to represent word dependency relations within a sentence, and one designed to represent emotions. The ontology of word dependency relies on roles to represent the way emotional contributions project over word dependencies. By applying automated classification of mark-up results in terms of the emotion ontology the system can interpret unrestricted input in terms of a restricted set of concepts for which particular rules are provided. The rules applied at the end of the process provide configuration parameters for a system for emotional voice synthesis.  相似文献   
112.
ContextDiagnosing processes in a small company requires process assessment practices which give qualitative and quantitative results; these should offer an overall view of the process capability. The purpose is to obtain relevant information about the running of processes, for use in their control and improvement. However, small organizations have some problems in running process assessment, due to their specific characteristics and limitations.ObjectiveThis paper presents a methodology for assessing software processes which assist the activity of software process diagnosis in small organizations. There is an attempt to address issues such as the fact that: (i) process assessment is expensive and typically requires major company resources and (ii) many light assessment methods do not provide information that is detailed enough for diagnosing and improving processes.MethodTo achieve all this, the METvalCOMPETISOFT assessment methodology was developed. This methodology: (i) incorporates the strategy of internal assessments known as rapid assessment, meaning that these assessments do not take up too much time or use an excessive quantity of resources, nor are they too rigorous and (ii) meets all the requirements described in the literature for an assessment proposal which is customized to the typical features of small companies.ResultsThis paper also describes the experience of the application of this methodology in eight small software organizations that took part in the COMPETISOFT project. The results obtained show that this approach allows us to obtain reliable information about the strengths and weaknesses of software processes, along with information to companies on opportunities for improvement.ConclusionThe assessment methodology proposed sets out the elements needed to assist with diagnosing the process in small organizations step-by-step while seeking to make its application economically feasible in terms of resources and time. From the initial application it may be seen that this assessment methodology can be useful, practical and suitable for diagnosing processes in this type of organizations.  相似文献   
113.
Ensemble learning has gained considerable attention in different tasks including regression, classification and clustering. Adaboost and Bagging are two popular approaches used to train these models. The former provides accurate estimations in regression settings but is computationally expensive because of its inherently sequential structure, while the latter is less accurate but highly efficient. One of the drawbacks of the ensemble algorithms is the high computational cost of the training stage. To address this issue, we propose a parallel implementation of the Resampling Local Negative Correlation (RLNC) algorithm for training a neural network ensemble in order to acquire a competitive accuracy like that of Adaboost and an efficiency comparable to that of Bagging. We test our approach on both synthetic and real datasets from the UCI and Statlib repositories for the regression task. In particular, our fine-grained parallel approach allows us to achieve a satisfactory balance between accuracy and parallel efficiency.  相似文献   
114.
Simulation of complex mechatronic systems like an automobile, involving mechanical components as well as actuators and active electronic control devices, can be accomplished by combining tools that deal with the simulation of the different subsystems. In this sense, it is often desirable to couple a multibody simulation software (for the mechanical simulation) with external numerical computing environments and block diagram simulators (for the modeling and simulation of nonmechanical components).  相似文献   
115.
In this paper we present adaptive algorithms for solving the uniform continuous piecewise affine approximation problem (UCPA) in the case of Lipschitz or convex functions. The algorithms are based on the tree approximation (adaptive splitting) procedure. The uniform convergence is achieved by means of global optimization techniques for obtaining tight upper bounds of the local error estimate (splitting criterion). We give numerical results in the case of the function distance to 2D and 3D geometric bodies. The resulting trees can retrieve the values of the target function in a fast way.  相似文献   
116.
A new fast prototype selection method based on clustering   总被引:2,自引:1,他引:1  
In supervised classification, a training set T is given to a classifier for classifying new prototypes. In practice, not all information in T is useful for classifiers, therefore, it is convenient to discard irrelevant prototypes from T. This process is known as prototype selection, which is an important task for classifiers since through this process the time for classification or training could be reduced. In this work, we propose a new fast prototype selection method for large datasets, based on clustering, which selects border prototypes and some interior prototypes. Experimental results showing the performance of our method and comparing accuracy and runtimes against other prototype selection methods are reported.  相似文献   
117.
This paper addresses the solution of smooth trajectory planning for industrial robots in environments with obstacles using a direct method, creating the trajectory gradually as the robot moves. The presented method deals with the uncertainties associated with the lack of knowledge of kinematic properties of intermediate via‐points since they are generated as the algorithm evolves looking for the solution. Several cost functions are also proposed, which use the time that has been calculated to guide the robot motion. The method has been applied successfully to a PUMA 560 robot and four operational parameters (execution time, computational time, distance travelled and number of configurations) have been computed to study the properties and influence of each cost function on the trajectory obtained. Copyright © 2010 John Wiley and Sons Asia Pte Ltd and Chinese Automatic Control Society  相似文献   
118.
Credit scoring modelling comprises one of the leading formal tools for supporting the granting of credit. Its core objective consists of the generation of a score by means of which potential clients can be listed in the order of the probability of default. A critical factor is whether a credit scoring model is accurate enough in order to provide correct classification of the client as a good or bad payer. In this context the concept of bootstraping aggregating (bagging) arises. The basic idea is to generate multiple classifiers by obtaining the predicted values from the fitted models to several replicated datasets and then combining them into a single predictive classification in order to improve the classification accuracy. In this paper we propose a new bagging-type variant procedure, which we call poly-bagging, consisting of combining predictors over a succession of resamplings. The study is derived by credit scoring modelling. The proposed poly-bagging procedure was applied to some different artificial datasets and to a real granting of credit dataset up to three successions of resamplings. We observed better classification accuracy for the two-bagged and the three-bagged models for all considered setups. These results lead to a strong indication that the poly-bagging approach may promote improvement on the modelling performance measures, while keeping a flexible and straightforward bagging-type structure easy to implement.  相似文献   
119.
This paper presents a parameterized shared-memory scheme for parameterized metaheuristics. The use of a parameterized metaheuristic facilitates experimentation with different metaheuristics and hybridation/combinations to adapt them to the particular problem we are working with. Due to the large number of experiments necessary for the metaheuristic selection and tuning, parallelism should be used to reduce the execution time. To obtain parallel versions of the metaheuristics and to adapt them to the characteristics of the parallel system, a unified parameterized shared-memory scheme is developed. Given a particular computational system and fixed parameters for the sequential metaheuristic, the appropriate selection of parameters in the unified parallel scheme eases the development of parallel efficient metaheuristics.  相似文献   
120.
In this paper, we propose a methodology for training a new model of artificial neural network called the generalized radial basis function (GRBF) neural network. This model is based on generalized Gaussian distribution, which parametrizes the Gaussian distribution by adding a new parameter τ. The generalized radial basis function allows different radial basis functions to be represented by updating the new parameter τ. For example, when GRBF takes a value of τ=2, it represents the standard Gaussian radial basis function. The model parameters are optimized through a modified version of the extreme learning machine (ELM) algorithm. In the methodology proposed (MELM-GRBF), the centers of each GRBF were taken randomly from the patterns of the training set and the radius and τ values were determined analytically, taking into account that the model must fulfil two constraints: locality and coverage. An thorough experimental study is presented to test its overall performance. Fifteen datasets were considered, including binary and multi-class problems, all of them taken from the UCI repository. The MELM-GRBF was compared to ELM with sigmoidal, hard-limit, triangular basis and radial basis functions in the hidden layer and to the ELM-RBF methodology proposed by Huang et al. (2004) [1]. The MELM-GRBF obtained better results in accuracy than the corresponding sigmoidal, hard-limit, triangular basis and radial basis functions for almost all datasets, producing the highest mean accuracy rank when compared with these other basis functions for all datasets.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号