首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   700篇
  免费   69篇
  国内免费   8篇
电工技术   9篇
综合类   22篇
化学工业   10篇
金属工艺   17篇
机械仪表   68篇
建筑科学   10篇
矿业工程   6篇
能源动力   11篇
轻工业   2篇
石油天然气   1篇
无线电   102篇
一般工业技术   38篇
原子能技术   2篇
自动化技术   479篇
  2025年   4篇
  2024年   5篇
  2023年   10篇
  2022年   9篇
  2021年   14篇
  2020年   9篇
  2019年   20篇
  2018年   14篇
  2017年   22篇
  2016年   26篇
  2015年   27篇
  2014年   40篇
  2013年   39篇
  2012年   64篇
  2011年   67篇
  2010年   25篇
  2009年   44篇
  2008年   37篇
  2007年   47篇
  2006年   26篇
  2005年   31篇
  2004年   30篇
  2003年   17篇
  2002年   18篇
  2001年   10篇
  2000年   12篇
  1999年   7篇
  1998年   12篇
  1997年   13篇
  1996年   5篇
  1995年   11篇
  1994年   11篇
  1993年   4篇
  1992年   8篇
  1991年   2篇
  1988年   3篇
  1987年   3篇
  1986年   3篇
  1985年   6篇
  1984年   4篇
  1983年   5篇
  1982年   2篇
  1981年   2篇
  1980年   2篇
  1979年   5篇
  1978年   2篇
排序方式: 共有777条查询结果,搜索用时 15 毫秒
41.
Yi  Gang  Yong  Zhengxin 《Decision Support Systems》2008,44(4):1016-1030
Speed and scalability are two essential issues in data mining and knowledge discovery. This paper proposed a mathematical programming model that addresses these two issues and applied the model to Credit Classification Problems. The proposed Multi-criteria Convex Quadric Programming (MCQP) model is highly efficient (computing time complexity O(n1.5–2)) and scalable to massive problems (size of O(109)) because it only needs to solve linear equations to find the global optimal solution. Kernel functions were introduced to the model to solve nonlinear problems. In addition, the theoretical relationship between the proposed MCQP model and SVM was discussed.  相似文献   
42.
43.
The importance of generalizability for anomaly detection   总被引:1,自引:1,他引:0  
In security-related areas there is concern over novel “zero-day” attacks that penetrate system defenses and wreak havoc. The best methods for countering these threats are recognizing “nonself” as in an Artificial Immune System or recognizing “self” through clustering. For either case, the concern remains that something that appears similar to self could be missed. Given this situation, one could incorrectly assume that a preference for a tighter fit to self over generalizability is important for false positive reduction in this type of learning problem. This article confirms that in anomaly detection as in other forms of classification a tight fit, although important, does not supersede model generality. This is shown using three systems each with a different geometric bias in the decision space. The first two use spherical and ellipsoid clusters with a k-means algorithm modified to work on the one-class/blind classification problem. The third is based on wrapping the self points with a multidimensional convex hull (polytope) algorithm capable of learning disjunctive concepts via a thresholding constant. All three of these algorithms are tested using the Voting dataset from the UCI Machine Learning Repository, the MIT Lincoln Labs intrusion detection dataset, and the lossy-compressed steganalysis domain. Gilbert “Bert” Peterson is an Assistant Professor of Computer Engineering at the Air Force Institute of Technology. Dr. Peterson received a BS degree in Architecture, and an M.S. and Ph.D. in Computer Science at the University of Texas at Arlington. He teaches and conducts research in digital forensics and artificial intelligence. Brent McBride is a Communications and Information Systems officer in the United States Air Force. He received a B.S. in Computer Science from Brigham Young University and an M.S. in Computer Science from the Air Force Institute of Technology. He currently serves as Senior Software Engineer at the Air Force Wargaming Institute.  相似文献   
44.
We consider the problem of fitting a convex piecewise-linear function, with some specified form, to given multi-dimensional data. Except for a few special cases, this problem is hard to solve exactly, so we focus on heuristic methods that find locally optimal fits. The method we describe, which is a variation on the K-means algorithm for clustering, seems to work well in practice, at least on data that can be fit well by a convex function. We focus on the simplest function form, a maximum of a fixed number of affine functions, and then show how the methods extend to a more general form.  相似文献   
45.
46.
The convex hull algorithm for simple polygons, due to Sklansky, fails in some cases, but its extreme simplicity, compared to the later algorithms, revived an interest in this algorithm. A sufficient condition for its success was given recently by Toussaint and Avis. They have proved that the algorithm works for polygons known as weakly externally visible polygons.

In this paper a new notion called external left visibility is introduced and it is shown that this is a necessary and sufficient condition for the success of Sklansky's algorithm. Moreover, algorithms testing simple polygons for external left visibility and weak external visibility are given.  相似文献   

47.
A method is presented for finding all vertices and all hyperplanes containing the faces of a convex polyhedron spanned by a given finite set X in Euclidean space En. The present paper indicates how this method can be applied to the investigation of linear separability of two given finite sets X1 and X2 in En. In the case of linear separability of these sets the proposed method makes it possible to find the separating hyperplane.  相似文献   
48.
A two-stage algorithm was recently proposed by Sklansky (1982) for computing the convex hull of a simple polygon P. The first step is intended to compute a simple polygon P1 which is monotonic in both the x and y directions and which contains the convex hull vertices of P. The second step applies a very simple convex hull algorithm on P1. In this note we show that the first step does not always work correctly and can even yield non-simple polygons, invalidating the use of the second step. It is also shown that the first step can discard convex hull vertices thus invalidating the use of any convex hull algorithm in the second step.  相似文献   
49.
Classic linear dimensionality reduction (LDR) methods, such as principal component analysis (PCA) and linear discriminant analysis (LDA), are known not to be robust against outliers. Following a systematic analysis of the multi-class LDR problem in a unified framework, we propose a new algorithm, called minimal distance maximization (MDM), to address the non-robustness issue. The principle behind MDM is to maximize the minimal between-class distance in the output space. MDM is formulated as a semi-definite program (SDP), and its dual problem reveals a close connection to “weighted” LDR methods. A soft version of MDM, in which LDA is subsumed as a special case, is also developed to deal with overlapping centroids. Finally, we drop the homoscedastic Gaussian assumption made in MDM by extending it in a non-parametric way, along with a gradient-based convex approximation algorithm to significantly reduce the complexity of the original SDP. The effectiveness of our proposed methods are validated on two UCI datasets and two face datasets.  相似文献   
50.
We consider a class of finite time horizon optimal control problems for continuous time linear systems with a convex cost, convex state constraints and non-convex control constraints. We propose a convex relaxation of the non-convex control constraints, and prove that the optimal solution of the relaxed problem is also an optimal solution for the original problem, which is referred to as the lossless convexification of the optimal control problem. The lossless convexification enables the use of interior point methods of convex optimization to obtain globally optimal solutions of the original non-convex optimal control problem. The solution approach is demonstrated on a number of planetary soft landing optimal control problems.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号