首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   130篇
  免费   0篇
电工技术   2篇
化学工业   17篇
金属工艺   3篇
轻工业   5篇
石油天然气   6篇
无线电   23篇
一般工业技术   19篇
冶金工业   27篇
自动化技术   28篇
  2022年   3篇
  2021年   5篇
  2020年   4篇
  2019年   1篇
  2018年   2篇
  2016年   5篇
  2015年   2篇
  2014年   2篇
  2013年   7篇
  2012年   5篇
  2011年   7篇
  2010年   9篇
  2009年   7篇
  2008年   2篇
  2007年   6篇
  2006年   4篇
  2005年   3篇
  2004年   3篇
  2003年   2篇
  2002年   2篇
  2001年   2篇
  2000年   1篇
  1998年   8篇
  1997年   7篇
  1995年   2篇
  1994年   1篇
  1993年   3篇
  1991年   1篇
  1990年   2篇
  1988年   1篇
  1987年   3篇
  1986年   2篇
  1985年   2篇
  1983年   1篇
  1981年   3篇
  1978年   4篇
  1977年   1篇
  1976年   1篇
  1973年   3篇
  1972年   1篇
排序方式: 共有130条查询结果,搜索用时 15 毫秒
61.
A fast non-iterative algorithm for the solution of large 3-D acoustic scattering problems is presented. The proposed approach can be used in conjunction with the conventional boundary element discretization of the integral equations of acoustic scattering. The algorithm involves domain decomposition and uses the nonuniform grid (NG) approach for the initial compression of the interactions between each subdomain and the rest of the scatterer. These interactions, represented by the off-diagonal blocks of the boundary element method matrix, are then further compressed while constructing sets of interacting and local basis and testing functions. The compressed matrix is obtained by eliminating the local degrees of freedom through the Schur's complement-based technique procedure applied to the diagonal blocks. In the solution process, the interacting unknowns are first determined by solving the compressed system equations. Subsequently, the local degrees of freedom are determined for each subdomain. The proposed technique effectively reduces the oversampling typically needed when using low-order discretization techniques and provides significant computational savings.  相似文献   
62.
63.
Objective: We conducted a citation analysis to explore the impact of articles published in Health Psychology and determine whether the journal is fulfilling its stated mission. Design: Six years of articles (N = 408) representing three editorial tenures from 1993–2003 were selected for analysis. Main Outcome Measures: Articles were coded for several dimensions enabling examination of the relationship of article features to subsequent citations rates. Journals citing articles published in Health Psychology were classified into four categories: (1) psychology, (2) medicine, (3) public health and health policy, and (4) other journals. Results: The majority of citations of Health Psychology articles were in psychology journals, followed closely by medical journals. Studies reporting data collected from college students, and discussing the theoretical implications of findings, were more likely to be cited in psychology journals, whereas studies reporting data from clinical populations, and discussing the practice implications of findings, were more likely to be cited in medical journals. Time since publication and page length were both associated with increased citation counts, and review articles were cited more frequently than observational studies. Conclusion: Articles published in Health Psychology have a wide reach, informing psychology, medicine, public health and health policy. Certain characteristics of articles affect their subsequent pattern of citation. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   
64.
65.
This paper presents a method for the design of PID-type controllers, including those augmented by a filter on the D element, satisfying a required gain margin and an upper bound on the (complementary) sensitivity for a finite set of plants. Important properties of the method are: (i) it can be applied to plants of any order including non-minimum phase plants, plants with delay, plants characterized by quasi-polynomials, unstable plants and plants described by measured data, (ii) the sensors associated with the PI terms and the D term can be different (i.e., they can have different transfer function models), (iii) the algorithm relies on explicit equations that can be solved efficiently, (iv) the algorithm can be used in near real-time to determine a controller for on-line modification of a plant accounting for its uncertainty and closed-loop specifications, (v) a single plot can be generated that graphically highlights tradeoffs among the gain margin, (complementary) sensitivity bound, low-frequency sensitivity and high-frequency sensor noise amplification, and (vi) the optimal controller for a practical definition of optimality can readily be identified.  相似文献   
66.
The present study provides evidence on the occurrence of DNA rearrangement between the redundant 5' and 3' R domains of equine infectious anemia virus (EIAV) Tat cDNA. This was correlated with a gradual loss of cDNA copy number concomitantly with a decrease in gene expression. Removal of the 5' RU5 abolished rearrangement and stabilized Tat expression in EIAV tat cDNA transfectants. Our data suggest that prior removal of the 5' R from cloned retroviral cDNAs can impede DNA rearrangement, thus preventing cDNA excision that frequently occurs and hinders permanent expression of retroviral cDNAs in stable transfectants.  相似文献   
67.
Cluster analysis is a primary tool for detecting anomalous behavior in real-world data such as web documents, medical records of patients or other personal data. Most existing methods for document clustering are based on the classical vector-space model, which represents each document by a fixed-size vector of weighted key terms often referred to as key phrases. Since vector representations of documents are frequently very sparse, inverted files are used to prevent a tremendous computational overload which may be caused in large and diverse document collections such as pages downloaded from the World Wide Web. In order to reduce computation costs and space complexity, many popular methods for clustering web documents, including those using inverted files, usually assume a relatively small prefixed number of clusters.We propose several new crisp and fuzzy approaches based on the cosine similarity principle for clustering documents that are represented by variable-size vectors of key phrases, without limiting the final number of clusters. Each entry in a vector consists of two fields. The first field refers to a key phrase in the document and the second denotes an importance weight associated with this key phrase within the particular document. Removing the restriction on the total number of clusters, may moderately increase computing costs but on the other hand improves the method’s performance in classifying incoming vectors as normal or abnormal, based on their similarity to the existing clusters. All the procedures represented in this work are characterized by two features: (a) the number of clusters is not restricted by some relatively prefixed small number, i.e., an arbitrary new incoming vector which is not similar to any of the existing cluster centers necessarily starts a new cluster and (b) a vector with multiple appearance n in the training set is counted as n distinct vectors rather than a single vector. These features are the main reasons for the high quality performance of the proposed algorithms. We later describe them in detail and show their implementation in a real-world application from the area of web activity monitoring, in particular, by detecting anomalous documents downloaded from the internet by users with abnormal information interests.  相似文献   
68.
We define the class of single-parent heap systems, which rely on a singly-linked heap in order to model destructive updates on tree structures. This encoding has the advantage of relying on a relatively simple theory of linked lists in order to support abstraction computation. To facilitate the application of this encoding, we provide a program transformation that, given a program operating on a multi-linked heap without sharing, transforms it into one over a single-parent heap. It is then possible to apply shape analysis by predicate and ranking abstraction. The technique has been successfully applied on examples with lists (reversal and bubble sort) and trees with of fixed arity (balancing of, and insertion into, a binary sort tree).  相似文献   
69.
70.
We present a gradient-based method for rigid registration of a patient preoperative computed tomography (CT) to its intraoperative situation with a few fluoroscopic X-ray images obtained with a tracked C-arm. The method is noninvasive, anatomy-based, requires simple user interaction, and includes validation. It is generic and easily customizable for a variety of routine clinical uses in orthopaedic surgery. Gradient-based registration consists of three steps: 1) initial pose estimation; 2) coarse geometry-based registration on bone contours, and; 3) fine gradient projection registration (GPR) on edge pixels. It optimizes speed, accuracy, and robustness. Its novelty resides in using volume gradients to eliminate outliers and foreign objects in the fluoroscopic X-ray images, in speeding up computation, and in achieving higher accuracy. It overcomes the drawbacks of intensity-based methods, which are slow and have a limited convergence range, and of geometry-based methods, which depend on the image segmentation quality. Our simulated, in vitro, and cadaver experiments on a human pelvis CT, dry vertebra, dry femur, fresh lamb hip, and human pelvis under realistic conditions show a mean 0.5-1.7 mm (0.5-2.6 mm maximum) target registration accuracy.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号