首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   9篇
  免费   0篇
化学工业   1篇
建筑科学   1篇
一般工业技术   1篇
冶金工业   1篇
自动化技术   5篇
  2021年   1篇
  2009年   1篇
  2007年   1篇
  2005年   1篇
  2004年   1篇
  2002年   1篇
  2001年   1篇
  1998年   1篇
  1982年   1篇
排序方式: 共有9条查询结果,搜索用时 15 毫秒
1
1.
Principal components analysis (PCA) is a multivariate statistical technique that transforms a data set having a large number of inter-related variables to a new set of uncorrelated variables called the principal components, determined to allow the dimensionality of the data set to be reduced while retaining as much of the variation present as possible. PCA can be applied to dynamic structural response data to identify the predominant modes of vibration of the structure. Because PCA is a statistical technique, there are errors in the computed modes due to the use of a sample of finite size. The aim of this paper is to study the effect of sample size on the accuracy with which the modes of vibration can be computed. The paper focuses predominantly on elastic response data and examines the potential influence of various parameters such as the period of the structure, the input excitation, and the spatial distribution of mass over the structure. Issues relating to errors in the modes of nonlinear structures are also discussed.  相似文献   
2.
Parekh  S.  Gandhi  N.  Hellerstein  J.  Tilbury  D.  Jayram  T.  Bigus  J. 《Real-Time Systems》2002,23(1-2):127-141
A widely used approach to achieving service level objectives for a software system (e.g., an email server) is to add a controller that manipulates the target system's tuning parameters. We describe a methodology for designing such controllers for software systems that builds on classical control theory. The classical approach proceeds in two steps: system identification and controller design. In system identification, we construct mathematical models of the target system. Traditionally, this has been based on a first-principles approach, using detailed knowledge of the target system. Such models can be complex and difficult to build, validate, use, and maintain. In our methodology, a statistical (ARMA) model is fit to historical measurements of the target being controlled. These models are easier to obtain and use and allow us to apply control-theoretic design techniques to a larger class of systems. When applied to a Lotus Notes groupware server, we obtain model-fits with R2 no lower than 75% and as high as 98%. In controller design, an analysis of the models leads to a controller that will achieve the service level objectives. We report on an analysis of a closed-loop system using an integral control law with Lotus Notes as the target. The objective is to maintain a reference queue length. Using root-locus analysis from control theory, we are able to predict the occurrence (or absence) of controller-induced oscillations in the system's response. Such oscillations are undesirable since they increase variability, thereby resulting in a failure to meet the service level objective. We implement this controller for a real Lotus Notes system, and observe a remarkable correspondence between the behavior of the real system and the predictions of the analysis. This indicates that the control theoretic analysis is sufficient to select controller parameters that meet the desired goals, and the need for simulations is reduced.  相似文献   
3.
The evolution of sulphur as H2S from three US bituminous coals, L-cystine, and thianthrene has been studied in the reflected shock region of a chemical shock tube. With heating rates of approximately 3 × 106Ks?1, to temperatures in the range 1000–2000K, ultimate yields of sulphur from 7 to 70% as H2S are observed in as little as 1.5 ms. The most important influence on ultimate yields may be the H/S molar ratio in the fuel which corresponds in the hydrocarbons studied with the H2S yields. Inherent mineral matter may also influence the evolution, as H2S formation precedes light hydrocarbon formation from the two model compounds, but occurs nearly simultaneously from the coals. The overall rate of H2S formation for the three coals is adequately described by a reaction, first order with respect to remaining sulphur, with a rate constant k=8.1 × 107exp(?15 960T) s?1. In the pyrolysis of all five solids, the H2S yields are decreased to below detectable limits at temperatures >1500K. The path for destruction of the H2S appears to be by reaction with hydrocarbons such as C2H2, the concentration of which becomes similar to the H2S concentration in the temperature range 1500–1600K.  相似文献   
4.
BACKGROUND: Papillary serous carcinoma of the peritoneum (PSCP) is a rare primary peritoneal tumor, described exclusively in women. It is believed to arise from the secondary müllerian system, which is comprised of the pelvic and lower abdominal mesothelial lining and subjacent (subcoelomic) mesenchyme in women. Both mesotheliomas and PSCP arise from the coelomic epithelium, but are clinicopathologically and biologically distinct entities. METHODS: The authors report clinicopathologic findings in a man, age 74 years, who died 3 months after the diagnosis of an extensive malignant abdominal disease. RESULTS: The routine histologic and immunocytochemical studies of tumor tissue, obtained during the patient's lifetime and at autopsy, validated the unique occurrence of PSCP in a man. CONCLUSIONS: This case illustrates that PSCP can occur in a man and that this diagnosis may be considered in the differential diagnosis of papillary serous tumors of the peritoneum in male patients. Although rare, PSCP is a diagnostically distinct entity the treatment of which is similar to ovarian serous tumors rather than mesotheliomas.  相似文献   
5.
We consider the preemptive job shop scheduling problem with two machines, with the objective to minimize the makespan. We present an algorithm that finds a schedule of length at most P max/2 greater than the optimal schedule length, where P max is the length of the longest job. Received June 13, 2000  相似文献   
6.
OLAP over uncertain and imprecise data   总被引:2,自引:0,他引:2  
We extend the OLAP data model to represent data ambiguity, specifically imprecision and uncertainty, and introduce an allocation-based approach to the semantics of aggregation queries over such data. We identify three natural query properties and use them to shed light on alternative query semantics. While there is much work on representing and querying ambiguous data, to our knowledge this is the first paper to handle both imprecision and uncertainty in an OLAP setting.  相似文献   
7.

Silica nanospheres have been explored much for drug delivery, photocatalysis, sensors and energy storage applications. It also acts as a template for Surface-Enhanced Raman Spectroscopy (SERS) substrates. Uniform nanostructures at low cost with high reproducibility are the major challenges in SERS substrate fabrication. In the present work, silica nanospheres were synthesized using stober method and deposited on to glass slides using Vertical deposition techniques. Different size/thickness of Silver (Ag) nanoparticles were deposited onto silica thin films using sputter deposition technique. The monodispersity of silica nanospheres and size of silver nanoparticles (10 nm, 20 nm and 30 nm) were confirmed by FESEM analysis. The structural properties were confirmed through XRD. UV–Vis analysis revealed that the plasmonic properties of Ag@SiO2 give high surface plasmons for 30 nm thickness of silver. The binding energy of Ag@SiO2 confirmed through XPS spectrum. The fabricated SERS substrates were used to detect Rhodamine 6G (R6G), Methylene blue (MB), Methylene violet (MV) and Methyl orange dyes as an analyte molecule with a limit of detection at about 10?11 mol/L. The addition of SiO2 nanospheres decreases the Ag oxidation rate and increases their stability. The maximum enhancement factor (1.5?×?107) achieved for 30nm thickness of Ag@SiO2. The results and technique establish the potential applications and reproducible SERS substrate.

  相似文献   
8.
We present a new method for proving strong lower bounds in communication complexity. This method is based on the notion of the conditional information complexity of a function which is the minimum amount of information about the inputs that has to be revealed by a communication protocol for the function. While conditional information complexity is a lower bound on communication complexity, we show that it also admits a direct sum theorem. Direct sum decomposition reduces our task to that of proving conditional information complexity lower bounds for simple problems (such as the AND of two bits). For the latter, we develop novel techniques based on Hellinger distance and its generalizations.Our paradigm leads to two main results:(1) An improved lower bound for the multi-party set-disjointness problem in the general communication complexity model, and a nearly optimal lower bound in the one-way communication model. As a consequence, we show that for any real k>2, approximating the kth frequency moment in the data stream model requires essentially Ω(n1−2/k) space; this resolves a conjecture of Alon et al. (J. Comput. System Sci. 58(1) (1999) 137).(2) A lower bound for the Lp approximation problem in the general communication model; this solves an open problem of Saks and Sun (in: Proceedings of the 34th Annual ACM Symposium on Theory of Computing (STOC), 2002, pp. 360-369). As a consequence, we show that for p>2, approximating the Lp norm to within a factor of nε in the data stream model with constant number of passes requires Ω(n1−4ε−2/p) space.  相似文献   
9.
Even though computing systems have increased the number of transistors, the switching speed, and the number of processors, most programs exhibit limited speedup due to the serial dependencies of existing algorithms. Analysis of intrinsically parallel systems such as brain circuitry have led to the identification of novel architecture designs, and also new algorithms than can exploit the features of modern multiprocessor systems. In this article we describe the details of a brain derived vision (BDV) algorithm that is derived from the anatomical structure, and physiological operating principles of thalamo-cortical brain circuits. We show that many characteristics of the BDV algorithm lend themselves to implementation on IBM CELL architecture, and yield impressive speedups that equal or exceed the performance of specialized solutions such as FPGAs. Mapping this algorithm to the IBM CELL is non-trivial, and we suggest various approaches to deal with parallelism, task granularity, communication, and memory locality. We also show that a cluster of three PS3s (or more) containing IBM CELL processors provides a promising platform for brain derived algorithms, exhibiting speedup of more than 140 × over a desktop PC implementation, and thus enabling real-time object recognition for robotic systems.  相似文献   
1
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号