首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 406 毫秒
1.
This paper concerns the construction of a quadrilateral finite element whose interpolation space admits of rational fractions for basis functions of “Wachspress type” [1, 2]. The construction of this finite element, which is in a way the “rational” equivalent of the ADINI finite element[3, 4], is founded on a method analogous to the one used for Serendip degree-two finite element construction in[2]. The study of interpolation error is dealt with in a paper by Apprato, Arcangeli and Gout in this journal “Rational interpolation of Wachspress error estimates”.  相似文献   

2.
By manipulating the imaginary part of the complex-analytic quadratic equation, we obtain, by iteration, a set that we call the “burning ship” because of its appearance. It is an analogue to the Mandelbrot set (M-set). Its nonanalytic “quasi”-Julia sets yield surprizing graphical shapes.  相似文献   

3.
THE PREVAILING WISDOM, AND a common “best practice,” of knowledge management (KM) is that a primary determinant of success in getting people to submit their most valuable personal knowledge to a repository is the existence of a “knowledge culture” in the organization.  相似文献   

4.
It is shown that in first-order linear-time temporal logic, validity questions can be translated into validity questions of formulas not containing “next” or “until” operators. The translation can be performed in linear time.  相似文献   

5.
In this paper we analyze a fundamental issue which directly impacts the scalability of current theoretical neural network models to applicative embodiments, in both software as well as hardware. This pertains to the inherent and unavoidable concurrent asynchronicity of emerging fine-grained computational ensembles and the consequent chaotic manifestations in the absence of proper conditioning. The latter concern is particularly significant since the computational inertia of neural networks in general and our dynamical learning formalisms manifests itself substantially, only in massively parallel hardward—optical, VLSI or opto-electronic. We introduce a mathematical framework for systematically reconditioning additive-type models and derive a neuro-operator, based on the chaotic relaxation paradigm whose resulting dynamics are neither “concurrently” synchronous nor “sequentially” asynchronous. Necessary and sufficient conditions guaranteeing concurrent asynchronous convergence are established in terms of contracting operators. Lyapunov exponents are also computed to characterize the network dynamics and to ensure that throughput-limiting “emergent computational chaos” behavior in models reconditioned with concurrently asynchronous algorithms was eliminated.  相似文献   

6.
This paper presents a finite element analysis of the in-plane bending behavior of elastic elbows. Rules and guidelines are presented for the systematic selection of the dimensions of the finite element meshes and for the interpretation of the numerical results. A simple asymptotic formula is presented that gives the dimensions of a so-called “optimum” (or “upper bound”) mesh as a function of the geometry of the elbow. This mesh, along with a companion one, the “lower bound” mesh, serve to establish the basis for the selection of the range of mesh dimensions that are used in the convergence studies of the MARC finite element computations for stresses, displacements and stress intensification and flexibility factors appropriate to a typical FFTF elbow. Reasonable good agreement is found in a comparison of the MARC results with those obtained from the ELBOW computer program, as well as with results predicted by Clark and Reissner's asymptotic formulas.  相似文献   

7.
The goal of this paper is to offer a framework for classification of images and video according to their “type”, or “style”––a problem which is hard to define, but easy to illustrate; for example, identifying an artist by the style of his/her painting, or determining the activity in a video sequence. The paper offers a simple classification paradigm based on local properties of spatial or spatio-temporal blocks. The learning and classification are based on the naive Bayes classifier. A few experimental results are presented.  相似文献   

8.
This paper presents a case of introducing new technology to a single stage in a maintenance operation composed of sequence of stages. The process - Thermal tile replacement - is a low volume, high value operation. A method for data collection at each stage, to estimate the variability in process quality, cost and duration is presented. The method involves: Identifying key product features, accuracy measure for each, rate of product rejection by feature and the associated probability density functions at each stage. The method relates accuracy variability by feature, “effect” to the contributing stage in the process “cause”. Simulation is used to justify the introduction of a new technology and to predict the percentage of product conformity in a “before” and “after” scenarios for the implementation of the new technology. The simulation model enables the quantification of technology impact on the product quality, overall productivity and the associated cost savings.  相似文献   

9.
A method for graphic stress representation   总被引:5,自引:0,他引:5  
Stress representation by coloured patterns and contours through the use of the finite element method is presented in this paper. Stresses are computed in the center of each subregion defined in the element surface being displayed. These stresses define a matrix of codes related to the “stress bands” which must be represented. This matrix is used to “paint” every subregion and to generate contours defined by points located between two adjacent subregions of different codes. Some examples are presented to show the effectiveness of the method.  相似文献   

10.
The class of interpolatory—Newton iterations is defined and analyzed for the computation of a simple zero of a non-linear operator in a Banach space of finite or infinite dimension. Convergence of the class is established.

The concepts of “informationally optimal class of algorithms” and “optimal algorithm” are formalized. For the multivariate case, the optimality of Newton iteration is established in the class of one-point iterations under an “equal cost assumption”.  相似文献   


11.
The Inner Graphic Formula Method (IGF) which was originally conceived by Professor Ishiketa and further developed by him and his associates was used to investigate the motivation of new company employees.

Japanese companies traditionally recruit new employees from senior classes and notify successful candidates of their intention to employ them around the first of January. Since graduation is in March, April first is, then, the first day of work for almost all of these graduates in their new companies.

The investigation period for this study covers the eleven months from January until the middle of November, and therefore includes the three month period after notification but prior to actual work, from January first until March thirty-first, and the first eight month of actual work, from April first to the middle of November. The subjects fell, naturally, into two groups; a “Blue Collar” group and a “White Collar” group.

This paper deals with the motivation of these newly employed workers in general and, specifically, with the difference in motivational tendencies between “Blue Collar” and “White Collar” workers. As expected analysis showed that clear motivational differences appeared.

Motivation in the white collar workers tended to raise after an initial downturn, while a general downward trend was detected for the blue collar workers. White collar worker's attitudes toward themselves and toward their work seemed to change for the better as a result of having the chance to become introspective while plotting the graph and writing the anecdotal responses needed to complete the investigative sheet for this study.  相似文献   


12.
《Information Systems》1989,14(6):443-453
Using a fuzzy-logic-based calculus of linguistically quantified propositions we present FQUERY III+, a new, more “human-friendly” and easier-to-use implementation of a querying scheme proposed originally by Kacprzyk and Zio kowski to handle imprecise queries including a linguistic quantifier as, e.g. find all records in which most (almost all, much more than 75%, … or any other linguistic quantifier) of the important attributes (out of a specified set) are as desired (e.g. equal to five, more than 10, large, more or less equal to 15, etc.). FQUERY III+ is an “add-on” to Ashton-Tate's dBase III Plus.  相似文献   

13.
Solutions of problems by the finite element method, when curved boundaries are present in the model, may not be accurate. Such a difficulty arises when straight-line elements are used to approximate the curved boundary. This behavior is known in the literature as the “Babuska Paradox”. Despite the fact that the problem has been recognized since the mid 60's, and methods to overcome it have been used quite successfully, many textbooks still ignore it. Here, this “paradox” is demonstrated by plane-stress problems, to which analytical results exist. One known method (the isoparametric element) is used to show how to overcome these difficulties.  相似文献   

14.
The threat of cyber attacks motivates the need to monitor Internet traffic data for potentially abnormal behavior. Due to the enormous volumes of such data, statistical process monitoring tools, such as those traditionally used on data in the product manufacturing arena, are inadequate. “Exotic” data may indicate a potential attack; detecting such data requires a characterization of “typical” data. We devise some new graphical displays, including a “skyline plot,” that permit ready visual identification of unusual Internet traffic patterns in “streaming” data, and use appropriate statistical measures to help identify potential cyberattacks. These methods are illustrated on a moderate-sized data set (135,605 records) collected at George Mason University.  相似文献   

15.
This paper presents a new class of algorithms based on Youden designs to detect and restore edges present in an image imbedded by mixture or “salt and pepper” noise. The mixture noise consists of a uncorrelated or correlated noisy background plus uncorrelated impulsive noise. The objective is to restore pixels affected by the impulsive part of the mixture noise. The approach is to consider that these pixels have lost their true value and their estimate is obtained via the normal equation that yields the least sum of square error (LSSE). This procedure is known in the literature as “The Missing Value Approach Problem”. The estimates are introduced into the image data and an ANOVA technique based on Youden design is carried out. We introduce Youden designs which are special Symmetric Balanced Incomplete block (SBIB) designs, the pertinent statistical tests and estimates of the factor effects. We derive the estimate of the missing value for the uncorrelated noise environment as well as for the correlated one. The high level of performance of these algorithms can be evaluated visually via the input/output images and objectively via the input/output signal-to-noise ratio (SNR).  相似文献   

16.
This article examines how healthcare professionals experience an Electronic Patient Record (EPR) adoption process. Based on a case study from two surgical wards in Danish hospitals, we analyze the healthcare professionals' conceptions of the technology, how it relates to their professional roles, and aspects of the implementation process from a “sensemaking” perspective.  相似文献   

17.
This article provides a brief tutorial of Wiki technology as a collaborative tool. A case example from a university administration context suggests that - like many other end-user technologies - training and support needs should be carefully considered before the potential value of using this “free” technology to support knowledge management efforts can be satisfactorily realized.  相似文献   

18.
The history of schema languages for XML is (roughly) an increase of expressiveness. While early schema languages mainly focused on the element structure, Clark first paid an equal attention to attributes by allowing both element and attribute constraints in a single constraint expression (we call his mechanism “attribute–element constraints”). In this paper, we investigate intersection and difference operations and inclusion test for attribute–element constraints, in view of their importance in static typechecking for XML processing programs. The contributions here are (1) proofs of closure under intersection and difference as well as decidability of inclusion test and (2) algorithm formulations incorporating a “divide-and-conquer” strategy for avoiding an exponential blow-up for typical inputs.  相似文献   

19.
This paper introduces the research work of the Performance Based Studies Research Group (PBSRG), the Alliance of Construction Excellence (ACE), and the Del E. Webb School of Construction (DEWSC) to apply a new evaluation/delivery mechanism for construction systems to increase performance of the construction systems. The research is based on “fuzzy thinking” and the management of information. Concepts utilized include “backward chaining” solution methodology, and a relative distancing model to procure the best available facility systems. Participants in the program include general contractors, Job Order Contractors, mechanical contractors, roofing contractors, material suppliers and manufacturers, and facility owners and managers from Motorola, Honeywell, Morrison Knudson, IBM, and from other major facilities in the Phoenix metropolitan area.  相似文献   

20.
The compound strip method is illustrated for the analysis of slab-girder bridges modeled as a linear elastic plate continuous over deflecting supports. This approach incorporates the effects of support elements in a direct stiffness methodology by creating a substructure composed of plate. beam. and column elements which is termed a “compound strip.” The theory and application of the compound strip method is presented. The finite element and compound strip methods are compared in an illustrative analysis for a slab-girder bridge. The results of the compound strip analysis compare well with the finite element method. The methodology presented herein can be used to efficiently model any slab-girder bridge configuration. Typically, the compound strip method requires significantly less computational resources than does the finite element method and is well suited for use on today's microcomputers.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号