首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
This paper is about the study of interpolation error for the Hermite rational “Wachspress type” third degree finite element that is constructed in[1]. We obtain results analogous with those of the “corresponding” ADINI (polynomial) finite element.  相似文献   

2.
This note corrects two errors that occurred during the typesetting of our paper “Axiomatisations of functional dependencies in the presence of records, lists, sets and multisets”, which appeared in Hartmann et al. [Axiomatisations of functional dependencies in the presence of records, lists, sets and multisets, Theoret. Comput. Sci. 353(2) (2006) 167–196].  相似文献   

3.
The first half is a tutorial on orderings, lattices, Boolean algebras, operators on Boolean algebras, Tarski's fixed point theorem, and relation algebras.

In the second half, elements of a complete relation algebra are used as “meanings” for program statements. The use of relation algebras for this purpose was pioneered by de Bakker and de Roever in [10–12]. For a class of programming languages with program schemes, single μ-recursion, while-statements, if-then-else, sequential composition, and nondeterministic choice, a definition of “correct interpretation” is given which properly reflects the intuitive (or operational) meanings of the program constructs. A correct interpretation includes for each program statement an element serving as “input/output relation” and a domain element specifying that statement's “domain of nontermination”. The derivative of Hitchcock and Park [17] is defined and a relation-algebraic version of the extension by de Bakker [8, 9] of the Hitchcock-Park theorem is proved. The predicate transformers wps(-) and wlps(-) are defined and shown to obey all the standard laws in [15]. The “law of the excluded miracle” is shown to hold for an entire language if it holds for that language's basic statements (assignment statements and so on). Determinism is defined and characterized for all the program constructs. A relation-algebraic version of the invariance theorem for while-statements is given. An alternative definition of intepretation, called “demonic”, is obtained by using “demonic union” in place of ordinary union, and “demonic composition” in place of ordinary relational composition. Such interpretations are shown to arise naturally from a special class of correct interpretations, and to obey the laws of wps(-).  相似文献   


4.
The goal of this paper is to offer a framework for classification of images and video according to their “type”, or “style”––a problem which is hard to define, but easy to illustrate; for example, identifying an artist by the style of his/her painting, or determining the activity in a video sequence. The paper offers a simple classification paradigm based on local properties of spatial or spatio-temporal blocks. The learning and classification are based on the naive Bayes classifier. A few experimental results are presented.  相似文献   

5.
In this paper we analyze a fundamental issue which directly impacts the scalability of current theoretical neural network models to applicative embodiments, in both software as well as hardware. This pertains to the inherent and unavoidable concurrent asynchronicity of emerging fine-grained computational ensembles and the consequent chaotic manifestations in the absence of proper conditioning. The latter concern is particularly significant since the computational inertia of neural networks in general and our dynamical learning formalisms manifests itself substantially, only in massively parallel hardward—optical, VLSI or opto-electronic. We introduce a mathematical framework for systematically reconditioning additive-type models and derive a neuro-operator, based on the chaotic relaxation paradigm whose resulting dynamics are neither “concurrently” synchronous nor “sequentially” asynchronous. Necessary and sufficient conditions guaranteeing concurrent asynchronous convergence are established in terms of contracting operators. Lyapunov exponents are also computed to characterize the network dynamics and to ensure that throughput-limiting “emergent computational chaos” behavior in models reconditioned with concurrently asynchronous algorithms was eliminated.  相似文献   

6.
This paper introduces the research work of the Performance Based Studies Research Group (PBSRG), the Alliance of Construction Excellence (ACE), and the Del E. Webb School of Construction (DEWSC) to apply a new evaluation/delivery mechanism for construction systems to increase performance of the construction systems. The research is based on “fuzzy thinking” and the management of information. Concepts utilized include “backward chaining” solution methodology, and a relative distancing model to procure the best available facility systems. Participants in the program include general contractors, Job Order Contractors, mechanical contractors, roofing contractors, material suppliers and manufacturers, and facility owners and managers from Motorola, Honeywell, Morrison Knudson, IBM, and from other major facilities in the Phoenix metropolitan area.  相似文献   

7.
This paper presents a finite element analysis of the in-plane bending behavior of elastic elbows. Rules and guidelines are presented for the systematic selection of the dimensions of the finite element meshes and for the interpretation of the numerical results. A simple asymptotic formula is presented that gives the dimensions of a so-called “optimum” (or “upper bound”) mesh as a function of the geometry of the elbow. This mesh, along with a companion one, the “lower bound” mesh, serve to establish the basis for the selection of the range of mesh dimensions that are used in the convergence studies of the MARC finite element computations for stresses, displacements and stress intensification and flexibility factors appropriate to a typical FFTF elbow. Reasonable good agreement is found in a comparison of the MARC results with those obtained from the ELBOW computer program, as well as with results predicted by Clark and Reissner's asymptotic formulas.  相似文献   

8.
This paper presents a case of introducing new technology to a single stage in a maintenance operation composed of sequence of stages. The process - Thermal tile replacement - is a low volume, high value operation. A method for data collection at each stage, to estimate the variability in process quality, cost and duration is presented. The method involves: Identifying key product features, accuracy measure for each, rate of product rejection by feature and the associated probability density functions at each stage. The method relates accuracy variability by feature, “effect” to the contributing stage in the process “cause”. Simulation is used to justify the introduction of a new technology and to predict the percentage of product conformity in a “before” and “after” scenarios for the implementation of the new technology. The simulation model enables the quantification of technology impact on the product quality, overall productivity and the associated cost savings.  相似文献   

9.
Summary This paper is an overview of recent developments in the construction of finite element interpolants, which areC 0-conforming on polygonal domains. In 1975, Wachspress proposed a general method for constructing finite element shape functions on convex polygons. Only recently has renewed interest in such interpolants surfaced in various disciplines including: geometric modeling, computer graphics, and finite element computations. This survey focuses specifically on polygonal shape functions that satisfy the properties of barycentric coordinates: (a) form a partition of unity, and are non-negative; (b) interpolate nodal data (Kronecker-delta property), (c) are linearly complete or satisfy linear precision, and (d) are smooth within the domain. We compare and contrast the construction and properties of various polygonal interpolants—Wachspress basis functions, mean value coordinates, metric coordinate method, natural neighbor-based coordinates, and maximum entropy shape functions. Numerical integration of the Galerkin weak form on polygonal domains is discussed, and the performance of these polygonal interpolants on the patch test is studied.  相似文献   

10.
This paper presents a new class of algorithms based on Youden designs to detect and restore edges present in an image imbedded by mixture or “salt and pepper” noise. The mixture noise consists of a uncorrelated or correlated noisy background plus uncorrelated impulsive noise. The objective is to restore pixels affected by the impulsive part of the mixture noise. The approach is to consider that these pixels have lost their true value and their estimate is obtained via the normal equation that yields the least sum of square error (LSSE). This procedure is known in the literature as “The Missing Value Approach Problem”. The estimates are introduced into the image data and an ANOVA technique based on Youden design is carried out. We introduce Youden designs which are special Symmetric Balanced Incomplete block (SBIB) designs, the pertinent statistical tests and estimates of the factor effects. We derive the estimate of the missing value for the uncorrelated noise environment as well as for the correlated one. The high level of performance of these algorithms can be evaluated visually via the input/output images and objectively via the input/output signal-to-noise ratio (SNR).  相似文献   

11.
THE PREVAILING WISDOM, AND a common “best practice,” of knowledge management (KM) is that a primary determinant of success in getting people to submit their most valuable personal knowledge to a repository is the existence of a “knowledge culture” in the organization.  相似文献   

12.
By manipulating the imaginary part of the complex-analytic quadratic equation, we obtain, by iteration, a set that we call the “burning ship” because of its appearance. It is an analogue to the Mandelbrot set (M-set). Its nonanalytic “quasi”-Julia sets yield surprizing graphical shapes.  相似文献   

13.
A method for graphic stress representation   总被引:5,自引:0,他引:5  
Stress representation by coloured patterns and contours through the use of the finite element method is presented in this paper. Stresses are computed in the center of each subregion defined in the element surface being displayed. These stresses define a matrix of codes related to the “stress bands” which must be represented. This matrix is used to “paint” every subregion and to generate contours defined by points located between two adjacent subregions of different codes. Some examples are presented to show the effectiveness of the method.  相似文献   

14.
The Inner Graphic Formula Method (IGF) which was originally conceived by Professor Ishiketa and further developed by him and his associates was used to investigate the motivation of new company employees.

Japanese companies traditionally recruit new employees from senior classes and notify successful candidates of their intention to employ them around the first of January. Since graduation is in March, April first is, then, the first day of work for almost all of these graduates in their new companies.

The investigation period for this study covers the eleven months from January until the middle of November, and therefore includes the three month period after notification but prior to actual work, from January first until March thirty-first, and the first eight month of actual work, from April first to the middle of November. The subjects fell, naturally, into two groups; a “Blue Collar” group and a “White Collar” group.

This paper deals with the motivation of these newly employed workers in general and, specifically, with the difference in motivational tendencies between “Blue Collar” and “White Collar” workers. As expected analysis showed that clear motivational differences appeared.

Motivation in the white collar workers tended to raise after an initial downturn, while a general downward trend was detected for the blue collar workers. White collar worker's attitudes toward themselves and toward their work seemed to change for the better as a result of having the chance to become introspective while plotting the graph and writing the anecdotal responses needed to complete the investigative sheet for this study.  相似文献   


15.
It is shown that in first-order linear-time temporal logic, validity questions can be translated into validity questions of formulas not containing “next” or “until” operators. The translation can be performed in linear time.  相似文献   

16.
In a companion paper, we presented an interval logic, and showed that it is elementarily decidable. In this paper we extend the logic to allow reasoning about real-time properties of concurrent systems; we call this logic real-time future interval logic (RTFIL). We model time by the real numbers, and allow our syntax to state the bounds on the duration of an interval. RTFIL possesses the “real-time interpolation property,” which appears to be the natural quantitative counterpart of invariance under finite stuttering. As the main result of this paper, we show that RTFIL is decidable; the decision algorithm is slightly more expensive than for the untimed logic. Our decidability proof is based on the reduction of the satisfiability problem for the logic to the emptiness problem for timed Büchi automata. The latter problem was shown decidable by Alur and Dill in a landmark paper, in which this real-time extension of ω-automata was introduced. Finally, we consider an extension of the logic that allows intervals to be constructed by means of “real-time offsets”, and show that even this simple extension renders the logic highly undecidable.  相似文献   

17.
The history of schema languages for XML is (roughly) an increase of expressiveness. While early schema languages mainly focused on the element structure, Clark first paid an equal attention to attributes by allowing both element and attribute constraints in a single constraint expression (we call his mechanism “attribute–element constraints”). In this paper, we investigate intersection and difference operations and inclusion test for attribute–element constraints, in view of their importance in static typechecking for XML processing programs. The contributions here are (1) proofs of closure under intersection and difference as well as decidability of inclusion test and (2) algorithm formulations incorporating a “divide-and-conquer” strategy for avoiding an exponential blow-up for typical inputs.  相似文献   

18.
The finite element method has become a powerful tool for computation of stress intensity factors in fracture mechanics. The simulation of singular behavior in the stress field is accomplished using “quarter points,” following the methods of Barsoum[1] and Henshell-Shaw[2]. The analysis has also been extended to cubic elements [3] and transition elements [4]. However, these concepts cannot be easily extended to three dimensional cases without additional conditions. Progress has been hampered firstly due to a variety of possible shapes the element may possess near the singular edge of the crack, and secondly due to the complexity of algebraic expressions that have to be manipulated.

In the present investigation we extensively used MACSYMA[5], a large symbolic manipulation program at MIT, thereby alleviating some of these difficulties. A simple condition between mid-side nodes has been derived which simulates the proper singular behavior along the crack.

In the investigation we first study a simple collapsed brick element. This is then generalized to a curved crack front. A few results are derived which can be used to compute the stress intensity factors. The concept of the transitional element has also been outlined. The stability of singular element has been discussed. Some of these ideas have been applied to a specific problem with unusual crack geometry. The analysis was carried out using ADINA on VAX machine. ADINA was implemented on VAX by W. E. Lorensen.  相似文献   


19.
The class of interpolatory—Newton iterations is defined and analyzed for the computation of a simple zero of a non-linear operator in a Banach space of finite or infinite dimension. Convergence of the class is established.

The concepts of “informationally optimal class of algorithms” and “optimal algorithm” are formalized. For the multivariate case, the optimality of Newton iteration is established in the class of one-point iterations under an “equal cost assumption”.  相似文献   


20.
In this work we develop an a-posteriori error estimation for the finite element simulation of an advection-diffusion problem with anisotropic meshes. The error analysis is carried out by exploiting some recent interpolation error estimates for anisotropic meshes [7, 8] within the a-posteriori analysis framework proposed in [2, 3, 4, 16]. The target application is the transport of a solute (like oxygen or lipids) by the blood stream in a large artery. Received: 30 January 2001 / Accepted: 30 May 2001  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号