首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 991 毫秒
1.
In this paper we analyze a fundamental issue which directly impacts the scalability of current theoretical neural network models to applicative embodiments, in both software as well as hardware. This pertains to the inherent and unavoidable concurrent asynchronicity of emerging fine-grained computational ensembles and the consequent chaotic manifestations in the absence of proper conditioning. The latter concern is particularly significant since the computational inertia of neural networks in general and our dynamical learning formalisms manifests itself substantially, only in massively parallel hardward—optical, VLSI or opto-electronic. We introduce a mathematical framework for systematically reconditioning additive-type models and derive a neuro-operator, based on the chaotic relaxation paradigm whose resulting dynamics are neither “concurrently” synchronous nor “sequentially” asynchronous. Necessary and sufficient conditions guaranteeing concurrent asynchronous convergence are established in terms of contracting operators. Lyapunov exponents are also computed to characterize the network dynamics and to ensure that throughput-limiting “emergent computational chaos” behavior in models reconditioned with concurrently asynchronous algorithms was eliminated.  相似文献   

2.
3.
This paper concerns the construction of a quadrilateral finite element whose interpolation space admits of rational fractions for basis functions of “Wachspress type” [1, 2]. The construction of this finite element, which is in a way the “rational” equivalent of the ADINI finite element[3, 4], is founded on a method analogous to the one used for Serendip degree-two finite element construction in[2]. The study of interpolation error is dealt with in a paper by Apprato, Arcangeli and Gout in this journal “Rational interpolation of Wachspress error estimates”.  相似文献   

4.
This paper presents a new class of algorithms based on Youden designs to detect and restore edges present in an image imbedded by mixture or “salt and pepper” noise. The mixture noise consists of a uncorrelated or correlated noisy background plus uncorrelated impulsive noise. The objective is to restore pixels affected by the impulsive part of the mixture noise. The approach is to consider that these pixels have lost their true value and their estimate is obtained via the normal equation that yields the least sum of square error (LSSE). This procedure is known in the literature as “The Missing Value Approach Problem”. The estimates are introduced into the image data and an ANOVA technique based on Youden design is carried out. We introduce Youden designs which are special Symmetric Balanced Incomplete block (SBIB) designs, the pertinent statistical tests and estimates of the factor effects. We derive the estimate of the missing value for the uncorrelated noise environment as well as for the correlated one. The high level of performance of these algorithms can be evaluated visually via the input/output images and objectively via the input/output signal-to-noise ratio (SNR).  相似文献   

5.
A new semantic-based video scene retrieval method is proposed in this paper. Twelve low-level features extracted from a video clip are represented in a genetic chromosome and target videos that user has in mind are retrieved by the interactive genetic algorithm through the feedback iteration. In this procedure, high-level semantic relevance between retrieved videos is accumulated with so-called semantic relevance matrix and semantic frequency matrix for each iteration, and they are combined with an automatic feature weight update scheme to retrieve more target videos at the next iteration. Experiments over 300 movie scene clips extracted from latest well-known movies, showed an user satisfaction of 0.71 at the fourth iteration for eight queries such as “gloominess”, “happiness”, “quietness”, “action”, “conversation”, “explosion”, “war”, and “car chase”.  相似文献   

6.
This paper presents a case of introducing new technology to a single stage in a maintenance operation composed of sequence of stages. The process - Thermal tile replacement - is a low volume, high value operation. A method for data collection at each stage, to estimate the variability in process quality, cost and duration is presented. The method involves: Identifying key product features, accuracy measure for each, rate of product rejection by feature and the associated probability density functions at each stage. The method relates accuracy variability by feature, “effect” to the contributing stage in the process “cause”. Simulation is used to justify the introduction of a new technology and to predict the percentage of product conformity in a “before” and “after” scenarios for the implementation of the new technology. The simulation model enables the quantification of technology impact on the product quality, overall productivity and the associated cost savings.  相似文献   

7.
This paper is about the study of interpolation error for the Hermite rational “Wachspress type” third degree finite element that is constructed in[1]. We obtain results analogous with those of the “corresponding” ADINI (polynomial) finite element.  相似文献   

8.
It is shown that in first-order linear-time temporal logic, validity questions can be translated into validity questions of formulas not containing “next” or “until” operators. The translation can be performed in linear time.  相似文献   

9.
By manipulating the imaginary part of the complex-analytic quadratic equation, we obtain, by iteration, a set that we call the “burning ship” because of its appearance. It is an analogue to the Mandelbrot set (M-set). Its nonanalytic “quasi”-Julia sets yield surprizing graphical shapes.  相似文献   

10.
This note corrects two errors that occurred during the typesetting of our paper “Axiomatisations of functional dependencies in the presence of records, lists, sets and multisets”, which appeared in Hartmann et al. [Axiomatisations of functional dependencies in the presence of records, lists, sets and multisets, Theoret. Comput. Sci. 353(2) (2006) 167–196].  相似文献   

11.
THE PREVAILING WISDOM, AND a common “best practice,” of knowledge management (KM) is that a primary determinant of success in getting people to submit their most valuable personal knowledge to a repository is the existence of a “knowledge culture” in the organization.  相似文献   

12.
This paper introduces the research work of the Performance Based Studies Research Group (PBSRG), the Alliance of Construction Excellence (ACE), and the Del E. Webb School of Construction (DEWSC) to apply a new evaluation/delivery mechanism for construction systems to increase performance of the construction systems. The research is based on “fuzzy thinking” and the management of information. Concepts utilized include “backward chaining” solution methodology, and a relative distancing model to procure the best available facility systems. Participants in the program include general contractors, Job Order Contractors, mechanical contractors, roofing contractors, material suppliers and manufacturers, and facility owners and managers from Motorola, Honeywell, Morrison Knudson, IBM, and from other major facilities in the Phoenix metropolitan area.  相似文献   

13.
The Inner Graphic Formula Method (IGF) which was originally conceived by Professor Ishiketa and further developed by him and his associates was used to investigate the motivation of new company employees.

Japanese companies traditionally recruit new employees from senior classes and notify successful candidates of their intention to employ them around the first of January. Since graduation is in March, April first is, then, the first day of work for almost all of these graduates in their new companies.

The investigation period for this study covers the eleven months from January until the middle of November, and therefore includes the three month period after notification but prior to actual work, from January first until March thirty-first, and the first eight month of actual work, from April first to the middle of November. The subjects fell, naturally, into two groups; a “Blue Collar” group and a “White Collar” group.

This paper deals with the motivation of these newly employed workers in general and, specifically, with the difference in motivational tendencies between “Blue Collar” and “White Collar” workers. As expected analysis showed that clear motivational differences appeared.

Motivation in the white collar workers tended to raise after an initial downturn, while a general downward trend was detected for the blue collar workers. White collar worker's attitudes toward themselves and toward their work seemed to change for the better as a result of having the chance to become introspective while plotting the graph and writing the anecdotal responses needed to complete the investigative sheet for this study.  相似文献   


14.
Existing search engines––with Google at the top––have many remarkable capabilities; but what is not among them is deduction capability––the capability to synthesize an answer to a query from bodies of information which reside in various parts of the knowledge base.

In recent years, impressive progress has been made in enhancing performance of search engines through the use of methods based on bivalent logic and bivalent-logic-based probability theory. But can such methods be used to add nontrivial deduction capability to search engines, that is, to upgrade search engines to question-answering systems? A view which is articulated in this note is that the answer is “No.” The problem is rooted in the nature of world knowledge, the kind of knowledge that humans acquire through experience and education.

It is widely recognized that world knowledge plays an essential role in assessment of relevance, summarization, search and deduction. But a basic issue which is not addressed is that much of world knowledge is perception-based, e.g., “it is hard to find parking in Paris,” “most professors are not rich,” and “it is unlikely to rain in midsummer in San Francisco.” The problem is that (a) perception-based information is intrinsically fuzzy; and (b) bivalent logic is intrinsically unsuited to deal with fuzziness and partial truth.

To come to grips with fuzziness of world knowledge, new tools are needed. The principal new tool––a tool which is briefly described in this note––is Precisiated Natural Language (PNL). PNL is based on fuzzy logic and has the capability to deal with partiality of certainty, partiality of possibility and partiality of truth. These are the capabilities that are needed to be able to draw on world knowledge for assessment of relevance, and for summarization, search and deduction.  相似文献   


15.
This paper describes the information system used to search for a potential matrimonial partner. The search is based on comparison of the subject's record, which consists of his/her answers to about 400 items of a specially designed questionnaire, to the records of the potential partners. The basic principle of the system is representation of the set of candidates for the client with psychological warnings about potential “conflict zones” in relationships between client and candidate rather than a ranking of candidates based on hypothetical “psychological compatibility” indices.  相似文献   

16.
The paper presents an approach to characterizing a “stop–flow” mode of sensor array operation. The considered operation mode involves three successive phases of sensors exposure: flow (in a stream of measured gas), stop (in zero flow conditions) and recovery (in a stream of pure air). The mode was characterized by describing the distribution of information, which is relevant for classification of measured gases in the response of sensor array. The input data for classifier were the sets of sensors output values, acquired in discrete time moments of the measurement. Discriminant Function Analysis was used for data analysis. Organic vapours of ethanol, acetic acid and ethyl acetate in air were measured and classified. Our attention was focused on data sets which allowed for 100% efficient recognition of analytes. The number, size and composition of those data sets were examined versus time of sensor array response. This methodology allowed to observe the distribution of classification-relevant information in the response of sensor array obtained in “stop–flow” mode. Hence, it provided for the characterization of this mode.  相似文献   

17.
Myung-Gon Yoon   《Automatica》2000,36(12):1923-1925
The paper “L optimal control of SISO continuous-time systems” by Wang and Sznaier (Wang & Sznaier (1997). Automatica, 33 (1), 85–90) studies the problem of designing a controller that optimally minimizes the peak absolute value of system output, due to a fixed input signal. With a newly defined function space A, it was claimed that the set of all L-bounded outputs could be parameterized and that the problem could be transformed to a minimal distance problem on L space. We believe, however, their formulation has essential flaws.  相似文献   

18.
“Walkthrough” and “Jogthrough” techniques are well known expert based methodologies for the evaluation of user interface design. In this paper we describe the use of “Graphical” Jogthrough method for evaluating the interface design of the Network Simulator, an educational simulation program that enables users to virtually build a computer network, install hardware and software components, make the necessary settings and test the functionality of the network. Graphical Jogthrough is a further modification of a typical Jogthrough method, where evaluators' ratings produce evidence in the form of a graph, presenting estimated proportion of users who effectively use the interface versus the time they had to work with it in order to succeed effectiveness. We comment on the question: “What are the possible benefits and limitations of the Graphical Jogthrough method when applied in the case of educational software interface design?” We present the results of the evaluation session, and concluding from our experience we argue that the method could offer designers quantitative and qualitative data for formulating a useful (though rough in some aspects) estimation about the novice–becoming–expert pace that end users might follow when working with the evaluated interface.  相似文献   

19.
《Information Systems》1989,14(6):443-453
Using a fuzzy-logic-based calculus of linguistically quantified propositions we present FQUERY III+, a new, more “human-friendly” and easier-to-use implementation of a querying scheme proposed originally by Kacprzyk and Zio kowski to handle imprecise queries including a linguistic quantifier as, e.g. find all records in which most (almost all, much more than 75%, … or any other linguistic quantifier) of the important attributes (out of a specified set) are as desired (e.g. equal to five, more than 10, large, more or less equal to 15, etc.). FQUERY III+ is an “add-on” to Ashton-Tate's dBase III Plus.  相似文献   

20.
Computer integrated manufacturing (CIM) is creating unexpected problems for a growing number of manufacturing companies. Manufacturers are finding it especially difficult to attract programmers who are both willing and able to develop the highly complex software that integrates existing accounting, sales, production, engineering, and quality control information subsystems. Consequently, many companies abdicate their responsibility for manufacturing information systems and seek third party support ranging from consulting assistance to a total takeover of the company's information resources and operations. Companies that “give away” their internal information system capabilities to third parties will ultimately lose control of their enterprise information, a danger to be avoided. Off-the-shelf software for desktop computers has become sufficiently powerful to help solve a major portion of this serious problem. We hypothesize that manufacturing engineers (and others) can be trained to use packaged software to leverage their company's systems programming capabilities. In effect they would become “paraprogrammers” who would help design, develop, and maintain manufacturing information systems. This new type of professional would not require a computer science or similar educational background, but could be trained to satisfy many specialized programming needs in a manner similar to how paramedics and paralegals are trained and used in the medical and legal professions, respectively. This paper reports on the early stages of research to determine whether or not product design engineers can use a desktop relational database management system and its various command languages to develop a master bill of material information system (BOMIS). The purpose of the research is to evaluate the amount of programming complexity reduction and increased operational effectiveness that can be achieved through paraprogramming by manufacturing engineers.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号