首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Time series data mining (TSDM) techniques permit exploring large amounts of time series data in search of consistent patterns and/or interesting relationships between variables. TSDM is becoming increasingly important as a knowledge management tool where it is expected to reveal knowledge structures that can guide decision making in conditions of limited certainty. Human decision making in problems related with analysis of time series databases is usually based on perceptions like “end of the day”, “high temperature”, “quickly increasing”, “possible”, etc. Though many effective algorithms of TSDM have been developed, the integration of TSDM algorithms with human decision making procedures is still an open problem. In this paper, we consider architecture of perception-based decision making system in time series databases domains integrating perception-based TSDM, computing with words and perceptions, and expert knowledge. The new tasks which should be solved by the perception-based TSDM methods to enable their integration in such systems are discussed. These tasks include: precisiation of perceptions, shape pattern identification, and pattern retranslation. We show how different methods developed so far in TSDM for manipulation of perception-based information can be used for development of a fuzzy perception-based TSDM approach. This approach is grounded in computing with words and perceptions permitting to formalize human perception-based inference mechanisms. The discussion is illustrated by examples from economics, finance, meteorology, medicine, etc.  相似文献   

2.
“Fuzzy Functions” are proposed to be determined by the least squares estimation (LSE) technique for the development of fuzzy system models. These functions, “Fuzzy Functions with LSE” are proposed as alternate representation and reasoning schemas to the fuzzy rule base approaches. These “Fuzzy Functions” can be more easily obtained and implemented by those who are not familiar with an in-depth knowledge of fuzzy theory. Working knowledge of a fuzzy clustering algorithm such as FCM or its variations would be sufficient to obtain membership values of input vectors. The membership values together with scalar input variables are then used by the LSE technique to determine “Fuzzy Functions” for each cluster identified by FCM. These functions are different from “Fuzzy Rule Base” approaches as well as “Fuzzy Regression” approaches. Various transformations of the membership values are included as new variables in addition to original selected scalar input variables; and at times, a logistic transformation of non-scalar original selected input variables may also be included as a new variable. A comparison of “Fuzzy Functions-LSE” with Ordinary Least Squares Estimation (OLSE)” approach show that “Fuzzy Function-LSE” provide better results in the order of 10% or better with respect to RMSE measure for both training and test cases of data sets.  相似文献   

3.
Extraction of geometric characteristics for manufacturability assessment   总被引:1,自引:0,他引:1  
One of the advantages of feature-based design is that it provides data which are defined as parameters of features in readily available forms for tasks from design through manufacturing. It can thus facilitate the integration of CAD and CAM. However, not all design features are features required in down stream applications and not all parameters or data can be predefined in the features. One of the significant examples is property that is formed by feature interactions. For example, the interaction of a positive feature and a negative feature induces a wall thickness change that might cause defects in a part. Therefore, the identification of the wall thickness change by detecting the feature interaction is required in the moldability assessment.The work presented in this paper deals with the extraction of geometric characteristics in feature-based design for manufacturability assessment. We focus on the manufacturability assessment of discrete parts with emphasis on a net shape process—injection molding. The definition, derivation and representation of the spatial relationships between features are described. The geometric characteristics formed by feature interactions are generalized as significant items, such as “depth”, “thickness”, “height” etc. based on the generalization of feature shapes. Reasoning on feature interactions and extraction of geometric characteristics is treated as a refinement procedure. High-level spatial relationships—“is_in”, “adjacent_to” and “coplanar” as well as their geometric details are first derived. The significant items formed from feature interactions are then computed based on the detailed spatial relationships. This work was implemented in a computer-aided concurrent product and process development environment to support molding product design assessment.  相似文献   

4.
An integrated multi-unit chemical plant presents a challenging control design problem due to the existence of recycling streams. In this paper, we develop a framework for analyzing the effects of recycling dynamics on closed-loop performance from which a systematic design of a decentralized control system for a recycled, multi-unit plant is established. In the proposed approach, the recycled streams are treated as unmodelled dynamics of the “unit” model so that their effects on closed-loop stability and performance can be analyzed using the robust control theory. As a result, two measures are produced: (1) the ν-gap metric, which quantifies the strength of recycling effects, and (2) the maximum stability margin of “unit” controller, which represents the ability of the “unit” controller to compensate for such effects. A simple rule for the “unit” control design is then established using the combined two measures in order to guarantee the attainment of good overall closed-loop performances. As illustrated by several design examples, the controllability of a recycled, multi unit process under a decentralized “unit” controller can be determined without requiring any detailed design of the “unit” controller because the simple rule is calculated from the open-loop information only.  相似文献   

5.
The threat of cyber attacks motivates the need to monitor Internet traffic data for potentially abnormal behavior. Due to the enormous volumes of such data, statistical process monitoring tools, such as those traditionally used on data in the product manufacturing arena, are inadequate. “Exotic” data may indicate a potential attack; detecting such data requires a characterization of “typical” data. We devise some new graphical displays, including a “skyline plot,” that permit ready visual identification of unusual Internet traffic patterns in “streaming” data, and use appropriate statistical measures to help identify potential cyberattacks. These methods are illustrated on a moderate-sized data set (135,605 records) collected at George Mason University.  相似文献   

6.
The effect of “impatient” behaviour is studied primarily in the context of “double-ended” queues where each demands service from the other, typically taxis and passengers. Related models, single queue, and double, with a variety of mechanisms are considered. “Impatience” is to be understood in a wider context than simply becoming tired of waiting: it can arise because the customer, for some reason, runs out of time (inventory and organ transplantation), or because an alternative service becomes available (communication applications). The emphasis in this paper is theoretical but a brief numerical assessment of operational consequences is given.

Scope and purpose

The “double-ended (or synchronization) queue” is a model for a variety of service demanding/providing systems. In an orderly taxi rank at a railway station or airport, on one side a queue is formed by the arrival of stream of passengers who wait for taxis to their destinations while on the other side a queue of taxis waiting for passengers. Obviously, the two queues can never coexist. The concept of “impatience” enters when a taxi or passenger leaves the queue before receiving service.This concept of “reneging” is widely applicable. In health care, for example, organs are stored for transplantation for needful patients. Both the organs and the demands for them have limited lifetime. A similar scenario applies to perishable inventory systems. In a similar manner, the real-time communication networks admit impatient behaviour. A typical example is a processor-shared queue in data networks with random time-out periods or deadlines.The paper sets out the basics in a variety of theoretical model settings with the common feature of exponential arrival, service and impatience mechanisms. A brief discussion based on numerical calculation is given of some operational features of the models but the thrust is on the theoretical techniques needed to make meaningful operational assessments.  相似文献   

7.
This paper gives a fresh look at my previous work on “epistemic actions” and information updates in distributed systems, from a coalgebraic perspective. I show that the “relational” semantics of epistemic programs, given in [BMS2] in terms of epistemic updates, can be understood in terms of functors on the category of coalgebras and natural transformations associated to them. Then, I introduce a new, alternative, more refined semantics for epistemic programs: programs as “epistemic coalgebras”. I argue for the advantages of this second semantics, from a semantic, heuristic, syntactical and proof-theoretic point of view. Finally, as a step towards a generalization, I show these concepts make sense for other functors, and that apparently unrelated concepts, such as Bayesian belief updates and process transformations, can be seen to arise in the same way as our “epistemic actions”.  相似文献   

8.
A novel technique for maximum “a posteriori” (MAP) adaptation of maximum entropy (MaxEnt) and maximum entropy Markov models (MEMM) is presented.The technique is applied to the problem of automatically capitalizing uniformly cased text. Automatic capitalization is a practically relevant problem: speech recognition output needs to be capitalized; also, modern word processors perform capitalization among other text proofing algorithms such as spelling correction and grammar checking. Capitalization can be also used as a preprocessing step in named entity extraction or machine translation.A “background” capitalizer trained on 20 M words of Wall Street Journal (WSJ) text from 1987 is adapted to two Broadcast News (BN) test sets – one containing ABC Primetime Live text and the other NPR Morning News/CNN Morning Edition text – from 1996.The “in-domain” performance of the WSJ capitalizer is 45% better relative to the 1-gram baseline, when evaluated on a test set drawn from WSJ 1994. When evaluating on the mismatched “out-of-domain” test data, the 1-gram baseline is outperformed by 60% relative; the improvement brought by the adaptation technique using a very small amount of matched BN data – 25–70k words – is about 20–25% relative. Overall, automatic capitalization error rate of 1.4% is achieved on BN data.The performance gain obtained by employing our adaptation technique using a tiny amount of out-of-domain training data on top of the background data is striking: as little as 0.14 M words of in-domain data brings more improvement than using 10 times more background training data (from 2 M words to 20 M words).  相似文献   

9.
The process of optimizing the precision of robotic liquidhandling instruments can be improved using the Design of Experiments methodology. Design of Experiments (DOE) is “a collection of statistical and mathematical techniques useful for developing, improving, and optimizing processes.” Using DOE one can design and analyze experiments with the goal of optimizing the precision of liquid deliveries. Tecan has developed a software application, “Neptune” to automate this process for the Tecan Genesis series of instruments. This application has been used to perform experiments on the liquid-handling properties of a variety of liquids. As an example of this process, we will examine a set of experiments performed on a 50% concentration of polyethylene glycol 8000. These experiments resulted in an improvement in the pipetting precision from an average CV of 22.2% to an average of 2.9%.  相似文献   

10.
The literature suggests the existence of critical success factors (CSFs) for the development of information systems that support senior executives. Our study of six organizations gives evidence for this notion of CSFs. The study further shows an interesting pattern, namely that companies either “get it right”, and essentially succeed on all CSFs, or “get it completely wrong”, that is, fall short on each of the CSFs. Among the six cases for which data were collected through in-depth interviews with company executives, three organizations seemed to manage all the CSFs properly, while two others managed all CSFs poorly. Only one organization showed a mixed scorecard, managing some factors well and some not so well. At the completion of the study, this organization could neither be judged as a success, nor as a failure. This dichotomy between success and failure cases suggests the existence of an even smaller set of “meta-success” factors. Based on our findings, we speculate that these “meta-success” factors are “championship”, “availability of resources”, and “link to organization objectives”.  相似文献   

11.
Defining operational semantics for a process algebra is often based either on labeled transition systems that account for interaction with a context or on the so-called reduction semantics: we assume to have a representation of the whole system and we compute unlabeled reduction transitions (leading to a distribution over states in the probabilistic case). In this paper we consider mixed models with states where the system is still open (towards interaction with a context) and states where the system is already closed. The idea is that (open) parts of a system “P” can be closed via an operator “PG” that turns already synchronized actions whose “handle” is specified inside “G” into prioritized reduction transitions (and, therefore, states performing them into closed states). We show that we can use the operator “PG” to express multi-level priorities and external probabilistic choices (by assigning weights to handles inside G), and that, by considering reduction transitions as the only unobservable τ transitions, the proposed technique is compatible, for process algebra with general recursion, with both standard (probabilistic) observational congruence and a notion of equivalence which aggregates reduction transitions in a (much more aggregating) trace based manner. We also observe that the trace-based aggregated transition system can be obtained directly in operational semantics and we present the “aggregating” semantics. Finally, we discuss how the open/closed approach can be used to also express discrete and continuous (exponential probabilistic) time and we show that, in such timed contexts, the trace-based equivalence can aggregate more with respect to traditional lumping based equivalences over Markov Chains.  相似文献   

12.
This paper describes a comparative study of a multidimensional visualisation technique and multivariate statistical process control (MSPC) for process historical data analysis. The visualisation technique uses parallel coordinates which visualise multidimensional data using two dimensional presentations and allow identification of clusters and outliers, therefore, can be used to detect abnormal events. The study is based on a database covering 527 days of operation of an industrial wastewater treatment plant. It was found that both the visualisation technique and MSPC based on T2 chart captured the same 17 days as “clearly abnormal” and another eight days as “likely abnormal”. Pattern recognition using K-means clustering was also applied to the same data in literature and was found to have identified 14 out of the 17 “clearly abnormal” days.  相似文献   

13.
Newer approaches for modelling travel behaviour require a new approach to integrated spatial economic modelling. Travel behaviour modelling is increasingly disaggregate, econometric, dynamic, and behavioural. A fully dynamic approach to urban system modelling is described, where interactions are characterized as two agents interacting through discrete events labelled as “offer” or “accept”. This leads to a natural partition of an integrated urban model into submodels based on the category of what is being exchanged, the type of agent, and the time and place of interaction.Where prices (or price-like signals such as congested travel times) exist to stimulate supply and/or to suppress demand, the dynamic change in prices can be represented either behaviourally, as individual agents adjust their expectations in response to their personal history and the history of the modelled region, or with an “auctioneer” from micro-economic theory, who adjusts average prices. When no auctioneers are used, the modelling system can use completely continuous representations of both time and space.Two examples are shown. The first is a demonstration of a continuous-time continuous-space transaction simulation with simple agents representing businesses and households. The second shows how an existing model—the Oregon TLUMIP project for statewide land-use and transport modelling—can be adapted into the paradigm.  相似文献   

14.
We show that the negative feedback interconnection of two causal, stable, linear time-invariant systems, with a “mixed” small gain and passivity property, is guaranteed to be finite-gain stable. This “mixed” small gain and passivity property refers to the characteristic that, at a particular frequency, systems in the feedback interconnection are either both “input and output strictly passive”; or both have “gain less than one”; or are both “input and output strictly passive” and simultaneously both have “gain less than one”. The “mixed” small gain and passivity property is described mathematically using the notion of dissipativity of systems, and finite-gain stability of the interconnection is proven via a stability result for dissipative interconnected systems.  相似文献   

15.
“Walkthrough” and “Jogthrough” techniques are well known expert based methodologies for the evaluation of user interface design. In this paper we describe the use of “Graphical” Jogthrough method for evaluating the interface design of the Network Simulator, an educational simulation program that enables users to virtually build a computer network, install hardware and software components, make the necessary settings and test the functionality of the network. Graphical Jogthrough is a further modification of a typical Jogthrough method, where evaluators' ratings produce evidence in the form of a graph, presenting estimated proportion of users who effectively use the interface versus the time they had to work with it in order to succeed effectiveness. We comment on the question: “What are the possible benefits and limitations of the Graphical Jogthrough method when applied in the case of educational software interface design?” We present the results of the evaluation session, and concluding from our experience we argue that the method could offer designers quantitative and qualitative data for formulating a useful (though rough in some aspects) estimation about the novice–becoming–expert pace that end users might follow when working with the evaluated interface.  相似文献   

16.
Gestalt psychology has shown the importance in human thinking and problem solving of the behavior that it labels “intuition,” “insight,” and “understanding.” This paper discusses computer programs, already described in the published literature, that stimulate exactly these kinds of behaviors. It is shown that much of what has been discussed under the heading of “insight” can be explained in terms of recognition processes that are readily simulated. Computer simulation has shown itself a powerful tool for interpreting and explaining a wide range of phenomena associated with the kinds of thinking and understanding that have been so usefully emphasized in the Gestalt literature.  相似文献   

17.
Many number theoretic problems such as integer factorization and the discrete logarithm problem have defied all attempts to classify their complexities. Thirteen such problems are considered, none of which is known either to have a deterministic polynomial time solution, or to be complete for any natural complexity class. Failing this, the next best goal is to determine which among these are the “easiest” and which are the “hardest” problems. Toward this end, this paper gives an overview of reductions among the problems. Two reductions are new: a deterministic polynomial time reduction from squarefreeness to Euler's function φ(n), and a probabilistic polynomial time reduction from order modulo a prime power to discrete logarithm modulo a prime power.  相似文献   

18.
Conflict situations do not only arise from misunderstandings, erroneous perceptions, partial knowledge, false beliefs, etc., but also from differences in “opinions” and in the different agents' value systems. It is not always possible, and maybe not even desirable, to “solve” this kind of conflict, as the sources are subjective. The communicating agents can, however, use knowledge of the opponent's preferences, to try and convince the partner of a point of view which they wish to promote. To deal with these situations requires an argumentative capacity, able to handle not only “demonstrative” arguments but also “dialectic” ones, which may not necessarily be based on rationality and valid premises. This paper presents a formalization of a theory of informal argumentation, focused on techniques to change attitudes of the interlocutor, in the domain of health promotion.  相似文献   

19.
On the controllability of linear juggling mechanical systems   总被引:1,自引:0,他引:1  
This paper deals with the controllability of a class of nonsmooth complementarity mechanical systems. Due to their particular structure they can be decomposed into an “object” and a “robot”, consequently they are named juggling systems. It is shown that the accessibility of the “object” can be characterized by nonlinear constrained equations, or generalized equations. Examples are presented, including a simple model of backlash. The main focus of the work is about linear jugglers.  相似文献   

20.
Intelligence analysts construct hypotheses from large volumes of data, but are often limited by social and organizational norms and their own preconceptions and biases. The use of exploratory data mining technology can mitigate these limitations by requiring fewer assumptions. We present the design of the ATHENS system, which discovers novel information, relative to a specified set of existing knowledge, in large information repositories such as the World Wide Web. We illustrate the use of the system by starting from the terms “al Qaeda” and “bin Laden”" and running the ATHENS system as if on September 12th, 2001. This provides a picture of what novel information could have been known at the time. This is of some intrinsic interest, but also serves to validate the performance of the system since much of this novel information has been discovered by conventional means in the intervening years.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号