首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
This article discusses the potential for legal commercial intellegence in the ocean shipping industry through the analysis of the information in the industry; it also describes an information system for processing such industry data. The system collects and verifies data from many sources and supplies information for the decision-maker in a comprehensible format, based on facts and not on guesses. Typical applications of the system are discussed and their impact is presented for several cases.  相似文献   

2.
A data file on the geochemistry of ferromanganese deposits has been constituted under the joint effort a group of individuals belonging to several institutions: Ecole des Mines, Paris, Woods Hole Oceanographic Institution, U.S.A., and CNEXO, France.We present in this paper the file management and coding schemes which have been used regarding this world-wide compilation as well as some results of a multidimensional statistical technique which is suitable for revealing the global trends existing in the geochemistry of manganese nodules.Various computer programs have been developed in order to display the recorded data using criteria such as: geographical location, type of metal analyzed, and name of the author.Some crude statistical schemes also can be selected in order to summarize the information related to samples which have the same location or which are close to one another. The available statistics are of three types: average, maximum, and minimum.A multidimensional statistical analysis termed correspondence analysis has been used to account for the similarities between the sampled nodules in view of the entire set of chemical elements analyzed.The results are displayed as two-dimensional diagrams where the samples and the chemical elements are represented as points. The distance between two points is a measure of the correlation existing between the associated samples or elements; the shorter the distance the higher the correlation.Correspondence analysis is a powerful instrument in the study of the various aspects of the geochemistry of manganese nodules. It can help in pinpointing the main factors which influence the affinities between sampled nodules in the entire set of variables.  相似文献   

3.
The analysis of biomedical signals has mostly been restricted to traditional signal-processing methods. This article proposes a different approach, applied to the evaluation of brainstem auditory evoked potentials. An automatic peak identification method is described which uses criteria applied by human evaluators in visual analysis. These criteria are defined as fuzzy sets and are combined using fuzzy operations, thus reflecting the weighting of different facts by humans. Membership values are interpreted as degrees of satisfaction which indicate the degree to which a sample satisfies a given criterion. The system judges its own performance in terms of degrees of reliability. Tests on a large set of clinical data showed high performance on good and average-quality curves. A substantial drawback was the assignment of too many peaks in poor potentials. The approaches presented here can easily be applied to similar one-dimensional (and higher) signal analysis tasks.  相似文献   

4.
During the last decade a number of approaches to information modelling, or conceptual modelling, have been presented. In most approaches it is assumed that the correctness of an information model is decided from intuitive considerations, though formal methods may be applied.In this report we discuss three correctness criteria of information models, namely: consistency, satisfiability and completeness. The satisfiability of an information model in a universe of discourse can be formally checked by complementing the information model with a representation of concrete knowledge which is assumed to be complete. A formal basis for such correctness checking is presented. Further, the relationship between an information model and a conceptual schema is discussed, and an observed transformation problem is presented.  相似文献   

5.
Information systems increasingly provide options for visually inspecting data during the process of information discovery and exploration. Little research has dealt so far with user interactions with these systems, and specifically with the effects of characteristics of the displayed data and the user on performance with such systems. The study reports an experiment on users' performance with a visual exploration system. Participants had to identify target graphs within a large set of candidate graphs by using visual filtering criteria that differed in their efficiency for reducing the number of candidate graphs. A pay-off matrix and a time limit served to motivate users to select filter criteria efficiently. Performance was measured as the number of correct identifications of target graphs within a time-limit, and the number, type and position of filter criteria selected for the search. Efficiency was somewhat biased by users' preference to select filter criteria sequentially, starting from the left to the right. Rational and experiential cognitive styles affected performance, and they interacted with learning and the types of filter criteria chosen. The study shows that not only visual search tools can be used effectively but also that data and user characteristics affect task performance with such systems.  相似文献   

6.
The classical model selection criteria, such as the Bayesian Information Criterion (BIC) or Akaike information criterion (AIC), have a strong tendency to overestimate the number of regressors when the search is performed over a large number of potential explanatory variables. To handle the problem of the overestimation, several modifications of the BIC have been proposed. These versions rely on supplementing the original BIC with some prior distributions on the class of possible models. Three such modifications are presented and compared in the context of sparse Generalized Linear Models (GLMs). The related choices of priors are discussed and the conditions for the asymptotic equivalence of these criteria are provided. The performance of the modified versions of the BIC is illustrated with an extensive simulation study and a real data analysis. Also, simplified versions of the modified BIC, based on least squares regression, are investigated.  相似文献   

7.
The most popular realizations of adaptive systems are based on the neural network type of algorithms, in particular feedforward multilayered perceptrons trained by backpropagation of error procedures. In this paper an alternative approach based on multidimensional separable localized functions centered at the data clusters is proposed. In comparison with the neural networks that use delocalized transfer functions this approach allows for full control of the basins of attractors of all stationary points. Slow learning procedures are replaced by the explicit construction of the landscape function followed by the optimization of adjustable parameters using gradient techniques or genetic algorithms. Retrieving information does not require searches in multidimensional subspaces but it is factorized into a series of one-dimensional searches. Feature Space Mapping is applicable to learning not only from facts but also from general laws and may be treated as a fuzzy expert system (neurofuzzy system). The number of nodes (fuzzy rules) is growing as the network creates new nodes for novel data but the search time is sublinear in the number of rules or data clusters stored. Such a system may work as a universal classificator, approximator and reasoning system. Examples of applications for the identification of spectra (classification), intelligent databases (association) and for the analysis of simple electrical circuits (expert system type) are given.  相似文献   

8.
A new approach is used to examine the stability of feedback systems comprising a time-varying linear subsystem and a time-varying non-linearity. The paper examines three stability aspects of the systems dynamic response: boundedness, unboundedness, and asymptotic decay. In addition to qualitative criteria, the results provide explicit quantitative bounds on the system response. The distinct time-domain method of analysis presented stipulates rather weak a priori restrictions on the nature of the linear and non-linear parts of the system, thus admitting a relatively broad class of systems. In the general time-varying non-linear feedback system considered, the non-linearity may exhibit hysteresis, and the linear subsystem may be non-causal, and may comprise any number of feed-forward differentiators of any order. The criteria obtained are given a simple graphical interpretation. A transformation which trades time-varying gains between plant and non-linearity is introduced. A number of examples demonstrate that the present results can predict stability information not obtainable from other existing criteria. In one example of a system with stationary plant, the present criteria prove superior to the Circle Theorem.  相似文献   

9.
This paper presents a multiagent system for studying in vitro cell motion. A typical application on the wound closure process is presented to illustrate the possibilities of the system, where different image sequences will be treated. The motion issue involves three aspects: image segmentation, object tracking and motion analysis. The current system version focuses mainly on the image segmentation aspect. A general agent model has been designed, which will be further expanded to include tracking and motion analysis behaviors as well. The agents integrate three basic behaviors: perception, interaction and reproduction. The perception evaluates pixels upon static and motion-based criteria. The interaction behavior allows two agents to merge or to negotiate parts of regions. The negotiation can be seen as a segmentation refinement process done by the agents. Finally, the reproduction behavior defines an exploration strategy of the images. Agents can start other agents around them, or they can duplicate themselves in the next frame. The frames are processed in pipeline, where previous information is used to treat the current frame. One unique agent model exist. Agents are specialized on execution time according to their goals. The results, coming from an existing prototype, show different types of cell behavior during cell migration, based on cell nuclei analysis.  相似文献   

10.
The design of pharmacokinetic and pharmacodynamic experiments concerns a number of issues, among which are the number of observations and the times when they are taken. Often a model is used to describe these data and the pharmacokinetic-pharmacodynamic behavior of a drug. Knowledge of the data analysis model at the design stage is beneficial for collecting patient data for parameter estimation. A number of criteria for model-oriented experiments, which maximize the information content of the data, are available. In this paper we present a program, Popdes, to investigate the D-optimal design of individual and population multivariate response models, such as pharmacokinetic-pharmacodynamic, physiologically based pharmacokinetic, and parent drug and metabolites models. A pre-clinical and clinical pharmacokinetic-pharmacodynamic model describing the concentration-time profile and effect of an oncology compound in development is used for illustration.  相似文献   

11.
A method, APEX, for query evaluation in deductive databases presented in this work is based on discovering of axioms and facts relevant to a given query. The notion of relevancy and migration of facts is derived from an analysis of data flow in the system. APEX is complete, and incorporates efficient query evaluation heuristics. Operation of APEX is illustrated by sample databases involving non-linear recursive axioms and cyclic relations. Main virtues of the method are its generality and adaptivity: it imposes no restrictions on the structure of axioms or the contents of relations, and it employs the knowledge of the actual data acquired at each step of a query evaluation.  相似文献   

12.
Principles are abstract rules intended to guide decision-makers in making normative judgments in domains like the law, politics, and ethics. It is difficult, however, if not impossible to define principles in an intensional manner so that they may be applied deductively. The problem is the gap between the abstract, open-textured principles and concrete facts. On the other hand, when expert decision-makers rationalize their conclusions in specific cases, they often link principles to the specific facts of the cases. In effect, these expert-defined associations between principles and facts provide extensional definitions of the principles. The experts operationalize the abstract principles by linking them to the facts.

This paper discusses research in which the following hypothesis was empirically tested: extensionally defined principles, as well as cited past cases, can help in predicting the principles and cases that might be relevant in the analysis of new cases. To investigate this phenomenon computationally, a large set of professional ethics cases was analyzed and a computational model called SIROCCO, a system for retrieving principles and past cases, was constructed. Empirical evidence is presented that the operationalization information contained in extensionally defined principles can be leveraged to predict the principles and past cases that are relevant to new problem situations. This is shown through an ablation experiment, comparing SIROCCO to a version of itself that does not employ operationalization information. Further, it is shown that SIROCCO's extensionally defined principles and case citations help it to outperform a full-text retrieval program that does not employ such information.  相似文献   


13.
On Using a Warehouse to Analyze Web Logs   总被引:1,自引:0,他引:1  
Analyzing Web Logs for usage and access trends can not only provide important information to web site developers and administrators, but also help in creating adaptive web sites. While there are many existing tools that generate fixed reports from web logs, they typically do not allow ad-hoc analysis queries. Moreover, such tools cannot discover hidden patterns of access embedded in the access logs. We describe a relational OLAP (ROLAP) approach for creating a web-log warehouse. This is populated both from web logs, as well as the results of mining web logs. We discuss the design criteria that influenced our choice of dimensions, facts and data granularity. A web based ad-hoc tool for analytic queries on the warehouse was developed. We present some of the performance specific experiments that we performed on our warehouse.  相似文献   

14.
TUNEX, an expert system developed for performance tuning of the UNIX operating system, is described. TUNEX was developed on UNIX system V. It uses the properties, commands and utilities of this version. The tuning activities it is concerned with include: (1) adjusting operating system tunable parameters, such as number of disk buffers; (2) running maintenance routines, i.e. reorganizing file systems; (3) developing operation rules, such as off-peak hour runs of backups; and (4) modifying hardware, buying an additional disk drive. The structure of TUNEX is presented and performance analysis modules which provide quantitative information to this tool are briefly described. The overhead in the resource usage introduced by the performance monitoring and tuning tool itself is discussed; the author points to the areas in which additional resources are required by TUNEX  相似文献   

15.
A highly interactive visual analysis system is presented that is based on an enhanced variant of parallel coordinates — a multivariate information visualization technique. The system combines many variations of previously described visual interaction techniques such as dynamic axis scaling, conjunctive visual queries, statistical indicators, and aerial perspective shading. The system capabilities are demonstrated on a hurricane climate data set. This climate study corroborates the notion that enhanced visual analysis with parallel coordinates provides a deeper understanding when used in conjunction with traditional multiple regression analysis.  相似文献   

16.
Redundant or distributed systems are increasingly used in system design so that the required reliability and availability can be easily achieved. However, such an approach requires additional resources that can be very costly. Hence, how to design and test such a system in the most cost-effective way is of concern to the developers. A general cost model and a solution algorithm are presented for the determination of the optimal number of hosts and optimal system debugging time that minimize the total cost while achieving a certain performance objective. During testing, software faults are corrected and the reliability shows an increasing trend, and hence system reliability increases. A general system model is constructed based on a Markov process with software reliability and availability obtained from software reliability growth models. The optimization problem is formulated based on the cost criteria and the solution procedure is described. An application example is presented.  相似文献   

17.
Classification using adaptive wavelets for feature extraction   总被引:8,自引:0,他引:8  
A major concern arising from the classification of spectral data is that the number of variables or dimensionality often exceeds the number of available spectra. This leads to a substantial deterioration in performance of traditionally favoured classifiers. It becomes necessary to decrease the number of variables to a manageable size, whilst, at the same time, retaining as much discriminatory information as possible. A new and innovative technique based on adaptive wavelets, which aims to reduce the dimensionality and optimize the discriminatory information is presented. The discrete wavelet transform is utilized to produce wavelet coefficients which are used for classification. Rather than using one of the standard wavelet bases, we generate the wavelet which optimizes specified discriminant criteria  相似文献   

18.
A nonparametric gradient-less shape optimization approach for finite element stress minimization problems is presented. The shape optimization algorithm is based on optimality criteria, which leads to a robust and fast convergence independent of the number of design variables. Sensitivity information of the objective function and constraints are not required, which results in superior performance and offers the possibility to solve the structural analysis task using fast and reliable industry standard finite element solvers such as ABAQUS, ANSYS, I-DEAS, MARC, NASTRAN or PERMAS. The approach has been successfully extended to complex nonlinear problems including material, boundary and geometric nonlinear behavior. The nonparametric geometry representation creates a complete design space for the optimization problem, which includes all possible solutions for the finite element discretization. The approach is available within the optimization system TOSCA and has been used successfully for real-world optimization problems in industry for several years. The approach is compared to other approaches and the benefits and restrictions are highlighted. Several academic and real-world examples are presented.  相似文献   

19.
20.
Automatic verification for a class of distributed systems   总被引:1,自引:0,他引:1  
Summary. The paper presents a new analysis method for a class of concurrent systems which are formed of several interacting components with the same structure. The model for these systems is composed of a control process and a set of homogeneous user processes. The control and user processes are modeled by finite labeled state transition systems which interact by means of enabling functions and triggering mechanisms. Based on this structure, an analysis method is presented which allows system properties, derived by reachability analysis for a finite number of user processes, to be generalized to an arbitrary number of user processes. A procedure for the automatic verification of properties such as mutual exclusion and absence of deadlocks is presented and is then used to provide for the first time a fully automated verification of the Lamport's fast mutual exclusion algorithm. Received: October 1998/Accepted January 2000  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号