首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Incident management systems have the potential to improve security dramatically but often experience problems stemming from organizational, interpersonal and social constraints that limit their effectiveness. These limits may cause underreporting of incidents, leading to erroneous perceptions of the actual safety and security situation of the organization. The true security situation may be better understood and underreporting may be reduced if underlying systemic issues surrounding security incident management are taken into account. A dynamic simulation, based on the parallel experience of industrial incident management systems, illustrates the cumulative effects of rewards, learning, and retributions on the fate of a hypothetical knowledge management system designed to collect information about events and incidents. Simulation studies are part of an ongoing research project to develop sustainable knowledge and knowledge transfer tools that support the development of a security culture.
Matthew JagerEmail:
  相似文献   

2.
In man-machine communication, there is a relationship between what may be described as tacit (human) and explicit (machine) knowledge. The tacit lies in practice and the explicit in the formulation of the processes and content of this practice. However, when a human communicates with another human face to face, we may describe them as communicating aspects of the tacit and explicit dimension of their knowledge, i.e. the expression and its background of meaning for the particular situation. When this is unsuccessful in being communicated, some mediator (not necessarily a third person) is needed to provide the bridge for the particular discrepant aspects of the tacit and explicit dimensions to meet. This is achieved by making the tacit nature of the discrepancy in the communication explicit to both participants such that they both understand the background to their discrepancy. Once they become aware, it is possible for them to begin to resolve it.In considering what has been termed here the cultural interface, i.e. communication across cultures, the paper will explore the nature of discrepancies in communication and the means by which we can accommodate to each other's differences, either via third party help or between ourselves. The interaction of each person's tacit (background of practices) and their interpretation, set against this background, of the explicit (the utterance, silence, gesture) needs to find some mutual ground, involving their cultural self. The operation of mediation and negotiation will be considered in this context.  相似文献   

3.
We are interested in the problem of associating messages with multimedia content for the purpose of identifying them. This problem can be addressed by a watermarking system that embeds the associated messages into the multimedia content (also called Work). A drawback of watermarking is that the content will be distorted during embedding. On the other hand, if we assume that the database is available, the problem can be addressed by a retrieval system. Although no undesirable distortion is introduced when a retrieval system is used, the overhead of searching in large databases is fundamentally difficult (also known as the dimensionality curse). In this paper we present a novel framework that strikes a trade-off between watermarking and retrieval systems. Our framework avoids the dimensionality curse by introducing small distortions (watermark) into the multimedia content. From another perspective, the framework improves the watermarking performance, marked by significant reduction in distortion, by introducing searching ability in the message detection stage. To prove the concept, we give an algorithm based on the proposed notion of active clustering.  相似文献   

4.
This exploratory study aims to achieve a better understanding of the users-related factors that affect the choice of routes in public transport (PT). We also look at what can motivate route and modes changes towards alternatives in a real situation. We investigated the experience of 19 users of PTs, using the critical incident technique (Flanagan in Psychol Bull 51(4):327, 1954). We asked participants to report incidents (i.e. situations) in cases they were very satisfied or dissatisfied with their choice. For both situations, the case of their usual route and case of an alternative were considered. A total of 91 incidents were collected and analysed using a multiple correspondences analysis. Additionally, users’ profiles were characterized and superposed to the analysis of incidents content. The main results are as follows. First, the user’s choice of PT route depends on the context (i.e. aim of the travel, time of day). Second, taking an alternative to the usual PT route or using a route combining different transport modes is determined by the context and by factors related to the pleasantness of the travel (e.g. to accompany a friend along the travel). Finally, depending on the user’s profile (i.e. combination of attitude towards PT and demographic variables), the factors taken into account to make the choice of a PT route are related to the efficiency or the pleasantness of the trip. These results show the importance of the contextual factors and the users’ profiles in route choice. They suggest that these factors should be further taken into account in new tools and services for mobility.  相似文献   

5.
This paper shows how to analytically calculate the statistical properties of the errors in estimated parameters. The basic tools to achieve this aim include first order approximation/perturbation techniques, such as matrix perturbation theory and Taylor Series. This analysis applies for a general class of parameter estimation problems that can be abstracted as a linear (or linearized) homogeneous equation. Of course there may be many reasons why one might which to have such estimates. Here, we concentrate on the situation where one might use the estimated parameters to carry out some further statistical fitting or (optimal) refinement. In order to make the problem concrete, we take homography estimation as a specific problem. In particular, we show how the derived statistical errors in the homography coefficients, allow improved approaches to refining these coefficients through subspace constrained homography estimation (Chen and Suter in Int. J. Comput. Vis. 2008). Indeed, having derived the statistical properties of the errors in the homography coefficients, before subspace constrained refinement, we do two things: we verify the correctness through statistical simulations but we also show how to use the knowledge of the errors to improve the subspace based refinement stage. Comparison with the straightforward subspace refinement approach (without taking into account the statistical properties of the homography coefficients) shows that our statistical characterization of these errors is both correct and useful.
Pei ChenEmail:
  相似文献   

6.
The problem of detecting an anomaly (or abnormal event) is such that the distribution of observations is different before and after an unknown onset time, and the objective is to detect the change by statistically matching the observed pattern with that predicted by a model. In the context of asymmetric threats, The expression “asymmetric threats” refers to tactics employed by countries, terrorist groups, or individuals to carry out attacks on a superior opponent, while trying to avoid direct confrontation. the detection of an abnormal situation refers to the discovery of suspicious activities of a hostile nation or group out of noisy, scattered, and partial intelligence data. The problem becomes complex in a low signal-to-noise ratio environment, such as asymmetric threats, because the “signal” observations are far fewer than “noise” observations. Furthermore, the signal observations are “hidden” in the noise. In this paper, we illustrate the capabilities of hidden Markov models (HMMs), combined with feature-aided tracking, for the detection of asymmetric threats. A transaction-based probabilistic model is proposed to combine HMMs and feature-aided tracking. A procedure analogous to Page's test is used for the quickest detection of abnormal events. The simulation results show that our method is able to detect the modeled pattern of an asymmetric threat with a high performance as compared to a maximum likelihood-based data mining technique. Performance analysis shows that the detection of HMMs improves with increase in the complexity of HMMs (i.e., the number of states in an HMM).   相似文献   

7.
This paper deals with metamorphic viruses. More precisely, it examines the use of advanced code obfuscation techniques with respect to metamorphic viruses. Our objective is to evaluate the difficulty of a reliable static detection of viruses that use such obfuscation techniques. Here we extend Spinellis’ result (IEEE Trans. Inform. Theory, 49(1), 280–284, 2003) on the detection complexity of bounded-length polymorphic viruses to metamorphic viruses. In particular, we prove that reliable static detection of a particular category of metamorphic viruses is an -complete problem. Then we empirically illustrate our result by constructing a practical obfuscator which could be used by metamorphic viruses in the future to evade detection.  相似文献   

8.
The problem of inconsistency between constraints often arises in practice as the result, among others, of the complexity of real models or due to unrealistic requirements and preferences. To overcome such inconsistency two major actions may be taken: removal of constraints or changes in the coefficients of the model. This last approach, that can be generically described as model correction is the problem we address in this paper in the context of linear constraints over the reals. The correction of the right hand side alone, which is very close to a fuzzy constraints approach, was one of the first proposals to deal with inconsistency, as it may be mapped into a linear problem. The correction of both the matrix of coefficients and the right hand side introduces non linearity in the constraints. The degree of difficulty in solving the problem of the optimal correction depends on the objective function, whose purpose is to measure the closeness between the original and corrected model. Contrary to other norms, that provide corrections with quite rigid patterns, the optimization of the important Frobenius norm was still an open problem. We have analyzed the problem using the KKT conditions and derived necessary and sufficient conditions which enabled us to unequivocally characterize local optima, in terms of the solution of the Total Least Squares and the set of active constraints. These conditions justify a set of pruning rules, which proved, in preliminary experimental results, quite successful in a tree search procedure for determining the global minimizer.  相似文献   

9.
In the last years, Total Variation minimization has become a popular and valuable technique for the restoration of noisy and blurred images. In this paper, we present a new technique for image restoration based on Total Variation minimization and the discrepancy principle. The new approach replaces the original image restoration problem with an equality constrained minimization problem. An inexact Newton method is applied to the first-order conditions of the constrained problem. The stopping criterium is derived from the discrepancy principle. Numerical results of image denoising and image deblurring test problems are presented to illustrate the effectiveness of the new approach.
G. LandiEmail:
  相似文献   

10.
In the classical synthesis problem, we are given a specification ψ over sets of input and output signals, and we synthesize a finite-state transducer that realizes ψ: with every sequence of input signals, the transducer associates a sequence of output signals so that the generated computation satisfies ψ. In recent years, researchers consider extensions of the classical Boolean setting to a multi-valued one. We study a multi-valued setting in which the truth values of the input and output signals are taken from a finite lattice, and so is the satisfaction value of specifications. We consider specifications in latticed linear temporal logic (LLTL). In LLTL, conjunctions and disjunctions correspond to the meet and join operators of the lattice, respectively, and the satisfaction values of formulas are taken from the lattice too. The lattice setting arises in practice, for example in specifications involving priorities or in systems with inconsistent viewpoints. We solve the LLTL synthesis problem, where the goal is to synthesize a transducer that realizes the given specification in a desired satisfaction value. For the classical synthesis problem, researchers have studied a setting with incomplete information, where the truth values of some of the input signals are hidden and the transducer should nevertheless realize ψ. For the multi-valued setting, we introduce and study a new type of incomplete information, where the truth values of some of the input signals may be noisy, and the transducer should still realize ψ in the desired satisfaction value. We study the problem of noisy LLTL synthesis, as well as the theoretical aspects of the setting, like the amount of noise a transducer may tolerate, or the effect of perturbing input signals on the satisfaction value of a specification. We prove that the noisy-synthesis problem for LLTL is 2EXPTIME-complete, as is traditional LTL synthesis.  相似文献   

11.
Robust error management within the cockpit is crucial to aviation safety. Crew resource management (CRM) focuses on non-technical skills for error management but the training of technical skills for error detection and error recovery is also a potentially valuable strategy. We propose a theoretical basis for training technical skills in error management as well as a cognitively oriented technique for analysing accidents and incidents to identify specific training requirements. To evaluate the strengths and limitations of this new approach, we present a case study of its application to the F-111, a strike aircraft in the Royal Australian Air Force. This case study demonstrates that the new training approach is both feasible and useful, although an empirical validation of the approach is still necessary. In addition, the case study highlights the limitations of the current F-111 simulator for training technical skills for error detection and error recovery.
Neelam NaikarEmail:
  相似文献   

12.
It has been shown that global predicate detection in a distributed computation is an NP-complete problem in general. However, efficient predicate detection algorithms exist for some subclasses of predicates, such as stable predicates, observer-independent predicates, conjunctions of local predicates, channel predicates, etc. We show here that the problem of deciding whether a given predicate is a member of any of these tractable subclasses is NP-hard in general.We also explore the tractability of linear and regular predicates. In particular, we show that, unless RP=NP, there is no polynomial-time algorithm to detect for linear and regular predicates B.  相似文献   

13.
A minimally unsatisfiable subformula (MUS) is a subset of clauses of a given CNF formula which is unsatisfiable but becomes satisfiable as soon as any of its clauses is removed. The selection of a MUS is of great relevance in many practical applications. This expecially holds when the propositional formula encoding the application is required to have a well-defined satisfiability property (either to be satisfiable or to be unsatisfiable). While selection of a MUS is a hard problem in general, we show classes of formulae where this problem can be solved efficiently. This is done by using a variant of Farkas lemma and solving a linear programming problem. Successful results on real-world contradiction detection problems are presented.  相似文献   

14.
Lane  Terran  Brodley  Carla E. 《Machine Learning》2003,51(1):73-107
This paper introduces the computer security domain of anomaly detection and formulates it as a machine learning task on temporal sequence data. In this domain, the goal is to develop a model or profile of the normal working state of a system user and to detect anomalous conditions as long-term deviations from the expected behavior patterns. We introduce two approaches to this problem: one employing instance-based learning (IBL) and the other using hidden Markov models (HMMs). Though not suitable for a comprehensive security solution, both approaches achieve anomaly identification performance sufficient for a low-level focus of attention detector in a multitier security system. Further, we evaluate model scaling techniques for the two approaches: two clustering techniques for the IBL approach and variation of the number of hidden states for the HMM approach. We find that over both model classes and a wide range of model scales, there is no significant difference in performance at recognizing the profiled user. We take this invariance as evidence that, in this security domain, limited memory models (e.g., fixed-length instances or low-order Markov models) can learn only part of the user identity information in which we're interested and that substantially different models will be necessary if dramatic improvements in user-based anomaly detection are to be achieved.  相似文献   

15.
In this paper, we address and solve the problem of anti-windup augmentation for linear systems with input and output delay. In particular, we give a formal definition of an optimal gain based anti-windup design problem in the global, local, robust and nominal cases. For each of these cases we show that a specific anti-windup compensation structure (which is a generalization of the approach in the Proceedings of the Fourth ECC, Brussels, Belgium, July 1997) is capable of solving the anti-windup problem whenever this solvable. The effectiveness of the proposed scheme is shown on a simple example taken from the literature, in which the plant is a marginally stable linear system.  相似文献   

16.
In this paper we describe an approach in which the Experimental Design Theory (EDT) (see Montgomery and Wiley 1984; Kiefer and Wolfowitz 1959; Fedorov 1972) is used as a tool in building approximate analysis models to be applied in structural optimization problems. This theory has been developed for the planning and analysis of comprehensive physical experiments in order to reduce the number of required experiments while preserving the amount of information that can be extracted from them. This situation is very similar to that of structural optimization, where the number of expensive finite element (FEM) analyses has to be minimized (Schoofs 1987). FEM computations can be regarded as numerical experiments, where the design variables are treated as input quantities. All computable properties of the structure, such as weight, displacements, stresses, etc. can be regarded as response quantities of the numerical experiment. The approximating models will be derived for these responses by using regression techniques, and they can be substituted in the optimization problem for the definition of the objective and the constraint functions. The application of the proposed method is illustrated with two case studies.Presented at NATO ASI Optimization of Large Structural Systems, Berchtesgaden, Sept. 23 – Oct. 4, 1991  相似文献   

17.
Clinical incidents, which occur during the provision of health care, can be costly and deadly. Over three-quarter of these incidents is preventable according to the studies in general practice in Australia (Bhasale, A., Miller, G., Reid, S., & Britt, H., (1998). Analysing potential harm in Australian general practice: an incident-monitoring study. MJA, 169, 73–76). It is important that we learn as much as possible from these incidents to prevent them in the future and improve quality of care. This paper introduces a holistic system, which amalgamates case-based reasoning, rule-based reasoning, causal-based reasoning and an ontological knowledge base for managing clinical incidents in general practice. Clinical incident management includes incident analysis, incident case browsing, statistics and explanation. The system enables health professionals to share the medical incident information, which has caused harm and can cause potential harm. The re-use of such information may prevent or mitigate human or medical errors. Such a hybrid approach provides an effective management of adverse clinical incidents for quality improvement in General Practice.  相似文献   

18.
In this paper we discuss a view of the Machine Learning technique called Explanation-Based Learning (EBL) or Explanation-Based Generalization (EBG) as a process for the interpretation of vague concepts in logic-based models of law.The open-textured nature of legal terms is a well-known open problem in the building of knowledge-based legal systems. EBG is a technique which creates generalizations of given examples on the basis of background domain knowledge. We relate these two topics by considering EBG's domain knowledge as corresponding to statute law rules, and EBG's training example as corresponding to a precedent case.By making the interpretation of vague predicates as guided by precedent cases, we use EBG as an effective process capable of creating a link between predicates appearing as open-textured concepts in law rules, and predicates appearing as ordinary language wording for stating the facts of a case.Standard EBG algorithms do not change the deductive closure of the domain theory. In the legal context, this is only adequate when concepts vaguely defined in some law rules can be reformulated in terms of other concepts more precisely defined in other rules. We call theory reformulation the process adopted in this situation of complete knowledge.In many cases, however, statutory law leaves some concepts completely undefined. We then propose extensions to the EBG standard that deal with this situation of incomplete knowledge, and call theory revision the extended process. In order to fill in knowledge gaps we consider precedent cases supplemented by additional heuristic information. The extensions proposed treat heuristics represented by abstraction hierarchies with constraints and exceptions.In the paper we also precisely characterize the distinction between theory reformulation and theory revision by stating formal definitions and results, in the context of the Logic Programming theory.We offer this proposal as a possible contribution to cross fertilization between machine learning and legal reasoning methods.  相似文献   

19.
Comments on "Data Mining Static Code Attributes to Learn Defect Predictors"   总被引:1,自引:0,他引:1  
In this correspondence, we point out a discrepancy in a recent paper, "data mining static code attributes to learn defect predictors," that was published in this journal. Because of the small percentage of defective modules, using probability of detection (pd) and probability of false alarm (pf) as accuracy measures may lead to impractical prediction models.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号