首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
Automating software testing activities can increase the quality and drastically decrease the cost of software development. Toward this direction, various automated test data generation tools have been developed. The majority of existing tools aim at structural testing, while a quite limited number aim at a higher level of testing thoroughness such as mutation. In this paper, an attempt toward automating the generation of mutation-based test cases by utilizing existing automated tools is proposed. This is achieved by reducing the killing mutants’ problem into a covering branches one. To this extent, this paper is motivated by the use of state of the art techniques and tools suitable for covering program branches when performing mutation. Tools and techniques such as symbolic execution, concolic execution, and evolutionary testing can be easily adopted toward automating the test input generation activity for the weak mutation testing criterion by simply utilizing a special form of the mutant schemata technique. The propositions made in this paper integrate three automated tools in order to illustrate and examine the method’s feasibility and effectiveness. The obtained results, based on a set of Java program units, indicate the applicability and effectiveness of the suggested technique. The results advocate that the proposed approach is able to guide existing automating tools in producing test cases according to the weak mutation testing criterion. Additionally, experimental results with the proposed mutation testing regime show that weak mutation is able to speedup the mutant execution time by at least 4.79 times when compared with strong mutation.  相似文献   

2.
3.
This paper addresses the problem of planning the movement of highly redundant humanoid robots based on non-linear attractor dynamics, where the attractor landscape is obtained by combining multiple force fields in different reference systems. The computational process of relaxation in the attractor landscape is similar to coordinating the movements of a puppet by means of attached strings, the strings in our case being the virtual force fields generated by the intended/attended goal and the other task dependent combinations of constraints involved in the execution of the task. Hence the name PMP (Passive Motion Paradigm) was given to the computational model. The method does not require explicit kinematic inversion and the computational mechanism does not crash near kinematic singularities or when the robot is asked to achieve a final pose that is outside its intrinsic workspace: what happens, in this case, is the gentle degradation of performance that characterizes humans in the same situations. Further, the measure of inconsistency in the relaxation in such cases can be directly used to trigger higher level reasoning in terms of breaking the goal into a sequence of subgoals directed towards searching and perhaps using tools to realize the otherwise unrealizable goal. The basic PMP model has been further expanded in the present paper by means of (1) a non-linear dynamical timing mechanism that provides terminal attractor properties to the relaxation process and (2) branching units that allow to ‘compose’ complex PMP-networks to coordinate multiple kinematic chains in a complex structure, including manipulated tools. A preliminary evaluation of the approach has been carried out with the 53 degrees of freedom humanoid robot iCub, with particular reference to trajectory formation and bimanual/whole upper body coordination under the presence of different structural and task specific constraints.  相似文献   

4.
It becomes more and more recognized that children should be involved in a product’s design and evaluation process. Many findings report on the methodology for usability research with children. However, there has been relatively little analysis of likeability research with children. In this paper, we propose the laddering method—traditionally a marketing method among adults—for likeability research in the domain of child–computer interaction. Three exploratory cases will be described. The cases report on the use of the laddering method with children aged between 7 and 16 to evaluate the likeability of two games. The lessons learnt about the use of the laddering method will be discussed in great detail. In order to adapt the laddering method to work with children, we recommend a variation of this method and call it the ‘contextual laddering method’.  相似文献   

5.
A general-purpose object-oriented fatigue tool set has been designed and implemented that can serve not only as a stand-alone code for preliminary design studies, but also as a foundation for highly complex industrial ‘in-house’ fatigue codes. Due to their programming structure, these tools may easily be modified to include additional fatigue prediction methods. Three component libraries have been created to address three topics in fatigue analysis: (1) fatigue material property definition; (2) basic fatigue calculations; and (3) cumulative damage calculations. The initial programming framework has been supplemented, demonstrating the expandability of the libraries. The component libraries have been incorporated into three programs to verify their capabilities and demonstrate their use.  相似文献   

6.
Discrete event simulators are important scientific tools and their efficient design and execution is the subject of much research. In this paper, we propose a new approach for constructing simulators that leverages virtual machines and combines advantages from the traditional systems‐based and language‐based simulator designs. We introduce JiST, a Java‐based simulation system that executes discrete event simulations both efficiently and transparently by embedding simulation semantics directly into the Java execution model. The system provides standard benefits that the modern Java runtime affords. In addition, JiST is efficient, out‐performing existing highly optimized simulation runtimes. As a case study, we illustrate the practicality of the JiST framework by applying it to the construction of SWANS, a scalable wireless ad hoc network simulator. We simulate million node wireless networks, which represents two orders of magnitude increase in scale over what existing simulators can achieve on equivalent hardware and at the same level of detail. Copyright © 2005 John Wiley & Sons, Ltd.  相似文献   

7.
A new technique is presented for searching digital audio at the word/phrase level. Unlike previous methods based upon Large Vocabulary Continuous Speech Recognition (LVCSR, with inherent problems of closed vocabulary and high word error rate), phonetic searching combines high speed and accuracy, supports open vocabulary, imposes low penalty for new words, permits phonetic and inexact spelling, enables user-determined depth of search, and is amenable to parallel execution for highly scalable deployment. A detailed comparison of accuracy between phonetic searching and one popular embodiment of LVCSR is presented along with other operating characteristics of the new technique. The current implementation for Digital Media Asset Management (DMAM) is described along with suggested applications in other domains.  相似文献   

8.
Retrieval of relevant unstructured information from the ever-increasing textual communications of individuals and businesses has become a major barrier to effective litigation/defense, mergers/acquisitions, and regulatory compliance. Such e-discovery requires simultaneously high precision with high recall (high-P/R) and is therefore a prototype for many legal reasoning tasks. The requisite exhaustive information retrieval (IR) system must employ very different techniques than those applicable in the hyper-precise, consumer search task where insignificant recall is the accepted norm. We apply Russell, et al.’s cognitive task analysis of sensemaking by intelligence analysts to develop a semi-autonomous system that achieves high IR accuracy of F1 ≥ 0.8 compared to F1 < 0.4 typical of computer-assisted human-assessment (CAHA) or alternative approaches such as Roitblat, et al.’s. By understanding the ‘Learning Loop Complexes’ of lawyers engaged in successful small-scale document review, we have used socio-technical design principles to create roles, processes, and technologies for scalable human-assisted computer-assessment (HACA). Results from the NIST-TREC Legal Track’s interactive task from both 2008 and 2009 validate the efficacy of this sensemaking approach to the high-P/R IR task.  相似文献   

9.
This work presents a novel interleaving crescent broadcasting protocol for near video-on-demand service. The interleaving crescent broadcasting protocol is a trade-off among the subscriber’s access latency, maximum buffer requirement, needed subscriber’s bandwidth, and maximum disk I/O transfer rate. A longer subscriber’s access latency may cause a subscriber to leave. A lower maximum buffer requirement, a lower needed subscriber’s bandwidth, and a lower maximum disk I/O transfer rate reduce subscribers’ costs. The interleaving crescent broadcasting protocol not only makes access latency shorter, but also lowers the overall system’s cost. We prove the correctness of the interleaving crescent protocol; provide mathematical analyses to demonstrate its efficiency.  相似文献   

10.
Classic task models for real-time systems focus on execution windows expressing earliest start times and deadlines of tasks for feasibility. Only within these windows the execution of tasks is feasible, and it is considered of uniform utility. Some tasks, however, have target demands in addition: a task should preferably execute at a specific target point within its execution window, but can execute around this point, albeit at lower utility. Examples of such applications include control and media processing. In this paper, we present a task model based on a gravitational analogy to address these issues. Tasks are considered as massive bobs hanging on a pendulum: a single task, left to itself, will execute at the bottom, the target point. If a force, such as the weight of other tasks, is applied, it can be shifted around this point. Thus, tasks’ importance and their utility around target points can be expressed. Since the execution of a task cannot be mapped to a point in time, the model allows tasks to express an arbitrary point with its execution to represent the whole execution. This point is called the anchor point. Moreover, we show an example of a scheduling algorithm for this model which finds an approximation to the best compromise of tasks’ interests based on the equilibrium state of a pendulum. Nonetheless, this task model is not restricted to a particular scheduling algorithm. Results from a simulation study show the effectiveness of the approach.
Gerhard FohlerEmail:
  相似文献   

11.
While terrorism informatics research has examined the technical composition of extremist media, there is less work examining the content and intent behind such media. We propose that the arguments and issues presented in extremist media provide insights into authors’ intent, which in turn may provide an evidence-base for detecting and assessing risk. We explore this possibility by applying two quantitative text-analysis methods to 50 online texts that incite violence as a result of the 2008/2009 Israeli military action in Gaza and the West Bank territories. The first method—a content coding system that identifies the occurrence of persuasive devices—revealed a predominance of moral proof arguments within the texts, and evidence for distinguishable ‘profiles’ of persuasion use across different authors and different group affiliations. The second method—a corpus-linguistic technique that identifies the core concepts and narratives that authors use—confirmed the use of moral proof to create an in-group/out-group divide, while also demonstrating a movement from general expressions of discontent to more direct audience-orientated expressions of violence as conflict heightened. We conclude that multi-method analyses are a valuable approach to building both an evidence-based understanding of terrorist media use and a valid set of applications within terrorist informatics.  相似文献   

12.
The growing need for high-performance embedded processors on the reconfigurable computing platform increases the pressure for developing design methods and tools. One important issue in mapping algorithms into hardware is the configuring of algorithms to fit the particular hardware structure, the available area and configuration, together with time parameters. This paper presents an overview of a new synthesis method—the Iso-plane method—on the polytope model of algorithm to increase the parallelism and facilitate the configurability in regular array design via algebraic transformations as associativity and commutativity. The paper presents a variety of new regular and scalable array solutions with improved performance and better layout including motherboards with daughter boards.  相似文献   

13.
We present a rich and highly dynamic technique for analyzing, visualizing, and exploring the execution traces of reactive systems. The two inputs are a designer’s inter-object scenario-based behavioral model, visually described using a UML2-compliant dialect of live sequence charts (LSC), and an execution trace of the system. Our method allows one to visualize, navigate through, and explore, the activation and progress of the scenarios as they “come to life” during execution. Thus, a concrete system’s runtime is recorded and viewed through abstractions provided by behavioral models used for its design, tying the visualization and exploration of system execution traces to model-driven engineering. We support both event-based and real-time-based tracing, and use details-on-demand mechanisms, multi-scaling grids, and gradient coloring methods. Novel model exploration techniques include semantics-based navigation, filtering, and trace comparison. The ideas are implemented and tested in a prototype tool called the Tracer.  相似文献   

14.
Automated verification tools vary widely in the types of properties they are able to analyze, the complexity of their algorithms, and the amount of necessary user involvement. In this paper we propose a framework for step-wise automatic verification and describe a lightweight scalable program analysis tool that combines abstraction and model checking. The tool guarantees that its True and False answers are sound with respect to the original system. We also check the effectiveness of the tool on an implementation of the Safety-Injection System.  相似文献   

15.
This study provides a step further in the computation of the transition path of a continuous time endogenous growth model discussed by Privileggi (Nonlinear dynamics in economics, finance and social sciences: essays in honour of John Barkley Rosser Jr., Springer, Berlin, Heidelberg, pp. 251–278, 2010)—based on the setting first introduced by Tsur and Zemel (J Econ Dyn Control 31:3459–3477, 2007)—in which knowledge evolves according to the Weitzman (Q J Econ 113:331–360, 1998) recombinant process. A projection method, based on the least squares of the residual function corresponding to the ODE defining the optimal policy of the ‘detrended’ model, allows for the numeric approximation of such policy for a positive Lebesgue measure range of values of the efficiency parameter characterizing the probability function of the recombinant process. Although the projection method’s performance rapidly degenerates as one departs from a benchmark value for the efficiency parameter, we are able to numerically compute time-path trajectories which are sufficiently regular to allow for sensitivity analysis under changes in parameters’ values.  相似文献   

16.
The existence of good probabilistic models for the job arrival process and the delay components introduced at different stages of job processing in a Grid environment is important for the improved understanding of the Grid computing concept. In this study, we present a thorough analysis of the job arrival process in the EGEE infrastructure and of the time durations a job spends at different states in the EGEE environment. We define four delay components of the total job delay and model each component separately. We observe that the job inter-arrival times at the Grid level can be adequately modelled by a rounded exponential distribution, while the total job delay (from the time it is generated until the time it completes execution) is dominated by the computing element’s register and queuing times and the worker node’s execution times. Further, we evaluate the efficiency of the EGEE environment by comparing the job total delay performance with that of a hypothetical ideal super-cluster and conclude that we would obtain similar performance if we submitted the same workload to a super-cluster of size equal to 34% of the total average number of CPUs participating in the EGEE infrastructure. We also analyze the job inter-arrival times, the CE’s queuing times, the WN’s execution times, and the data sizes exchanged at the kallisto.hellasgrid.gr cluster, which is node in the EGEE infrastructure. In contrast to the Grid level, we find that at the cluster level the job arrival process exhibits self-similarity/long-range dependence. Finally, we propose simple and intuitive models for the job arrival process and the execution times at the cluster level.  相似文献   

17.
In this article the design of an intelligent robust controller for a micro-actuator is presented. The μ-actuator is composed of a micro-capacitor, whose one plate is clamped while its other flexible plate’s motion is constrained by hinges acting as a combination of springs and dashpots. The distance of the plates is varied by the applied voltage between them. The dynamics of the plate’s rigid-body motion results in an unstable, nonlinear system. The control structure is constructed from: (a) a feedforward controller which stabilizes the micro-actuator around its nominal operating point, (b) a robust PID controller with its gains tuned via the utilization of Linear Matrix Inequalities (LMIs), and (c) an intelligent prefilter which shapes appropriately the reference signal. The resulting overall control scheme is applied to the non-linear model of the μ-actuator where simulation results are presented to prove the efficacy of the suggested scheme.  相似文献   

18.
Patients are most at risk during transitions in care across settings and providers. The communication and reconciliation of an accurate medication list throughout the care continuum are essential in the reduction in transition-related adverse drug events. Most current research focuses on the outcomes of reconciliation interventions, yet not on the clinician’s perspective. We aimed to explore clinicians’ cognitive processes and heuristics of making sense of patients’ disease histories. We used the affinity diagram method to simulate real-life medication reconciliation with 24 clinicians. The participants were given paper cards with diseases and medications representing a real case from an anesthesiology department. The task was to sort the cards in a set that made sense to the clinician. The experiment was video-recorded, and the data were analyzed using a quantitative spatial analysis technique. Levene’s test for equality of variance showed that 79% of the 24 participants arranged the diseases along a straight line (p < 0.001). With only few exceptions, the diseases were arranged along the line in a fixed order, from cardiac conditions to depression (Friedman’s χ2(44) = 291.9, p < 0.001). We learn from this study that although clinicians employ a variety of coping strategies while reconciling patients’ medical histories, there are common reconciliation strategies. Understanding heuristics and the mental models clinicians have for the reconciliation process may help to develop and implement methods and tools to promote safety research and practice.  相似文献   

19.
This article proposes a new mathematical definition of the execution of pure Prolog, in the form of axioms in a structural operational semantics. The main advantage of the model is its ease in representing backtracking, due to the functionality of the transition relation and its converse. Thus, forward and backward derivation steps are possible. A novel concept of stages is introduced, as a refinement of final states, which captures the evolution of a backtracking computation. An advantage over the traditional stack-of-stacks approaches is a modularity property. Finally, the model combines the intuition of the traditional ‘Byrd box’ metaphor with a compact representation of execution state, making it feasible to formulate and prove theorems about the model. In this paper we introduce the model and state some useful properties.  相似文献   

20.
In this paper, we consider the definition of a three-valued semantics for a μ-calculus on abstractions of hybrid automata. To this end, we first develop a framework that is general in the sense that it provides a preservation result for several possible semantics of the modal operators. In a second step, we instantiate our framework to two particular abstractions. To this end, a key issue is the consideration of both over- and underapproximated reachability, while classic simulation-based abstractions rely only on overapproximations, and therefore limit the preservation to the universal (μ-calculus’) fragment. To specialize our general result, we consider (1) modal abstractions, where the notions of ‘may’ and ‘must’ transitions are extended from the purely discrete to the hybrid time framework, and (2) so-called discrete bounded bisimulation abstractions.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号