首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
A critical problem in software development is the monitoring, control and improvement in the processes of software developers. Software processes are often not explicitly modeled, and manuals to support the development work contain abstract guidelines and procedures. Consequently, there are huge differences between ‘actual’ and ‘official’ processes: “the actual process is what you do, with all its omissions, mistakes, and oversights. The official process is what the book, i.e., a quality manual, says you are supposed to do” (Humphrey in A discipline for software engineering. Addison-Wesley, New York, 1995). Software developers lack support to identify, analyze and better understand their processes. Consequently, process improvements are often not based on an in-depth understanding of the ‘actual’ processes, but on organization-wide improvement programs or ad hoc initiatives of individual developers. In this paper, we show that, based on particular data from software development projects, the underlying software development processes can be extracted and that automatically more realistic process models can be constructed. This is called software process mining (Rubin et al. in Process mining framework for software processes. Software process dynamics and agility. Springer Berlin, Heidelberg, 2007). The goal of process mining is to better understand the development processes, to compare constructed process models with the ‘official’ guidelines and procedures in quality manuals and, subsequently, to improve development processes. This paper reports on process mining case studies in a large industrial company in The Netherlands. The subject of the process mining is a particular process: the change control board (CCB) process. The results of process mining are fed back to practice in order to subsequently improve the CCB process.  相似文献   

2.
This paper concerns the development and use of ontologies for electronically supporting and structuring the highest-level function of government: the design, implementation and evaluation of public policies for the big and complex problems that modern societies face. This critical government function usually necessitates extensive interaction and collaboration among many heterogeneous government organizations (G2G collaboration) with different backgrounds, mentalities, values, interests and expectations, so it can greatly benefit from the use of ontologies. In this direction initially an ontology of public policy making, implementation and evaluation is described, which has been developed as part of the project ICTE-PAN of the Information Society Technologies (IST) Programme of the European Commission, based on sound theoretical foundations mainly from the public policy analysis domain and contributions of experts from the public administrations of four European Union countries (Denmark, Germany, Greece and Italy). It is a ‘horizontal’ ontology that can be used for electronically supporting and structuring the whole lifecycle of a public policy in any vertical (thematic) area of government activity; it can also be combined with ‘vertical’ ontologies of the specific vertical (thematic) area of government activity we are dealing with. In this paper is also described the use of this ontology for electronically supporting and structuring the collaborative public policy making, implementation and evaluation through ‘structured electronic forums’, ‘extended workflows’, ‘public policy stages with specific sub-ontologies’, etc., and also for the semantic annotation, organization, indexing and integration of the contributions of the participants of these forums, which enable the development of advanced semantic web capabilities in this area.  相似文献   

3.
The most cursory examination of the history of artificial intelligence highlights numerous egregious claims of its researchers, especially in relation to a populist form of ‘strong’ computationalism which holds that any suitably programmed computer instantiates genuine conscious mental states purely in virtue of carrying out a specific series of computations. The argument presented herein is a simple development of that originally presented in Putnam’s (Representation & Reality, Bradford Books, Cambridge in 1988) monograph, “Representation & Reality”, which if correct, has important implications for turing machine functionalism and the prospect of ‘conscious’ machines. In the paper, instead of seeking to develop Putnam’s claim that, “everything implements every finite state automata”, I will try to establish the weaker result that, “everything implements the specific machine Q on a particular input set (x)”. Then, equating Q (x) to any putative AI program, I will show that conceding the ‘strong AI’ thesis for Q (crediting it with mental states and consciousness) opens the door to a vicious form of panpsychism whereby all open systems, (e.g. grass, rocks etc.), must instantiate conscious experience and hence that disembodied minds lurk everywhere.  相似文献   

4.
This study provides a step further in the computation of the transition path of a continuous time endogenous growth model discussed by Privileggi (Nonlinear dynamics in economics, finance and social sciences: essays in honour of John Barkley Rosser Jr., Springer, Berlin, Heidelberg, pp. 251–278, 2010)—based on the setting first introduced by Tsur and Zemel (J Econ Dyn Control 31:3459–3477, 2007)—in which knowledge evolves according to the Weitzman (Q J Econ 113:331–360, 1998) recombinant process. A projection method, based on the least squares of the residual function corresponding to the ODE defining the optimal policy of the ‘detrended’ model, allows for the numeric approximation of such policy for a positive Lebesgue measure range of values of the efficiency parameter characterizing the probability function of the recombinant process. Although the projection method’s performance rapidly degenerates as one departs from a benchmark value for the efficiency parameter, we are able to numerically compute time-path trajectories which are sufficiently regular to allow for sensitivity analysis under changes in parameters’ values.  相似文献   

5.
We study properties of non-uniform reductions and related completeness notions. We strengthen several results of Hitchcock and Pavan (ICALP (1), Lecture Notes in Computer Science, vol. 4051, pp. 465–476, Springer, 2006) and give a trade-off between the amount of advice needed for a reduction and its honesty on NEXP. We construct an oracle relative to which this trade-off is optimal. We show, in a more systematic study of non-uniform reductions, among other things that non-uniformity can be removed at the cost of more queries. In line with Post’s program for complexity theory (Buhrman and Torenvliet in Bulletin of the EATCS 85, pp. 41–51, 2005) we connect such ‘uniformization’ properties to the separation of complexity classes.  相似文献   

6.
The notion of P-simple points was introduced by Bertrand to conceive parallel thinning algorithms. In ‘A 3D fully parallel thinning algorithm for generating medial faces’ (Pattern Recogn. Lett. 16:83–87, 1995), Ma proposed an algorithm for which there are objects whose topology is not preserved. In this paper, we propose a new application of P-simple points: to automatically correct Ma’s algorithm.  相似文献   

7.
We study the mathematical modeling and numerical simulation of the motion of red blood cells (RBC) and vesicles subject to an external incompressible flow in a microchannel. RBC and vesicles are viscoelastic bodies consisting of a deformable elastic membrane enclosing an incompressible fluid. We provide an extension of the finite element immersed boundary method by Boffi and Gastaldi (Comput Struct 81:491–501, 2003), Boffi et al. (Math Mod Meth Appl Sci 17:1479–1505, 2007), Boffi et al. (Comput Struct 85:775–783, 2007) based on a model for the membrane that additionally accounts for bending energy and also consider inflow/outflow conditions for the external fluid flow. The stability analysis requires both the approximation of the membrane by cubic splines (instead of linear splines without bending energy) and an upper bound on the inflow velocity. In the fully discrete case, the resulting CFL-type condition on the time step size is also more restrictive. We perform numerical simulations for various scenarios including the tank treading motion of vesicles in microchannels, the behavior of ‘healthy’ and ‘sick’ RBC which differ by their stiffness, and the motion of RBC through thin capillaries. The simulation results are in very good agreement with experimentally available data.  相似文献   

8.
Li P  Banerjee S  McBean AM 《GeoInformatica》2011,15(3):435-454
Statistical models for areal data are primarily used for smoothing maps revealing spatial trends. Subsequent interest often resides in the formal identification of ‘boundaries’ on the map. Here boundaries refer to ‘difference boundaries’, representing significant differences between adjacent regions. Recently, Lu and Carlin (Geogr Anal 37:265–285, 2005) discussed a Bayesian framework to carry out edge detection employing a spatial hierarchical model that is estimated using Markov chain Monte Carlo (MCMC) methods. Here we offer an alternative that avoids MCMC and is easier to implement. Our approach resembles a model comparison problem where the models correspond to different underlying edge configurations across which we wish to smooth (or not). We incorporate these edge configurations in spatially autoregressive models and demonstrate how the Bayesian Information Criteria (BIC) can be used to detect difference boundaries in the map. We illustrate our methods with a Minnesota Pneumonia and Influenza Hospitalization dataset to elicit boundaries detected from the different models.  相似文献   

9.
Kripke (Wittgenstein on rules and private language: an elementary exposition. Harvard University Press, Cambridge Mass, 1982) rejected a naturalistic dispositional account of meaning (hereafter semantic dispositionalism) in a skeptical argument about rule-following he attributes to Wittgenstein (Philosophical investigation. Basil Blackwell, Oxford, 1958). Most philosophers who oppose Kripke’s criticisms of semantic dispositionalism take the stance that the argument proves too much: semantic dispositionalism is similar to much of our respected science in some important aspects, and hence to discard the former would mean to give up the latter, which is obviously wrong. In this paper, I shall discuss and reject a recent defense of Kripke by Kusch (Analysis 65(2):156–163 2005; Sceptical guide to meaning and rules: defending Kripke’s Wittgenstein. McGill-Queen’s, London, 2006). Kusch attempts to show that semantic dispositionalism differs from the sciences, and consequently, Kripke’s attack can only target semantic dispositionalism, but not the sciences. Specifically, Kusch identifies some important features of the sciences with regard to how it employs idealization and ceteris paribus clauses, and argues that the ways in which semantic dispositionalism uses them are dramatically different. I argue that, upon close examination, the two are more similar than otherwise in each of those features.  相似文献   

10.
Computing LTS Regression for Large Data Sets   总被引:9,自引:0,他引:9  
Data mining aims to extract previously unknown patterns or substructures from large databases. In statistics, this is what methods of robust estimation and outlier detection were constructed for, see e.g. Rousseeuw and Leroy (1987). Here we will focus on least trimmed squares (LTS) regression, which is based on the subset of h cases (out of n) whose least squares fit possesses the smallest sum of squared residuals. The coverage h may be set between n/2 and n. The computation time of existing LTS algorithms grows too much with the size of the data set, precluding their use for data mining. In this paper we develop a new algorithm called FAST-LTS. The basic ideas are an inequality involving order statistics and sums of squared residuals, and techniques which we call ‘selective iteration’ and ‘nested extensions’. We also use an intercept adjustment technique to improve the precision. For small data sets FAST-LTS typically finds the exact LTS, whereas for larger data sets it gives more accurate results than existing algorithms for LTS and is faster by orders of magnitude. This allows us to apply FAST-LTS to large databases.  相似文献   

11.
Plato divided science (episteme) into ‘science of action’ (praktike) and ‘science of mere knowing’ (gnostike). His argument is the first known attempt to distinguish what is now recognised as technology, as distinct from more purely rational science. Aristotle coined the compound term technologia and thereby established this new department of science within the general system of knowledge. Plato did not develop his novel characterisation of the architect any further, for the ancient Greeks did not consider architecture a fine or estimable art. The best available source of Greek architectural pedagogy is the Roman Vitruvius. Graham Pont discusses Vitruvius’s distinction between the ‘practical’ side of architecture (fabrica) and the ‘theoretical’ (ratiocinatio), and examines the mathematical preparation of ancient Greek and Roman architects

相似文献   


12.
The development of autonomous mobile machines to perform useful tasks in real work environments is currently being impeded by concerns over effectiveness, commercial viability and, above all, safety. This paper introduces a case study of a robotic excavator to explore a series of issues around system development, navigation in unstructured environments, autonomous decision making and changing the behaviour of autonomous machines to suit the prevailing demands of users. The adoption of the Real-Time Control Systems (RCS) architecture (Albus, 1991) is proposed as a universal framework for the development of intelligent systems. In addition it is explained how the use of Partially Observable Markov Decision Processes (POMDP) (Kaelbling et al., 1998) can form the basis of decision making in the face of uncertainty and how the technique can be effectively incorporated into the RCS architecture. Particular emphasis is placed on ensuring that the resulting behaviour is both task effective and adequately safe, and it is recognised that these two objectives may be in opposition and that the desired relative balance between them may change. The concept of an autonomous system having “values” is introduced through the use of utility theory. Limited simulation results of experiments are reported which demonstrate that these techniques can create intelligent systems capable of modifying their behaviour to exhibit either ‘safety conscious’ or ‘task achieving’ personalities.  相似文献   

13.
Transaction-level modeling is used in hardware design for describing designs at a higher level compared to the register-transfer level (RTL) (e.g. Cai and Gajski in CODES+ISSS ’03: proceedings of the 1st IEEE/ACM/IFIP international conference on Hardware/software codesign and system synthesis, pp. 19–24, 2003; Chen et al. in FMCAD ’07: proceedings of the formal methods in computer aided design, pp. 53–61, 2007; Mahajan et al. in MEMOCODE ’07: proceedings of the 5th IEEE/ACM international conference on formal methods and models for codesign, pp. 123–132, 2007; Swan in DAC ’06: proceedings of the 43rd annual conference on design automation, pp. 90–92, 2006). Each transaction represents a unit of work, which is also a useful unit for design verification. In such models, there are many properties of interest which involve interactions between multiple transactions. Examples of this are ordering relationships in sequential processing and hazard checking in pipelined circuits. Writing such properties on the RTL design requires significant expertise in understanding the higher-level computation being done in a given RTL design and possible instrumentation of the RTL to express the property of interest. This is a barrier to the easy use of such properties in RTL designs.  相似文献   

14.
This paper discusses the domestication of Information and Communication Technologies (ICTs), particularly their use, in UK households reporting on research undertaken between 1998 and 2004. Issues raised are linked to the dominant discourse of the ‘digital divide’, which in the UK means engaging with ICTs in a ‘meaningful’ way to ensure the economic and social well-being of UK plc (public limited company—in the UK this refers to companies whose shares can be sold to the public. The acronym is used here ironically to indicate the motivation of the government to brand and promote the UK as a whole.). Utilising a framework of understanding digital inequality and the ‘deepening divide’, domestication theory is applied to discuss motivational, material and physical, skills and usage access in the gendered household, critically contrasting this approach to ‘smart house’ research. This qualitative enquiry contributes to the neglected area of domestication studies in Information Systems research.  相似文献   

15.
Apostrophe is best known as a punctuation mark (’) or as a key poetic figure (with a speaker addressing an imaginary or absent person or entity). In origin, however, it is a pivotal rhetorical figure that indicates a ‘breaking away’ or turning away of the speaker from one addressee to another, in a different mode. In this respect, apostrophe is essentially theatrical. To be sure, the turn away implies two different modes of address that may follow upon one another, as is hinted at by the two meanings of the verb ‘to witness’: being a witness and bearing witness. One cannot do both at the same time. My argument will be, however, that in order to make witnessing work ethically and responsibly, the two modes of address must take place simultaneously, in the coincidence of two modalities of presence: one actual and one virtual. Accordingly, I will distinguish between an address of attention and an address of expression. Whereas the witness is actually paying attention to that which she witnesses, she is virtually (and in the sense Deleuze intended, no less really) turning away in terms of expression. The two come together in what Kelly Oliver called the ‘inner witness’. The simultaneous operation of two modes of address suggests that Caroline Nevejan’s so-called YUTPA model would have to include two modalities of ‘you’. Such a dual modality has become all the more important, in the context of the society of the spectacle. One text will help me first to explore two modes of address through apostrophe. I will focus on a story by Dutch author Maria Derm?ut, written in the fifties of the twentieth century, reflecting on an uprising and the subsequent execution of its leader in the Dutch Indies in 1817. Secondly, I will move to American artist Kara Walker’s response, in the shape of an installation and a visual essay, to the flooding of New Orleans in 2005. The latter will serve to illustrate a historic shift in the theatrical nature and status of ‘presence’ in the two modes of address. Instead of thinking of the convergence of media, of which Jenkins speaks, we might think of media swallowing up one another. For instance, the theatrical structure of apostrophe is swallowed up, and in a sense perverted, by the model of the spectacle in modern media. This endangers the very possibility of witnessing in any ethical sense of the word.  相似文献   

16.
“There will always (I hope) be print books, but just as the advent of photography changed the role of painting or film changed the role of theater in our culture, electronic publishing is changing the world of print media. To look for a one-to-one transposition to the new medium is to miss the future until it has passed you by.”—Tim O’Reilly (2002). It is not hard to envisage that publishers will leverage subscribers’ information, interest groups’ shared knowledge and others sources to enhance their publications. While this enhances the value of the publication through more accurate and personalized content, it also brings a new set of challenges to the publisher. Content is now driven by web and in a truly automated system, that is, no designer “re-touch” intervention is envisaged. This paper introduces an exploratory mapping strategy to allocate web driven content in a highly graphical publication like a traditional magazine. Two major aspects of the mapping are covered, those enable different level of flexibility and address different content flowing strategies. The last contribution is an evaluation of existing standards, which potentially can leverage this work to incorporate flexible mapping, and subsequently, composition capabilities. The work published here is an extended version of the article presented at the Eight ACM Symposium on Document Engineering in fall 2008 (Giannetti 2008).  相似文献   

17.
With the rise of ubiquitous computing in recent years, concepts of spatiality have become a significant topic of discussion in design and development of multimedia systems. This article investigates spatial practices at the intersection of youth, technology, and urban space in Seoul, and examines what the author calls ‘transyouth’: in the South Korean context, these people are between the ages of 18 and 24, situated on the delicate border between digital natives and immigrants in Prensky’s [46] terms. In the first section, the article sets out the technosocial environment of contemporary Seoul. This is followed by a discussion of social networking processes derived from semi-structured interviews conducted in 2007–2008 with Seoul transyouth about their ‘lived experiences of the city.’ Interviewees reported how they interact to play, work, and live with and within the city’s unique environment. The article develops a theme of how technosocial convergence (re)creates urban environments and argues for a need to consider such user-driven spatial recreation in designing cities as (ubiquitous) urban networks in recognition of its changing technosocial contours of connections. This is explored in three spaces of different scales: Cyworld as an online social networking space; cocoon housing—a form of individual residential space which is growing rapidly in many Korean cities—as a private living space; and ubiquitous City as the future macro-space of Seoul.  相似文献   

18.
This paper provides a corrigendum to resolve a couple of minor issues in the algorithm presented in Sarkar et al. (Real-Time Syst., 2011). The first issue relates to the postponement of execution of a task when its own ‘deadline of postponement’ have not been crossed. The second issue concerns the updation of certain scheduler data structures.  相似文献   

19.
William Rapaport, in “How Helen Keller used syntactic semantics to escape from a Chinese Room,” (Rapaport 2006), argues that Helen Keller was in a sort of Chinese Room, and that her subsequent development of natural language fluency illustrates the flaws in Searle’s famous Chinese Room Argument and provides a method for developing computers that have genuine semantics (and intentionality). I contend that his argument fails. In setting the problem, Rapaport uses his own preferred definitions of semantics and syntax, but he does not translate Searle’s Chinese Room argument into that idiom before attacking it. Once the Chinese Room is translated into Rapaport’s idiom (in a manner that preserves the distinction between meaningful representations and uninterpreted symbols), I demonstrate how Rapaport’s argument fails to defeat the CRA. This failure brings a crucial element of the Chinese Room Argument to the fore: the person in the Chinese Room is prevented from connecting the Chinese symbols to his/her own meaningful experiences and memories. This issue must be addressed before any victory over the CRA is announced.  相似文献   

20.
A 199-line Matlab code for Pareto-optimal tracing in topology optimization   总被引:3,自引:3,他引:0  
The paper ‘A 99-line topology optimization code written in Matlab’ by Sigmund (Struct Multidisc Optim 21(2):120–127, 2001) demonstrated that SIMP-based topology optimization can be easily implemented in less than hundred lines of Matlab code. The published method and code has been used even since by numerous researchers to advance the field of topology optimization. Inspired by the above paper, we demonstrate here that, by exploiting the notion of topological-sensitivity (an alternate to SIMP), one can generate Pareto-optimal topologies in about twice the number of lines of Matlab code. In other words, optimal topologies for various volume fractions can be generated in a highly efficient manner, by directly tracing the Pareto-optimal curve.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号