首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Manipulatives—physical learning materials such as cubes or tiles—are prevalent in educational settings across cultures and have generated substantial research into how actions with physical objects may support children’s learning. The ability to integrate digital technology into physical objects—so-called ‘digital manipulatives’—has generated excitement over the potential to create new educational materials. However, without a clear understanding of how actions with physical materials lead to learning, it is difficult to evaluate or inform designs in this area. This paper is intended to contribute to the development of effective tangible technologies for children’s learning by summarising key debates about the representational advantages of manipulatives under two key headings: offloading cognition—where manipulatives may help children by freeing up valuable cognitive resources during problem solving, and conceptual metaphors—where perceptual information or actions with objects have a structural correspondence with more symbolic concepts. The review also indicates possible limitations of physical objects—most importantly that their symbolic significance is only granted by the context in which they are used. These arguments are then discussed in light of tangible designs drawing upon the authors’ current research into tangibles and young children’s understanding of number.  相似文献   

2.
Based on the literature dealing with the diffusion of innovation and with information systems, and building on the emerging concepts in electronic commerce (e-commerce), this paper aims at assessing the influence of various factors on firms’ future level of use of electronic marketplaces (e-marketplaces). This theoretical model is tested on data collected from 1,200 senior managers in Canadian firms. Findings indicate that a firm’s past experience in e-commerce, as well as the factors relating to its business relationships ultimately affect its future use of e-marketplaces. Results of TOBIT regressions also show that the complexity of sophisticated e-commerce implementations is negatively correlated with the future level of use of e-marketplaces, and that consultants and other experts play an essential role in encouraging and facilitating the use of this new type of electronic platform. Finally, our survey data demonstrate that the relative influence of some determinants differs according to the firms’ size. Pierre Hadaya is an assistant professor of MIS at the Faculté d’administration of the Université de Sherbrooke (Canada). He holds a Ph.D. in Management of Technology from the école Polytechnique de Montréal. His research interests lie at the intersection of information technology management, business strategy and interorganizational design.  相似文献   

3.
The success of information system development involving multi-organizational collaboration can depend heavily on effective knowledge sharing across boundaries. This paper reports on a comparative examination of knowledge sharing in two separate networks of public sector organizations participating in information technology innovation projects in New York State. As is typical of innovations resulting from recent government reforms, the knowledge sharing in these cases is a critical component of the information system development, involving a mix of tacit, explicit, and interactional forms of sharing across organizational boundaries. In one case the sharing is among state agencies and in the other across state and local government agencies. Using interviews, observations and document analysis, the longitudinal case studies follow knowledge sharing and other interactions in the interorganizational networks of these two distinct settings. Results confirm the difficulty of sharing knowledge across agencies, and further reveal the influences of several relevant factors—incentives, risks and barriers for sharing, and trust—on the effectiveness of knowledge sharing. The results contribute to theory on knowledge sharing processes in multi-organizational public sector settings and provide practice guidance for developing effective sharing relationships in collaborative cross-boundary information system initiatives. The research reported here is supported by the National Science Foundation grant #SES-9979839. The views and conclusions expressed in this report are those of the authors alone and do not reflect the views or policies of the National Science Foundation.  相似文献   

4.
5.
While terrorism informatics research has examined the technical composition of extremist media, there is less work examining the content and intent behind such media. We propose that the arguments and issues presented in extremist media provide insights into authors’ intent, which in turn may provide an evidence-base for detecting and assessing risk. We explore this possibility by applying two quantitative text-analysis methods to 50 online texts that incite violence as a result of the 2008/2009 Israeli military action in Gaza and the West Bank territories. The first method—a content coding system that identifies the occurrence of persuasive devices—revealed a predominance of moral proof arguments within the texts, and evidence for distinguishable ‘profiles’ of persuasion use across different authors and different group affiliations. The second method—a corpus-linguistic technique that identifies the core concepts and narratives that authors use—confirmed the use of moral proof to create an in-group/out-group divide, while also demonstrating a movement from general expressions of discontent to more direct audience-orientated expressions of violence as conflict heightened. We conclude that multi-method analyses are a valuable approach to building both an evidence-based understanding of terrorist media use and a valid set of applications within terrorist informatics.  相似文献   

6.
Implementation and maintenance of interorganizational systems (IOS) require investments by all the participating firms. Compared with intraorganizational systems, however, there are additional uncertainties and risks. This is because the benefits of IOS investment depend not only on a firm’s own decisions, but also on those of its business partners. Without appropriate levels of investment by all the firms participating in an IOS, they cannot reap the full benefits. Drawing upon the literature in institutional economics, we examine IOS ownership as a means to induce value-maximizing noncontractible investments. We model the impact of two factors derived from the theory of incomplete contracts and transaction cost economics: relative importance of investments and specificity of investments. We apply the model to a vendor-managed inventory system (VMI) in a supply chain setting. We show that when the specificity of investments is high, this is a more critical determinant of optimal ownership structure than the relative importance of investments. As technologies used in IOS become increasingly redeployable and reusable, and less specific, the relative importance of investments becomes a dominant factor. We also show that the bargaining mechanism—or the agreed upon approach to splitting the incremental payoffs—that is used affects the relationship between these factors in determining the optimal ownership structure of an IOS.
Barrie R. NaultEmail:
  相似文献   

7.
This paper explores how wikis may be used to support primary education students’ collaborative interaction and how such an interaction process can be characterised. The overall aim of this study is to analyse the collaborative processes of students working together in a wiki environment, in order to see how primary students can actively create a shared context for learning in the wiki. Educational literature has already reported that wikis may support collaborative knowledge-construction processes, but in our study we claim that a dialogic perspective is needed to accomplish this. Students must develop an intersubjective orientation towards each others’ perspectives, to co-construct knowledge about a topic. For this purpose, our project utilised a ‘Thinking Together’ approach to help students develop an intersubjective orientation towards one another and to support the creation of a ‘dialogic space’ to co-construct new understanding in a wiki science project. The students’ asynchronous interaction process in a primary classroom—which led to the creation of a science text in the wiki—was analysed and characterised, using a dialogic approach to the study of CSCL practices. Our results illustrate how the Thinking Together approach became embedded within the wiki environment and in the students’ collaborative processes. We argue that a dialogic approach for examining interaction can be used to help design more effective pedagogic approaches related to the use of wikis in education and to equip learners with the competences they need to participate in the global knowledge-construction era.  相似文献   

8.
Reachability analysis asks whether a system can evolve from legitimate initial states to unsafe states. It is thus a fundamental tool in the validation of computational systems—be they software, hardware, or a combination thereof. We recall a standard approach for reachability analysis, which captures the system in a transition system, forms another transition system as an over-approximation, and performs an incremental fixed-point computation on that over-approximation to determine whether unsafe states can be reached. We show this method to be sound for proving the absence of errors, and discuss its limitations for proving the presence of errors, as well as some means of addressing this limitation. We then sketch how program annotations for data integrity constraints and interface specifications—as in Bertrand Meyer’s paradigm of Design by Contract—can facilitate the validation of modular programs, e.g., by obtaining more precise verification conditions for software verification supported by automated theorem proving. Then we recap how the decision problem of satisfiability for formulae of logics with theories—e.g., bit-vector arithmetic—can be used to construct an over-approximating transition system for a program. Programs with data types comprised of bit-vectors of finite width require bespoke decision procedures for satisfiability. Finite-width data types challenge the reduction of that decision problem to one that off-the-shelf tools can solve effectively, e.g., SAT solvers for propositional logic. In that context, we recall the Tseitin encoding which converts formulae from that logic into conjunctive normal form—the standard format for most SAT solvers—with only linear blow-up in the size of the formula, but linear increase in the number of variables. Finally, we discuss the contributions that the three papers in this special section make in the areas that we sketched above.  相似文献   

9.
This paper presents the main features of an extension to Prolog toward modularity and concurrency—calledCommunicating Prolog Units (CPU)—whose main aim is to allow logic programming to be used as an effective tool for system programming and prototyping. While Prolog supports only a single set of clauses and sequential computations, CPU allows programmers to define different theories (P-unis) and parallel processes interacting via P-units, according to a model very similar to Linda’s generative communication. The possibility of expressingmeta-rules to specify where and how object-level (sub)golas have to be proved, not only enhances modularity, but also increases the expressive power and flexibility of CPU systems.  相似文献   

10.
Linux malware can pose a significant threat—its (Linux) penetration is exponentially increasing—because little is known or understood about Linux OS vulnerabilities. We believe that now is the right time to devise non-signature based zero-day (previously unknown) malware detection strategies before Linux intruders take us by surprise. Therefore, in this paper, we first do a forensic analysis of Linux executable and linkable format (ELF) files. Our forensic analysis provides insight into different features that have the potential to discriminate malicious executables from benign ones. As a result, we can select a features’ set of 383 features that are extracted from an ELF headers. We quantify the classification potential of features using information gain and then remove redundant features by employing preprocessing filters. Finally, we do an extensive evaluation among classical rule-based machine learning classifiers—RIPPER, PART, C4.5 Rules, and decision tree J48—and bio-inspired classifiers—cAnt Miner, UCS, XCS, and GAssist—to select the best classifier for our system. We have evaluated our approach on an available collection of 709 Linux malware samples from vx heavens and offensive computing. Our experiments show that ELF-Miner provides more than 99% detection accuracy with less than 0.1% false alarm rate.  相似文献   

11.
Approximations play an important role in optimization to reduce the number of expensive analysis runs. In particular, metamodels are one way to reduce CPU time for multidisciplinary optimization and are sometimes the enabler for huge optimization projects. One type of approximation is interpolation and radial basis functions (RBFs) are an example for it. In general, a large number of radial basis functions have the attractive property that—independently from the arrangement of sampling points—the existence of a unique solution can be guaranteed for the linear equation system that is solved to determine the coefficients. However, this doesn’t mean that the handling is uncritical, as can be seen for radial basis functions of Gaussian type (GRBFs). In this case ill-conditioning and Runge-type oscillations spoil the tuning of the interpolant’s shape parameter and make its general application as a metamodel impossible. We introduce a heuristic approach to modify the GRBFs in a way that allows the shape parameter to be optimized within a much larger range before ill-conditioning appears. It seems that also the appearance of the Runge-type oscillations are solved by this approach.  相似文献   

12.
Abstract.  Information technology (IT) innovation research examines the organizational and technological factors that determine IT adoption and diffusion, including firm size and scope, technological competency and expected benefits. We extend the literature by focusing on information requirements as a driver of IT innovation adoption and diffusion. Our framework of IT innovation diffusion incorporates three industry-level sources of information requirements: process complexity, clock speed and supply chain complexity. We apply the framework to US manufacturing industries using aggregate data of internet-based innovations and qualitative analysis of two industries: wood products and beverage manufacturing. Results show systematic patterns supporting the basic thesis of the information processing paradigm: higher IT innovation diffusion in industries with higher information processing requirements; the salience of downstream industry structure in the adoption of interorganizational systems; and the role of the location of information intensity in the supply chain in determining IT adoption and diffusion. Our study provides a new explanation for why certain industries were early and deep adopters of internet-based innovations while others were not: variation in information processing requirements.  相似文献   

13.
Hypercomputation—the hypothesis that Turing-incomputable objects can be computed through infinitary means—is ineffective, as the unsolvability of the halting problem for Turing machines depends just on the absence of a definite value for some paradoxical construction; nature and quantity of computing resources are immaterial. The assumption that the halting problem is solved by oracles of higher Turing degree amounts just to postulation; infinite-time oracles are not actually solving paradoxes, but simply assigning them conventional values. Special values for non-terminating processes are likewise irrelevant, since diagonalization can cover any amount of value assignments. This should not be construed as a restriction of computing power: Turing’s uncomputability is not a ‘barrier’ to be broken, but simply an effect of the expressive power of consistent programming systems.  相似文献   

14.
In today’s dynamic business environments, organizations are under pressure to modernize their existing software systems in order to respond to changing business demands. Service oriented architectures provide a composition framework to create new business functionalities from autonomous building blocks called services, enabling organizations to quickly adapt to changing conditions and requirements. Characteristics of services offer the promise of leveraging the value of enterprise systems through source code reuse. In this respect, existing system components can be used as the foundation of newly created services. However, one problem to overcome is the lack of business semantics to support the reuse of existing source code. Without sufficient semantic knowledge about the code in the context of business functionality, it would be impossible to utilize source code components in services development. In this paper, we present an automated approach to enrich source code components with business semantics. Our approach is based on the idea that the gap between the two ends of an enterprise system—(1) services as processes and (2) source code—can be bridged via similarity of data definitions used in both ends. We evaluate our approach in the framework of a commercial enterprise systems application. Initial results indicate that the proposed approach is useful for annotating source code components with business specific knowledge.  相似文献   

15.
This study examines an Emergency Medical Service in order to analyze the composite set of activities and instruments directed at locating the patient. The good management of information about the location of the emergency is highly relevant for a reliable rescue service, but this information depends on knowledge of the territory that is socially distributed between EMS operators and callers. Accordingly, the decision-making process often has to go beyond the emergency service protocols, engaging the operator in undertaking an open negotiation in order to transform the caller’s role from layman to “co-worker”. The patient’s location turns out to be an emerging phenomenon, collaborative work based on knowledge management involving two communities—the callers and the EMS operators—that overlap partially. Drawing examples from emergency calls, the study analyzes the practice of locating a patient as a complex and multi-layered process, highlighting the role played by new and old technologies (the information system and the paper maps) in this activity. We argue that CSCW technologies enable the blended use of different kinds of instruments and support an original interconnection between the professional localization systems and the public’s way of defining a position.  相似文献   

16.
Emergent behaviour—system behaviour not determined by the behaviours of system components when considered in isolation—is commonplace in multi-agent systems, particularly when agents adapt to environmental change. This article considers the manner in which Formal Methods may be used to authenticate the trustworthiness of such systems. Techniques are considered for capturing emergent behaviour in the system specification and then the incremental refinement method is applied to justify design decisions embodied in an implementation. To demonstrate the approach, one and two-dimensional cellular automata are studied. In particular an incremental refinement of the ‘glider’ in Conway’s Game of Life is given from its specification.  相似文献   

17.
Significant enhancement of the thermal stability of the hologram recorded on photochromic materials had been achieved via covalent bonding of the photochromic dye to the polymer matrix as compared to the host-guest systems. One time partial reduction of the hologram’s initial diffraction efficiency due to thermal exposure was observed for both—polymers with attached dye as well as hostguest materials. Such one time reduction is interpreted to be due to the thermal relaxation of the polymer network induced with photochromic transition in the dye molecule. A gradual hologram erasure at elevated temperatures was observed for the host-guest system, which is assumed to be due to dye’s diffusion between highly lit and dark areas. For the photochromic materials with covalent boding of the dye to the polymer matrix the level of thermal stability is demonstrated such that at temperatures ca. 100C there was no detectable diffusion type degradation of the hologram after 6 hours of exposure to elevated temperature. Same heating of the hologram in photochromic host-guest polymer led to a full hologram erasure within 40 minutes. In both cases the one time initial DE reduction due to polymer matrix relaxation at elevated temperatures had comparable value of about 0.5 of hologram’s initial DE. The article is published in the original.  相似文献   

18.
To date the most popular and sophisticated types of virtual worlds can be found in the area of video gaming, especially in the genre of Massively Multiplayer Online Role Playing Games (MMORPG). Game developers have made great strides in achieving game worlds that look and feel increasingly realistic. However, despite these achievements in the visual realism of virtual game worlds, they are much less sophisticated when it comes to modeling face-to-face interaction. In face-to-face, ordinary social activities are “accountable,” that is, people use a variety of kinds of observational information about what others are doing in order to make sense of others’ actions and to tightly coordinate their own actions with others. Such information includes: (1) the real-time unfolding of turns-at-talk; (2) the observability of embodied activities; and (3) the direction of eye gaze for the purpose of gesturing. But despite the fact that today’s games provide virtual bodies, or “avatars,” for players to control, these avatars display much less information about players’ current state than real bodies do. In this paper, we discuss the impact of the lack of each type of information on players’ ability to tightly coordinate their activities and offer guidelines for improving coordination and, ultimately, the players’ social experience. “They come here to talk turkey with suits from around the world, and they consider it just as good as a face-to-face. They more or less ignore what is being said—a lot gets lost in translation, after all. They pay attention to the facial expressions and body language of the people they are talking to. And that’s how they know what’s going on inside a person’s head—by condensing fact from the vapor of nuance.” —Neal Stephenson, Snow Crash, 1992  相似文献   

19.
In D’Ariano in Philosophy of Quantum Information and Entanglement, Cambridge University Press, Cambridge, UK (2010), one of the authors proposed a set of operational postulates to be considered for axiomatizing Quantum Theory. The underlying idea is to derive Quantum Theory as the mathematical representation of a fair operational framework, i.e. a set of rules which allows the experimenter to make predictions on future events on the basis of suitable tests, e.g. without interference from uncontrollable sources and having local control and low experimental complexity. In addition to causality, two main postulates have been considered: PFAITH (existence of a pure preparationally faithful state), and FAITHE (existence of a faithful effect). These postulates have exhibited an unexpected theoretical power, excluding all known nonquantum probabilistic theories. In the same paper also postulate PURIFY-1 (purifiability of all states) has been introduced, which later has been reconsidered in the stronger version PURIFY-2 (purifiability of all states unique up to reversible channels on the purifying system) in Chiribella et al. (Reversible realization of physical processes in probabilistic theories, arXiv:0908.1583). There, it has been shown that Postulate PURIFY-2, along with causality and local discriminability, narrow the probabilistic theory to something very close to the quantum one. In the present paper we test the above postulates on some nonquantum probabilistic models. The first model—the two-box world—is an extension of the Popescu–Rohrlich model (Found Phys, 24:379, 1994), which achieves the greatest violation of the CHSH inequality compatible with the no-signaling principle. The second model—the two-clock world— is actually a full class of models, all having a disk as convex set of states for the local system. One of them corresponds to—the two-rebit world— namely qubits with real Hilbert space. The third model—the spin-factor—is a sort of n-dimensional generalization of the clock. Finally the last model is the classical probabilistic theory. We see how each model violates some of the proposed postulates, when and how teleportation can be achieved, and we analyze other interesting connections between these postulate violations, along with deep relations between the local and the non-local structures of the probabilistic theory.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号