共查询到20条相似文献,搜索用时 31 毫秒
1.
While terrorism informatics research has examined the technical composition of extremist media, there is less work examining
the content and intent behind such media. We propose that the arguments and issues presented in extremist media provide insights
into authors’ intent, which in turn may provide an evidence-base for detecting and assessing risk. We explore this possibility
by applying two quantitative text-analysis methods to 50 online texts that incite violence as a result of the 2008/2009 Israeli
military action in Gaza and the West Bank territories. The first method—a content coding system that identifies the occurrence
of persuasive devices—revealed a predominance of moral proof arguments within the texts, and evidence for distinguishable
‘profiles’ of persuasion use across different authors and different group affiliations. The second method—a corpus-linguistic
technique that identifies the core concepts and narratives that authors use—confirmed the use of moral proof to create an
in-group/out-group divide, while also demonstrating a movement from general expressions of discontent to more direct audience-orientated
expressions of violence as conflict heightened. We conclude that multi-method analyses are a valuable approach to building
both an evidence-based understanding of terrorist media use and a valid set of applications within terrorist informatics. 相似文献
2.
Stefan Gruner 《Minds and Machines》2011,21(2):275-299
On the basis of an earlier contribution to the philosophy of computer science by Amnon Eden, this essay discusses to what
extent Eden’s ‘paradigms’ of computer science can be transferred or applied to software engineering. This discussion implies
an analysis of how software engineering and computer science are related to each other. The essay concludes that software
engineering can neither be fully subsumed by computer science, nor vice versa. Consequently, also the philosophies of computer
science and software engineering—though related to each other—are not identical branches of a general philosophy of science.
This also implies that not all of Eden’s earlier arguments can be directly mapped from the domain of computer science into
the domain of software science. After the discussion of this main topic, the essay also points to some further problems and
open issues for future studies in the philosophy of software science and engineering. 相似文献
3.
Manipulatives—physical learning materials such as cubes or tiles—are prevalent in educational settings across cultures and
have generated substantial research into how actions with physical objects may support children’s learning. The ability to
integrate digital technology into physical objects—so-called ‘digital manipulatives’—has generated excitement over the potential
to create new educational materials. However, without a clear understanding of how actions with physical materials lead to
learning, it is difficult to evaluate or inform designs in this area. This paper is intended to contribute to the development
of effective tangible technologies for children’s learning by summarising key debates about the representational advantages
of manipulatives under two key headings: offloading cognition—where manipulatives may help children by freeing up valuable cognitive resources during problem solving, and conceptual metaphors—where perceptual information or actions with objects have a structural correspondence with more symbolic concepts. The review
also indicates possible limitations of physical objects—most importantly that their symbolic significance is only granted
by the context in which they are used. These arguments are then discussed in light of tangible designs drawing upon the authors’
current research into tangibles and young children’s understanding of number. 相似文献
4.
Bradley Rives 《Minds and Machines》2009,19(2):199-227
It is commonplace in cognitive science that concepts are individuated in terms of the roles they play in the cognitive lives
of thinkers, a view that Jerry Fodor has recently been dubbed ‘Concept Pragmatism’. Quinean critics of Pragmatism have long
argued that it founders on its commitment to the analytic/synthetic distinction, since without such a distinction there is
plausibly no way to distinguish constitutive from non-constitutive roles in cognition. This paper considers Fodor’s empirical
arguments against analyticity, and in particular his arguments against lexical decomposition and definitions, and argues that
Concept Pragmatists have two viable options with respect to them. First, Concept Pragmatists can confront them head-on, and
argue that they do not show that lexical items are semantically primitive or that lexical concepts are internally unstructured.
Second, Pragmatists may accept that these arguments show that lexical concepts are atomic, but insist that this need not entail
that Pragmatism is false. For there is a viable version of Concept Pragmatism that does not take lexical items to be semantically
structured or lexical concepts to be internally structured. Adopting a version of Pragmatism that takes meaning relations
to be specified by inference rules, or meaning postulates, allows one to accept the empirical arguments in favor of Concept
Atomism, while at the same time deny that such arguments show that there are no analyticities. The paper concludes by responding
to Fodor’s recent objection that such a version of Concept Pragmatism has unhappy consequences concerning the relation between
concept constitution and concept possession.
相似文献
Bradley RivesEmail: |
5.
Daniel Fallman 《AI & Society》2010,25(1):43-52
Traditional human–computer interaction (HCI) allowed researchers and practitioners to share and rely on the ‘five E’s’ of
usability, the principle that interactive systems should be designed to be effective, efficient, engaging, error tolerant,
and easy to learn. A recent trend in HCI, however, is that academic researchers as well as practitioners are becoming increasingly
interested in user experiences, i.e., understanding and designing for relationships between users and artifacts that are for
instance affective, engaging, fun, playable, sociable, creative, involving, meaningful, exciting, ambiguous, and curious.
In this paper, it is argued that built into this shift in perspective there is a concurrent shift in accountability that is
drawing attention to a number of ethical, moral, social, cultural, and political issues that have been traditionally de-emphasized
in a field of research guided by usability concerns. Not surprisingly, this shift in accountability has also received scarce
attention in HCI. To be able to find any answers to the question of what makes a good user experience, the field of HCI needs
to develop a philosophy of technology. One building block for such a philosophy of technology in HCI is presented. Albert Borgmann argues that we need to be cautious
and rethink the relationship as well as the often-assumed correspondence between what we consider useful and what we think
of as good in technology. This junction—that some technologies may be both useful and good, while some technologies that are
useful for some purposes might also be harmful, less good, in a broader context—is at the heart of Borgmann’s understanding
of technology. Borgmann’s notion of the device paradigm is a valuable contribution to HCI as it points out that we are increasingly experiencing the world with, through, and by
information technologies and that most of these technologies tend to be designed to provide commodities that effortlessly grant our wishes without demanding anything in return, such as patience, skills, or effort. This paper
argues that Borgmann’s work is relevant and makes a valuable contribution to HCI in at least two ways: first, as a different
way of seeing that raises important social, cultural, ethical, and moral issues from which contemporary HCI cannot escape;
and second, as providing guidance as to how specific values might be incorporated into the design of interactive systems that
foster engagement with reality. 相似文献
6.
Mark Hogarth 《Natural computing》2009,8(3):493-498
Wittgenstein saw a problem with the idea that ‘rule following’ is a transparent process. Here I present an additional problem,
based on recent ideas about non-Turing computing. I show that even the simplest algorithm—Frege’s successor function, i.e.
counting—cannot by itself determine the ‘output’. Specification of a computing machine is also required. 相似文献
7.
Ina Wagner 《Computer Supported Cooperative Work (CSCW)》2012,21(1):1-42
The focus of this paper is on studying mixed teams of urban planners, citizens and other stakeholders co-constructing their
vision for the future of a site. The MR Tent provides a very specific collaborative setting: an assembly of technologies brought outdoors onto the site of an urban project,
which offers vistas onto the site as well as a multiplicity of representations of the site to work with, in different media
and taken from different perspectives. The prime focus of this paper is on the complex narratives participants co-constructed
in three participatory workshops, with the aim to understand how the core aspects of the MR Tent—spatiality, representation
and haptic engagement—shape these narratives. Main findings of this research concern: how the design of the multi-layered
space of the MR-Tent supports spatial story-telling; how the different representations of the site of an urban project offer
the opportunity to choreograph a ‘site-seeing’ that helps participants understand the site and plan interventions; how the
‘tangibles’ in the MR-Tent encourage a different way of contributing to a shared project and ‘building a vision’. 相似文献
8.
Ed Blakey 《Natural computing》2011,10(4):1245-1259
Unconventional computers—which may, for example, exploit chemical/analogue/quantum phenomena in order to compute, rather than
electronically implementing discrete logic gates—are widely studied in both theoretical and practical contexts. One particular
motivation behind unconventional computation is the desire efficiently to solve classically difficult problems—we recall chemical-computer
attempts at solving
NP
-complete problems such as the Travelling Salesperson Problem—, with computational complexity theory offering the criteria
for judging this efficiency. However, care must be taken here; conventional (Turing-machine-style) complexity analysis is
not always appropriate for unconventional computers: new, non-standard computational resources, with correspondingly new complexity
measures, are often required. Accordingly, we discuss in the present paper various resources beyond merely time and space
(and, indeed, discuss various interpretations of the term ‘resource’ itself), advocating such resources’ consideration when
analysing the complexity of unconventional computers. We hope that this acts as a useful starting point for practitioners
of unconventional computing and computational complexity. 相似文献
9.
Maurice van Keulen Ander de Keijzer 《The VLDB Journal The International Journal on Very Large Data Bases》2009,18(5):1191-1217
In data integration efforts, portal development in particular, much development time is devoted to entity resolution. Often
advanced similarity measurement techniques are used to remove semantic duplicates or solve other semantic conflicts. It proves
impossible, however, to automatically get rid of all semantic problems. An often-used rule of thumb states that about 90%
of the development effort is devoted to semi-automatically resolving the remaining 10% hard cases. In an attempt to significantly
decrease human effort at data integration time, we have proposed an approach that strives for a ‘good enough’ initial integration
which stores any remaining semantic uncertainty and conflicts in a probabilistic database. The remaining cases are to be resolved
with user feedback during query time. The main contribution of this paper is an experimental investigation of the effects
and sensitivity of rule definition, threshold tuning, and user feedback on the integration quality. We claim that our approach
indeed reduces development effort—and not merely shifts the effort—by showing that setting rough safe thresholds and defining
only a few rules suffices to produce a ‘good enough’ initial integration that can be meaningfully used, and that user feedback
is effective in gradually improving the integration quality. 相似文献
10.
This paper explores how research teams in Intel’s Digital Health Group are using ethnography to identify ‘designable moments’—spaces,
times, objects, issues and practices which suggest opportunities for appropriate interventions. It argues that technology
innovation should aim to incorporate the views, experiences and practices of users from the start of the design process to
support independent living and develop culturally sensitive enhancements that contribute towards wellbeing and a life of quality
for local older populations. 相似文献
11.
Dis/integrating animals: ethical dimensions of the genetic engineering of animals for human consumption 总被引:1,自引:0,他引:1
Traci Warkentin 《AI & Society》2006,20(1):82-102
Research at the intersections of feminism, biology and philosophy provides dynamic starting grounds for this discussion of
genetic technologies and animals. With a focus on animal bodies, I will examine moral implications of the genetic engineering
of “domesticated” animals—primarily pigs and chickens—for the purposes of human consumption. Concepts of natural and artificial,
contamination and purity, integrity and fragmentation and mind and body will feature in the discussion. In this respect, Margaret
Atwood’s novel, Oryx and Crake, serves as a cogent medium for exploring these highly contentious practices and ideas as it provides hypothetical narratives
of possibility. Moreover, it is used to highlight contemporary hegemonic assumptions and values in ways that make them visible.
Particular attention is paid to issues of growing human organs in pigs for xenotransplantation (resulting, for Atwood, in
“pigoons”) and the ultimate end of the intensive factory farming of chickens through the genetic engineering of ‘mindless’
chicken tumours (or, as Atwood calls them, “ChickieNobs”). Integral to these philosophical considerations is the provocative
question of the genetic modification of animal bodies as a means to end the suffering of domestic food animals. The ultimate
implications of this question include an ongoing sensory and moral deprivation of human experience, potentially resulting
in a future mechanomophosis, the extreme manifestation of an existing mechanomorphism.
相似文献
Traci WarkentinEmail: |
12.
Helen J. Richardson 《Information Systems Frontiers》2009,11(5):599-608
This paper discusses the domestication of Information and Communication Technologies (ICTs), particularly their use, in UK
households reporting on research undertaken between 1998 and 2004. Issues raised are linked to the dominant discourse of the
‘digital divide’, which in the UK means engaging with ICTs in a ‘meaningful’ way to ensure the economic and social well-being
of UK plc (public limited company—in the UK this refers to companies whose shares can be sold to the public. The acronym is
used here ironically to indicate the motivation of the government to brand and promote the UK as a whole.). Utilising a framework
of understanding digital inequality and the ‘deepening divide’, domestication theory is applied to discuss motivational, material
and physical, skills and usage access in the gendered household, critically contrasting this approach to ‘smart house’ research.
This qualitative enquiry contributes to the neglected area of domestication studies in Information Systems research. 相似文献
13.
Emergent behaviour—system behaviour not determined by the behaviours of system components when considered in isolation—is
commonplace in multi-agent systems, particularly when agents adapt to environmental change. This article considers the manner
in which Formal Methods may be used to authenticate the trustworthiness of such systems. Techniques are considered for capturing
emergent behaviour in the system specification and then the incremental refinement method is applied to justify design decisions
embodied in an implementation. To demonstrate the approach, one and two-dimensional cellular automata are studied. In particular
an incremental refinement of the ‘glider’ in Conway’s Game of Life is given from its specification. 相似文献
14.
As part of the Bristol Wearable Computing Initiative, we are exploring location-sensing systems suitable for use with wearable
computing. In this paper we present our findings, and in particular a wearable application — the ‘Shopping Jacket’ — which
relies on a minimal infrastructure to be effective. We use two positioning devices, ‘pingers’ and GPS. The pinger is used
to signal the presence of a shop, and to indicate the type of shop and its website. The GPS is used to disambiguate which
branch of a high street chain is being passed. The wearable uses this information to determine whether the wearer needs to
be alerted that they are passing an interesting shop, or to direct the wearer around a shopping mall. The Shopping Jacket
integrates a wearable CardPC, GPS and pinger receivers, a near-field radio link, hand-held display, GSM data telephone and
a speech interface into a conventional sports blazer. 相似文献
15.
Zippora Arzi-Gonczarowski 《Annals of Mathematics and Artificial Intelligence》1999,26(1-4):215-252
This paper formalizes and analyzes cognitive transitions between artificial perceptions that consist of an analogical or metaphorical
transference of perception. The formalization is performed within a mathematical framework that has been used before to formalize
other aspects of artificial perception and cognition. The mathematical infrastructure consists of a basic category of ‘artificial
perceptions’. Each ‘perception’ consists of a set of ‘world elements’, a set of ‘connotations’, and a three valued (true,
false, undefined) predicative connection between the two sets. ‘Perception morphisms’ describe structure preserving paths
between perceptions. Quite a few artificial cognitive processes can be viewed and formalized as perception morphisms or as
other categorical constructs. We show here how analogical transitions can be formalized in a similar way. A factorization
of every analogical transition is shown to formalize metaphorical perceptions that are inspired by the analogy. It is further
shown how structural aspects of ‘better’ analogies and metaphors can be captured and evaluated by the same categorical setting,
as well as generalizations that emerge from analogies. The results of this study are then embedded in the existing mathematical
formalization of other artificial cognitive processes within the same premises. A fallout of the rigorous unified mathematical
theory is that structured analogies and metaphors share common formal aspects with other perceptually acute cognitive processes.
This revised version was published online in June 2006 with corrections to the Cover Date. 相似文献
16.
Prices are macro-observables of a financial market that result from the trading actions of a huge number of individual investors.
Major stylized facts of empirical asset returns concern (i) non-Gaussian distribution of empirical asset returns and (ii)
volatility clustering, i.e., the slow decay of auto- correlations of absolute returns. We propose a model for the aggregate
dynamics of the market which is generated by the coupling of a ‘slow’ and a ‘fast’ dynamical component, where the ‘fast’ component
can be seen as a perturbation of the ‘slow’ one. Statistical properties of price changes in this model are estimated by simulation;
sample size is 4 × 106. It is shown that increasing the decoupling of these two dynamical levels generates a crossover in the distribution of log
returns from a concave Gaussian-like distribution to a convex, truncated Levy-like one. For a sufficiently large degree of
dynamic decoupling, the return trails exhibit pronounced volatility clustering. 相似文献
17.
Peter Jones 《AI & Society》2010,25(4):455-464
The paper offers a critical reflection, inspired by the insights of integrational linguistics, on the conception of thinking
and action within the distributed cognition approach of Edwin Hutchins. Counterposing a fictional account of a mutiny at sea
to Hutchins’ observational study of navigation on board the Palau, the paper argues that the ethical fabric of communication and action with its ‘first person’ perspective must not be overlooked
in our haste to appeal to ‘culture’ as an alternative to the internalist, computer metaphor of thinking. The paper accepts
Hutchins’ own critique of the ‘meaning in the message’ illusion but goes beyond this critique to argue for a view of communication,
thinking and action as creative, ethically charged and morally accountable acts of engagement. 相似文献
18.
In this paper we reflect on a body of work to develop a simpler form of digital photography. We give three examples of ‘Less
is More’ thinking in this area which are directed and inspired by naturalistic user behaviours and reactions to design ideas.
Each example happens to review the place of an old technology in the new scheme of things, and challenges a technological
trend in the industry. Hence, we consider the role of sound in photography to recommend audiophotographs rather than short
video clips as a new media form. We look again at the role of paper in photo sharing and recommend its support and augmentation
against the trend towards screen-based viewing. Finally, we consider the role of physical souvenirs and memorabilia alongside
photographs, to recommend their use as story triggers and containers, in contrast to explicit multimedia presentations. The
implications for simple computing are discussed.
This paper originated from the International Forum ‘Less is More—Simple computing in an age of complexity’, 27–28 April 2005,
Cambridge UK. 相似文献
19.
‘Correlations without correlata’ is an influential way of thinking of quantum entanglement as a form primitive correlation
which nonetheless maintains locality of quantum theory. A number of arguments have sought to suggest that such a view leads
either to internal inconsistency or to conflict with the empirical predictions of quantum mechanics. Here we explicate and
provide a partial defence of the notion, arguing that these objections import unwarranted conceptions of correlation properties
as hidden variables. A more plausible account sees the properties in terms of Everettian relative states. The ontological
robustness of entanglement is also defended from recent objections. 相似文献
20.
Experimental research with humans and animals suggests that sleep — particularly REM sleep — is, in some way, associated with
learning. However, the nature of the association and the underlying mechanism remain unclear. A number of theoretical models
have drawn inspiration from research into Artificial Neural Networks. Crick and Mitchinson's ‘unlearning’ and Robins and McCallum's
‘pseudo-rehearsal’ models suggest alternative mechanisms through which sleep could contribute to learning. In this paper we
present simulations, suggesting a possible synthesis. Our simulations use a modified version of a Hopfield network to model
the possible contribution of sleep to memory consolidation. Sleep is simulated by removing all sensory input to the network
and by exposing it to a ‘noise’, intended as a highly abstract model of the signals generated by the Ponto-geniculate-occipital
system during sleep. The results show that simulated sleep does indeed contribute to learning and that the relationship between
the observed effect and the length of simulated sleep can be represented by a U-shaped curve. It is shown that while high-amplitude,
low-frequency noise (reminiscent of NREM sleep) leads to a general reinforcement of memory, low-amplitude, high-frequency
noise (as observed in REM sleep) leads to ‘forgetting’ of all but the strongest memory traces. This suggests that a combination
of the two kinds of sleep might produce a stronger effect than either kind of sleep on its own and that effective consolidation
of memory during sleep may depend not just on REM or NREM sleep but on the overall dynamics of the sleep cycle. 相似文献