首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 21 毫秒
1.
The exponential growth of end-user computing can mean unexpected costs and haphazard applications development unless organizations plan ahead. Executive management must turn to MIS for guidance—but MIS must be willing to adopt new ideas about its responsibility toward corporate computing.  相似文献   

2.
Affordance as context   总被引:2,自引:0,他引:2  
The concept of affordance is relatively easy to define, but has proved to be remarkably difficult to engineer. This paradox has sparked numerous debates as to its true nature. The discussion presented here begins with a review of the use of the term from which emerges evidence for a two-fold classification—simple affordance and complex affordance. Simple affordance corresponds to Gibson's original formulation, while complex affordances embody such things as history and practice. In trying to account for complex affordance, two contrasting, but complementary philosophical treatments are considered. The first of these is Ilyenkov's account of significances which he claims are ‘ideal’ phenomena. Ideal phenomena occupy are objective characteristics of things and are the product of human purposive activity. This makes them objective, but not independent (of any particular mind or perception) hence their similarity to affordances.

The second perspective is Heidegger's phenomenological treatment of ‘familiarity’ and ‘equipment’. As will be seen, Heidegger has argued that familiarity underpins our ability to cope in the world. A world, in turn, which itself comprises the totality of equipment. We cope by making use of equipment. Despite the different philosophical traditions both Ilyenkov and Heidegger have independently concluded that a thing is identified by its use and that use, in turn, is revealed by way of its affordances/significances. Finally, both authors—Heidegger directly and Ilyenkov indirectly—equate context and use, leading to the conclusion that affordance and context are one and the same.  相似文献   


3.
We derive the exact distributions of R=X+Y, P=XY and W=X/(X+Y) and the corresponding moment properties when X and Y follow Lawrence and Lewis's bivariate exponential distribution. The expressions turn out to involve special functions. We also provide extensive tabulations of the percentage points associated with the distributions. These tables—obtained using intensive computing power—will be of use to practitioners of the bivariate exponential distribution.  相似文献   

4.
As the products of new technologies proliferate, they are accompanied by a corresponding rise in problems concerning their selection, use, and maintenance. Any hardware and software products may reveal quirks or outright failures even after the most rigorous testing by both seller and buyer. Solving these problems is often beyond the capability of in-house personnel. Although an obvious solution is to turn to the vendor for help, an alternative does exist—other users of the same product.  相似文献   

5.
We discuss solution schemes for the incremental elastic-plastic structural problem, discretized by means of the Finite Element method. Attention is focused on their formulation and implementation in a parallel computing environment defined by a cluster of workstations connected by means of a network. The availability of parallel computers allows one to consider possible formulations and solution strategies so far not considered competitive with the classical Newton-like schemes implying the definition of an elastic-plastic tangent stiffness matrix. The solution strategies herein considered are based on the explicit integration of the actual elastic-plastic rate problem. This, in turn, is phrased in terms of two different formulations, whose relative advantages—particularly with respect to their integration in parallel—are discussed. A − gl (displacemen plastic multiplier) formulation of the structural rate theory of plasticity [1], integrated by means of an explicit, element-by-element scheme, seems to be the most promising one.  相似文献   

6.
Based on attitude—behavior theory, it was hypothesized that computer use would enhance beliefs about self-perceived computer confidence, which would in turn affect attitudes towards computers. Primary level students (N = 723) completed self-report surveys that measured these three constructs. Covariance structural analyses revealed that (a) computer use positively affected computer confidence, and (b) computer confidence positively affected computer attitudes. Unexpectedly, direct computer use had a negative effect on computer attitudes, when confidence was held constant. Results suggest how computer educational environments might be improved.  相似文献   

7.
This article looks at software quality from a managerial point of view. Despite the wealth of literature about software quality and how it is measured, high quality software systems are still alarmingly rare. It is suggested that DP management should look to modern production management for a lesson in how to promote quality in their products. This in turn means observing Japanese methods of management, in particular the innovation of the early eighties — the quality circle. Also investigated is the suspicion that today's DP staff would be unwilling to accept changes that management may be forced to adopt to improve the quality of software systems.  相似文献   

8.
The resolution limit of visual sensors due to finite pixel spacing can be overcome by applying continuous low-amplitude vibrations to the image—or taking advantage of existing vibrations in the environment. Thereby, spatial intensity gradients turn into temporal intensity fluctuations which can be detected and processed by every pixel independently from the others. This approach enhances resolution and virtually eliminates fixed-pattern noise. A visual sensing microsystem taking advantage of this principle is described. It incorporates a custom analog integrated circuit implementing an array of 32 by 32 pixels with local temporal signal processing. Another key component is a resonant mechanical device producing low-amplitude image scanning movements powered by environmental vibrations.  相似文献   

9.
In this paper, we attack the figure — ground discrimination problem from a combinatorial optimization perspective. In general, the solutions proposed in the past solved this problem only partially: either the mathematical model encoding the figure — ground problem was too simple or the optimization methods that were used were not efficient enough or they could not guarantee to find the global minimum of the cost function describing the figure — ground model. The method that we devised and which is described in this paper is tailored around the following contributions. First, we suggest a mathematical model encoding the figure — ground discrimination problem that makes explicit a definition of shape (or figure) based on cocircularity, smoothness, proximity, and contrast. This model consists of building a cost function on the basis of image element interactions. Moreover, this cost function fits the constraints of aninteracting spin system, which in turn is a well suited physical model to solve hard combinatorial optimization problems. Second, we suggest a combinatorial optimization method for solving the figure — ground problem, namely mean field annealing which combines the mean field approximation and annealing. Mean field annealing may well be viewed as a deterministic approximation of stochastic methods such as simulated annealing. We describe in detail the theoretical bases of this method, derive a computational model, and provide a practical algorithm. Finally, some experimental results are shown for both synthetic and real images.This research has been sponsored in part by Commissariat à l'Energie Atomique, and in part by the ORASIS project (PRC Communications Homme/Machine).  相似文献   

10.
Objects can exhibit different dynamics at different spatio-temporal scales, a property that is often exploited by visual tracking algorithms. A local dynamic model is typically used to extract image features that are then used as inputs to a system for tracking the object using a global dynamic model. Approximate local dynamics may be brittle—point trackers drift due to image noise and adaptive background models adapt to foreground objects that become stationary—and constraints from the global model can make them more robust. We propose a probabilistic framework for incorporating knowledge about global dynamics into the local feature extraction processes. A global tracking algorithm can be formulated as a generative model and used to predict feature values thereby influencing the observation process of the feature extractor, which in turn produces feature values that are used in high-level inference. We combine such models utilizing a multichain graphical model framework. We show the utility of our framework for improving feature tracking as well as shape and motion estimates in a batch factorization algorithm. We also propose an approximate filtering algorithm appropriate for online applications and demonstrate its application to tasks in background subtraction, structure from motion and articulated body tracking.  相似文献   

11.
Scale Space Hierarchy   总被引:1,自引:0,他引:1  
We investigate the deep structure of a scale space image. We concentrate on scale space critical points—points with vanishing gradient with respect to both spatial and scale direction. We show that these points are always saddle points. They turn out to be extremely useful, since the iso-intensity manifolds through these points provide a scale space hierarchy tree and induce a pre-segmentation: a segmentation without a priori knowledge. Furthermore, both these scale space saddles and the so-called catastrophe points form the critical points of the parameterised critical curves—the curves along which the spatial critical points move in scale space. This enables one to localise these two types of special points relatively easy and automatically. Experimental results concerning the hierarchical representation and pre-segmentation are given and show results that correspond to a fair degree to both the mathematical and the intuitive forecast.  相似文献   

12.
The fundamental idea of Perceptual Control Theory (PCT) has been known since at least the time of Aristotle, and was well expounded by William James. It is that people act so as to bring about the conditions they desire—to perceive their world as they wish it to be. They control their perceptions. However, the technical understanding required to turn this idea into a theory was largely developed only in this century. This editorial illustrates the nature of hierarchic control, and shows how control tasks can be partitioned between a human and a machine. It then considers some common but incorrect objections to PCT as a basis for psychology, and finally describes the eight papers that constitute this Special Issue.  相似文献   

13.
Low female participation rates in computing are a current concern of the education sector. To address this problem an intervention was developed — computing skills were introduced to girls in their English classes using three different teaching styles: peer tutoring, cross-age tutoring and teacher instruction (control). The sample comprised 136 girls from Years 8 and 10 from a single-sex government school. A pre-test post-test quantitative design was used. To describe the students perspectives, qualitative data were collected from six focus groups conducted with 8–10 students — one from each of the six classes. It was predicted that cross-age tutoring would yield more positive effects than peer tutoring which, in turn, would yield more positive effects than traditional teacher instruction as assessed by achievement on class tasks and attitudes towards computing. The hypothesis was not supported by the quantitative analysis, however in the qualitative data cross-age tutoring was appraised more favourably than peer tutoring or teacher instruction. The latter was the least preferred condition due to: (1) inefficiency; (2) difficulty understanding teachers' explanations; and (3) lack of teacher knowledge. Problems with the implementation of the intervention identified in the focus groups were teacher differences, system failures, missed classes, lack of communication, and selection of computing activities. Practical suggestions were provided relevant to the introduction of cross-age tutoring and the use of computers within secondary level English classes.  相似文献   

14.
Currently, no approved standards exist to aid authors in designing usable multimedia documents. Sites exist on the Internet offering style guides which are often very detailed. In most cases, only the most determined of authors will wade their way through such sites for advice. Icon bars contribute a great deal to resolving usability problems. Standard Internet browsers have an icon bar, which this paper refers to as a ‘Site level icon bar’, due to the functions of its buttons—they control functions of the browser which do not belong to the viewed document but rather to the Internet sites visited. This paper argues that major usability problems (i.e. in relation to authoring and navigation of Web documents) can be overcome by introducing another type of icon bars, i.e. a ‘Document level icon bar’—for which the functions are directly relevant to the document being viewed, rather than to the site visited as in the first type. A programmable document level icon bar would enable the authoring of a range of standard navigational buttons which in turn would increase the speed of document transmission. Furthermore, widespread adoption of this icon bar will reduce the total burden on the bandwidth of the Internet as a whole. A prototype tool, called WebCheck, has been developed with a programmable document level icon bar to demonstrate this idea.  相似文献   

15.
Discussed in this paper are the issues underlying the mechanical design of a seven-axes isotropic manipulator. The kinematic design of this manipulator was made based on one main criterion, namely, accuracy. Thus, the main issue determining the underlying architecture, defined by its Hartenberg—Denavit (HD) parameters, was the optimization of its kinematic conditioning. This main criterion led not to one set of HD parameters, but rather to a manifold of these sets, which allowed the incorporation of further requirements, such as structural behavior, workspace considerations and functionality properties. These requirements in turn allowed the determination of the link shapes and the selection of actuators. The detailed mechanical design led to heuristic rules that helped in the decision-making process in defining issues such as link sub-assemblies and motor location along the joint axes.  相似文献   

16.
In signs one sees an advantage for discovery that is greatest when they express the exact nature of a thing briefly and, as it were, picture it; then, indeed, the labour of thought is wonderfully diminished Leibniz

Icons are today used in many programming systems—to simplify man-machine communication. Does this represent something new and important or is it just a sales argument similar, for example, to the 'removal' of formulas in Cobol, by replacing the mathematical symbols by English words? In order to answer this question we will examine the use of icons in man-machine interfaces, comparing this usage with the q ualities of icons. In doing this we will rely on the fundamental work on signs of Charles Sanders Peirce (1839-1914) about the turn of the century (Hartshorne and Weiss 1965).  相似文献   

17.
18.
In knowledge management (KM)-related research, effective knowledge sharing is considered to be one of the most critical components of KM success. For the present research, the authors conducted a longitudinal, two-phased study to evaluate if the Theory of Reasoned Action (TRA) and three variations of the Theory of Planned Behavior—namely, TPB, decomposed TPB (DTPB), and revised TPB (RTPB)—can adequately predict knowledge sharing behaviors. The first TRA-based study shows a severe limitation in the ability of the intention to predict actual knowledge sharing behaviors collected from a knowledge management platform. In a subsequent study, three variations of TPB-based models were employed to show that, although the independent variables (i.e., attitude, subjective norm, and perceived behavior control that is decomposed into controllability and self-efficacy) give satisfactory explanations of variance in intention (R2 > 42%), the intention–behavior gap still exists in each of the three models. Only the perceived self-efficacy in the revised TPB can directly predict knowledge sharing behaviors. This gap highlights the importance of knowledge sharing as a fundamentally social activity for which the actualization of intention into actions may be interrupted due to barriers such as a mistake-free culture or others’ deliberate misinterpretations that may in turn cause unanticipated negative consequences to the person. The theoretical implication of this study is that in applying TPB to study knowledge sharing practices, researchers must focus on control beliefs that reflect people’s capacity to overcome possible environmental challenges encountered in carrying out their knowledge sharing intentions.  相似文献   

19.
A pervasive problem in freight railroad operations is to determine a feasible flow of cars to meet the required demands within a certain period of time. In this work we present a method to determine an optimal flow of loaded and empty cars in order to maximize profits, revenue or tonnage transported, given the schedule of the trains, together with their traction capacities. We propose an integer multicommodity flow model for the problem whose linear relaxation leads to very good upper bounds — at the cost of using a very large number of variables and constraints. In order to turn this model into a practical tool, we apply a preprocessing phase that may reduce its size by two or three orders of magnitude. The reduced model can then be solved by standard integer program packages with little, if any, branching effort. Computational results on real instances of the largest Latin American railroad freight company are reported. The product that resulted from this research is already in use at that company.  相似文献   

20.
Polyethylene glycol-modified choline oxidase (PEG—ChOD) has been prepared in order to construct choline-sensing carbon-paste electrodes: the PEG—ChOD and mediator, 1,1-dimethylferrocene, are incorporated into a carbon-paste matrix. The PEG—ChOD-based electrode exhibits higher enzyme activity than an electrode using native ChOD, since the PEG is effective in protecting ChOD from oil in the paste. The current response of the PEG—ChOD-based electrode is larger than that of the native ChOD-basedone for choline concentrations higher than 5 mM. The PEG—ChOD-based electrode is more stable than the native ChOD-based one. Coating the surface of the PEG—ChOD-based electrode with a cation polymer is effective in enhancing the current response to choline: the current response of the PEG—ChOD-based electrode coated with a layer of AQ-29D (Kodak) is about four times larger than that of the AQ-29D-free electrode.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号