首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
Journal publication is an important indicator of research productivity for individual researchers as well as academic institutions. It is also a contentious issue as various stakeholders have different and often conflicting interests and perspectives. This study explores how IS researchers view journal review and publication process based on their professional status, institutional mission, and role orientation. It also investigates publication practices of IS researchers, such as frequency of article submissions, acceptance/rejection rates, number of revisions required before publication, and publication outlets.  相似文献   

2.
This paper presents a new design approach used in order to solve the facility layout problem. The layout problem is viewed from the general perspective as a problem of the arrangement of elements within a system. The main attributes and relationships among the elements of the system are analyzed.  相似文献   

3.

An experiment was conducted to examine skill differences in the control strategy for computer program comprehension. A computer program along with its hierarchy of program plans was provided to 10 intermediate and 10 novice computer programmers. Each program plan is known as a program segment to the subjects. A random list of plan goals was also provided to the subjects. The subjects were asked to match each program segment with its goal while they were comprehending the program. Several measures of the subjects' performance and control strategy were collected and analysed. The results indicated the use of an overall top-down strategy by both intermediates and novices for program comprehension. Novices' control strategies involved more opportunistic elements than experts' in the overall top-down process of program comprehension. Those differences in the control strategy between intermediates and novices result in better performance in intermediates than novices.  相似文献   

4.
An experiment was conducted to examine skill differences in the control strategy for computer program comprehension. A computer program along with its hierarchy of program plans was provided to 10 intermediate and 10 novice computer programmers. Each program plan is known as a program segment to the subjects. A random list of plan goals was also provided to the subjects. The subjects were asked to match each program segment with its goal while they were comprehending the program. Several measures of the subjects' performance and control strategy were collected and analysed. The results indicated the use of an overall top-down strategy by both intermediates and novices for program comprehension. Novices' control strategies involved more opportunistic elements than experts' in the overall top-down process of program comprehension. Those differences in the control strategy between intermediates and novices result in better performance in intermediates than novices.  相似文献   

5.
While monitoring, instrumented long running parallel applications generate huge amount of instrumentation data. Processing and storing this data incurs overhead, and perturbs the execution. A technique that eliminates unnecessary instrumentation data and lowers the intrusion without loosing any performance information is valuable for tool developers. This paper presents a new algorithm for software instrumentation to measure the amount of information content of instrumentation data to be collected. The algorithm is based on entropy concept introduced in information theory, and it makes selective data collection for a time-driven software monitoring system possible.  相似文献   

6.
7.
Within the competitive global environment, information has become a key resource for increasing a corporation's competitiveness by changing the nature or conduct of business. Accordingly, corporations are now seeking a method for information systems planning to maximize their strategic effectiveness.Strategic Information Systems Planning (SISP) refers to the process of creating a portfolio for the implementation and use of IS to maximize the effectiveness and efficiency of a corporation, so that it can achieve its objectives. An investigation of SISP, however, showed that only 24% of planned applications were actually developed (Int. J. Comput. Appl. Technol., 8 (1995), 61; MIS Quarterly, September (1988), 445). This figure clearly shows that enhancements are required for current SISP processes. In particular, this paper focuses on SISP methodologies, which provide support for overall SISP processes.The paper initially identifies four general SISP methodology problems: lack of support for Information Technology Architecture, under-emphasis on information technology opportunities, duration of SISP, and lack of support for business process reengineering. Next, it proposes an integrated SISP methodology which solves the above problems while retaining the advantageous qualities of current SISP methodologies. Finally, a case study is added to show how the methodology actually works in practice.  相似文献   

8.
The correct functioning of interactive computer systems depends on both the faultless operation of the device and correct human actions. In this paper, we focus on system malfunctions due to human actions. We present abstract principles that generate cognitively plausible human behaviour. These principles are then formalised in a higher-order logic as a generic, and so retargetable, cognitive architecture, based on results from cognitive psychology. We instantiate the generic cognitive architecture to obtain specific user models. These are then used in a series of case studies on the formal verification of simple interactive systems. By doing this, we demonstrate that our verification methodology can detect a variety of realistic, potentially erroneous actions, which emerge from the combination of a poorly designed device and cognitively plausible human behaviour.  相似文献   

9.
We live in an increasingly mobile world, which leads to the duplication of information across domains. Though organizations attempt to obscure the identities of their constituents when sharing information for worthwhile purposes, such as basic research, the uncoordinated nature of such environment can lead to privacy vulnerabilities. For instance, disparate healthcare providers can collect information on the same patient. Federal policy requires that such providers share “de-identified” sensitive data, such as biomedical (e.g., clinical and genomic) records. But at the same time, such providers can share identified information, devoid of sensitive biomedical data, for administrative functions. On a provider-by-provider basis, the biomedical and identified records appear unrelated, however, links can be established when multiple providers' databases are studied jointly. The problem, known as trail disclosure, is a generalized phenomenon and occurs because an individual's location access pattern can be matched across the shared databases. Due to technical and legal constraints, it is often difficult to coordinate between providers and thus it is critical to assess the disclosure risk in distributed environments, so that we can develop techniques to mitigate such risks. Research on privacy protection has so far focused on developing technologies to suppress or encrypt identifiers associated with sensitive information. There is a growing body of work on the formal assessment of the disclosure risk of database entries in publicly shared databases, but less attention has been paid to the distributed setting. In this research, we review the trail disclosure problem in several domains with known vulnerabilities and show that disclosure risk is influenced by the distribution of how people visit service providers. Based on empirical evidence, we propose an entropy metric for assessing such risk in shared databases prior to their release. This metric assesses risk by leveraging the statistical characteristics of a visit distribution, as opposed to person-level data. It is computationally efficient and superior to existing risk assessment methods, which rely on ad hoc assessment that are often computationally expensive and unreliable. We evaluate our approach on a range of location access patterns in simulated environments. Our results demonstrate that the approach is effective at estimating trail disclosure risks and the amount of self-information contained in a distributed system is one of the main driving factors.  相似文献   

10.
Decision theory is a formal basis for considering human decision making. It has typically focused on humans and even if the decision-maker is assisted in the process, it is assumed that the assistance is provided by another human. However, in the computer age the decision-maker is assisted more often than not by a computer. Hence in this paper we explore the rationale for an integrated human-computer information processor and consider the information processing capabilities of the human and the computer within a formal model of decision-making. The analysis for the computer is done assuming it has at least the capabilities of a decision support system.  相似文献   

11.
In ambiguous decision domain, interval numbers are always employed to model multiple attribute decision making (MADM) problems, in which attribute values of alternatives are expressed in the form of intervals. In this study, a new approach is proposed for the interval MADM problems: The interval decision matrix is transformed into definite one by comparing the interval attribute values with the base intervals of attributes; The attribute weights are determined based on the definite decision matrix; the alternatives are assessed based on TOPSIS. An example of assessing the faculty candidates (alternatives) is used to illustrate the proposed approach.  相似文献   

12.
Ethical issues related to information systems are important to the information technology (IT) professionals. These issues are also significant for organizations and societies. Although considerable literature on IT and related ethical issues exists, a review of this literature has found little empirical research on ethical practices within the government and private sector organizations. Therefore, the objective of this paper is to draw inferences regarding such practices currently in these sectors. The research results indicate a significant correlation between the code of ethics and the attitude of professionals towards the unethical use of software in government and private sector organizations. These also indicate significant differences in government and private sectors.  相似文献   

13.
Two prototype identifiable structures are presented which make possible the identification via an equation-error model reference adaptive system of linear plants with rational transfer function matrices. The structures include as specialisations many of the particular structures presented hitherto in the literature. Convergence properties are also discussed, and several modes of convergence are distinguished: model output to plant output, model transfer function matrix to plant transfer function matrix, and model parameters to plant parameters. Conditions are presented for exponentially fast convergence in the absence of noise.  相似文献   

14.
A growing number of software developers use standards as a basis for their quality systems. Some of them go one step further and have their quality systems certified. In this paper, two well-known quality standards, ISO 9000 and the CMM, are compared. It is concluded that both standards are useful but there is a growing need for more specific standards.  相似文献   

15.
In this paper the performability analysis of fault-tolerant computer systems using a hierarchical decomposition technique is presented. A special class of queueing network (QN) models, the so-called BCMP [4], and generalized stochastic Petri nets (GSPN) [1] which are often used to separately model performance and reliability respectively, have been combined in order to preserve the best modelling features of both.

A conceptual model is decomposed into GSPN and BCMP submodels, which are solved in isolation. Then, the remaining GSPN portion of the model is aggregated with flow-equivalents of BCMP models, in order to compute performability measures. The substitutes of BCMP models are presented by means of simple GSPN constructs, thereby preserving the 1st and 2nd moments of the throughput. A simple example of a data communication system where failed transmissions are corrected, is presented.  相似文献   


16.
This paper aims to contribute to a theory of integration within the field of information systems (IS) project management. Integration is a key IS project management issue when new systems are developed and implemented into an increasingly integrated information infrastructure in corporate and governmental organizations. Expanding the perspective of traditional project management research, we draw extensively on central insights from IS research. Building on socio-technical IS research and software engineering research, we suggest four generic patterns of integration: big bang, stakeholder integration, technical integration and socio-technical integration. We analyse and describe the advantages and disadvantages of each pattern. The four patterns are ideal types. To explore the forces and challenges in these patterns, three longitudinal case studies were conducted. In particular we investigate the management challenges for each pattern. We find that the patterns are context-sensitive and describe the different contexts where the patterns are applicable. For IS project management, the four integration patterns are a contribution to the management of integration risks – extending the vocabulary for assessing and mitigating these risks in IS development. For practitioners the four integration patterns represent an analytical framework to be used in planning modern IS development projects.  相似文献   

17.
Process mining aims at deriving order relations between tasks recorded by event logs in order to construct their corresponding process models. The quality of the results is not only determined by the mining algorithm being used, but also by the quality of the provided event logs. As a criterion of log quality, completeness measures the magnitude of information for process mining covered by an event log. In this paper, we focus on the evaluation of the local completeness of an event log. In particular, we consider the direct succession (DS) relations between the tasks of a business process. Based on our previous work, an improved approach called CPL+ is proposed in this paper. Experiments show that the proposed CPL+ works better than other approaches, on event logs that contain a small amount of traces. Finally, by further investigating CPL+, we also found that the more distinct DSs observed in an event log, the lower the local completeness of the log is.  相似文献   

18.
Prototyping in information systems (IS) development has recently shown increased benefits. In principle, the prototyping process provides users with more opportunities to improve their work, to verify that their needs are provided for, and that the terms used in the interface of the designed system are consistent with those in use in their work. As a result, they should be highly motivated to participate in an IS development process.However, certain drawbacks inherited from traditional prototyping in industrial production could limit the use of this approach in IS development. Some problems are identified in this paper, such as: (1) product-oriented thinking; (2) feedback delay; (3) the preoccupation of designers with respect to the experimental approach; (4) problems arising from the users' participation being indirect, and (5) negative attitudes towards contradiction. This paper proposes an organic approach, the ‘Embryonic Approach’ (EmA), in order to explore the full potentialities of prototyping in IS development. This approach is based on two fundamental elements; an adaptive and expandable kernel-structure, and a built-in communication mechanism.  相似文献   

19.
This paper presents a new approach, model theory approach, to small and medium scale transaction processing system (TPS) development. A TPS of this paper is an information system designed to process day-to-day business event data at operational level of an organization. The paper is not concerned with data base construction but with transaction processing.

The model theory approach is not a software engineering approach but a systems theory approach. In the approach a model of the target system, which is called a user model, is constructed in set theory using a formal system structure of a TPS. The user model is, then, compiled into an extended Prolog (extProlog) model. The extProlog is an extension of Prolog to meet requirements for management information system development. On compilation a standardized user interface (UI) called internal UI is attached. The extProlog model with the internal UI is, then, executed under control of another standardized UI called an external UI. Implementation is an integral part of the approach. Because the UIs are designed for the formalized (abstract) structure of a TPS, they can be standardized and are provided as black box components to system development. Because a systems developer is required to only build a user model in set theory based on a model theoretic structure in the approach, it is called a model theory approach. Advantages of this approach are that it provides a theoretical structure to information systems development so that systems development can be made an engineering discipline, and facilitates rapid systems development.  相似文献   

20.
Notwithstanding the importance of metaphors in organizational research, this paper recognizes that few empirical studies examining the multiple metaphors elicited by stakeholders (for instance, teachers, administrators and managers) in the context of IS/IT implementation within an educational setting have been conducted. Using a total of 30 in-depth interviews carried out within a further and higher educational college in the United Kingdom, the broad aim of this paper is to examine the metaphors produced from organizational members in their accounts of IS/IT implementation. The analysis reveals a number of dominant metaphors that feature in IS literature on which participants drew—journey, military, machine, bodily/illness, sports and religion—as well as a number of additional, novel metaphors not widely acknowledged within IS literature, namely nautical, horticultural and child-like metaphors. The paper proposes that practitioners and managers should be aware of the multiple metaphorical expressions members of an organization use as it may help to clarify the tensions arising during IS/IT implementation. Furthermore, an exploration of these metaphors generates fresh insights into implementation practices, which may otherwise go unnoticed.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号