共查询到20条相似文献,搜索用时 15 毫秒
1.
Anna S. Palaiologk Anastasios A. Economides Heiko D. Tjalsma Laurents B. Sesink 《International Journal on Digital Libraries》2012,12(4):195-214
Financial sustainability is an important attribute of a trusted, reliable digital repository. The authors of this paper use the case study approach to develop an activity-based costing (ABC) model. This is used for estimating the costs of preserving digital research data and identifying options for improving and sustaining relevant activities. The model is designed in the environment of the Data Archiving and Networked Services (DANS) institute, a well-known trusted repository. The DANS–ABC model has been tested on empirical cost data from activities performed by 51 employees in frames of over 40 different national and international projects. Costs of resources are being assigned to cost objects through activities and cost drivers. The ‘euros per dataset’ unit of costs measurement is introduced to analyse the outputs of the model. Funders, managers and other decision-making stakeholders are being provided with understandable information connected to the strategic goals of the organisation. The latter is being achieved by linking the DANS–ABC model to another widely used managerial tool—the Balanced Scorecard (BSC). The DANS–ABC model supports costing of services provided by a data archive, while the combination of the DANS–ABC with a BSC identifies areas in the digital preservation process where efficiency improvements are possible. 相似文献
2.
André Årnes Paul Haas Giovanni Vigna Richard A. Kemmerer 《Journal in Computer Virology》2007,2(4):275-289
This paper presents ViSe, a virtual security testbed, and demonstrates how it can be used to efficiently study computer attacks
and suspect tools as part of a computer crime reconstruction. Based on a hypothesis of the security incident in question,
ViSe is configured with the appropriate operating systems, services, and exploits. Attacks are formulated as event chains
and replayed on the testbed. The effects of each event are analyzed in order to support or refute the hypothesis. The purpose
of the approach is to facilitate reconstruction experiments in digital forensics. Two examples are given to demonstrate the
approach; one overview example based on the Trojan defense and one detailed example of a multi-step attack. Although a reconstruction
can neither prove a hypothesis with absolute certainty nor exclude the correctness of other hypotheses, a standardized environment,
such as ViSe, combined with event reconstruction and testing, can lend credibility to an investigation and can be a great
asset in court. 相似文献
3.
The paper focuses upon an in depth investigation to decipher whether larger organisations embracing Lean as a philosophy were indeed more successful. Achievement was measured by the impact an organisation's Lean journey had on its financial and operational efficiency levels. 相似文献
4.
炼化企业传统成本核算方法对生产经营的评价和指导存在信息滞后,为改善弊端,实现实时评价和精细控制,践行企业物流与价值流的双向管理理念,将作业成本管理理论引入日成本核算体系,以实际生产流程、物料流向和工艺参数为依据,构建基于炼化一体化生产过程的成本管理和控制系统架构,并针对公用工程部锅炉产汽、汽轮机抽汽、发电过程中能量转移和转化特点,搭建汽电成本分配模型,解决汽轮机产出的各级蒸汽和电产品的成本分配问题;针对炼油生产过程中原料消耗、能源动力消耗和固定成本项目的消耗特点,分别搭建分配原料成本的物耗系数模型、分配能动消耗成本的能耗系数模型和在二者基础上建立的分配固定成本的综合系数模型,形成了完整的炼油装置技术系数模型框架,提高了企业日成本核算信息的准确性和科学性。日成本核算系统的应用实现了炼化一体化生产过程成本日清日结,为管理决策提供了真实的生产经营信息,并推动企业精细化生产经营管理水平进一步提升。 相似文献
5.
Using metrics to manage software projects 总被引:1,自引:0,他引:1
Five years ago, Bull's Enterprise Servers Operation in Phoenix, Arizona, used a software process that, although understandable, was unpredictable in terms of product quality and delivery schedule. The process generated products with unsatisfactory quality levels and required significant extra effort to avoid major schedule slips. All but the smallest software projects require metrics for effective project management. Hence, as part of a program designed to improve the quality, productivity, and predictability of software development projects, the Phoenix operation launched a series of improvements in 1989. One improvement based software project management on additional software measures. Another introduced an inspection program, since inspection data was essential to project management improvements. Project sizes varied from several thousand lines of code (KLOC) to more than 300 KLOC. The improvement projects enhanced quality and productivity. In essence, Bull now has a process that is repeatable and manageable, and that delivers higher quality products at lower cost. We describe the metrics we selected and implemented, illustrating with examples drawn from several development projects 相似文献
6.
Visualizing digital evidence in an easy and constructive manner is a major problem because of the advanced techniques for hiding, wiping, encrypting and deleting digital data developed during the last few years. Òo tackle this problem, a system for visualizing digital data in 3-dimensional (3D) mode has been developed. XML was used as a common language to allow fine-grained management of digital data with flexibility and ease. The extensibility of the implementation makes it particularly suitable as a research and development platform in the sector of open source computer forensics tools for the future. This article examines real-life problems that benefit from using this tool in a congenial and constructive manner to validate its key underlining concept. The design decisions that have been taken in producing the system architecture, and the features it supports are elaborated upon. To determine the effectiveness of the tool, an actual case study is presented which examines the results of the tool and why it is necessary to go for an open source model as a standard. The paper concludes with performance measurements of the tool and suggests possible extensions to make the tool even smarter. 相似文献
7.
Multimedia Tools and Applications - Traditional or Binary Steganalysis brands a digital object such as an image as stego or innocent only but modern day information security requires deeper insight... 相似文献
8.
Chun-Che Huang 《IEEE transactions on systems, man, and cybernetics. Part A, Systems and humans : a publication of the IEEE Systems, Man, and Cybernetics Society》2001,31(6):508-523
Businesses are undergoing a major paradigm shift, moving from traditional management into a world of agile organizations and processes. An agile corporation should be able to rapidly respond to market changes. For this reason, corporations have been seeking to develop numerous information technology (IT) systems to assist with the management of their business processes. Many of the coming new business processes may contain embedded intelligent agent-based systems. Agent technology looks set to radically alter not only the way in which computers are interacted, but also the way complex processes, e.g., product development, are conceptualized and built. The paper presents a fuzzy approach based on an intelligent agent framework to develop modular products. This approach aims to address the research issue: "How can modular design be carried out through intelligent agents to meet a customer's fuzzy requirements using modules that come from suppliers that are geographically separated and operate on differing computer platforms?" The proposed methodology is applied to a real-world case that involves module-based synthesis at one of largest distribution centers in the world 相似文献
9.
10.
With the development of World Wide Web (www), storage and utilization of web data has become a big challenge for data management
research community. Web data are essentially heterogeneous data, and may change schema frequently, traditional relational
data model is inappropriate for web data management. A new data model, called Wide Table (or WT for simplicity), was introduced for this task. There are several characteristics of the WT model. First, WT is usually
highly sparsely populated so that most data can be fit into a line or record. Second, queries are composed on only a small
subset of the attributes. Thus, existing query processing and optimization techniques for relational database with normalized
tables will not work efficiently anymore. Furthermore, WT is usually of extremely large volume. It is thought that only large-scale
distributed storage can accommodate themassive data set. In this paper, requirements and challenges to web data management
are discussed. Existing techniques for WT, including logical presentation, physical storage, and query processing, are introduced
and analyzed in detail. 相似文献
11.
In this paper, we provide a logic for digital investigation of security incidents and its high-level-specification language. The logic is used to prove the existence or non-existence of potential attack scenarios which, if executed on the investigated system, would produce the different forms of specified evidence. To generate executable attack scenarios showing with details how the attack scenario was conducted and how the system behaved accordingly, we develop in this paper a Model Checker tool which provides tolerance to unknown attacks and integrates a technique for hypothetical actions generation 相似文献
12.
The aim of this paper is to present a model for the Computer Centre of an important Italian banking group. The model groups data and transactions to deal with the large dimension of the Centre. The transactions arrivals are considered as Posson stochastic variables and the probability values are estimated. Some computational results are given. 相似文献
13.
Policy actions against various cyberspace crimes should respond both systematically and specifically to the nature of each crime and its accompanying evidence. Voice phishing as an emerging internet fraud practice has already victimized many unsuspecting consumers and hit the headlines frequently; cyber prostitution, pornography, illegal duplication of software, and online gambling have also become routine problems. Anonymity and facelessness in cyberspace distort one’s sense of guilt and allow for rapid collection, modification, and distribution of illegal and harmful information that disrupts social order and public safety. The purpose of this research is to examine the concept and type of cyber crime and to trace its trend in the Republic of Korea in the recent years. Due to its informational, national, global, and networking properties, cyber crime often puts serious restraints on conventional criminal justice procedures that cannot compromise such legal issues as individual safety and protection of privacy. If made accessible, information stored in private computers has potential to serve as decisive evidence in many criminal litigations. The Korean criminal procedure law promulgated on July 18, 2011 and enforced since January 1, 2012 contains an express provision permitting the search and seizure of digital evidence. But this partial code does not explicitly define the admissibility of digital evidence and acceptable methods of examining it. Hence, it is one of the goals of this research to identify problems pertaining to this law and suggest ways to improve it. 相似文献
14.
Stephen Mahar Peter A. Salzarulo P. Daniel Wright 《Computers & Operations Research》2012,39(5):991-999
A major development in online retailing is the significant increase in the number of traditional “offline” retailers extending their brands online. Many of these retail/e-tail firms are attempting to leverage channel synergies by allowing customers to purchase products over the internet and then pick their orders up at one of the firm's local stores. This paper proposes that the firm presents only a subset of its stores to online customers as available pickup locations, rather than simply listing all local stores with inventory. By doing so, the firm can protect stores with critically low inventory levels and thereby reduce backorder costs. Specifically, we develop and evaluate a dynamic pickup site inclusion policy that incorporates real-time information to specify which of the firm's e-fulfillment locations should be presented at online checkout. Computational results indicate that managing in-store demand via such policies can decrease total cost (holding, backorder, and lost or redirected pickup sale costs) by as much as 18% over allowing customers to pick online orders up from any site with available inventory. The percentage of pickup sales and customers' sensitivity to travel are critical in determining the magnitude of the benefit. 相似文献
15.
The Year 2000 software conversion effort dramatically illustrates how time consuming and costly maintaining large computer systems can become, especially when each system consists of millions of lines of source code. Understanding how a system's components interact is a key factor in implementing portfolio-wide changes, adding new features, and providing ongoing system maintenance. Any help that software developers can get in making existing software systems easier to understand improves developer productivity, enhances software quality, and reduces development cycles, all leading to faster time to market. The paper discusses the Visual Insights code viewer, a visualization application developed specifically to address the problem of working with large amounts of source code. Used within the Lucent Technologies 5ESS Switch development environment, the code viewer has resulted in increased software developer productivity. In addition, a systems integrator currently uses this tool to help understand and correct the Year 2000 date references in customer software 相似文献
16.
Network administrators use several methods to protect their network. Installing a honeynet within large enterprise networks provides an additional security tool. Honeynets complement the use of firewalls and IDS and help overcome some of the shortcomings inherent in those systems. In addition, honeynets can also serve as platforms for conducting computer security research and education. 相似文献
17.
Immersion in a digital virtual environment (DVE) increases the likelihood that individuals will feel present in the DVE and hence respond as they would in a similar physically grounded environment. Previous research utilizing high-fidelity technology has demonstrated that by starting a virtual experience in a virtual replica of the immediate physical environment, presence is increased. The purpose of this study was to determine whether utilizing such a transitional environment to increase presence could be replicated on a significantly less immersive system—a 2D desktop monitor with mouse and keyboard for navigation. Participants began their DVE experience either in a “preamble” DVE made to look like the surrounding physical laboratory space, or in a novel DVE (i.e., a house). Then, they were given verbal instructions to leave their respective environments and told to go up a set of stairs to explore a museum. Afterward, they reported levels of immersion and presence in the latter DVE. Results demonstrated that entering a target DVE via a familiar “preamble” environment increased perceptions of reality judgment of the virtual experience, perceptions of possibility to act, and levels of presence. These results suggest that incorporating a familiar digital preamble environment as a prelude to the target DVE enables DVE designers and enthusiasts to increase presence without having to invest in more expensive hardware, but it could also augment existing immersive technology. Their efficacy may be because they offer a gradual transition into the virtual world, such that the familiarity eases users into the novel experience. 相似文献
18.
This article explores how serious games improve knowledge and competencies management in the context of human resources management. The exploratory research, based on the conceptual framework of the SECI model from Nonaka, analyzes the performances of three serious games developed in 3 different financial companies, from France, USA and India. These three case studies will help to define a 7-step development process of a knowledge and competencies management serious game. The banking sector has interesting characteristics for this study, some of the associated knowledge being both very standardized and also highly heterogeneous. It will be shown that serious games contribute significantly to improve “socialization”, “externalization”, “combination”, and “internalization” of knowledge and that they promote benchmarking throughout the company. 相似文献
19.
The internet users share a massive amount of digital images daily. The accessibility of powerful image manipulation tools has made the integrity of image contents questionable. The most popular image tampering is to duplicate a region elsewhere in the same image to replicate or conceal some other region. The duplicated regions have identical color and texture attributes that make this artifact invisible to the human eye. Therefore, efficient techniques are required to verify the credibility of image contents by detecting the regions duplicated in the digital images. This paper proposes an efficient technique for exposing region duplication forgery in digital images. The proposed technique divides the approximation (LL) sub-band of shift invariant stationary wavelet transform into overlapping blocks of w × w (i.e. w = 4, 8) sizes. The distinctive features extracted from the overlapping blocks are utilized to expose the region duplication forgeries in digital images. The experimental results of the proposed technique are compared with state-of-the-art techniques that reveal the prominence, and effectiveness of the proposed technique in terms of precision, recall and F 1 score for different block sizes. Therefore, the proposed technique can reliably be applied to identify the counterfeited regions and the benefits of the proposed technique can be achieved in different fields for example crime investigation, news reporting, and judiciary. 相似文献
20.
Weili Han Ye Cao Elisa Bertino Jianming Yong 《Expert systems with applications》2012,39(15):11861-11869
The theft attacks of web digital identities, e.g., phishing, and pharming, could result in severe loss to users and vendors, and even hold users back from using online services, e-business services, especially. In this paper, we propose an approach, referred to as automated individual white-list (AIWL), to protect user’s web digital identities. AIWL leverages a Naïve Bayesian classifier to automatically maintain an individual white-list of a user. If the user tries to submit his or her account information to a web site that does not match the white-list, AIWL will alert the user of the possible attack. Furthermore, AIWL keeps track of the features of login pages (e.g., IP addresses, document object model (DOM) paths of input widgets) in the individual white-list. By checking the legitimacy of these features, AIWL can efficiently defend users against hard attacks, especially pharming, and even dynamic pharming. Our experimental results and user studies show that AIWL is an efficient tool for protecting web digital identities. 相似文献