首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Research in information security, risk management and investment has grown in importance over the last few years. However, without reliable estimates on attack probabilities, risk management is difficult to do in practice. Using a novel data set, we provide estimates on attack propensity and how it changes with disclosure and patching of vulnerabilities. Disclosure of software vulnerability has been controversial. On one hand are those who propose full and instant disclosure whether the patch is available or not and on the other hand are those who argue for limited or no disclosure. Which of the two policies is socially optimal depends critically on how attack frequency changes with disclosure and patching. In this paper, we empirically explore the impact of vulnerability information disclosure and availability of patches on attacks targeting the vulnerability. Our results suggest that on an average both secret (non-published) and published (published and not patched) vulnerabilities attract fewer attacks than patched (published and patched) vulnerabilities. When we control for time since publication and patches, we find that patching an already known vulnerability decreases the number of attacks, although attacks gradually increase with time after patch release. Patching an unknown vulnerability, however, causes a spike in attacks, which then gradually decline after patch release. Attacks on secret vulnerabilities slowly increase with time until the vulnerability is published and then attacks rapidly decrease with time after publication.
Rahul TelangEmail:

Ashish Arora   is Professor of Economics and Public Policy in the Heinz School. Arora’s research focuses on the economics of technology and technical change. His research interests include the study of technology-intensive industries such as software, biotechnology, and chemicals; the role of patents and licensing in promoting technology startups; and the economics of information technology. Anand Nandkumar   is a graduate student pursuing Ph.D. at the Carnegie Mellon University. His research interests include economics of information security and economics of entrepreneurship and strategy in the software industry. Rahul Telang   is an Assistant Professor of Information Systems at Carnegie Mellon University. Telang’s key research field is in economics of Information security. He has done extensive empirical and analytical work on disclosure issues surrounding software vulnerabilities, software vendor’s incentives to provide quality, mechanism designs for optimal security investments in a multi-unit firms, etc.   相似文献   

2.
Modern business process management expands to cover the partner organisations’ business processes across organisational boundaries, and thereby supports organisations to coordinate the flow of information among organisations and link their business processes. With collaborative business processes, organisations can create dynamic and flexible collaborations to synergically adapt to the changing conditions, and stay competitive in the global market. Due to its significant potential and value, collaborative business processes are now turning to be an important issue of contemporary business process management, and attracts lots of attention and efforts from both academic and industry sides. In this paper, we review the development of B2B collaboration and collaborative business processes, provide an overview of related issues in managing collaborative business processes, and discuss some emerging technologies and their relationships to collaborative business processes. Finally, we introduce the papers that are published in this special issue.
Xiaohui Zhao (Corresponding author)Email:
  相似文献   

3.
Increasingly, business processes are being controlled and/or monitored by information systems. As a result, many business processes leave their “footprints” in transactional information systems, i.e., business events are recorded in so-called event logs. Process mining aims at improving this by providing techniques and tools for discovering process, control, data, organizational, and social structures from event logs, i.e., the basic idea of process mining is to diagnose business processes by mining event logs for knowledge. In this paper we focus on the potential use of process mining for measuring business alignment, i.e., comparing the real behavior of an information system or its users with the intended or expected behavior. We identify two ways to create and/or maintain the fit between business processes and supporting information systems: Delta analysis and conformance testing. Delta analysis compares the discovered model (i.e., an abstraction derived from the actual process) with some predefined processes model (e.g., the workflow model or reference model used to configure the system). Conformance testing attempts to quantify the “fit” between the event log and some predefined processes model. In this paper, we show that Delta analysis and conformance testing can be used to analyze business alignment as long as the actual events are logged and users have some control over the process.
W. M. P. van der AalstEmail:
  相似文献   

4.
Statistical process control (SPC) is a conventional means of monitoring software processes and detecting related problems, where the causes of detected problems can be identified using causal analysis. Determining the actual causes of reported problems requires significant effort due to the large number of possible causes. This study presents an approach to detect problems and identify the causes of problems using multivariate SPC. This proposed method can be applied to monitor multiple measures of software process simultaneously. The measures which are detected as the major impacts to the out-of-control signals can be used to identify the causes where the partial least squares (PLS) and statistical hypothesis testing are utilized to validate the identified causes of problems in this study. The main advantage of the proposed approach is that the correlated indices can be monitored simultaneously to facilitate the causal analysis of a software process.
Chih-Ping ChuEmail:

Ching-Pao Chang   is a PhD candidate in Computer Science & Information Engineering at the National Cheng-Kung University, Taiwan. He received his MA from the University of Southern California in 1998 in Computer Science. His current work deals with the software process improvement and defect prevention using machine learning techniques. Chih-Ping Chu   is Professor of Software Engineering in Department of Computer Science & Information Engineering at the National Cheng-Kung University (NCKU) in Taiwan. He received his MA in Computer Science from the University of California, Riverside in 1987, and his Doctorate in Computer Science from Louisiana State University in 1991. He is especially interested in parallel computing and software engineering.   相似文献   

5.
Scenario-based methods for evaluating software architecture require a large number of stakeholders to be collocated for evaluation meetings. Collocating stakeholders is often an expensive exercise. To reduce expense, we have proposed a framework for supporting software architecture evaluation process using groupware systems. This paper presents a controlled experiment that we conducted to assess the effectiveness of one of the key activities, developing scenario profiles, of the proposed groupware-supported process of evaluating software architecture. We used a cross-over experiment involving 32 teams of three 3rd and 4th year undergraduate students. We found that the quality of scenario profiles developed by distributed teams using a groupware tool were significantly better than the quality of scenario profiles developed by face-to-face teams (p < 0.001). However, questionnaires indicated that most participants preferred the face-to-face arrangement (82%) and 60% thought the distributed meetings were less efficient. We conclude that distributed meetings for developing scenario profiles are extremely effective but that tool support must be of a high standard or participants will not find distributed meetings acceptable.
Ross JefferyEmail:

Dr. Muhammad Ali Babar   is a Senior Researcher with Lero, the Irish Software Engineering Research Centre. Previously, he worked as a researcher with National ICT Australia (NICTA). Prior to joining NICTA, he worked as a software engineer and an IT consultant. He has authored/co-authored more than 50 publications in peer-reviewed journals, conferences, and workshops. He has presented tutorials in the area of software architecture knowledge management at various international conferences including ICSE 2007, SATURN 2007 and WICSA 2007. His current research interests include software product lines, software architecture design and evaluation, architecture knowledge management, tooling supporting, and empirical methods of technology evaluation. He is a member of the IEEE Computer Society. Barbara Kitchenham   is Professor of Quantitative Software Engineering at Keele University in the UK. From 2004-2007, she was a Senior Principal Researcher at National ICT Australia. She has worked in software engineering for nearly 30 years both in industry and academia. Her main research interest is software measurement and its application to project management, quality control, risk management and evaluation of software technologies. Her most recent research has focused on the application of evidence-based practice to software engineering. She is a Chartered Mathematician and Fellow of the Institute of Mathematics and its Applications, a Fellow of the Royal Statistical Society and a member of the IEEE Computer Society. Dr. Ross Jeffery   is Research Program Leader for Empirical Software Engineering in NICTA and Professor of Software Engineering in the School of Computer Science and Engineering at UNSW. His research interests are in software engineering process and product modeling and improvement, electronic process guides and software knowledge management, software quality, software metrics, software technical and management reviews, and software resource modeling and estimation. His research has involved over fifty government and industry organizations over a period of 20 years and has been funded by industry, government and universities. He has co-authored four books and over one hundred and forty research papers. He was elected Fellow of the Australian Computer Society for his contribution to software engineering research.   相似文献   

6.
In this paper, a statistical model called statistical local spatial relations (SLSR) is presented as a novel technique of a learning model with spatial and statistical information for semantic image classification. The model is inspired by probabilistic Latent Semantic Analysis (PLSA) for text mining. In text analysis, PLSA is used to discover topics in a corpus using the bag-of-word document representation. In SLSR, we treat image categories as topics, therefore an image containing instances of multiple categories can be modeled as a mixture of topics. More significantly, SLSR introduces spatial relation information as a factor which is not present in PLSA. SLSR has rotation, scale, translation and affine invariant properties and can solve partial occlusion problems. Using the Dirichlet process and variational Expectation-Maximization learning algorithm, SLSR is developed as an implementation of an image classification algorithm. SLSR uses an unsupervised process which can capture both spatial relations and statistical information simultaneously. The experiments are demonstrated on some standard data sets and show that the SLSR model is a promising model for semantic image classification problems.
Wenhui Li (Corresponding author)Email:

Dongfeng Han   received the B.Sc. 2002 and M.S. 2005 in computer science and technology from Jilin University, Changchun, P. R. China. From 2005, he pursuits the PhD degree in computer science and technology Jilin University. His research interests include computer vision, image processing, machine learning and pattern recognition. Wenhui Li   received the PhD degree in computer science from Jilin University in 1996. Now he is a professor of Jilin University. His research interests include computer vision, computer graphic and virtual reality. Zongcheng Li   undergraduated student of Shandong University of Technology, P. R. China. His research interests include computer vision and image processing.   相似文献   

7.
This research develops a framework for organizational value creation from agile IT applications. Based on the four themes in the business value research—business process perspective, complementarities, application level of analysis, and extent of use—three antecedents (organizational fit, process assimilation, and network adoption) are identified as pre-requisites for realizing the value of agile supply chain applications. Advanced planning and scheduling (APS) systems are used as examples, and two case studies for their implementation in the electronics and consumer goods industry are reported to support the propositions. The theories of diffusion of innovation, complementarities, network externalities, and technology structuration are applied to develop the propositions for fit, assimilation, and network effects. Information sharing and industry clockspeed are identified as the moderating factors in the proposed model. The framework has both managerial and research relevance. The research guides managers regarding ways to more fully realize the value of agile applications and forms a basis for future research on the business value of IT applications.
David J. ClossEmail:
  相似文献   

8.
Various types of applications access to objects distributed in peer-to-peer (P2P) overlay networks. Even if the locations of target objects are detected by some algorithms like flooding and distributed hash table (DHT) ones, applications cannot manipulate the target objects without access requests. It is critical to discover which peer can manipulate an object in which method, i.e. only a peer with an access right is allowed to manipulate an object. Hence, the application rather has to find target peers which can manipulate a target object than detecting the location of a target object. Due to the scalability and variety of peers, it is difficult, possibly impossible to maintain a centralized directory showing in which peer each object is distributed. An acquaintance peer of a peer p is a peer whose service the peer p knows and with which the peer p can directly communicate. We discuss types of acquaintance relations of peers with respect to what objects each peer holds, is allowed to manipulate, and can grant access rights on. Acquaintance peers of a peer may notify the peer of different information on target peers due to communication and propagation delay. Here, it is critical to discuss how much a peer trusts each acquaintance peer. We first define the satisfiability, i.e. how much a peer is satisfied by issuing an access request to another peer. For example, if a peer locally manipulates a target object o and sends a response, a peer p i is mostly satisfied. On the other hand, if has to ask another peer to manipulate the object o, p i is less satisfied. We define the trustworthiness of an acquaintance peer of a peer from the satisfiability and the ranking factor.
Makoto TakizawaEmail:
  相似文献   

9.
Businesses need to continuously focus on change and innovation in order to survive in dynamic environments. The ability of an organization to deploy appropriate business processes requires that the fit between business processes and systems that support the management of these processes is continuously maintained and evolved. Acquisition and use of the knowledge about the context in which business processes are defined, modified, and implemented can help maintain this fit. We identify requirements for a business process management system (BPMS) capable of managing contextual knowledge. Based on these requirements, we have enhanced KOPeR, a knowledge-based system for business process improvement, with an explanation facility that can acquire and maintain knowledge about the context behind process definitions and design choices. A case study that illustrates the functionalities of this system which is designed to improve the fit between business processes and BPMS is presented.
Peng XuEmail:
  相似文献   

10.
We present a comprehensive unified modeling language (UML) statechart diagram analysis framework. This framework allows one to progressively perform different analysis operations to analyze UML statechart diagrams at different levels of model complexity. The analysis operations supported by the framework are based on analyzing Petri net models converted from UML statechart diagrams using a previously proposed transformation approach. After introducing the general framework, the paper emphasizes two simulation-based analysis operations from the framework: direct MSC inspection, which provides a visual representation of system behavior described by statechart diagrams; and a pattern-based trace query technique, which can be used to define and query system properties. Two case-study examples are presented with different emphasis. The gas station example is a simple multi-object system used to demonstrate both the visual and query-based analysis operations. The early warning system example uses only one object, but features composite states and includes analysis specifically aimed at one composite state feature, history states.
Sol M. ShatzEmail:

Jiexin Lian   is a Ph.D. candidate in computer science at the University of Illinois at Chicago. His research interests include software engineering and Petri net theory and applications. He received his B.S. in computer science from Tongji University, China. Zhaoxia Hu   received her B.S. degree in Physics from Beijing University, Beijing, China in 1990. She received the M.S. and Ph.D. degrees, in computer science, from University of Illinois at Chicago, Chicago, IL, in 2001 and 2005, respectively. She currently works for an investment research company (Morningstar, Inc.) as an application developer. Sol M. Shatz   received the B.S. degree in computer science from Washington University, St. Louis, Missouri, and the M.S. and Ph.D. degrees, also in computer science, from Northwestern University, Evanston, IL, in 1981 and 1983, respectively. He is currently a Professor of Computer Science and Associate Dean for Research and Graduate Studies in the College of Engineering at the University of Illinois at Chicago. He also serves as co-director of the Concurrent Software Systems Laboratory. His research is in the field of software engineering, with particular interest in formal methods for specification and analysis of concurrent and distributed software. He has served on the program and organizing committees of many conferences, including co-organizer of the Workshop on Software Engineering and Petri Nets held in Denmark, June 2000; program co-chair for the International Conference on Distributed Computing Systems (ICDCS), 2003; and General Chair for ICDCS 2007. He has given invited talks in the US, Japan, and China, and presented tutorials (both live and video) for the IEEE Computer Society. Dr. Shatz is a member of the Editorial Board for various technical journals, having served on the Editorial Board for IEEE Transactions on Software Engineering from 2001 to 2005. His research as been supported by grants from NSF and ARO, among other agencies and companies. He has received various teaching awards from the University of Illinois at Chicago as well as the College of Engineering’s Faculty Research Award in 2003.   相似文献   

11.
A novel approach for process mining based on event types   总被引:2,自引:0,他引:2  
Despite the omnipresence of event logs in transactional information systems (cf. WFM, ERP, CRM, SCM, and B2B systems), historic information is rarely used to analyze the underlying processes. Process mining aims at improving this by providing techniques and tools for discovering process, control, data, organizational, and social structures from event logs, i.e., the basic idea of process mining is to diagnose business processes by mining event logs for knowledge. Given its potential and challenges it is no surprise that recently process mining has become a vivid research area. In this paper, a novel approach for process mining based on two event types, i.e., START and COMPLETE, is proposed. Information about the start and completion of tasks can be used to explicitly detect parallelism. The algorithm presented in this paper overcomes some of the limitations of existing algorithms such as the α-algorithm (e.g., short-loops) and therefore enhances the applicability of process mining.
Jiaguang SunEmail:
  相似文献   

12.
A key consideration during investment decision making is the overall business value potential of an information technology (IT) solution. The complexity of the contemporary IT landscape is increasing. As information systems and technologies become more advanced and interconnected, they often impact multiple business processes in the organization. This in turn increases the complexity of IT investment decisions. This paper describes a decision framework for considering investments in information technologies that impact multiple business processes in the organization. The decision framework is illustrated via a case study of a small business that invested in mobile and wireless computing. The microcosm of the small business serves to illustrate some aspects of the business value derived from information technology investments that are often challenging to isolate in more complex organizational environments. The decision framework can support managers to analyze the overall business value returns arising from the ‘ripple effect’ of an IT investment on core and ancillary business processes. In the decision framework, the business value ripple effect is analyzed via a vertical dimension that emanates from core business processes to ancillary processes, and a horizontal dimension that extends over time.
Rens ScheepersEmail:
  相似文献   

13.
The functions of Australia’s railways are divided between the delivery of suburban passenger railway services, long distance general and intermodal freight services and regional bulk commodity haulage. There is a national interest in the efficient carriage of export freight flows because Australia is a major trading nation. Australia’s railways are expected play their part in hauling export commodities. Unsurprisingly, there are conflicting demands on these railways. Hence, universities, and the industry itself, are directing research and investigation effort into policy, planning, engineering, operational and human factors matters. A question which then arises is what means could be used to analyse the problems identified during this research and investigation. This paper is thus concerned with the inter-relationships between the planning, engineering and operations of railways in Australia. It identifies four areas of analysis associated with the planning and development of railway infrastructure and operations. It then discusses a range of analytical tools which could be applied to different components of these analytical areas and critiques their appropriateness from an Australian perspective. Having made this assessment, the paper uses a recent Australian case study to show how analytical tools could be used and what lessons might be learnt from the process. Figure Use of Railway Analysis Tools from an Australian Perspective
Alex W. WardropEmail:
  相似文献   

14.
Because requirements engineering (RE) problems are widely acknowledged as having a major impact on the effectiveness of the software development process, Sommerville et al. have developed a requirements maturity model. However, research has shown that the measurement process within Sommerville’s model is ambiguous, and implementation of his requirements maturity model leads to confusion. Hence, the objective of our research is to propose a new RE maturity measurement framework (REMMF) based on Sommerville’s model and to provide initial validation of REMMF. The main purpose of proposing REMMF is to allow us to more effectively measure the maturity of the RE processes being used within organisations and to assist practitioners in measuring the maturity of their RE processes. In order to evaluate REMMF, two organisations implemented the measurement framework within their IT divisions, provided us with an assessment of their requirements process and gave feedback on the REMMF measurement process. The results show that our measurement framework is clear, easy to use and provides an entry point through which the practitioners can effectively judge the strengths and weakness of their RE processes. When an organisation knows where it is, it can more effectively plan for improvement.
June VernerEmail:
  相似文献   

15.
We show how to create a music video automatically, using computable characteristics of the video and music to promote coherent matching. We analyze the flow of both music and video, and then segment them into sequences of near-uniform flow. We extract features from the both video and music segments, and then find matching pairs. The granularity of the matching process can be adapted by extending the segmentation process to several levels. Our approach drastically reduces the skill required to make simple music videos.
Siwoo ByunEmail:

Jong-Chul Yoon   received his B.S. and M.S. degree in Media from Ajou University in 2003 and 2005, respectively. He is currently a Ph.D. candidate in the Computer Science from Yonsei University. His research interests include computer animation, multi-media control, and geometric modeling. In-Kwon Lee   received his B.S. degree in Computer Science from Yonsei University in 1989 and earned his M.S. and Ph.D. in Computer Science from POSTECH in 1992 and 1997, respectively. Currently, he is teaching and researching in the area of computer animation, geometric modeling, and computational music in Yonsei University. Siwoo Byun   received his B.S. degree in Computer Science from Yonsei University in 1989 and earned his M.S. and Ph.D. in Computer Science from Korea Advanced Institute of Science and Technology (KAIST) in 1991 and 1999, respectively. Currently, he is teaching and researching in the area of distributed database systems, mobile computing, and fault-tolerant systems in Anyang University.   相似文献   

16.
Emphysema is a common chronic respiratory disorder characterised by the destruction of lung tissue. It is a progressive disease where the early stages are characterised by a diffuse appearance of small air spaces, and later stages exhibit large air spaces called bullae. A bullous region is a sharply demarcated region of emphysema. In this paper, it is shown that an automated texture-based system based on co-training is capable of achieving multiple levels of emphysema extraction in high-resolution computed tomography (HRCT) images. Co-training is a semi-supervised technique used to improve classifiers that are trained with very few labelled examples using a large pool of unseen examples over two disjoint feature sets called views. It is also shown that examples labelled by experts can be incorporated within the system in an incremental manner. The results are also compared against “density mask”, currently a standard approach used for emphysema detection in medical image analysis and other computerized techniques used for classification of emphysema in the literature. The new system can classify diffuse regions of emphysema starting from a bullous setting. The classifiers built at different iterations also appear to show an interesting correlation with different levels of emphysema, which deserves more exploration.
Mithun Prasad (Corresponding author)Email:
Arcot SowmyaEmail:
Peter WilsonEmail:

Mithun Prasad   received his PhD from the University of New South Wales, Sydney, Australia in 2006. He was a postdoctoral scholar at the University of California, Los Angeles and now a research associate at Rensselaer Polytechnic Institute, NY. His research interests are computer aided diagnosis, cell and tissue image analysis. Arcot Sowmya   is a Professor, School of Computer Science and Engineering, UNSW, Sydney. She holds a PhD degree in Computer Science from Indian Institute of Technology, Bombay, besides other degrees in Mathematics and Computer Science. Her areas of research include learning in vision as well as embedded system design. Her research has been applied to extraction of linear features in remotely sensed images as well as feature extraction, recognition and computer aided diagnosis in medical images. Peter Wilson   is a clinical Radiologist at Pittwater Radiology in Sydney. He was trained at Royal North Shore Hospital and taught Body Imaging at the University of Rochester, NY, prior to taking up his current position.   相似文献   

17.
Current workflow management technology offers rich support for process-oriented coordination of distributed teamwork. In this paper, we evaluate the performance of an industrial workflow process where similar tasks can be performed by various actors at many different locations. We analyzed a large workflow process log with state-of-the-art mining tools associated with the ProM framework. Our analysis leads to the conclusion that there is a positive effect on process performance when workflow actors are geographically close to each other. Our case study shows that the use of workflow technology in itself is not sufficient to level geographical barriers between team members and that additional measures are required for a desirable performance.
Byungduk JeongEmail:
  相似文献   

18.
Service-oriented architecture (SOA) and Software as a Service (SaaS) are the latest hot topics to software manufacturing and delivering, and attempt to provide a dynamic cross-organisational business integration solution. In a dynamic cross-organisational collaboration environment, services involved in a business process are generally provided by different organisations, and lack supports of common security mechanisms and centralized management middleware. On such occasions, services may have to achieve middleware functionalities and achieve business objectives in a pure peer-to-peer fashion. As the participating services involved in a business process may be selected and combined at run time, a participating service may have to collaborate with multiple participating services which it has no pre-existing knowledge in prior. This introduces some new challenges to traditional trust management mechanisms. Automated Trust Negotiation (ATN) is a practical approach which helps to generate mutual trust relationship for collaborating principals which may have no pre-existing knowledge about each other without in a peer-to-peer way. Because credentials often contain sensitive attributes, ATN defines an iterative and bilateral negotiation process for credentials exchange and specifies security policies that regulate the disclosure of sensitive credentials. Credentials disclosure in the iterative process may follow different orders and combinations, each of which forms a credential chain. It is practically desirable to identify the optimal credential chain that satisfies certain objectives such as minimum release of sensitive information and minimum performance penalty. In this paper we present a heuristic and context-aware algorithm for identifying the optimal chain that uses context-related knowledge to minimize 1) the release of sensitive information including both credentials and policies and 2) the cost of credentials retrieving. Moreover, our solution offers a hierarchical method for protecting sensitive policies and provides a risk-based strategy for handling credential circular dependency. We have implemented the ATN mechanisms based on our algorithm and incorporated them into the CROWN Grid middleware. Experimental results demonstrate their performance-related advantages over other existing solutions.
Jie XuEmail:

Jianxin Li   is a research staff and assistant professor in the School of Computer Science and Engineering, Beihang University, Beijing china. He received the Ph.D. degree in Jan. 2008. He has authored over 10 papers in SRDS, HASE and eScience etc. Her research interests include trust management, information security and distributed system.
Dacheng Zhang   received his BSc. in Computer Science at Northern Jiaotong University. Dacheng then worked at the Beijing Rail Mansion and Beijing Zhan Hua Dong He Ltd. as a software engineer. In 2004, Dacheng received his MSc. degree in Computer Science at the University of Durham. The topic of his thesis was “Multi-Party Authentication for Web Services”. Dacheng is now a PhD student in the School of Computing, University of Leeds, UK. His research area covers Multi-Party Authentication systems for Web services, Long Transactions, and Identity based authentication systems. Currently, he is exploring Coordinated Automatic Actions to manage Web Service Multi-Party Sessions.
Jinpeng Huai   is a Professor and Vice President of Beihang University. He serves on the Steering Committee for Advanced Computing Technology Subject, the National High-Tech Program (863) as Chief Scientist. He is a member of the Consulting Committee of the Central Government Information Office, and Chairman of the Expert Committee in both the National e-Government Engineering Taskforce and the National e-Government Standard office. Dr. Huai and his colleagues are leading the key projects in e-Science of the National Science Foundation of China (NSFC) and Sino-UK. He has authored over 100 papers. His research interests include middleware, peer-to-peer (P2P), grid computing, trustworthiness and security.
Professor Jie Xu   is Chair of Computing at the University of Leeds (UK) and Director of the EPSRC WRG e-Science Centre involving the three White Rose Universities of Leeds, York and Sheffield. He is also a visiting professor at the School of Computing Science, the University of Newcastle upon Tyne (UK) and a Changjiang Scholar visiting professor at Chongqing University (China). He has worked in the field of Distributed Computer Systems for over twenty years and had industrial experience in building large-scale networked systems. Professor Xu now leads a collaborative research team at Leeds studying Grid and Internet technologies with a focus on complex system engineering, system security and dependability, and evolving system architectures. He is the recipient of the BCS/IEE Brendan Murphy Prize 2001 for the best work in the area of distributed systems and networks. He has led or co-led many key research projects served as Program Chair/PC member of, many international computer conferences. Professor Xu has published more than 150 edited books, book chapters and academic papers, and has been Editor of IEEE Distributed Systems since 2000.   相似文献   

19.
Efficient collaboration allows organizations and individuals to improve the efficiency and quality of their business activities. Delegations, as a signif icant approach, may occur as workflow collabora tions, supply chain collaborations, or collaborative commerce. Role-based delegation models have been used as flexible and efficient access management for collaborative business environments. Delegation revocations can provide significant functionalities for the models in business environments when the delegated roles or permissions are required to get back. However, problems may arise in the revocation process when one user delegates user U a role and another user delegates U a negative authorization of the role. This paper aims to analyse various role-based delegation revocation features through examples. Revocations are categorized in four dimensions: Dependency, Resilience, Propagation and Dominance. According to these dimensions, sixteen types of revocations exist for specific requests in collaborative business environments: DependentWeakLocalDelete, Dependent WeakLocalNegative, DependentWeakGlobalDelete, DependentWeakGlobalNegative, IndependentWeak LocalDelete, IndependentWeakLocalNegative, Inde pendentWeakGlobalDelete, IndependentWeakGlobal Negative, and so on. We present revocation delegating models, and then discuss user delegation authorization and the impact of revocation operations. Finally, comparisons with other related work are discussed.
Yanchun ZhangEmail:
  相似文献   

20.
Facilitation of collaborative business processes across organizational and infrastructural boundaries continues to present challenges to enterprise software developers. One of the greatest difficulties in this respect is achieving a streamlined pipeline from business modeling to execution infrastructures. In this paper we present Evie - an approach for rapid design and deployment of event driven collaborative processes based on significant language extensions to Java that are characterized by abstract and succinct constructs. The focus of this paper is to provide proof of concept of Evie’s expressability using a recent benchmark known as service interaction patterns. While the patterns encapsulate the breadth of required business process semantics the Evie language delivers a rapid means of encoding them at an abstract level, and subsequently compiling and executing them to create a fully fledged Java-based execution environment.
Wasim SadiqEmail:

Tony O’Hagan   is a Senior Research Fellow in School of Information Technology and Electrical Engineering at The University of Queensland, Brisbane, Australia. He is currently working in the eResearch group of the School of Information Technology and Electrical Engineering developing software tools to assist scientists in research data publication. His interests include Business Process Execution, Collaborative Business Processes, Scientific Processes, Service Oriented Architectures and Language Design, Messaging Middleware and Application Security. Tony has over 20 years software development experience and has been awarded a Postgraduate Diploma of Information Technology and B. Sc. degree majoring in Computing from the University of Queensland. Shazia Sadiq   is a Senior Lecturer in the School of Information Technology and Electrical Engineering at The University of Queensland, Brisbane, Australia. She is part of the Data and Knowledge Engineering (DKE) research group and is involved in teaching and research in databases and information systems. Shazia holds a PhD from The University of Queensland in Information Systems and a Masters degree in Computer Science from the Asian Institute of Technology, Bangkok, Thailand. Her main research interests are innovative solutions for Business Information Systems that span several areas including business process management, governance, risk and compliance, data quality management, workflow systems, and service oriented computing. Wasim Sadiq   is a Research Architect at SAP Research. He has over 22 years of research and development experience in the areas of enterprise applications, business process management, workflow technology, service-oriented architectures, database management systems, distributed systems, and e-learning. Wasim has a PhD in Computer Science from the University of Queensland, Australia, in the area of conceptual modeling and verification of workflows. He has led several research projects collaborating with academic and industry partners in Australia, Europe and USA.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号