首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 10 毫秒
1.
Cryptography is one of the most active areas of research in computer science. It survives only where efforts to reduce computational complexity have failed, because the intractability of various problems keeps unwanted intruders at bay. Predicting its future, however, is difficult. Researchers are constantly devising new cryptosystems that are often based on new, untested intractability assumptions. For every cipher that a cryptanalyst breaks, two more seem to sprout up in its place. Despite the fact that revolutionary discoveries in algorithmics might render entire classes of cryptosystems obsolete overnight, the field likely will continue to survive due to its breadth and diversity alone.  相似文献   

2.
3.
The article draws on a decade of work in the UK by the UK Work Organisation Network (UKWON), and recommends a systematic approach. Taking cases in the National Health Service, the focus is on employee involvement, partnership and the development of social capital. High and low road approaches are compared, in an evaluation of the Improving Working Lives programme.
Rosemary Exton (Corresponding author)Email:
Peter TotterdillEmail:
  相似文献   

4.
Neural Computing and Applications - This paper introduces the concept of softarison. Softarisons merge soft set theory with the theory of binary relations. Their purpose is the comparison of...  相似文献   

5.
The enormous popularity of the Internet has made an instant star of the Java programming language. Java's portability, reusability, security and clean design has made it the language of choice for Web-based applications and a popular alternative to C++ for object-oriented programming. Unfortunately, the performance of the standard Java implementation, even with just-in-time compilation technology, is far behind the most popular languages today. The need for an aggressive optimizing compiler for Java is clear. Building on preliminary experience with the JavaSoft bytecode optimizer, this paper explores some of the issues that arise in building efficient implementations of Java. A number of interesting problems are presented by the Java language design, making classical optimization strategies much harder to implement. On the other hand, Java presents the opportunity for some new optimizations, unique for this language. © 1997 John Wiley & Sons, Ltd.  相似文献   

6.
As software engineering (and other) standards are developed over a period of years or decades, the suite of standards thus developed often begins to lose any cohesion that it originally possessed. This has led to discussions in the standards communities of possible collaborative development, interoperability and harmonization of their existing standards. Here, I assess how such harmonization efforts may be aided by recent research results to create better quality standards to replace the status quo.  相似文献   

7.
Methods, means, and tools of compositional programming are considered. The fundamentals of composition of multilanguages objects in fourth-generation languages in the OS/ES environment are considered. New approaches to the formal declaration and standardization of data types in modern languages and practical aspects of systematization of ready-made objects with a view to reusing them in composing large systems in modern environments are shown. New ideas and approaches to the support of interaction between multilanguages objects in the environment of a family of application systems are described.  相似文献   

8.
9.
Many hot objects which may be touched or handled every day can cause either discomfort, pain or burning of the skin. The precise effect will depend on the Contact Temperature tC, an intermediate value between the hot object and the skin temperature. The value of tC varies with the material, and is governed by the Contact Coefficient b, a property of the material which has a wide range of values from metals to plastics. In the experiments with 48 female subjects, surface and contact temperatures for three materials were measured over a wide range, and subject reactions recorded on a five-point comfort scale. From the heat conduction theory outlined, and using the calculated values of b for the three materials, the predicted safe surface temperatures were determined. These predicted values were then compared with the observed temperatures and with those recommended in British Standards.  相似文献   

10.
The theory and practice of Bayesian image labeling   总被引:10,自引:5,他引:5  
Image analysis that produces an image-like array of symbolic or numerical elements (such as edge finding or depth map reconstruction) can be formulated as a labeling problem in which each element is to be assigned a label from a discrete or continuous label set. This formulation lends itself to algorithms, based on Bayesian probability theory, that support the combination of disparate sources of information, including prior knowledge.In the approach described here, local visual observations for each entity to be labeled (e.g., edge sites, pixels, elements in a depth array) yield label likelihoods. Likelihoods from several sources are combined consistently in abstraction-hierarchical label structures using a new, computationally simple procedure. The resulting label likelihoods are combined with a priori spatial knowledge encoded in a Markov random field (MRF). From the prior probabilities and the evidence-dependent combined likelihoods, the a posteriori distribution of the labelings is derived using Bayes' theorem.A new inference method, Highest Confidence First (HCF) estimation, is used to infer a unique labeling from the a posteriori distribution that is consistent with both prior knowledge and evidence. HCF compares favorably to previous techniques, all equivalent to some form of energy minimization or optimization, in finding a good MRF labeling. HCF is computationally efficient and predictable and produces better answers (lower energies) while exhibiting graceful degradation under noise and least commitment under inaccurate models. The technique generalizes to higher-level vision problems and other domains, and is demonstrated on problems of intensity edge detection and surface depth reconstruction.  相似文献   

11.
Direct standardization of rates is a classical method in epidemiology by which the relationship between confounders and variables of analysis is balanced between the samples or populations to be compared. While the method is appropriate if single variables of aggregated data sets are to be compared, it is limited with respect to handling many variables of different measurement scale, especially in primary data situations. Therefore the method of direct standardization is adapted to the analysis of non-aggregated data. Computation of not only rates but other statistics of central tendency and dispersion as well as confidence intervals hence becomes feasible. In addition it is possible to re-adjust the primary data weights in order to perform stratified analyses. A procedure to do so, based solely on the available original weights without the need for analyzing the standard population is proposed along with an extensive SAS macro. The utility allows for the rapid and standardized analysis and tabulation of large multi-variable data sets. Both the statistical and technical properties of the macro are discussed and its usage is explained as well as exemplified.  相似文献   

12.
13.
14.
That the influence of the PRAM model is ubiquitous in parallel algorithm design is as clear as the fact that it is technologically infeasible for the forseeable future. The current generation of parallel hardware prominently features distributed memory and high‐performance interconnection networks—very much the antithesis of the shared memory required for the PRAM model. It has been shown that, in spite of communication costs, for some problems very fast parallel algorithms are available for distributed‐memory machines—from embarassingly parallel problems to sorting and numerical analysis. In contrast it is known that for other classes of problem PRAM‐style shared‐memory simulation on a distributed‐memory machine can, in theory, produce solutions of comparable performance to the best possible for such architectures. The Bulk Synchronous Parallel (BSP) model accurately represents most parallel machines—theoretical and actual—in an execution and cost model. We introduce a scalable portable PRAM realization appropriate for BSP computers and a methodology for usage. Our system is fast and built upon the familiar sequential C++ coupled with the new standard BSP library of parallel computation and communication primitives. It is portable to and predictable on a vast number of parallel computers including workstation clusters, a 256‐processor Cray T3D, an 8‐node IBM SP/2 and a 4‐node shared‐memory SGI Power Challenge machine. Our approach achieves simplicity of programming over direct‐mode BSP programming for reasonable overhead cost. We objectively compare optimized BSP and PRAM algorithms implemented with our C++ PRAM library and provide encouraging experimental results for our new style of programming. Copyright © 2000 John Wiley & Sons, Ltd.  相似文献   

15.
Drawing graphs by eigenvectors: theory and practice   总被引:1,自引:0,他引:1  
The spectral approach for graph visualization computes the layout of a graph using certain eigenvectors of related matrices. Two important advantages of this approach are an ability to compute optimal layouts (according to specific requirements) and a very rapid computation time. In this paper, we explore spectral visualization techniques and study their properties from different points of view. We also suggest a novel algorithm for calculating spectral layouts resulting in an extremely fast computation by optimizing the layout within a small vector space.  相似文献   

16.
Most accounts of computer-based innovation in organizational settings assume a naive picture of organizational change, overlooking events, features, and behaviors that, though unexpected and puzzling, may be the sources of inventions, new knowledge, new organizational routines and arrangements. The ambivalent, untidy, and often unpredictable character of IT-based innovation and change is hardly captured, even by more recent theoretical approaches that have nevertheless provided a deeper understanding of the complex interaction between technology and organizations. Based on field observations of the failures and successes during a major systems development effort in a large European computer manufacturer, we tell a different story: We submit that failures at innovation, surprises, and a whole range of related phenomena can be accounted for by introducing the notion of formative context, that is, the set of institutional arrangements and cognitive imageries that inform the actors' practical and reasoning routines in organizations. Limited capability to inquire into formative contexts is responsible for the actors' limited learning, irrespective of their strategies, interests, espoused theories, and methods. Still, we suggest, plenty of opportunities for innovation lie in the open, pasted-up nature of formative contexts and a new vision of design based on “context-making” interventions can bring them to light.  相似文献   

17.
18.
Abstract. Electronic government, or e‐government, increases the convenience and accessibility of government services and information to citizens. Despite the benefits of e‐government – increased government accountability to citizens, greater public access to information and a more efficient, cost‐effective government – the success and acceptance of e‐government initiatives, such as online voting and licence renewal, are contingent upon citizens’ willingness to adopt this innovation. In order to develop ‘citizen‐centred’ e‐government services that provide participants with accessible, relevant information and quality services that are more expedient than traditional ‘brick and mortar’ transactions, government agencies must first understand the factors that influence citizen adoption of this innovation. This study integrates constructs from the Technology Acceptance Model, Diffusions of Innovation theory and web trust models to form a parsimonious yet comprehensive model of factors that influence citizen adoption of e‐government initiatives. The study was conducted by surveying a broad diversity of citizens at a community event. The findings indicate that perceived ease of use, compatibility and trustworthiness are significant predictors of citizens’ intention to use an e‐government service. Implications of this study for research and practice are presented.  相似文献   

19.
This article argues that a relational view of innovation opens up new perspectives of examining and explaining how novelty develops in creative industries. Although many researchers have given time to this topic, a theoretically grounded concept of relational innovation remains undeveloped within the literature. To address this issue, I set out to offer a framework informed by Gabriel Tarde's relational sociology, by re‐interpreting this sociology with regard to practice theory. By applying this framework in an empirical study of haute cuisine, I identify three processes of innovating at varying degrees of novelty (repeating, adapting, and differentiating). By relating those processes in the form of practices‐nets, I show that innovating is not a linear development process, but that a culinary innovation emerges in between relations of everyday practices that define and transform its value. I hope, in this way, to contribute to a more complex and subtle understanding of culinary innovation as relational.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号