首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Universal Access in the Information Society - The present exploratory study describes senior citizens’ attitudes relating to biotechnologies, which were compared with a younger sample. Using...  相似文献   

2.
Outsourced computing is gaining popularity in recent years. However, due to the existence of malicious workers in the open outsourced environment, offering high accuracy computing services is critical and challenging. A practical solution for this class of problems is to replicate outsourced tasks and compare the replicated task results, or to verify task results by the outsourcer herself. However, since most outsourced computing services are not free, the portion of tasks to be replicated or verified is restricted by the outsourcer’s budget. In this paper, we propose Integrity Assurance Outsourced Computing (IAOC) system, which employs probabilistic task replication, probabilistic task verification and credit management techniques to offer a high accuracy guarantee for the generalized outsourced computing jobs. Based on IAOC system, we perform theoretical analysis and model the behaviors of IAOC system and the attacker as a two-player zero sum game. We propose two algorithms, Interactive Gradient Descent (IGD) algorithm and Tiered Interactive Gradient Descent (TIGD) algorithm that can find the optimal parameter settings under user’s accuracy requirement, without or with considering user’s budget requirement. We prove that the parameter setting generated by IGD/TIGD algorithm form a Nash Equilibrium, and also suggests an accuracy lower bound. Our experiments show that even in the most severe situation, where the malicious workers dominate the outsourced computing environment, our algorithm is able to find the parameter settings satisfying user’s budget and accuracy requirement.  相似文献   

3.
A two-stage model is described where firms take decisions on where to locate their facility and on how much to supply to which market. In such models in literature, typically the market price reacts linearly on supply. Often two competing suppliers are assumed or several that are homogeneous, i.e., their cost structure is assumed to be identical. The focus of this paper is on developing methods to compute equilibria of the model where more than two suppliers are competing that each have their own cost structure, i.e., they are heterogeneous. Analytical results are presented with respect to optimality conditions for the Nash equilibria in the two stages. Based on these analytical results, an enumeration algorithm and a local search algorithm are developed to find equilibria. Numerical cases are used to illustrate the results and the viability of the algorithms. The methods find an improvement of a result reported in literature.  相似文献   

4.
5.
Computing power is largely becoming a basic supply which you can envisage to buy from a provider like you buy power or water. This is the result of a now long running trend that consists in connecting computing resources together so as to set up what can globally be referred to as a remote computing platform, the most up-to-date incarnation of which is the notion of a grid (Foster and Kesselman, 2003). These resources can then be shared among users, what means circulating codes and the results of their execution over a network, what is highly insecure. At the other end of the spectrum of computing devices, smart cards ([Mayes and Markantonakis, 2008] and [Hendry, 2001]) offer extremely secure but extremely limited computing capabilities. The question is thus to bridge the gap between computational power and high security. The aim of this paper is to show how large and high capacity remote computing architectures can interact with smart cards, which certainly are the most widely deployed, still the smallest computing systems of the information technology era, so as to improve the overall security of a global infrastructure.  相似文献   

6.
7.
8.
Biology has rapidly become a data-rich, information-hungry science because of recent massive data generation technologies. Our biological colleagues are designing more clever and informative experiments because of recent advances in molecular science. These experiments and data hold the key to the deepest secrets of biology and medicine, but we cannot fully analyze this data due to the wealth and complexity of the information available. The result is a great need for intelligent systems in biology. There are many opportunities for intelligent systems to help produce knowledge in biology and medicine. Intelligent systems probably helped design the last drug your doctor prescribed, and they were probably involved in some aspect of the last medical care you received. Intelligent computational analysis of the human genome will drive medicine for at least the next half-century. Intelligent systems are working on gene expression data to help understand genetic regulation and ultimately the regulated control of all life processes including cancer, regeneration, and aging. Knowledge bases of metabolic pathways and other biological networks make inferences in systems biology that, for example, let a pharmaceutical program target a pathogen pathway that does not exist in humans, resulting in fewer side effects to patients. Modern intelligent analysis of biological sequences produces the most accurate picture of evolution ever achieved. Knowledge-based empirical approaches currently are the most successful method known for general protein structure prediction. Intelligent literature-access systems exploit a knowledge flow exceeding half a million biomedical articles per year. Machine learning systems exploit heterogenous online databases whose exponential growth mimics Moore's law.  相似文献   

9.
Using computational biology, we have depicted the insulin phylogenetics. We have also analyzed the sequence alignment and sequence logos formation for both the insulin chain A and B for three groups namely, the mammalian group, vertebrates group and fish group. We have also analyzed cladograms of insulin for the mammalian group. In accordance with that path lengths, matrix for distance analysis, matching representation of nodes of the cladogram and dissimilarity between two nodes have been performed for both of the A and B chains of the mammalian group. Our results show that 12 amino acid residues (GlyA1, IleA2, ValA3, TyrA19, CysA20, AsnA21, LeuB6, GlyB8, LeuB11, ValB12, GlyB23 and PheB24) are highly conserved for all groups and among them some (GlyA1, IleA2, ValA3);(TyrA19, CysA20, AsnA21) are continuous. This study shows a rapid method to calculate the amino acid sequences in terms of evolutionary conservation rates as well as molecular phylogenetics.  相似文献   

10.
Neural Computing and Applications - The “boost-diffusion” low-pressure nitriding used to low-frictional coatings manufacturing of aircraft engines’ piston rings is a...  相似文献   

11.

Context

Agile information systems development (ISD) has received much attention from both the practitioner and researcher community over the last 10-15 years. However, it is still unclear what precisely constitutes agile ISD.

Objective

Based on four empirical studies conducted over a 10-year time period from 1999 to 2008 the objective of this paper is to show how the meaning and practice of agile ISD has evolved over time and on this basis to speculate about what comes next.

Method

Four phases of research has been conducted, using a grounded theory approach. For each research phase qualitative interviews were held in American and/or Danish companies and a grounded theory was inductively discovered by careful data analysis. Subsequently, the four unique theories have been analyzed for common themes, and a global theory was identified across the empirical data.

Results

In 1999 companies were developing software at high-speed in a desperate rush to be first-to-market. In 2001 a new high-speed/quick results development process had become established practice. In 2003 changes in the market created the need for a more balanced view on speed and quality, and in 2008 companies were successfully combining agile and plan-driven approaches to achieve the benefits of both. The studies reveal a two-stage pattern in which dramatic changes in the market causes disruption of established practices and process adaptations followed by consolidation of lessons learnt into a once again stable software development process.

Conclusion

The cyclical history of punctuated process evolution makes it possible to distinguish pre-agility from current practices (agility), and on this basis, to speculate about post-agility: a possible next cycle of software process evolution concerned with proactively pursuing the dual goal of agility and alignment through a diversity of means.  相似文献   

12.
Cloud computing has become an increasingly common computing infrastructure for contemporary firms. An important decision for firms to make in adopting a cloud computing model is whether to build it in-house (a private cloud) or outsource it (a public cloud). Prior literature has focused on the impact of firms’ characteristics but generated inconsistent results regarding the selection of cloud computing models. To add to this line of inquiry, we consider the relative resource structure, which reflects the importance of physical and knowledge resources for individual firms, and examine their respective effects on the selection of cloud computing deployment models (CCDMs). Using data from 520 companies deploying cloud computing in mainland China, we find that firms with higher physical capital intensity (PCI) tend to outsource cloud computing, whereas those with higher knowledge capital intensity (KCI) tend to use private clouds. Firms with higher codified knowledge capital intensity (CKCI) are found to be more susceptible to the negative relationship between KCI and public cloud selection than those with higher tacit knowledge capital intensity (TKCI). The direct positive influence of regional legal protection on a firm’s preferences for a public cloud is also confirmed, as well as its indirect moderating effect on alleviating the negative relationships between CKCI and deploying a public cloud.  相似文献   

13.
《国际计算机数学杂志》2012,89(9):1915-1937
In this work, we introduce a numerical method to approximate the solutions of a multidimensional parabolic partial differential equation with nonlinear diffusion and reaction, subject to nonnegative initial data and homogeneous boundary conditions of the Neumann type. The equation considered is a model for both the growth of biological films and the propagation of mutant genes which are advantageous to a population. The initial-boundary-value problem under investigation is fully discretized temporally and spatially following a finite-difference methodology which results in a simple, linear, implicit scheme that is consistent with respect to the continuous problem. The method is a two-step technique that preserves the positivity and the boundedness of initial profiles. We provide some simulations on the growth of microbial colonies, and comparisons versus a standard approach.  相似文献   

14.
Abstract

Despite the impressive amount of financial resources recently invested in carrying out large-scale brain simulations, it is controversial what the pay-offs are of pursuing this project. One idea is that from designing, building, and running a large-scale neural simulation, scientists acquire knowledge about the computational performance of the simulating system, rather than about the neurobiological system represented in the simulation. It has been claimed that this knowledge may usher in a new era of neuromorphic, cognitive computing systems. This study elucidates this claim and argues that the main challenge this era is facing is not the lack of biological realism. The challenge lies in identifying general neurocomputational principles for the design of artificial systems, which could display the robust flexibility characteristic of biological intelligence.  相似文献   

15.
It is now just over 50 years since the deployment of LEO-the first business computer and application - in 1951. The paper attempts to look 50 years beyond the birth of LEO in order to discern the nature and effects of business computing in 2051.Scenarios are offered of some possible business applications fifty years hence. These include the business information systems in space and the nature of manufacturing.The scenarios serve as a basis for addressing a number of issues. These include the availability of technology to support the scenarios presented, the nature of organizations shaped by future information systems, the nature of employment in the new organizational structure, consumer-vendor relations in the new economy, the effects of the new information technology on the nature of national governments, and the effect of information technologies on the structure of the global economy.  相似文献   

16.
17.
We present a detailed case study of the design, launch, and evaluation of a handheld mobile computing guide for visitors to the Smithsonian American Art Museums Renwick Gallery. Of particular emphasis is integrating methods and tools of evaluation into the process of designing for new visitor experiences. Using a method of reflective design and evaluation incorporating interviews, surveys, observation, and clickstream analysis, we uncover assumptions and hypothesis for further testing. Finally, we discuss the cross-over between physical navigation of museum spaces and information navigation of online museum data.  相似文献   

18.
19.
A distributed implementation of the Spatially-Explicit Individual-Based Simulation Model of Florida Panther and White-Tailed Deer in the Everglades and Big Cypress Landscapes (SIMPDEL) model is presented. SIMPDEL models the impact of different water management strategies in the South Florida region on the white-tailed deer and the Florida panther populations. SIMPDEL models the interaction of the four interrelated components – vegetation, hydrology, white-tailed deer and Florida panther, over a time span up to several decades. Very similar outputs of bioenergetic and survival statistics were obtained from the serial and distributed models. A performance evaluation of the two models revealed moderate speed improvements for the distributed model (referred to as DSIMPDEL). The 4-processor configuration attained a speed improvement of 3.83 with small deer populations on an ATM-based network of SUN Ultra 2 workstations over the serial model executing on a single SUN Ultra 2 workstation.  相似文献   

20.
Transfer of learning in virtual environments: a new challenge?   总被引:1,自引:0,他引:1  
The aim of all education is to apply what we learn in different contexts and to recognise and extend this learning to new situations. Virtual learning environments can be used to build skills. Recent research in cognitive psychology and education has shown that acquisitions are linked to the initial context. This provides a challenge for virtual reality in education or training. A brief overview of transfer issues highlights five main ideas: (1) the type of transfer enables the virtual environment (VE) to be classified according to what is learned; (2) the transfer process can create conditions within the VE to facilitate transfer of learning; (3) specific features of VR must match and comply with transfer of learning; (4) transfer can be used to assess a VE’s effectiveness; and (5) future research on transfer of learning must examine the singular context of learning. This paper discusses how new perspectives in cognitive psychology influence and promote transfer of learning through the use of VEs.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号