首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Multihop wireless ad hoc and sensor networks open the door for great networking opportunities especially in scenarios where it is infeasible or expensive to deploy significant networking infrastructure. However, the open communication media and the lack of networking infrastructure make these networks vulnerable to a wide range of security attacks. A particularly devastating attack is the control traffic tunneling attack, where a malicious node records control traffic at one location and tunnels it to a colluding node, possibly far away, which replays it locally. One of the control traffic attacks’ incarnations is the wormhole attack that can be used to prevent route establishment by preventing nodes from discovering legitimate routes that are more than two hops away. These attacks have been addressed by many researchers, however, most of the presented work is either limited to static scenarios, require expensive hardware or suffer from high overhead and performance degradation. In this paper, we present a scalable countermeasure for the control traffic tunneling attack, called CTAC, which alleviates these drawbacks and efficiently mitigates the attack in both static and mobile networks. CTAC uses trusted nodes called cluster heads (CH) for global tracking of node locations and profile keeping. Local monitoring is used to detect and isolate malicious nodes locally. Additionally, when sufficient suspicion builds up at a CH, it enforces a global isolation of the malicious node from the whole network. The performance gain, the relatively low overhead, and the positive impact of CTAC on the data traffic fidelity are brought out through analysis and extensive simulation using ns-2. The results show that CTAC achieves higher detection ratio and faster isolation time while considerably decreases the overhead energy and the end-to-end delay compared to the state-of-the art schemes.  相似文献   

2.
In mergers and acquisitions (M&A), a primary objective of acquirer is to integrate IT resources of the target with its own. IT M&A integration is assumed to create synergies, which in turn increase shareholder wealth by making the value of the merged firm greater than the sum of the standalone values of the two firms. In this study, we challenge this assumption and argue that IT M&A integration does not always lead to greater value creation. Prior research on IT M&A integrations indicates that IT resources are often not scale-free in M&A: that is, they do not transfer easily and costlessly from an acquirer to its target or vice versa. In fact, IT M&A integration can destroy value rather than create it when IT resources are not scale-free. We theorize about the contingencies under which IT M&A integration can create value for shareholders of acquirers. We test our hypotheses in a sample of 549 M&A transactions between 1998 and 2007. We find that, on average, capital markets react negatively with M&A announcements of acquirers whose IT capabilities are superior relative to those of the targets. The superiority of the acquirer’s IT capabilities signals that the acquirer is likely to rip and replace IT resources of the target. This IT M&A integration approach increases risks of disruption to target’s operations and revenue growth. Capital markets take such risks into account and reduce the stock price of the acquirer. One contingency that reduces the negative reactions of capital markets is industry relatedness of target. In a same-industry acquisition, an acquirer and its target have similar operating models, competitive dynamics, and regulatory context. Thus, ripping and replacing weaker IT resources of the target with superior IT resources of the acquirer creates expectations of more efficient operation, engenders positive stock price reactions, and increases shareholder wealth. Another contingency that reduces the negative reactions of capital markets is the acquirer’s track record in profitable growth. A profitably growing acquirer that has superior IT capabilities increases the confidence of capital markets that it can minimize potential disruption risks of IT integration, continue its profitable growth pattern with newly acquired target, engender positive stock price reactions, and create shareholder wealth. These findings indicate that IT M&A integration does not always lead to greater value creation in M&A. The study makes a contribution by identifying the contingencies under which IT M&A integration creates wealth for acquirer’s shareholders.  相似文献   

3.
Lee KS 《Ergonomics》2005,48(5):547-558
The objective of this paper is to describe how and why ergonomics should be promoted in total quality management (TQM). Ergonomics and TQM activities are compared. An approach is proposed to apply ergonomics in TQM using ergonomics circles. An eight-step approach is introduced for applying ergonomics using ergonomics circles and a study that employed this approach in Korea is discussed. In applying this approach, all processes were first evaluated by workers. Processes that were identified as problematic were analysed by a company-wide committee to set priorities for improvement. An ergonomics improvement team consisting of safety and health personnel, process engineers and management innovation personnel then worked on the processes using a low-cost approach. It was found that applying ergonomics using ergonomics circles as quality circles in TQM was effective in improving workplaces and resulted in increasing productivity, cost saving and improved safety.  相似文献   

4.
The paper first summarizes a general approach to the training of recurrent neural networks by gradient-based algorithms, which leads to the introduction of four families of training algorithms. Because of the variety of possibilities thus available to the "neural network designer," the choice of the appropriate algorithm to solve a given problem becomes critical. We show that, in the case of process modeling, this choice depends on how noise interferes with the process to be modeled; this is evidenced by three examples of modeling of dynamical processes, where the detrimental effect of inappropriate training algorithms on the prediction error made by the network is clearly demonstrated.  相似文献   

5.
Siyeon Kim 《Ergonomics》2016,59(4):496-503
The aim of this study was to investigate stable and valid measurement sites of skin temperatures as a non-invasive variable to predict deep-body temperature while wearing firefighters’ personal protective equipment (PPE) during air temperature changes. Eight male firefighters participated in an experiment which consisted of 60-min exercise and 10-min recovery while wearing PPE without self-contained breathing apparatus (7.75 kg in total PPE mass). Air temperature was periodically fluctuated from 29.5 to 35.5 °C with an amplitude of 6 °C. Rectal temperature was chosen as a deep-body temperature, and 12 skin temperatures were recorded. The results showed that the forehead and chest were identified as the most valid sites to predict rectal temperature (R2 = 0.826 and 0.824, respectively) in an environment with periodically fluctuated air temperatures. This study suggests that particular skin temperatures are valid as a non-invasive variable when predicting rectal temperature of an individual wearing PPE in changing ambient temperatures.

Practitioner Summary: This study should offer assistance for developing a more reliable indirect indicating system of individual heat strain for firefighters in real time, which can be used practically as a precaution of firefighters’ heat-related illness and utilised along with physiological monitoring.  相似文献   


6.
Neural Computing and Applications - Artificial metaplasticity is the machine learning algorithm inspired in the biological metaplasticity of neural synapses. Metaplasticity stands for plasticity of...  相似文献   

7.
Chadwick  D. 《Computer》2000,33(8):107-109
Microsoft conceived Windows 2000 as the operating system for the Internet. This gave many people pause for thought, what with Microsoft's less-than-sterling reputation regarding cohabitation of competitors' software on their operating system. The Internet is based on open standards and interworking between different systems from different suppliers. If Windows 2000 compromises the Internet's integrity and ubiquity-two of its primary hallmarks-will it really be the best operating system to base your Internet services on? Some of the new additions to Windows 2000 show that, although Microsoft pays lip-service to the Internet's sacred tenets of openness and support for standards, it has actually (and sometimes only subtly) removed or subverted these tenets. The Windows 2000 changes appear to subtly exclude technologies from other vendors and make interworking more difficult  相似文献   

8.
At present, air traffic controllers (ATCOs) exercise strict control over routing authority for aircraft movement in airspace. The onset of a free flight environment, however, may well result in a dramatic change to airspace jurisdictions, with aircraft movements for the large part being governed by aircrew, not ATCOs. The present study examined the impact of such changes on spatial memory for recent and non-recent locations of aircraft represented on a visual display. The experiment contrasted present conditions, in which permission for manoeuvres is granted by ATCOs, with potential free flight conditions, in which aircrew undertake deviations without explicit approval from ATCOs. Results indicated that the ATCO role adopted by participants impacted differently on short-term and long-term spatial representations of aircraft manoeuvres. Although informing participants of impending deviations has beneficial effects on spatial representations in the short term, long-term representations of spatial events are affected deleteriously by the presentation of subsequent information pertaining to other aircraft. This study suggests strongly that recognition of the perceptual and cognitive consequences of changing to a free flight environment is crucial if air safety is not to be jeopardized.  相似文献   

9.
In the literature students are sometimes assumed to feel empowered with respect to learning because of their familiarity with and access to ICT. However, after interviewing 25 students from post-elementary schools, it was found that the majority of the students, although they use the Internet and other ICT for school purposes, believed that their generation is not as good at learning as the pre-ICT generation. Several students explained the situation in terms of the school’s failure to build on their abilities. Nonetheless, the majority believed that the Internet over-simplifies schoolwork (perceived primarily as the traditional processing of textual sources), which in turn diminishes learning abilities. These results carry important implications regarding school, given that low self-efficacy might make students less likely to apply themselves to learning.  相似文献   

10.
Using five medical datasets we detected the influence of missing values on true positive rates and classification accuracy. We randomly marked more and more values as missing and tested their effects on classification accuracy. The classifications were performed with nearest neighbour searching when none, 10, 20, 30% or more values were missing. We also used discriminant analysis and naïve Bayesian method for the classification. We discovered that for a two-class dataset, despite as high as 20–30% missing values, almost as good results as with no missing value could still be produced. If there are more than two classes, over 10–20% missing values are probably too many, at least for small classes with relatively few cases. The more classes and the more classes of different sizes, a classification task is the more sensitive to missing values. On the other hand, when values are missing on the basis of actual distributions affected by some selection or non-random cause and not fully random, classification can tolerate even high numbers of missing values for some datasets.  相似文献   

11.
Meeting multiple Quality of Service (QoS) requirements is an important factor in the success of complex software systems. This paper presents an automated, model-based scheduler synthesis approach for scheduling application software tasks to meet multiple QoS requirements. As a first step, it shows how designers can meet deadlock-freedom and timeliness requirements, in a manner that (i) does not over-provision resources, (ii) does not require architectural changes to the system, and that (iii) leaves enough degrees of freedom to pursue further properties. A major benefit of our synthesis methodology is that it increases traceability, by linking each scheduling constraint with a specific pair of QoS property and underlying platform execution model, so as to facilitate the validation of the scheduling constraints and the understanding of the overall system behaviour, required to meet further QoS properties.  相似文献   

12.
Opportunistic networks, in which nodes opportunistically exploit any pair-wise contact to identify next hops towards the destination, are one of the most interesting technologies to support the pervasive networking vision. Opportunistic networks allow content sharing between mobile users without requiring any pre-existing Internet infrastructure, and tolerate partitions, long disconnections, and topology instability in general. In this paper we propose a context-aware framework for routing and forwarding in opportunistic networks. The framework is general, and able to host various flavors of context-aware routing. In this work we also present a particular protocol, HiBOp, which, by exploiting the framework, learns and represents through context information, the users’ behavior and their social relations, and uses this knowledge to drive the forwarding process. The comparison of HiBOp with reference to alternative solutions shows that a context-aware approach based on users’ social relations turns out to be a very efficient solution for forwarding in opportunistic networks. We show performance improvements over the reference solutions both in terms of resource utilization and in terms of user perceived QoS.  相似文献   

13.
《Ergonomics》2012,55(1-3):239-246
Four methods of transforming research into practice are reviewed: (1) summaries by academics for governments; (2) reviews in professional journals for shiftworking populations; (3) guidelines for shiftworkers by shiftwork researchers; and (4) international laws and recommendations. Examples are cited from a selection of these that appear to be in part defective, unbalanced, misleading, biased, idiosyncratic, or ignorant. It is concluded that until more careful controlled evaluations are carried out, experts should be cautious in providing advice to shiftworkers that is not demonstrably better than the advice they get from experienced ‘grandmother’ shiftworkers.  相似文献   

14.
The emergence of the Internet as a global communication infrastructure has dramatically reduced interaction costs within and across organizations, with significant impact on inter-organizational relationships, vertical industry structures, and markets. More recently, service-oriented architectures (SOA) and Web services have introduced the next paradigm shift and foster the idea of dynamic business networks with quick connect and disconnect relationships. However, little research has systematically analyzed how companies leverage SOA to improve their inter-organizational relationships and reshape their business networks. In addition, the mature research stream on inter-organizational information systems (IOS) has not yet sufficiently considered SOA. In order to close this gap, our research seeks to improve the fundamental understanding of how SOA is applied in business networks and how it differs from prior forms of IOS. Using an exploratory research approach, we investigate 33 SOA cases to identify focus areas and patterns of SOA adoption in business networks. Our case analysis builds on a multi-dimensional classification scheme which we derived from prior literature. While our empirical findings do not confirm all promising propositions related to SOA, they underline the specific contribution of SOA compared to prior forms of IOS. We conclude by suggesting five clusters of SOA adoption in the inter-organizational domain, each of those introducing new aspects in the coordination of distributed business networks.  相似文献   

15.
16.
17.
18.
This research, funded by the British National Bibliography Research Fund, examined how publishers’ websites are causing changes in relations in the book industry in order to gain further understanding of the implications of the impact of the Internet on the publishing chain. The paper is set within the context of the development of electronic commerce and how business to consumer commerce is now being overtaken by the activity in business to business trading. Publishers have followed the main business trends in using their websites to develop relationships directly with the consumer, but have been rather slower in developing their business to business activity through the Internet. This study investigated what changes were taking place as a result of current publisher activity on the Web and how these changes were affecting the traditional lines of communication in the book industry. An analysis of a range of consumer publishers’ websites was carried out to see what facilities they were offering both to the general public and to business partners within the industry. Questionnaires and interviews were then conducted to establish how publishers, booksellers and wholesalers were using publishers’ websites and whether these sites were beneficial to the industry as a whole. Facilities found on the websites were examined and analysed in order to determine their usefulness and how they might be developed to aid business to business commerce. Additionally, several issues relating to online sales, changes in sales patterns and changes in working methods were discussed. Although an amount of quantitative data is included in the report, many of the issues raised relied on the perceptions and opinions of practitioners in the book trade.  相似文献   

19.
New technologies are transforming medicine, and this revolution starts with data. Health data, clinical images, genome sequences, data on prescribed therapies and results obtained, data that each of us has helped to create. Although the first uses of artificial intelligence (AI) in medicine date back to the 1980s, it is only with the beginning of the new millennium that there has been an explosion of interest in this sector worldwide. We are therefore witnessing the exponential growth of health-related information with the result that traditional analysis techniques are not suitable for satisfactorily management of this vast amount of data. AI applications (especially Deep Learning), on the other hand, are naturally predisposed to cope with this explosion of data, as they always work better as the amount of training data increases, a phase necessary to build the optimal neural network for a given clinical problem. This paper proposes a comprehensive and in-depth study of Deep Learning methodologies and applications in medicine. An in-depth analysis of the literature is presented; how, where and why Deep Learning models are applied in medicine are discussed and reviewed. Finally, current challenges and future research directions are outlined and analysed.  相似文献   

20.
An approximate approach to solving the nested analysis equations in topology optimization is proposed. The procedure consists of only one matrix factorization for the whole design process and a small number of iterative corrections for each design cycle. The approach is tested on 3D topology optimization problems. It is shown that the computational cost can be reduced by one order of magnitude without affecting the outcome of the optimization process.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号