Most schemes exhibit low robustness due to LSB’s (Least Significant Bit) and MSB’s (Most Significant Bit) based information hiding in the cover image. However, most of these IW schemes have low imperceptibility as the cover image distortion reveals to the attacker due to information hiding in MSB’s. In this paper, a hybrid image watermarking scheme is proposed based on integrating Robust Principal Component Analysis (R-PCA), Discrete Tchebichef Transform (DTT), and Singular Value Decomposition (SVD). A grayscale watermark image is twisted/scrambled using a 2D Discrete Hyper-chaotic Encryption System (2D-DHCES) to boost up the robustness/heftiness and security. The original cover image is crumbled into sparse components using R-PCA and using DTT the substantial component is additionally decomposed and the watermark will be embedded in the cover image using SVD processing. In DTT, scarcer coefficients hold the utmost energy, also provide an optimum sparse depiction of the substantial image edges and features that supports proficient retrieval of the watermark image even after unadorned image distortion based channel attacks. The imperceptibility and robustness of the proposed method are corroborated against a variety of signal processing channel attacks (salt and pepper noise, multi-directional shearing, cropping, and frequency filtering, etc.). The visual and quantifiable outcomes reveal that the proposed image watermarking scheme is much effective and delivers high forbearance against several image processing and geometric attacks.
Deposition of diamond films onto various substrates can result in significant technological advantages in terms of functionality
and improved life and performance of components. Diamond is hard, wear resistant, chemically inert, and biocompatible. It
is considered to be the ideal material for surfaces of cutting tools and biomedical components. However, it is well known
that diamond deposition onto technologically important substrates, such as co-cemented carbides and steels, is problematic
due to carbon interaction with the substrate, low nucleation densities, and poor adhesion. Several papers previously published
in the relevant literature have reported the application of interlayer materials such as metal nitrides and carbides to provide
bonding between diamond and hostile substrates. In this study, the chemical vapor deposition (CVD) of polycrystalline diamond
on TiN/SiNx nc (nc) interlayers deposited at relatively low temperatures has been investigated for the first time. The nc layers were
deposited at 70 or 400 °C on Si substrates using a dual ion beam deposition system. The results showed that a preliminary
seeding pretreatment with diamond suspension was necessary to achieve large diamond nucleation densities and that diamond
nucleation was larger on nc films than on bare sc-Si subjected to the same pretreatment and CVD process parameters. TiN/SiNx layers synthesized at 70 or 400 °C underwent different nanostructure modifications during diamond CVD. The data also showed
that TiN/SiNx films obtained at 400 °C are preferable in so far as their use as interlayers between hostile substrates and CVD diamond
is concerned.
This paper was presented at the fourth International Surface Engineering Congress and Exposition held August 1–3, 2005 in
St. Paul, MN. 相似文献
Many recent software engineering papers have examined duplicate issue reports. Thus far, duplicate reports have been considered a hindrance to developers and a drain on their resources. As a result, prior research in this area focuses on proposing automated approaches to accurately identify duplicate reports. However, there exists no studies that attempt to quantify the actual effort that is spent on identifying duplicate issue reports. In this paper, we empirically examine the effort that is needed for manually identifying duplicate reports in four open source projects, i.e., Firefox, SeaMonkey, Bugzilla and Eclipse-Platform. Our results show that: (i) More than 50 % of the duplicate reports are identified within half a day. Most of the duplicate reports are identified without any discussion and with the involvement of very few people; (ii) A classification model built using a set of factors that are extracted from duplicate issue reports classifies duplicates according to the effort that is needed to identify them with a precision of 0.60 to 0.77, a recall of 0.23 to 0.96, and an ROC area of 0.68 to 0.80; and (iii) Factors that capture the developer awareness of the duplicate issue’s peers (i.e., other duplicates of that issue) and textual similarity of a new report to prior reports are the most influential factors in our models. Our findings highlight the need for effort-aware evaluation of approaches that identify duplicate issue reports, since the identification of a considerable amount of duplicate reports (over 50 %) appear to be a relatively trivial task for developers. To better assist developers, research on identifying duplicate issue reports should put greater emphasis on assisting developers in identifying effort-consuming duplicate issues. 相似文献
Reuse of software components, either closed or open source, is considered to be one of the most important best practices in software engineering, since it reduces development cost and improves software quality. However, since reused components are (by definition) generic, they need to be customized and integrated into a specific system before they can be useful. Since this integration is system-specific, the integration effort is non-negligible and increases maintenance costs, especially if more than one component needs to be integrated. This paper performs an empirical study of multi-component integration in the context of three successful open source distributions (Debian, Ubuntu and FreeBSD). Such distributions integrate thousands of open source components with an operating system kernel to deliver a coherent software product to millions of users worldwide. We empirically identified seven major integration activities performed by the maintainers of these distributions, documented how these activities are being performed by the maintainers, then evaluated and refined the identified activities with input from six maintainers of the three studied distributions. The documented activities provide a common vocabulary for component integration in open source distributions and outline a roadmap for future research on software integration. 相似文献
In this study, we wanted to discriminate between two groups of people. The database used in this study contains 20 patients with Parkinson’s disease and 20 healthy people. Three types of sustained vowels (/a/, /o/ and /u/) were recorded from each participant and then the analyses were done on these voice samples. Firstly, an initial feature vector extracted from time, frequency and cepstral domains. Then we used linear and nonlinear feature extraction techniques, principal component analysis (PCA), and nonlinear PCA. These techniques reduce the number of parameters and choose the most effective acoustic features used for classification. Support vector machine with its different kernel was used for classification. We obtained an accuracy up to 87.50 % for discrimination between PD patients and healthy people. 相似文献
Standard genetic algorithms (SGAs) are investigated to optimise discrete-time proportional-integral-derivative (PID) controller parameters, by three tuning approaches, for a multivariable glass furnace process with loop interaction. Initially, standard genetic algorithms (SGAs) are used to identify control oriented models of the plant which are subsequently used for controller optimisation. An individual tuning approach without loop interaction is considered first to categorise the genetic operators, cost functions and improve searching boundaries to attain the desired performance criteria. The second tuning approach considers controller parameters optimisation with loop interaction and individual cost functions. While, the third tuning approach utilises a modified cost function which includes the total effect of both controlled variables, glass temperature and excess oxygen. This modified cost function is shown to exhibit improved control robustness and disturbance rejection under loop interaction. 相似文献
Internet of Things (IoT) connects billions of devices in an Internet-like structure. Each device encapsulated as a real-world service which provides functionality and exchanges information with other devices. This large-scale information exchange results in new interactions between things and people. Unlike traditional web services, internet of services is highly dynamic and continuously changing due to constant degrade, vanish and possibly reappear of the devices, this opens a new challenge in the process of resource discovery and selection. In response to increasing numbers of services in the discovery and selection process, there is a corresponding increase in number of service consumers and consequent diversity of quality of service (QoS) available. Increase in both sides’ leads to the diversity in the demand and supply of services, which would result in the partial match of the requirements and offers. This paper proposed an IoT service ranking and selection algorithm by considering multiple QoS requirements and allowing partially matched services to be counted as a candidate for the selection process. One of the applications of IoT sensory data that attracts many researchers is transportation especially emergency and accident services which is used as a case study in this paper. Experimental results from real-world services showed that the proposed method achieved significant improvement in the accuracy and performance in the selection process. 相似文献
This article draws on three case studies of drip irrigation adoption in Morocco to consider the water–energy–food nexus concept from a bottom-up perspective. Findings indicate that small farmers' adoption of drip irrigation is conditional, that water and energy efficiency does not necessarily reduce overall consumption, and that adoption of drip irrigation (and policies supporting it) can create winners and losers. The article concludes that, although the water–energy–food WEF nexus concept may offer useful insights, its use in policy formulation should be tempered with caution. Technical options that appear beneficial at the conceptual level can have unintended consequences in practice, and policies focused on issues of scarcity and efficiency may exacerbate other dimensions of poverty and inequality. 相似文献