The Journal of Supercomputing - Power consumption is likely to remain a significant concern for exascale performance in the foreseeable future. In addition, graphics processing units (GPUs) have... 相似文献
The Key Expansion Function is a vital constituent component of any block cipher. Many of Key Expansion Functions generate subkeys through the algorithms which are based on Feistel or Substitution Permutation Network (SPN) structures against which cryptanalytic methods have been well researched. In this very paper, an efficient method for generating subkeys based on chaotic maps has been suggested. The phenomenon behind the proposed Key Expansion Function is the mixing property of Tent Map. Using chaotic binary sequences, the proposed Key Expansion Function satisfies the specific statistical and cryptographic properties of chaotic generators. A new Bit Extraction Technique based on IEEE-754 Floating-point Standard (binary32) is used to extract the bits of subkeys from the chaotic binary sequences. The generated subkeys are then analyzed. The results show that the given Chaos-based Key Expansion Function is well protected and fully strengthened in all respects. 相似文献
An abstract file system is defined here as a partial function from (absolute) paths to data. Such a file system determines
the set of valid paths. It allows the file system to be read and written at a valid path, and it allows the system to be modified
by the Unix operations for creation, removal, and moving of files and directories. We present abstract definitions (axioms)
for these operations. This specification is refined towards a pointer implementation. The challenge is to have a natural abstraction
function from the implementation to the specification, to define operations on the concrete store that behave exactly in the
same way as the corresponding functions on the abstract store, and to prove these facts. To mitigate the problems attached
to partial functions, we do this in two steps: first a refinement towards a pointer implementation with total functions, followed
by one that allows partial functions. These two refinements are proved correct by means of a number of invariants. Indeed,
the insights gained consist, on the one hand, of the invariants of the pointer implementation that are needed for the refinement
functions, and on the other hand of the precise enabling conditions of the operations on the different levels of abstraction.
Each of the three specification levels is enriched with a permission system for reading, writing, or executing, and the refinement
relations between these permission systems are explored. Files and directories are distinguished from the outset, but this
rarely affects our part of the specifications. All results have been verified with the proof assistant PVS, in particular,
that the invariants are preserved by the operations, and that, where the invariants hold, the operations commute with the
refinement functions. 相似文献
Melanoma is the deadliest type of skin cancer with highest mortality rate. However, the annihilation in early stage implies a high survival rate therefore, it demands early diagnosis. The accustomed diagnosis methods are costly and cumbersome due to the involvement of experienced experts as well as the requirements for highly equipped environment. The recent advancements in computerized solutions for these diagnoses are highly promising with improved accuracy and efficiency. In this article, we proposed a method for the classification of melanoma and benign skin lesions. Our approach integrates preprocessing, lesion segmentation, features extraction, features selection, and classification. Preprocessing is executed in the context of hair removal by DullRazor, whereas lesion texture and color information are utilized to enhance the lesion contrast. In lesion segmentation, a hybrid technique has been implemented and results are fused using additive law of probability. Serial based method is applied subsequently that extracts and fuses the traits such as color, texture, and HOG (shape). The fused features are selected afterwards by implementing a novel Boltzman Entropy method. Finally, the selected features are classified by Support Vector Machine. The proposed method is evaluated on publically available data set PH2. Our approach has provided promising results of sensitivity 97.7%, specificity 96.7%, accuracy 97.5%, and F‐score 97.5%, which are significantly better than the results of existing methods available on the same data set. The proposed method detects and classifies melanoma significantly good as compared to existing methods. 相似文献
The objective of the present study was to analyze and compare the phenolic compounds and their antioxidant capacities of new lines of Dacus carota. The selected cultivars showed high variation in the contents of total phenolics (30.26–65.39 mg/100 g FW) and total ascorbic acid (41.12–58.36 mg/100 g FW). Analysis on RP-HPLC revealed that hydroxycinnamic acids and its derivatives were major phenolic compounds present in D. carota extracts, whereas 5-caffeolquinic acid was a major hydroxycinnamic acid (ranged from 30.26 to 65.39 mg/100 g FW). DCP cultivar showed high total antioxidant capacity (77.69 mg/100 g), 2,2-diphenyl-1-picrylhydrazyl (DPPH) scavenging capacity (52.36 mg/100 g), superoxide radical scavenging capacity (53.69 mg/100 g), and hydroxyl radical scavenging capacity (51.91 mg/100 g). A linear relationship was found between total phenolic acid contents and antioxidant capacity. Both phenolic compounds and antioxidant capacities varied significantly (ρ < 0.05) among cultivars. DCP cultivar was found to be a rich source of phenolics and ascorbic acid with high antioxidant activity. 相似文献
Earthquake is a natural disaster which causes extensive damage as well as the death of thousands of people. Earthquake professionals for many decades have recognized the benefits to society from reliable earthquake predictions. Techniques like: mathematical modelling, hydrology analysis, ionosphere analysis and even animal responses have been used to forecast a quake. Most of these techniques rely on certain precursors like, stress or seismic activity. Data mining techniques can also be used for prediction of this natural hazard. Data mining consists of evolving set of techniques such as association rule mining that can be used to extract valuable information and knowledge from massive volumes of data. The aim of this study is to predict a subsequent earthquake from the data of the previous earthquake. This is achieved by applying association rule mining on earthquake data from 1979 to 2012. These associations are polished using predicate-logic techniques to draw stimulating production-rules to be used with a rule-based expert system. Prediction process is done by an expert system, which takes only current earthquake attributes to predict a subsequent earthquake. The rules generated for predicting the earthquake are mathematically validated as well as tested on real life earthquake data. Results from our study show that the proposed rule-based expert system is able to detect 100 % of earthquakes which actually occurred within 15 hours at-most within a defined range, depth and location. This work solely relies on previous earthquake data for predicting the next. 相似文献
In the past few years, much effort has been invested into developing a new blue economy based on harvesting, cultivating and processing marine macroalgae in Norway. Macroalgae have high potential for a wide range of applications, e.g. as source of pharmaceuticals, production of biofuels or as food and feed. However, data on the chemical composition of macroalgae from Norwegian waters are scant. This study was designed to characterize the chemical composition of 21 algal species. Both macro‐ and micronutrients were analysed. Concentrations of heavy metals and the metalloid arsenic in the algae were also quantified.
RESULTS
The results confirm that marine macroalgae contain nutrients which are relevant for both human and animal nutrition, the concentrations whereof are highly dependent on species. Although heavy metals and arsenic were detected in the algae studied, concentrations were mostly below maximum allowed levels set by food and feed legislation in the EU.
The Internet of Things (IoT) is a paradigm that has made everyday objects intelligent by offering them the ability to connect to the Internet and communicate. Integrating the social component into IoT gave rise to the Social Internet of Things (SIoT), which has helped overcome various issues such as heterogeneity and navigability. In this kind of environment, participants compete to offer a variety of attractive services. Nevertheless, some of them resort to malicious behaviour to spread poor-quality services. They perform so-called Trust-Attacks and break the basic functionality of the system. Trust management mechanisms aim to counter these attacks and provide the user with an estimate of the trust degree they can place in other users, thus ensuring reliable and qualified exchanges and interactions. Several works in literature have interfered with this problem and have proposed different Trust-Models. The majority tried to adapt and reapply Trust-Models designed for common social networks or peer-to-peer ones. That is, despite the similarities between these types of networks, SIoT ones present specific peculiarities. In SIoT, users, devices and services are collaborating. Devices entities can present constrained computing and storage capabilities, and their number can reach some millions. The resulting network is complex, constrained and highly dynamic, and the attacks-implications can be more significant. In this paper, we propose DSL-STM a new dynamic and scalable multi-level Trust-Model, specifically designed for SIoT environments. We propose multidimensional metrics to describe and SIoT entities behaviours. The latter are aggregated via a Machine Learning-based method, allowing classifying users, detecting attack types and countering them. Finally, a hybrid propagation method is suggested to spread trust values in the network, while minimizing resource consumption and preserving scalability and dynamism. Experimentation made on various simulated scenarios allows us to prove the resilience and performance of DSL-STM.