By combining the modified Stokes-Einstein formula with the authors’ model for the melting-point viscosity, the authors present
a model for accurate predictions of self-diffusivity of liquid metallic elements. The model is expressed in terms of well-known
physical quantities and has been applied to various liquid metallic elements for which experimental data are available. The
results of calculations show that agreement with experimental data is excellent; the uncertainties in the calculations of
the self-diffusivities in various liquid metallic elements are equal to the uncertainties associated with experimental measurements.
Also, the authors propose an expression for the temperature dependence of self-diffusivity in liquid metallic elements in
terms of melting-point temperature. Using the model, self-diffusivity data are predicted for liquid iron, cobalt, nickel,
titanium, aluminum, magnesium, silicon, and so forth. 相似文献
Increasing the parallelism in transaction processing and maintaining data consistency appear to be two conflicting goals in designing distributed database systems (DDBSs). This problem is especially difficult if the DDBS is serving long-lived transactions (LLTs). A special case of LLTs, called sagas, has been introduced that addresses this problem. A DDBS with sagas provides high parallelism to transactions by allowing sagas to release their locks as early as possible. However, it is also subject to an overhead, due to the efforts needed to restore data consistency in the case of failure. We conduct a series of simulation studies to compare the performance of LLT systems with and without saga implementation in a faulty environment. The studies show that saga systems outperform their nonsaga counterparts under most of conditions, including heavy failure cases. We thus propose an analytical queuing model to investigate the performance behavior of saga systems. The development of this analytical model assists us to quantitatively study the performance penalty of a saga implementation due to the failure recovery overhead. Furthermore, the analytical solution can be used by system administrators to fine-tune the performance of a saga system. This analytical model captures the primary aspects of a saga system, namely data locking, resource contention and failure recovery. Due to the complicated nature of the analytical modeling, we solve the model approximately for various performance metrics using decomposition methods, and validate the accuracy of the analytical results via simulations 相似文献
Amorphous films of germanium were grown using a vacuum evaporation technique, on glass substrates kept at room temperature. As-grown films were irradiated with Q-Switched Nd-YAG laser pulses (=1.06 m, 20nsec, 10 to 50Jcm–2). The d.c. conductivity measurements were made in the temperature range 77 to 300 K. It was observed that the effect of laser irradiation was similar to the effect caused by the thermal annealing of the films. The d.c. conductivity data were analysed in the light of Mott's theory of a variable range hopping conduction process. 相似文献
In the present investigation, the effect of three process variables viz., air rate, temperature and time of oxidation has been studied to determine the possibility of making paving bitumens from North Rumaila Crude Oil. In order to have flexibility in the choice of proper feedstock for bitumen production, the effect of these process parameters was studied on three different vacuum residues. Air rate and temperature of oxidation were optimised to produce a bitumen product having less temperature susceptibility. The composition studies of typical feeds and oxidised products indicate that mainly “saturates” components get converted to “asphaltics” on air blowing. 相似文献
Number of software applications demands various levels of security at the time of scheduling in Computational Grid. Grid may offer these securities but may result in the performance degradation due to overhead in offering the desired security. Scheduling performance in a Grid is affected by the heterogeneities of security and computational power of resources. Customized Genetic Algorithms have been effectively used for solving complex optimization problems (NP Hard) and various heuristics have been suggested for solving Multi-objective optimization problems. In this paper a security driven, elitist non-dominated sorting genetic algorithm, Optimal Security with Optimal Overhead Scheduling (OSO2S), based on NSGA-II, is proposed. The model considers dual objectives of minimizing the security overhead and maximizing the total security achieved. Simulation results exhibit that the proposed algorithm delivers improved makespan and lesser security overhead in comparison to other such algorithms viz. MinMin, MaxMin, SPMinMin, SPMaxMin and SDSG. 相似文献
We study the complexity issues for Walrasian equilibrium in a special case of combinatorial auction, called single-minded
auction, in which every participant is interested in only one subset of commodities. Chen et al. (J. Comput. Syst. Sci. 69(4):
675–687, 2004) showed that it is NP-hard to decide the existence of a Walrasian equilibrium for a single-minded auction and proposed a
notion of approximate Walrasian equilibrium called relaxed Walrasian equilibrium. We show that every single-minded auction
has a relaxed Walrasian equilibrium that satisfies at least two-thirds of the participants, proving a conjecture posed in
Chen et al. (J. Comput. Syst. Sci. 69(4): 675–687, 2004). Motivated by practical considerations, we introduce another concept of approximate Walrasian equilibrium called weak Walrasian
equilibrium. We show NP-completeness and hardness of approximation results for weak Walrasian equilibria.
In search of positive results, we restrict our attention to the tollbooth problem (Guruswami et al. in Proceedings of the
Symposium on Discrete Algorithms (SODA), pp. 1164–1173, 2005), where every participant is interested in a single path in some underlying graph. We give a polynomial time algorithm to
determine the existence of a Walrasian equilibrium and compute one (if it exists), when the graph is a tree. However, the
problem is still NP-hard for general graphs. 相似文献
Data quality became significant with the emergence of data warehouse systems. While accuracy is intrinsic data quality, validity of data presents a wider perspective, which is more representational and contextual in nature. Through our article we present a different perspective in data collection and collation. We focus on faults experienced in data sets and present validity as a function of allied parameters such as completeness, usability, availability and timeliness for determining the data quality. We also analyze the applicability of these metrics and apply modifications to make it conform to IoT applications. Another major focus of this article is to verify these metrics on aggregated data set instead of separate data values. This work focuses on using the different validation parameters for determining the quality of data generated in a pervasive environment. Analysis approach presented is simple and can be employed to test the validity of collected data, isolate faults in the data set and also measure the suitability of data before applying algorithms for analysis. On analyzing the data quality of the two data sets on the basis of above-mentioned parameters. We show that validity for data set 1 was found to be 75% while it was found to be 67% only for data set 2. Availability and data freshness metrics performance were analyzed graphically. It was found that for data set 1, data freshness was better while availability metric was found better for data set 2. Usability obtained for data set 2 was 86% which was higher as compared to data set 1 whose usability metric was 69%. Thus, this work presents methods that can be leveraged for estimating data quality that can be beneficial in various IoT based industries which are essentially data centric and the decisions made by them depends upon the validity of data.
Wireless Personal Communications - This work aims to implement a clustering scheme to separate vehicles into a cluster that is based on various parameters, such as the total number of relay nodes,... 相似文献
The present study reports a novel, facile, biosynthesis route for the synthesis of carbon nanodots (CDs) with an approximate quantum yield of 38.5%, using Musk melon extract as a naturally derived‐precursor material. The synthesis of CDs was established by using ultraviolet–visible (UV–vis) spectroscopy, Dynamic light scattering, photoluminescence spectroscopy, X‐ray diffraction, transmission electron microscopy and Fourier transform infrared (FTIR) spectroscopy. The as‐prepared CDs possess an eminent fluorescence under UV–light (λex = 365 nm). The size range of CDs was found to be in the range of 5–10 nm. The authors further explored the use of such biosynthesised CDs as a photocatalyst material for removal of industrial dye. Degradation of methylene blue dye was performed in a photocatalytic reactor and monitored using UV–vis spectroscopy. The CDs show excellent dye degradation capability of 37.08% in 60 min and reaction rate of 0.0032 min−1. This study shows that synthesised CDs are highly stable in nature, and possess potential application in wastewater treatment.Inspec keywords: carbon, nanostructured materials, nanofabrication, catalysis, photochemistry, ultraviolet spectra, visible spectra, photoluminescence, X‐ray diffraction, transmission electron microscopy, Fourier transform infrared spectra, fluorescence, dyesOther keywords: green synthesis, highly stable CD, photocatalytic performance, biosynthesis route, carbon nanodots, quantum yield, Musk melon extract, naturally derived‐precursor material, ultraviolet‐visible spectroscopy, dynamic light scattering, photoluminescence spectroscopy, X‐ray diffraction, transmission electron microscopy, Fourier transform infrared spectroscopy, FTIR spectroscopy, fluorescence, biosynthesised CD, photocatalyst material, industrial dye, methylene blue dye degradation, photocatalytic reactor, UV‐vis spectroscopy, wastewater treatment, size 5 nm to 10 nm, time 60 min相似文献
Motivated by reliability considerations in data deduplication for storage systems, we introduce the problem of flexible coloring. Given a hypergraph H and the number of allowable colors k, a flexible coloring of H is an assignment of one or more colors to each vertex such that, for each hyperedge, it is possible to choose a color from each vertex?s color list so that this hyperedge is strongly colored (i.e., each vertex has a different color). Different colors for the same vertex can be chosen for different incident hyperedges (hence the term flexible). The goal is to minimize color consumption, namely, the total number of colors assigned, counting multiplicities. Flexible coloring is NP-hard and trivially approximable, where s is the size of the largest hyperedge, and n is the number of vertices. Using a recent result by Bansal and Khot, we show that if k is constant, then it is UGC-hard to approximate to within a factor of s−ε, for arbitrarily small constant ε>0. Lastly, we present an algorithm with an approximation ratio, where k′ is number of colors used by a strong coloring algorithm for H. 相似文献