Catalysis Letters - An environmentally benign process for synthesizing 4-methoxyphenol through methylation of hydroquinone using polystyrene immobilized Bronsted acidic ionic liquid is presented.... 相似文献
Summary A new general procedure for preparation of functionalized oligopolysiloxanes of predetermined molecular weight is described. It utilizes heterogeneously catalyzed siloxane equilibration polymerization reactions which do not require troublesome and sometimes difficult post-preparative work-up procedures usually encountered with the well known homogeneously catalyzed corresponding reactions. The method is described using as example the preparation of , -telechelic vinyldimethylsiloxy-oligopolydimethylsiloxanes from octamethylcyclotetrasiloxane and 1,3-divinyltetramethyldisiloxane, but reference to the preparations of trimethylsiloxy-, dimethylsiloxy-and carboxypropyldimethylsiloxyoligopolydimethylsiloxanes, oligopolymethylhydridosiloxanes or their copolymers is also made. 相似文献
Mass transfer studies were carried out in a bubble column using the chemical method. Catalytic oxidation of sodium sulfite was chosen for the studies and the corresponding specific rates of oxidation were obtained using a stirred cell. Laser Doppler anemometer (LDA) was used to measure the instantaneous velocities in the same stirred cell as well as in bubble columns (100 and i.d.). An efficient algorithm based on the multiresolution analysis of the velocity-time data using wavelets was used for the isolation of data belonging to the gas and liquid phases. Eddy isolation model was used for the characterization of the eddy motion including the estimation of the energy dissipation rate. Using the knowledge of eddy motion, a methodology was developed for the prediction of true mass transfer coefficient (kL) in a stirred cell as well as in bubble columns. The predicted values of kL have been compared with the experimental values obtained by the chemical method. 相似文献
A novel image encryption framework is proposed in this article. A new chaotic map and a pseudorandom bit generator are proposed. Apart from this, a novel image encryption system is designed based on the proposed map and the proposed pseudorandom bit generator. These three are the major contributions of this work that makes a complete cryptosystem. The proposed new chaotic map is proposed which will be known as the ‘RCM map’ and its chaotic property is studied based on Devaney’s theory. The proposed pseudorandom bit generator is tested using the NIST test suite. The proposed method is simple to implement and does not involve any highly complex operations. Moreover, the proposed method is completely lossless, and therefore cent percent of data can be recovered from the encrypted image. The decryption process is also simple to implement i.e. just reverse of the encryption procedure. A scrambling algorithm is also proposed to further enhance the security of the overall system. The simulation, detailed analysis, and comparative studies of the proposed overall image encryption framework will help to understand the strengths and weaknesses of it. The experimental results are very promising and show the prospects of chaos theory and its usage in the field of data security.
Cloud computing is becoming a key factor in the market day by day. Therefore, many companies are investing or going to invest in this sector for development of large data centers. These data centers not only consume more energy but also produce greenhouse gases. Because of large amount of power consumption, data center providers go for different types of power generator to increase the profit margin which indirectly affects the environment. Several studies are carried out to reduce the power consumption of a data center. One of the techniques to reduce power consumption is virtualization. After several studies, it is stated that hardware plays a very important role. As the load increases, the power consumption of the CPU is also increased. Therefore, by extending the study of virtualization to reduce the power consumption, a hardware-based algorithm for virtual machine provisioning in a private cloud can significantly improve the performance by considering hardware as one of the important factors. 相似文献
Query optimizers rely on statistical models that succinctly describe the underlying data. Models are used to derive cardinality estimates for intermediate relations, which in turn guide the optimizer to choose the best query execution plan. The quality of the resulting plan is highly dependent on the accuracy of the statistical model that represents the data. It is well known that small errors in the model estimates propagate exponentially through joins, and may result in the choice of a highly sub-optimal query execution plan. Most commercial query optimizers make the attribute value independence assumption: all attributes are assumed to be statistically independent. This reduces the statistical model of the data to a collection of one-dimensional synopses (typically in the form of histograms), and it permits the optimizer to estimate the selectivity of a predicate conjunction as the product of the selectivities of the constituent predicates. However, this independence assumption is more often than not wrong, and is considered to be the most common cause of sub-optimal query execution plans chosen by modern query optimizers. We take a step towards a principled and practical approach to performing cardinality estimation without making the independence assumption. By carefully using concepts from the field of graphical models, we are able to factor the joint probability distribution over all the attributes in the database into small, usually two-dimensional distributions, without a significant loss in estimation accuracy. We show how to efficiently construct such a graphical model from the database using only two-way join queries, and we show how to perform selectivity estimation in a highly efficient manner. We integrate our algorithms into the PostgreSQL DBMS. Experimental results indicate that estimation errors can be greatly reduced, leading to orders of magnitude more efficient query execution plans in many cases. Optimization time is kept in the range of tens of milliseconds, making this a practical approach for industrial-strength query optimizers. 相似文献
This paper investigates the stabilization of uncertain networked control systems (NCSs) with actuator saturation and cyberattacks. The cyberattacks are governed by a set of independent random variables satisfying Bernoulli distribution. To relieve the network bandwidth load effectively, an event-triggered communication strategy is proposed. By employing Lyapunov stability theory and stochastic analysis techniques, a stability criterion is obtained for the system with actuator saturation and cyberattacks. Moreover, the desired controller gain is derived by solving some matrix inequalities. Finally, the validity and applicability of the criteria are verified through numerical examples. 相似文献
The purpose of this work was to explore a new feature extraction method for classifying paddy seeds using a feature extraction algorithm to achieve the area ratio, horizontal–slant and front–rear angles and find whether the proposed features have high discriminating power. Another objective was to find the smallest feature set that can ensure highly accurate recognition of seeds. A total of a 100 image features were extracted, and features having significant discriminating power were identified based on the analysis of variance (ANOVA). From the 100 features, 14 features were found to have high discriminating power and from these features, six were selected as the proposed features. Experimental results show that the proposed features and removal of redundant features enhanced the discriminating power of the feature set, and that the proposed features have an excellent discriminating property for seeds. The presented features resulted in the highest classification accuracy (98.8%) when compared to other methods. 相似文献