Calcination is a thermo-chemical process, widely used in the cement industry, where limestone is converted by thermal decomposition into lime CaO and carbon dioxide CO2. The focus of this paper is on the implementation and validation of the endothermic calcination reaction mechanism of limestone in a commercial finite volume based CFD code. This code is used to simulate the turbulent flow field, the temperature field, concentrations of the reactants and products, as well as the interaction of particles with the gas phase, by solving the mathematical equations, which govern these processes. For calcination, the effects of temperature, decomposition pressure, diffusion and pore efficiency were taken into account. A simple three-dimensional geometry of a pipe reactor was used for numerical simulations. To verify the accuracy of the modelling approach, the numerical predictions were compared with experimental data, yielding satisfying results and proper trends of physical parameters influencing the process. 相似文献
This paper presents a decentralized observer with a consensus filter for the state observation of discrete-time linear distributed systems. Each agent in the distributed system has an observer with a model of the plant that utilizes the set of locally available measurements, which may not make the full plant state detectable. This lack of detectability is overcome by utilizing a consensus filter that blends the state estimate of each agent with its neighbors’ estimates. It is proven that the state estimates of the proposed observer exponentially converge to the actual plant states under arbitrarily changing, but connected, communication and pseudo-connected sensing graph topologies. Except these connectivity properties, full knowledge of the sensing and communication graphs is not needed at the design time. As a byproduct, we obtained a result on the location of eigenvalues, i.e., the spectrum, of the Laplacian for a family of graphs with self-loops. 相似文献
The main aim of data analysis in biochemical metrology is the extraction of relevant information from biochemical data measurements. A system of extended exploratory data analysis (EDA) based on the concept of graphical tools for sample data summarization and exploration is proposed and the original EDA algorithm in S-Plus is available on the Internet at http://www.trilobyte.cz/EDA. To check basic assumptions about biochemical and medical data is to examine the independence of sample elements, sample normality and homogeneity. The exact assessment of the mean-value and the variance of steroid levels in controls is necessary for the correct assessment of the samples from patients. Data examination procedures are illustrated by a determination of the mean-value of 17-hydroxypregnenolone in the umbilical blood of newborns. For an asymmetric, strongly skewed sample distribution corrupted with outliers the best estimate of location seems to be the median. The Box–Cox transformation improves a sample symmetry. The proposed procedure gives reliable estimates of a mean-value for an asymmetric distribution of 17-hydroxypregnenolone when the arithmetic mean can not be used. 相似文献
The paper deals with the problem of automatic verification of programs working with extended linear linked dynamic data structures,
in particular, pattern-based verification is considered. In this approach, one can abstract memory configurations by abstracting
away the exact number of adjacent occurrences of certain memory patterns. With respect to the previous work on the subject
the method presented in the paper has been extended to be able to handle multiple patterns, which allows for verification
of programs working with more types of structures and/or with structures with irregular shapes. The experimental results obtained
from a prototype implementation of the method show that the method is very competitive and offers a big potential for future
extensions. 相似文献
This paper demonstrates and systematically characterizes the enrichment of biomolecular compounds using aptamer-functionalized surfaces within a microfluidic device. The device consists of a microchamber packed with aptamer-functionalized microbeads and integrated with a microheater and temperature sensor to enable thermally controlled binding and release of biomolecules by the aptamer. We first present an equilibrium binding-based analytical model to understand the enrichment process. The characteristics of the aptamer-analyte binding and enrichment are then experimentally studied, using adenosine monophosphate (AMP) and a specific RNA aptamer as a model system. The temporal process of AMP binding to the aptamer is found to be primarily determined by the aptamer-AMP binding kinetics. The temporal process of aptamer-AMP dissociation at varying temperatures is also obtained and observed to occur relatively rapidly (<2 s). The specificity of the enrichment is next confirmed by performing selective enrichment of AMP from a sample containing biomolecular impurities. Finally, we investigate the enrichment of AMP by either discrete or continuous introduction of a dilute sample into the microchamber, demonstrating enrichment factors ranging from 566 to 686×, which agree with predictions of the analytical model. 相似文献
As the urgent need for efficient and sustainable energy usage becomes ever more apparent, interest in Smart Homes is on the rise. The SESAME-S project (SEmantic SmArt Metering – Services for Energy Efficient Houses) uses semantically linked data to actively assist end-consumers in making well-informed decisions and controlling their energy consumption. By integrating smart metering and home automation functionality, SESAME-S works to effectively address the potential mass market of end-consumers with an easily customizable solution that can be widely implemented in domestic or business environments, with expected savings of over 20?% from the total energy bill. The developed system is a basis for conceptualizing, demonstrating, and evaluating a variety of innovative end-consumer services and their user interface paradigms. In this paper, we present the SESAME-S system as a whole and discuss the semantically enabled services, demonstrating that such systems may have broad acceptance in the future. The data obtained through such systems will be invaluable for future global energy-efficiency strategies and businesses. 相似文献
Recent technological developments made various many-core hardware platforms widely accessible. These massively parallel architectures have been used to significantly accelerate many computation demanding tasks. In this paper, we show how the algorithms for LTL model checking can be redesigned in order to accelerate LTL model checking on many-core GPU platforms. Our detailed experimental evaluation demonstrates that using the NVIDIA CUDA technology results in a significant speedup of the verification process. Together with state space generation based on shared hash-table and DFS exploration, our CUDA accelerated model checker is the fastest among state-of-the-art shared memory model checking tools. 相似文献
Transform coding is commonly used in image processing algorithms to provide high compression ratios, often at the expense of processing time and simplicity of the system. We have recently proposed a pixel value prediction scheme in order to exploit adjacent pixel correlation, providing a low-complexity model for image coding. However, the proposed model was unable to reach high compression ratios retaining high quality of reconstructed image at the same time. In this paper we propose a new segmentation algorithm which further utilizes adjacent pixel correlation, provides higher compression ratios and it is based on application of Hadamard transform coding. Additional compression is provided by using vector quantization for a low number of quantization levels and by simplifying generalized Lloyd’s algorithm where the special attention is paid to determination of optimal partitions for vector quantization, making a fixed quantizer. The proposed method is quite simple and experimental results show that it ensures better or similar rate-distortion ratio for very low bit-rates, comparing to the other similar methods that are based on wavelet or curvelet transform coding and support or core vector machine application. Furthermore, the proposed method requires very low processing time since the proposed quantizers are fixed, much less than the required time for the aforementioned methods that we compare with as well as much less than the time required for fractal image coding. In the end, the appropriate discussion is provided comparing the results with a scheme based on linear prediction and dual-mode quantization.
This paper examines the success of an e-learning system in a company from the perspective of employees by using a multimethod approach. For this purpose Moodle learning management system was used. The success of e-learning as an information system was evaluated using four constructs of the updated DeLone and McLean IS success model—system quality, use, user satisfaction and net benefits, and adding one more construct—user performance. In this research a combination of observation and survey as two different research methods was used, which allowed the new measure to be incorporated into the model. Empirical assessment was carried out by exploratory factor analysis, confirmatory factor analysis and structural equation modeling. The research model was found to be valid and reliable. The results provide an expanded understanding of the constructs that measure the success of an e-learning system, helping to more deeply understand the key success dimensions and their interrelationships. The implications of our work were discussed. The DeLone and McLean IS success model applied equally well. However, the use of observation as a method of data collection revealed the weaknesses of the original model. 相似文献