The suitability of computational simulation of the Belousov–Zhabotinskii oscillating chemical reaction by differential kinetic methodology for resolving nonlinear multi-component system is demonstrated in this work. According to the Field–KÖrÖs–Noyes mechanism and the Oregonator model, the change of the concentrations of HBrO2, bromide ion and cerium ion are simulated. The results of computational simulation are consistent with experimental results very well. At the same time, the effect of variables and parameters, especially the rate constant on the oscillation curve, are investigated deeply. A simple method of estimating rate constants is obtained through simulation the concentrations of key components of the system, and then comparison the simulation results with the experimental ones. The reasonable rate constant is also proposed. 相似文献
In single particle analysis, two-dimensional (2-D) alignment is a fundamental step intended to put into register various particle projections of biological macromolecules collected at the electron microscope. The efficiency and quality of three-dimensional (3-D) structure reconstruction largely depends on the computational speed and alignment accuracy of this crucial step. In order to improve the performance of alignment, we introduce a new method that takes advantage of the highly accurate interpolation scheme based on the gridding method, a version of the nonuniform fast Fourier transform, and utilizes a multi-dimensional optimization algorithm for the refinement of the orientation parameters. Using simulated data, we demonstrate that by using less than half of the sample points and taking twice the runtime, our new 2-D alignment method achieves dramatically better alignment accuracy than that based on quadratic interpolation. We also apply our method to image to volume registration, the key step in the single particle EM structure refinement protocol. We find that in this case the accuracy of the method not only surpasses the accuracy of the commonly used real-space implementation, but results are achieved in much shorter time, making gridding-based alignment a perfect candidate for efficient structure determination in single particle analysis. 相似文献
A scheme for the design of diffractive phase elements (DPE's) that integrates several optical functions is presented in a consistent sense based on the general theory of amplitude-phase retrieval and the Yang-Gu algorithm [Appl. Opt. 33, 209 (1994)]. We extend the original Yang-Gu algorithm to treat a system illuminated by a beam of incident light whose components are at different wavelengths, and a set of equations for determining the phase distribution of the DPE is derived. The profile of a surface-relief DPE can be designed with an iterative algorithm. Numerical simulations are carried out for the design of one-dimensional DPE's capable of both demultiplexing different wavelength components and focusing each partial wave at predetermined positions. The influence of the extension of sampling points in the DPE's from ideal geometric points to physical spots on design results is also investigated. The numerical simulation results show that the new algorithm can be used successfully to design the desired DPE's. It is therefore expected to be useful in the design of DPE's for micro-optical systems. 相似文献
Under isothermal and linear heating conditions, the thermal stability of the three-dimensional metallic glass Ni68B21Si11, produced by rapid quenching of the denucleated melt (RQDM), has been systematically studied using PE DSC7 differential scanning calorimetry in relation to denucleation of liquid alloy prior to rapid quenching, pre-anneal treatment of amorphous specimens, and cooling rate. The following results were observed. First, the thermal stability of metallic glass prepared by RQDM is obviously enhanced because of the removal of pre-existing nuclei in advance. This is substantiated by the experimental data showing that the projected life of three-dimensional metallic glass Ni68B21Si11 is increased by an order of magnitude at 400 K. Secondly, pre-anneal treatment of the amorphous alloy leads to a reduction of temperature for the onset of crystallization,Tx, and crystallization heat, H. Finally, quenching rates have little effect on the thermal stability of amorphous alloys. 相似文献
Wireless Personal Communications - With the rapid development of information technology, issues such as network security and privacy protection have attracted more and more attention. The... 相似文献
Engineering with Computers - Aerated flow characterized by complex mass transfer processes with multiple hydraulic properties is a common enviro-hydraulics phenomenon, which have a variety of... 相似文献
The Journal of Supercomputing - In the edge computing, service placement refers to the process of installing service platforms, databases, and configuration files corresponding to computing tasks... 相似文献
The heavy reliance on data is one of the major reasons that currently limit the development of deep learning. Data quality directly dominates the effect of deep learning models, and the long-tailed distribution is one of the factors affecting data quality. The long-tailed phenomenon is prevalent due to the prevalence of power law in nature. In this case, the performance of deep learning models is often dominated by the head classes while the learning of the tail classes is severely underdeveloped. In order to learn adequately for all classes, many researchers have studied and preliminarily addressed the long-tailed problem. In this survey, we focus on the problems caused by long-tailed data distribution, sort out the representative long-tailed visual recognition datasets and summarize some mainstream long-tailed studies. Specifically, we summarize these studies into ten categories from the perspective of representation learning, and outline the highlights and limitations of each category. Besides, we have studied four quantitative metrics for evaluating the imbalance, and suggest using the Gini coefficient to evaluate the long-tailedness of a dataset. Based on the Gini coefficient, we quantitatively study 20 widely-used and large-scale visual datasets proposed in the last decade, and find that the long-tailed phenomenon is widespread and has not been fully studied. Finally, we provide several future directions for the development of long-tailed learning to provide more ideas for readers.
Neural Computing and Applications - Existing data race detection approaches based on deep learning are suffering from the problems of unique feature extraction and low accuracy. To this end, this... 相似文献