Combining accurate neural networks (NN) in the ensemble with negative error correlation greatly improves the generalization ability. Mixture of experts (ME) is a popular combining method which employs special error function for the simultaneous training of NN experts to produce negatively correlated NN experts. Although ME can produce negatively correlated experts, it does not include a control parameter like negative correlation learning (NCL) method to adjust this parameter explicitly. In this study, an approach is proposed to introduce this advantage of NCL into the training algorithm of ME, i.e., mixture of negatively correlated experts (MNCE). In this proposed method, the capability of a control parameter for NCL is incorporated in the error function of ME, which enables its training algorithm to establish better balance in bias-variance-covariance trade-off and thus improves the generalization ability. The proposed hybrid ensemble method, MNCE, is compared with their constituent methods, ME and NCL, in solving several benchmark problems. The experimental results show that our proposed ensemble method significantly improves the performance over the original ensemble methods. 相似文献
In this article, we consider the project critical path problem in an environment with hybrid uncertainty. In this environment, the duration of activities are considered as random fuzzy variables that have probability and fuzzy natures, simultaneously. To obtain a robust critical path with this kind of uncertainty a chance constraints programming model is used. This model is converted to a deterministic model in two stages. In the first stage, the uncertain model is converted to a model with interval parameters by alpha-cut method and distribution function concepts. In the second stage, the interval model is converted to a deterministic model by robust optimization and min-max regret criterion and ultimately a genetic algorithm with a proposed exact algorithm are applied to solve the final model. Finally, some numerical examples are given to show the efficiency of the solution procedure. 相似文献
In this paper, a novel algorithm for image encryption based on hash function is proposed. In our algorithm, a 512-bit long external secret key is used as the input value of the salsa20 hash function. First of all, the hash function is modified to generate a key stream which is more suitable for image encryption. Then the final encryption key stream is produced by correlating the key stream and plaintext resulting in both key sensitivity and plaintext sensitivity. This scheme can achieve high sensitivity, high complexity, and high security through only two rounds of diffusion process. In the first round of diffusion process, an original image is partitioned horizontally to an array which consists of 1,024 sections of size 8 × 8. In the second round, the same operation is applied vertically to the transpose of the obtained array. The main idea of the algorithm is to use the average of image data for encryption. To encrypt each section, the average of other sections is employed. The algorithm uses different averages when encrypting different input images (even with the same sequence based on hash function). This, in turn, will significantly increase the resistance of the cryptosystem against known/chosen-plaintext and differential attacks. It is demonstrated that the 2D correlation coefficients (CC), peak signal-to-noise ratio (PSNR), encryption quality (EQ), entropy, mean absolute error (MAE) and decryption quality can satisfy security and performance requirements (CC <0.002177, PSNR <8.4642, EQ >204.8, entropy >7.9974 and MAE >79.35). The number of pixel change rate (NPCR) analysis has revealed that when only one pixel of the plain-image is modified, almost all of the cipher pixels will change (NPCR >99.6125 %) and the unified average changing intensity is high (UACI >33.458 %). Moreover, our proposed algorithm is very sensitive with respect to small changes (e.g., modification of only one bit) in the external secret key (NPCR >99.65 %, UACI >33.55 %). It is shown that this algorithm yields better security performance in comparison to the results obtained from other algorithms. 相似文献
The ability to handle very large amounts of image data is important for image analysis, indexing and retrieval applications. Sadly, in the literature, scalability aspects are often ignored or glanced over, especially with respect to the intricacies of actual implementation details. In this paper we present a case-study showing how a standard bag-of-visual-words image indexing pipeline can be scaled across a distributed cluster of machines. In order to achieve scalability, we investigate the optimal combination of hybridisations of the MapReduce distributed computational framework which allows the components of the analysis and indexing pipeline to be effectively mapped and run on modern server hardware. We then demonstrate the scalability of the approach practically with a set of image analysis and indexing tools built on top of the Apache Hadoop MapReduce framework. The tools used for our experiments are freely available as open-source software, and the paper fully describes the nuances of their implementation. 相似文献
Adsorption isotherms for activated carbon made from pecan shells have been obtained at 25 °C and an approximate pH of 3 for a number of metal ion solutes. It was found that the Slips and Freundlich equations were satisfactory for explaining the experimental data. The correlation of metal ion adsorption with the solute parameters of metal ion electronegativity and first stability constant of the metal hydroxide was investigated. In the case of most of the metal ions studied, higher electronegativities and stability constants corresponded to the higher adsorption levels of metal ions onto the activated carbon. A correlation was developed that predicts the constants of the Freundlich equation from the selected parameters of the metal ions, and thus can predict the adsorption isotherms at constant pH. The developed correlation gives results with acceptable deviations from experimental data. A procedure is proposed for obtaining similar correlations for different conditions (temperature, pH, carbon type and dosage). The ratio of equivalent metal ions adsorbed to protons released is calculated for the studied metal ions over a range of concentrations. In most cases, particularly at low concentrations, this ratio is close to one, confirming that ion exchange of one proton with one equivalent metal ion is the dominant reaction mechanism. 相似文献
ABSTRACT Cadmium as a highly toxic metal is released into the environment through paper production, metal processing, phosphate fertilizers, insecticides, and treatment of wastewater. Cadmium also inhibits the body activities and is very toxic for kidney and other organisms. In the current study, zinc-based metal–organic framework, zeolitic imidazolate framework (ZIF)-8, was synthesized and modified by dimethylethylenediamine (ZIF-8-mmen) for the removal of cadmium. To optimize the experiments, response surface methodology was applied with three variables including pH, adsorbent dosage, and contact time using central composite design. The optimum conditions for pH, dosage, and time were 2, 0.1 g, and 89 min, respectively, with removal efficiency of 85.38%. The Langmuir isotherm (qm = 1000 mg/g) indicates the monolayer adsorption. The kinetic studies reveal that the Lagergren model was predominant and cadmium was not chemisorbed. Thermodynamic parameters show spontaneous, endothermic, and physisorption processes. 相似文献
Mapping vulnerability to Saltwater Intrusion (SWI) in coastal aquifers is studied in this paper using the GALDIT framework but with a novelty of transforming the concept of vulnerability indexing to risk indexing. GALDIT is the acronym of 6 data layers, which are put consensually together to invoke a sense of vulnerability to the intrusion of saltwater against aquifers with freshwater. It is a scoring system of prescribed rates to account for local variations; and prescribed weights to account for relative importance of each data layer but these suffer from subjectivity. Another novelty of the paper is to use fuzzy logic to learn rate values and catastrophe theory to learn weight values and these together are implemented as a scheme and hence Fuzzy-Catastrophe Scheme (FCS). The GALDIT data layers are divided into two groups of Passive Vulnerability Indices (PVI) and Active Vulnerability Indices (AVI), where their sum is Total Vulnerability Index (TVI) and equivalent to GALDIT. Two additional data layers (Pumping and Water table decline) are also introduced to serve as Risk Actuation Index (RAI). The product of TVI and RAI yields Risk Indices. The paper applies these new concepts to a study area, subject to groundwater decline and a possible saltwater intrusion problem. The results provide a proof-of-concept for PVI, AVI, RAI and RI by studying their correlation with groundwater quality samples using the fraction of saltwater (fsea), Groundwater Quality Indices (GQI) and Piper diagram. Significant correlations between the appropriate values are found and these provide a new insight for the study area.
In this work, the transport properties of gaseous penetrant through several dense glassy polymeric membranes are studied. The nonequilibrium lattice fluid (NELF) in conjunction with the modified Fick's law and dual mode sorption model was used to simulate the gas transport in glassy polymeric membranes. The approach is based on the sorption, diffusion, in which solubility is calculated based on the NELF model, and diffusion coefficient is obtained from the product thermodynamic coefficient and molecular mobility. The governing equation is solved by the finite element method using COMSOL multi-physics software. The developed model for gas permeability of glassy polymeric membrane can be applied in a wide range of pressure and temperature. The comparison of the calculated permeability and solubility of gasses with the experimental data represented the ability of the developed model. Increasing feed gas temperature increases the gas permeability, while this variation leads to lower gas solubility in the glassy polymeric membranes. The effect of feed temperature and pressure on permeability and solubility is investigated, and the experimental data from literature are described by the developed model. A good prediction of the experimental data can be observed over the considered condition. 相似文献