Changes in operational environment of the process industry such as decreasing selling prices, increased competition between companies and new legislation, set requirements for performance and effectiveness of the industrial production lines and processes. For the basis of this study, a life cycle profit (LCP) model of a pulp process was constructed using different kind of process information including chemical consumptions and production levels of material and energy flows in unit processes. However, all the information needed in the creation of relevant LCP model was not directly provided by information systems of the plant. In this study, neural networks was used to model pulp bleaching process and fill out missing information and furthermore to create estimators for the alkaline chemical consumption. A data-based modelling approach was applied using an example, where factors affecting the sodium hydroxide consumption in the bleaching stage were solved. The results showed that raw process data can be refined into new valuable information using computational methods and moreover to improve the accuracy of life cycle profit models. 相似文献
Artificial Immune System algorithms use antibodies that fully specify the solution of an optimization, learning, or pattern
recognition problem. By being restricted to fully specified antibodies, an AIS algorithm cannot make use of schemata or classes
of partial solutions, while sub solutions can help a lot in faster emergence of a totally good solution in many problems.
To exploit schemata in artificial immune systems, this paper presents a novel algorithm that combines traditional artificial
immune systems and symbiotic combination operator. The algorithm starts searching with partially specified antibodies and
gradually builds more and more specified solutions till it finds complete answers. The algorithm is compared with CLONALG
algorithm on several multimodal function optimization and combinatorial optimization problems and it is shown that it is faster
than CLONALG on most problems and can find solutions in problems that CLONALG totally fails. 相似文献
High-performance polymers are an important class of materials that are used in challenging conditions, such as in aerospace applications. Until now, 3D printing based on stereolithography processes can not be performed due to a lack of suitable materials. There is report on new materials and printing compositions that enable 3D printing of objects having extremely high thermal resistance, with Tg of 283 °C and excellent mechanical properties. The printing is performed by a low-cost Digital Light Processing printer, and the formulation is based on a dual-cure mechanism, photo, and thermal process. The main components are a molecule that has both epoxy and acrylate groups, alkylated melamine that enables a high degree of crosslinking, and a soluble precursor of silica. The resulting objects are made of hybrid materials, in which the silicon is present in the polymeric backbone and partly as silica enforcement particles. 相似文献
We perceive big data with massive datasets of complex and variegated structures in the modern era. Such attributes formulate hindrances while analyzing and storing the data to generate apt aftermaths. Privacy and security are the colossal perturb in the domain space of extensive data analysis. In this paper, our foremost priority is the computing technologies that focus on big data, IoT (Internet of Things), Cloud Computing, Blockchain, and fog computing. Among these, Cloud Computing follows the role of providing on-demand services to their customers by optimizing the cost factor. AWS, Azure, Google Cloud are the major cloud providers today. Fog computing offers new insights into the extension of cloud computing systems by procuring services to the edges of the network. In collaboration with multiple technologies, the Internet of Things takes this into effect, which solves the labyrinth of dealing with advanced services considering its significance in varied application domains. The Blockchain is a dataset that entertains many applications ranging from the fields of crypto-currency to smart contracts. The prospect of this research paper is to present the critical analysis and review it under the umbrella of existing extensive data systems. In this paper, we attend to critics' reviews and address the existing threats to the security of extensive data systems. Moreover, we scrutinize the security attacks on computing systems based upon Cloud, Blockchain, IoT, and fog. This paper lucidly illustrates the different threat behaviour and their impacts on complementary computational technologies. The authors have mooted a precise analysis of cloud-based technologies and discussed their defense mechanism and the security issues of mobile healthcare.
The World Wide Web(WWW) comprises a wide range of information, and it is mainly operated on the principles of keyword matching which often reduces accurate information retrieval. Automatic query expansion is one of the primary methods for information retrieval, and it handles the vocabulary mismatch problem often faced by the information retrieval systems to retrieve an appropriate document using the keywords. This paper proposed a novel approach of hybrid COOT-based Cat and Mouse Optimization (CMO) algorithm named as hybrid COOT-CMO for the appropriate selection of optimal candidate terms in the automatic query expansion process. To improve the accuracy of the Cat and Mouse Optimization (CMO) algorithm, the parameters are tuned with the help of the Coot algorithm. The best suitable expanded query is identified from the available expanded query sets also known as candidate query pools. All feasible combinations in this candidate query pool should be obtained from the top retrieved documents. Benchmark datasets such as the GOV2 Test Collection, the Cranfield Collections, and the NTCIR Test Collection are utilized to assess the performance of the proposed hybrid COOT-CMO method for automatic query expansion. This proposed method surpasses the existing state-of-the-art techniques using many performance measures such as F-score, precision, and mean average precision (MAP).
An environmentally friendly and rapid procedure was developed to synthesise silver nanoparticles (Ag‐NPs) by Chamaemelum nobile extract and to evaluate its in vivo anti‐inflammatory and antioxidant activities. The ultraviolet–visible absorption spectrum of the synthesised Ag‐NPs showed an absorbance peak at 422. The average size of spherical nanoparticles was 24 nm as revealed by transmission electron microscopy. Fourier transform infra‐red spectroscopy analysis supported the presence of biological active compounds involved in the reduction of Ag ion and X‐ray diffraction confirmed the crystalline structure of the metallic Ag. The anti‐inflammatory and antioxidant activity of the Ag‐NPs was investigated against carrageenan‐induced paw oedema in mice. The levels of malondialdehyde (MDA) and antioxidant enzymes superoxide dismutase, catalase, glutathione peroxidase and inflammatory cytokines tumour necrosis factor (TNF‐α), interferon gamma and interleukin (IL)‐6, IL‐1β were assessed in this respect. The results demonstrated that anti‐inflammatory activity of the Ag‐NPs might be due to the ability of the nanoparticles to reduce IL‐1β, IL‐6 and TNF‐α. Moreover, reduction of antioxidant enzymes along with an increase in MDA level shows that the anti‐inflammatory activity of the synthesised Ag‐NPs by C. nobile is attributed to its ameliorating effect on the oxidative damage.Inspec keywords: silver, nanoparticles, nanofabrication, ultraviolet spectra, visible spectra, particle size, transmission electron microscopy, Fourier transform infrared spectra, X‐ray diffraction, crystal structure, enzymes, molecular biophysics, tumours, biomedical materials, nanomedicineOther keywords: Chamaemelum nobile extract, oxidative stress, mice paw, silver nanoparticles, antiinflammatory activity, antioxidant activity, ultraviolet‐visible absorption spectrum, spherical nanoparticle size, transmission electron microscopy, Fourier transform infrared spectroscopy, biological active compounds, X‐ray diffraction, crystalline structure, carrageenan‐induced paw oedema, malondialdehyde, antioxidant enzymes, superoxide dismutase, catalase, glutathione peroxidase, inflammatory cytokines, tumour necrosis factor, interferon gamma, interleukin, IL‐1β, IL‐6, TNF‐α, MDA level, Ag相似文献
In this article, a new population-based algorithm for real-parameter global optimization is presented, which is denoted as self-organizing centroids optimization (SOC-opt). The proposed method uses a stochastic approach which is based on the sequential learning paradigm for self-organizing maps (SOMs). A modified version of the SOM is proposed where each cell contains an individual, which performs a search for a locally optimal solution and it is affected by the search for a global optimum. The movement of the individuals in the search space is based on a discrete-time dynamic filter, and various choices of this filter are possible to obtain different dynamics of the centroids. In this way, a general framework is defined where well-known algorithms represent a particular case. The proposed algorithm is validated through a set of problems, which include non-separable problems, and compared with state-of-the-art algorithms for global optimization. 相似文献
Electroencephalography (EEG) is widely used in variety of research and clinical applications which includes the localization of active brain sources. Brain source localization provides useful information to understand the brain's behavior and cognitive analysis. Various source localization algorithms have been developed to determine the exact locations of the active brain sources due to which electromagnetic activity is generated in brain. These algorithms are based on digital filtering, 3D imaging, array signal processing and Bayesian approaches. According to the spatial resolution provided, the algorithms are categorized as either low resolution methods or high resolution methods. In this research study, EEG data is collected by providing visual stimulus to healthy subjects. FDM is used for head modelling to solve forward problem. The low‐resolution brain electromagnetic tomography (LORETA) and standardized LORETA (sLORETA) have been used as inverse modelling methods to localize the active regions in the brain during the stimulus provided. The results are produced in the form of MRI images. The tables are also provided to describe the intensity levels for estimated current level for the inverse methods used. The higher current value or intensity level shows the higher electromagnetic activity for a particular source at certain time instant. Thus, the results obtained demonstrate that standardized method which is based on second order Laplacian (sLORETA) in conjunction with finite difference method (FDM) as head modelling technique outperforms other methods in terms of source estimation as it has higher current level and thus, current density (J) for an area as compared to others. 相似文献
Process capability indices such as Cp are used extensively in manufacturing industries to assess processes in order to decide about purchasing. In practice, the parameter for calculating Cp is rarely known and is frequently replaced with estimates from an in-control reference sample. This article explores the optimal sample size required to achieve a desired error of estimation using absolute percentage error of different Cp estimates. Moreover, some practical tools are created to allow practitioners to find sample size in different situations. 相似文献