The effects of three types of salt including NaF, KCl, and NaCl on the properties of NiFe2O4 nanoparticles using salt-assisted solution combustion synthesis (SSCS) have been investigated. The synthesized powders were evaluated by SEM, TEM, FTIR, XRD, and VSM analysis. Also, the specific surface area (SSA), as well as size distribution and volume of the porosities of NiFe2O4 powders were determined by the BET apparatus. The visual observations showed that the intensity and time of combustion synthesis of nanoparticles have been severely influenced by the type of salt. The highest crystallinity was observed in the synthesized powder using NaCl. The SSA has also been correlated completely to the type of salt. The quantities of SSA was achieved about 91.62, 64.88, and 47.22 m2g-1 for the powders synthesized by KCl, NaCl, and NaF respectively. Although the magnetic hysteresis loops showed the soft ferromagnetic behavior of the NiFe2O4 nanoparticles in all conditions, KCl salt could produce the particles with the least coercivity and remanent magnetization. Based on the present study, the salt type is a key parameter in the SSCS process for the preparation of spinel ferrites. Thermodynamic evaluation also showed that the melting point and heat capacity are important parameters for the proper selection of the salt. 相似文献
In this paper, we present a pipeline architecture specifically designed to process and compress DNA microarray images. Many of the pixilated image generation methods produce one row of the image at a time. This property is fully exploited by the proposed pipeline that takes in one row of the produced image at each clock pulse and performs the necessary image processing steps on it. This will remove the present need for sluggish software routines that are considered a major bottleneck in the microarray technology. Moreover, two different structures are proposed for compressing DNA microarray images. The proposed architecture is proved to be highly modular, scalable, and suited for a standard cell VLSI implementation. 相似文献
This paper addresses a new concept for generating bills of quantities (B.o.Qs) using AutoCAD drawings for a building project, which demonstrates the application of Industry Foundation Class (IFC), developed by International Alliance for Interoperability (IAI). The procedure considers types of materials and the structural shapes of the AutoCAD drawings to compute the cost of the structural skeleton elements using interactive automation. The main concept focuses on using layer computation of the AutoCAD drawing after converting it into a drawing inter-exchangeable file format (DXF). Once the coordinates are detected, it is easier to determine the area and volume for any structural shape, including circles and polygons. The extracting method is a new technique for structural engineers and quantity surveyors to estimate required material for beam, columns, slabs and foundations. The algorithm extracts and recognizes the layers and objects from a two dimensional DXF drawing along with the coordinates information. The results obtained using this technique are more accurate and reliable than manual procedures or any other traditional techniques. In this paper, an automated and interactive procedure for B.o.Q computation is demonstrated. The process involves a user-friendly interface, dynamic linking to the structural drawings and tracking of B.o.Q modifications at the same time. 相似文献
Requirements Engineering - To reduce program risks, engineering methods capitalizing on modeling and machine assistance have been extensively investigated within systems engineering (and more... 相似文献
With growing use of roadheaders in the world and its significant role in the successful accomplishment of a tunneling project, it is a necessity to accurately predict performance of this machine in different ground conditions. On the other hand, the existence of some shortcomings in the prediction models has made it necessary to perform more research on the development of the new models. This paper makes an attempt to model the rate of roadheader performance based on the geotechnical and geological site conditions. For achieving the aim, an artificial neural network (ANN), a powerful tool for modeling and recognizing the sophisticated structures involved in data, is employed to model the relationship between the roadheader performance and the parameters influencing the tunneling operations with a high correlation. The database used in modeling is compiled from laboratory studies conducted at Azad University at Science and Research Branch, Tehran, Iran. A model with architecture 4-10-1 trained by back-propagation algorithm is found to be optimum. A multiple variable regression (MVR) analysis is also applied to compare performance of the neural network. The results demonstrate that predictive capability of the ANN model is better than that of the MVR model. It is concluded that roadheader performance could be accurately predicted as a function of unconfined compressive strength, Brazilian tensile strength, rock quality designation, and alpha angle R2 = 0.987. Sensitivity analysis reveals that the most effective parameter on roadheader performance is the unconfined compressive strength. 相似文献
Automatic key concept identification from text is the main challenging task in information extraction, information retrieval, digital libraries, ontology learning, and text analysis. The main difficulty lies in the issues with the text data itself, such as noise in text, diversity, scale of data, context dependency and word sense ambiguity. To cope with this challenge, numerous supervised and unsupervised approaches have been devised. The existing topical clustering-based approaches for keyphrase extraction are domain dependent and overlooks semantic similarity between candidate features while extracting the topical phrases. In this paper, a semantic based unsupervised approach (KP-Rank) is proposed for keyphrase extraction. In the proposed approach, we exploited Latent Semantic Analysis (LSA) and clustering techniques and a novel frequency-based algorithm for candidate ranking is introduced which considers locality-based sentence, paragraph and section frequencies. To evaluate the performance of the proposed method, three benchmark datasets (i.e. Inspec, 500N-KPCrowed and SemEval-2010) from different domains are used. The experimental results show that overall, the KP-Rank achieved significant improvements over the existing approaches on the selected performance measures.
Adaptive steganography methods tend to increase the security against attacks. Most of adaptive methods use LSB flipping (LSB-F) for embedding part of their algorithms. LSB-F is very much vulnerable against simple steganalysis methods but it allows the adaptive algorithms to be extractable at the receiver side. Use of LSB matching (LSB-M) could increase the security but extraction of data at the receiver is difficult or, in occasions, impossible. There are numerous attacks against LSB-M. In this paper we are proposing an adaptive algorithm which, unlike most adaptive methods, uses LSB-M as its embedding method. The proposed method uses a complexity measure based on a local neighborhood analysis for determination of secure locations of an image. Comparable adaptive methods that use LSB-M suffer from possible changes in the complexity of pixels when embedding is performed. The proposed algorithm is such that when a pixel is categorized as complex at the transmitter and is embedded the receiver will identify it as complex too, and data is correctly retrieved. Better performance of the algorithm is shown by obtaining higher PSNR values for the embedded images with respect to comparable adaptive algorithms. The security of the algorithm against numerous attacks is shown to be higher than LSB-M. Also, it is compared with a recent adaptive method and is proved to be advantageous for most embedding rates. 相似文献
This paper proposes a new fuzzy approach to count eosinophils, as a measure of inflammation, in bronchoalveolar lavage fluid images, provided by digital camera through microscope. We use fuzzy cluster analysis and fuzzy classification algorithm to determine the number of objects in an image. For this purpose, a fuzzy image processing procedure consisting of five main stages is presented. The first stage is pre-highlighting the objects in the images by using an image pre-processing method for enhancement, which is sharpening the image with the Laplaian high pass filter in order to have acceptable contrast in the image. The second stage is segmentation by clustering with fuzzy c-mean algorithm for portioning. In this stage the clustered data are the rough symbols of objects in the image containing noise. In the third step, first, a Gaussian low pass filter is used for noise reduction. Then, a contrast adoption in the image is done by modifying the membership functions in the image [H.R. Tizhoosh, G. Krell, B. Michaelis, Knowledge-based enhancement of megavoltage images in radiation therapy using a hybrid neuro-fuzzy system, Image and Vision Computing 19(July) (2000) 217–233]. Object recognition, the fourth stage, will be done by using fuzzy labeling for the objects in the image, using a fuzzy classification method. The number of labeled images shows the number of eosinophils in an image which is an index for diagnosing inflammation. The last stage is tuning parameters and verification of the system performance by using a feed forward Neural Network. 相似文献