首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
A computer program to generate automatically the stiffness and mass matrices in finite element analysis is introduced. By using programs like this, researchers only need to derive the expressions for the energies (strain or kinetic) and the expressions for the displacements or stresses and their derivatives in terms of nodal variables. Tedious calculations to obtain the matrices are avoided. Such programs would facilitate checking of existing matrices and would encourage researchers to try out new finite element models.  相似文献   

2.
This paper mainly illustrates the Tree Seeds Algorithm (TSA) to tackle structural damage identification problem. The damage model is simulated by the alterations of both stiffness and mass parameters. The objective function is introduced by minimizing the differences between the measured and calculated acceleration data. To enhance the performance of the standard TSA, two modifications including the bare-bones Gaussian updated mechanism and the withering process are introduced. The modified algorithm is named after the BGTSA. In the numerical simulation part, the BGTSA is firstly used to make comparisons with several state-of-the-art algorithms on the CEC05. Secondly, the BGTSA is utilized to deal with the structural damage identification problem by optimizing the acceleration-based nonlinear objective function. Numerical experiments involving a simply supported beam and a truss are carried out to verify the effectiveness of the proposed algorithm. The final results show that with low amount of acceleration data, the BGTSA can acquire better identification results compared with other evolutionary algorithms. Therefore the proposed algorithm could be viewed as a potential tool to solve the structural damage identification problem.  相似文献   

3.
In this investigation, Model Order Reduction (MOR) of second-order systems having cubic nonlinearity in stiffness is developed for the first time using Krylov subspace methods and the associated symmetric transfer functions. In doing so, new second-order Krylov subspaces will be defined for MOR procedure which avoids the need to transform the second-order system to its state space form and thus the main characteristics of the second-order system such as symmetry and positive definiteness of mass and stiffness matrices will be preserved. To show the efficacy of the presented method, three examples will be considered as practical case studies. The first example is a nonlinear shear-beam building model subjected to a seismic disturbance. The second and third examples are nonlinear longitudinal vibration of a rod and vibration of a cantilever beam resting on a nonlinear elastic foundation, respectively. Simulation results in all cases show good accuracy of the vibrational response of the reduced order models when compared with the original ones while reducing the computational load.  相似文献   

4.
In the present work, a methodology based on digraph and matrix methods is developed for evaluation of alternative industrial robots. A robot selection index is proposed that evaluates and ranks robots for a given industrial application. The index is obtained from a robot selection attributes function, obtained from the robot selection attributes digraph. The digraph is developed considering robot selection attributes and their relative importance for the application considered. A step by step procedure for evaluation of robot selection index is suggested. Coefficients of similarity and dissimilarity and the identification sets are also proposed. These are obtained from the robot selection attributes function and are useful for easy storage and retrieval of the data. Two examples are included to illustrate the approach.  相似文献   

5.
ABSTRACT

Rapid identification of post-earthquake collapsed buildings can be used to conduct immediate damage assessments (scope and extent), which could potentially be conducive to the formulation of emergency response strategies. Up to the present, the assessments of earthquake damage are mainly achieved through artificial field investigations, which are time-consuming and cannot meet the urgent requirements of quick-response emergency relief allocation. In this research study, an intelligent assessment method based on deep-learning, super-pixel segmentation, and mathematical morphology was proposed to evaluate the damage degrees of earthquake-damaged buildings. This method firstly utilized the Deeplab v2 neural network to obtain the initial damaged building areas. Then, the simple linear iterative cluster (SLIC) method was employed to segment the test images so as to accurately extract the area boundaries of the earthquake-damaged buildings. Next, the images subdivided by SLIC can be merged according to the initial damaged building areas identified by Deeplab v2 neural network. Finally, a mathematical morphological method was introduced to eliminate the background noise. Experimental results demonstrated that the proposed algorithm was superior to others in both convergent speed and accuracy. Besides, its parameter selection was flexible and easily realized which was of great significance to earthquake damage assessments and provided valuable guidance for the formulation of future emergency response plans after earthquake events.  相似文献   

6.
《Computers & Structures》2006,84(29-30):2065-2080
We present a methodology for the multi-objective optimization of laminated composite materials that is based on an integer-coded genetic algorithm. The fiber orientations and fiber volume fractions of the laminae are chosen as the primary optimization variables. Simplified micromechanics equations are used to estimate the stiffnesses and strength of each lamina using the fiber volume fraction and material properties of the matrix and fibers. The lamina stresses for thin composite coupons subjected to force and/or moment resultants are determined using the classical lamination theory and the first-ply failure strength is computed using the Tsai–Wu failure criterion. A multi-objective genetic algorithm is used to obtain Pareto-optimal designs for two model problems having multiple, conflicting, objectives. The objectives of the first model problem are to maximize the load carrying capacity and minimize the mass of a graphite/epoxy laminate that is subjected to biaxial moments. In the second model problem, the objectives are to maximize the axial and hoop rigidities and minimize the mass of a graphite/epoxy cylindrical pressure vessel subject to the constraint that the failure pressure be greater than a prescribed value.  相似文献   

7.
The authors develop and analyze iterative methods with different (linear, quadratic, or of p (p2) order) rates of convergence. The methods are used to calculate weighted pseudoinverse matrices with positive defined weights. To find weighted normal pseudosolutions with positive defined weights, iterative methods with a quadratic rate of convergence are developed and analyzed. The iterative methods for calculation of weighted normal pseudosolutions are used to solve least-square problems with constraints.Translated from Kibernetika i Sistemnyi Analiz, No. 5, pp. 20–44, September–October 2004.  相似文献   

8.

What information may be extracted over urban area by means of joint analysis of two-dimensional (2D) and three-dimensional (3D) remote sensing data? We exploit aerial, Synthetic Aperture Radar (SAR) and Laser Induced Detection and Ranging (LIDAR) data to characterize precisely the Presidio area in San Francisco. We discriminate between different objects in the scene using their 2D and 3D characteristics. The final product of the analysis is a set of raster or vector information layers providing land covers, 3D building shapes and Digital Terrain Models (DTMs) of the Presidio. This paper investigates the relative merits of the collected data in retrieving each of these information layers, and examines how automatic algorithms to extract land cover, Digital Terrain Model (DTM) and 3D building shape could be integrated in a processing chain.  相似文献   

9.
We discuss the identification and localization of a buried object using B-scan response of a ground penetration radar (GPR). We use the Finite Difference Time Domain (FDTD) and an Improved Particle Swarm Optimization (IPSO) methods association as an inverse problem. The A-scan response of the soil without the presence of any object is used in this inverse problem to estimate the physical characteristics of this soil. Then, we included these parameters into the inverse problem to characterize a cylindrical buried object from its GPR B-scan response. This response is simulated using the FDTD method. Several simulated cases of cylindrical object were tested with different radius, electrical conductivity, and depth. The proposed method allowed us to locate and identify buried objects (plastic and metal) at different depths.  相似文献   

10.
An extension of the GRASP2 multi-configuration Dirac-Fock (MCDF) program for calculation of the specific mass shift (SMS) is described. The various modes (average level, optimal level, etc.) for achieving an approximate wavefunction, and their impact on the relativistic SMS values, are explored. Comparisons are made with other theoretical SMS values as well as with experiment for Ar+, Ni, Kr+, and Ce+ with new results reported for each atom.  相似文献   

11.
The Al-La, Al-Ce, Al-Pr, Al-Nd and Al-Sm (Al-light rare earth) binary systems have been systematically assessed and optimized based on the available experimental data and ab-initio data using the FactSage thermodynamic software. Optimized model parameters of the Gibbs energies for all phases which reproduced all the reliable experimental data to satisfaction have been obtained. The optimization procedure was biased by putting a strong emphasis on the observed trends in the thermodynamic properties of Al-RE phases. The Modified Quasichemical Model, which takes short-range ordering into account, is used for the liquid phase and the Compound Energy Formalism is used for the solid solutions in the binary systems. It is shown that the Modified Quasichemical Model used for the liquid alloys permits us to obtain entropies of mixing that are more reliable than that based on the Bragg-Williams random mixing model which does not take short-range ordering into account.  相似文献   

12.
In this paper, we present a system using computational linguistic techniques to extract metadata for image access. We discuss the implementation, functionality and evaluation of an image catalogers’ toolkit, developed in the Computational Linguistics for Metadata Building (CLiMB) research project. We have tested components of the system, including phrase finding for the art and architecture domain, functional semantic labeling using machine learning, and disambiguation of terms in domain-specific text vis a vis a rich thesaurus of subject terms, geographic and artist names. We present specific results on disambiguation techniques and on the nature of the ambiguity problem given the thesaurus, resources, and domain-specific text resource, with a comparison of domain-general resources and text. Our primary user group for evaluation has been the cataloger expert with specific expertise in the fields of painting, sculpture, and vernacular and landscape architecture.
Carolyn SheffieldEmail:

Judith L. Klavans   is a Senior Research Scientist at the University of Maryland Institute for Advanced Computer Studies (UMIACS), and Principal Investigator on the Mellon-funded Computational Linguistics for Metadata Building (CLiMB) and IMLS-supported T3 research projects. Her research includes text-mining from corpora and dictionaries, disambiguation, and multilingual multidocument summarization. Previously, she directed the Center for Research on Information Access at Columbia University. Carolyn Sheffield   holds an M.L.S. from the University of Maryland and her research interests include access issues surrounding visual and time-based materials. She designs, conducts and analyzes the CLiMB user studies and works closely with image catalogers to ensure that the CLiMB system reflects their needs and workflow. Eileen Abels   is Masters’ Program Director and Professor in the College of Information Science and Technology at Drexel University. Prior to joining Drexel in January 2007, Dr. Abels spent more than 15 years at the College of Information Studies at the University of Maryland. Her research focuses on user needs and information behaviors. She works with a broad range of information users including translators, business school students and faculty, engineers, scientists, and members of the general public. Dr. Abels holds a PhD from the University of California, Los Angeles. Jimmy Lin’s   research interests lie at the intersection of natural language processing and information retrieval. His work integrates knowledge- and data-driven approaches to address users’ information needs. Rebecca J. Passonneau   is a Research Scientist at the Center for Computational Learning Systems, Columbia University. Her areas of interest include linking empirical research methods on corpora with computational models of language processing, the intersection of language and context in semantics and pragmatics, corpus design and analysis, and evaluation methods for NLP. Her current projects involve working with machine learning for the Consolidated Edison utility company, and designing an experimental dialog system to take patron book orders by phone for the Andrew Heiskell Braille and Talking Book library. Tandeep Sidhu   is the Software Developer and Research Assistant for the CLiMB project. He is incharge of designing the CLiMB Toolkit as well as the NLP modules behind the Toolkit. He is currently pursuing his MS degree in Computer Science. Dagobert Soergel   has been teaching information organization at the University of Maryland since 1970 and is an internationally known expert in Knowledge Organization Systems and in Digital Libraries. In the CLiMB project he served as general consultant and was specially involved in the design of study on the relationship between an image and cataloging terms assigned to it.   相似文献   

13.
Mass service networks with multiregime strategies and several types of demands are studied. Single-line nodes can operate in several regimes corresponding to different degrees of performance. The time of existence of each regime has an exponential distribution. The discipline of the demand servicing by the device is the “generalized processor sharing” (GPS) with random channel choice in the node. The amount of work on servicing a demand is a random quantity with an arbitrary distribution function. The invariance of the stationary probability distribution of network states with respect to the functional form of the distribution of the amount of work required for the demand servicing is established.  相似文献   

14.
Nitrogen, phosphorus, and potassium are some of the most important biochemical components of plant organic matter, and hence, estimation of their contents can help monitor the metabolism processes and health of plants. This study, conducted in the Yixing region of China, aimed to compare partial least squares regression (PLSR) and support vector machine regression (SVMR) methods for estimating the nitrogen (C N), phosphorus (C P), and potassium (C K) contents present in leaves of diverse plants using laboratory-based visible and near-infrared (Vis-NIR) reflectance spectroscopy. A total of 95 leaf samples taken from rice, corn, sesame, soybean, tea, grass, shrub, and arbour plants were collected, and their C N, C P, C K, and Vis-NIR reflectance data were measured in a laboratory. The PLSR and SVMR methods were calibrated to estimate the C N, C P, and C K contents of the obtained samples from spectral reflectance. Cross-validation with an independent data set was employed to assess the performance of the calibrated models. The calibration results indicated that the PLSR method accounted for 59.1%, 50.9%, and 50.6% of the variation of C N, C P, and C K, whereas the SVMR method accounted for more than 90% of the variation of C N, C P, and C K. According to cross-validation, the SVMR method achieved better estimation accuracies, which had determination coefficients of 0.706, 0.722, and 0.704 for C N, C P, and C K, respectively, than the PLSR method, which had determination coefficients of 0.663, 0.643, and 0.541. It was concluded that the SVMR method combined with laboratory-based Vis-NIR reflectance data has the potential to estimate the contents of biochemical components.  相似文献   

15.
16.

The Covid-19 virus outbreak that emerged in China at the end of 2019 caused a huge and devastating effect worldwide. In patients with severe symptoms of the disease, pneumonia develops due to Covid-19 virus. This causes intense involvement and damage in lungs. Although the emergence of the disease occurred a short time ago, many literature studies have been carried out in which these effects of the disease on the lungs were revealed by the help of lung CT imaging. In this study, 1.396 lung CT images in total (386 Covid-19 and 1.010 Non-Covid-19) were subjected to automatic classification. In this study, Convolutional Neural Network (CNN), one of the deep learning methods, was used which suggested automatic classification of CT images of lungs for early diagnosis of Covid-19 disease. In addition, k-Nearest Neighbors (k-NN) and Support Vector Machine (SVM) was used to compare the classification successes of deep learning with machine learning. Within the scope of the study, a 23-layer CNN architecture was designed and used as a classifier. Also, training and testing processes were performed for Alexnet and Mobilenetv2 CNN architectures as well. The classification results were also calculated for the case of increasing the number of images used in training for the first 23-layer CNN architecture by 5, 10, and 20 times using data augmentation methods. To reveal the effect of the change in the number of images in the training and test clusters on the results, two different training and testing processes, 2-fold and 10-fold cross-validation, were performed and the results of the study were calculated. As a result, thanks to these detailed calculations performed within the scope of the study, a comprehensive comparison of the success of the texture analysis method, machine learning, and deep learning methods in Covid-19 classification from CT images was made. The highest mean sensitivity, specificity, accuracy, F-1 score, and AUC values obtained as a result of the study were 0,9197, 0,9891, 0,9473, 0,9058, 0,9888; respectively for 2-fold cross-validation, and they were 0,9404, 0,9901, 0,9599, 0,9284, 0,9903; respectively for 10-fold cross-validation.

  相似文献   

17.
Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) reflectance and emissivity data were used to discriminate nonphotosynthetic vegetation (NPV) from exposed soils, to produce a topsoil texture image, and to relate sand fraction estimates with elevation data in an agricultural area of central Brazil. The results show that the combination of the shortwave infrared (SWIR) bands 5 and 6 (hydroxyl absorption band) and thermal infrared (TIR) bands 10 and 14 (quartz reststrahlen feature) discriminated dark red clayey soils and bright sandy soils from NPV (crop litter), respectively. The ratio of the bands 10 and 14 was correlated with laboratory measured total sand fraction. When applied to the image and associated with topography, a predominance of sandy soil surfaces at lower elevations and clayey soil surfaces at higher elevations was observed. Areas presenting the largest sand fraction values, identified from ASTER band 10/14 emissivity ratio, were coincident with land degradation processes.  相似文献   

18.
Hyperspectral remote sensing provides great potential to monitor and study biodiversity of tropical forests through species identification and mapping. In this study, five species were selected to examine crown-level spectral variation within and between species using HYperspectral Digital Collection Experiment (HYDICE) data collected over La Selva, Costa Rica. Spectral angle was used to evaluate the spectral variation in reflectance, first derivative and wavelet-transformed spectral domains. Results indicated that intra-crown spectral variation does not always follow a normal distribution and can vary from crown to crown, therefore presenting challenges to statistically define the spectral variation within species using conventional classification approaches that assume normal distributions. Although derivative analysis has been used extensively in hyperspectral remote sensing of vegetation, our results suggest that it might not be optimal for species identification in tropical forestry using airborne hyperspectral data. The wavelet-transformed spectra, however, were useful for the identification of tree species. The wavelet coefficients at coarse spectral scales and the wavelet energy feature are more capable of reducing variation within crowns/species and capturing spectral differences between species. The implications of this examination of intra- and inter-specific variability at crown-level were: (1) the wavelet transform is a robust tool for the identification of tree species using hyperspectral data because it can provide a systematic view of the spectra at multiple scales; and (2) it may be impractical to identify every species using only hyperspectral data, given that spectral similarity may exist between species and that within-crown/species variability may be influenced by many factors.  相似文献   

19.
The Al–Gd, Al–Tb, Al–Dy, Al–Ho and Al–Er (Al–heavy rare earths) binary systems have been systematically assessed and optimized based on the available experimental data and ab-initio data using the FactSage thermodynamic software. A systematic technique (reduced melting temperature proposed by Gschneidner) was used for estimating the Al–Tb phase diagram due to lack of experimental data. Optimized model parameters of the Gibbs energies for all phases which reproduced all the reliable experimental data to satisfaction have been obtained. The optimization procedure was biased by putting a strong emphasis on the observed trends in the thermodynamic properties of Al–RE phases. The Modified Quasichemical Model, which takes short-range ordering into account, is used for the liquid phase and the Compound Energy Formalism is used for the solid solutions in the binary systems. It is shown that the Modified Quasichemical Model used for the liquid alloys permits one to obtain entropies of mixing that are more reliable than that based on the Bragg–Williams random mixing model which does not take short-range ordering into account.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号