首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
2.
Dispersion is accepted as a fundamental step required for analyzing broadband light. The recognition of color by the human eye, its digital reproduction by a camera, or detailed analysis by a spectrometer all utilize dispersion; it is also an inherent component of color detection and machine vision. Here, we present a device (called artificial eye or, A-Eye) that accurately recognizes and reproduces tested colors, without any spectral dispersion. Instead, A-Eye uses N = 3–12 transmissive windows each with unique spectral features resulting from the broadband transmittance and excitonic peak-features of 2D transition metal dichalcogenides. Colored light passing through (and modified by) these windows and incident on a single photodetector generated different photocurrents, and these were used to create a reference database (training set) for 1337 “seen” and 0.55 million synthesized “unseen” colors. By “looking” at test colors modified by these windows, A-Eye can accurately recognize and reproduce “seen” colors with zero deviation from their original spectra and “unseen” colors with only ∼1 % median deviation, using the k-NN algorithm. A-Eye can continuously improve color estimation by adding any corrected guesses to its training database. A-Eye’s accurate color recognition dispels the notion that dispersion of colors is a prerequisite for color identification and paves the way for ultra-reliable color-recognition by machines with reduced engineering complexity.  相似文献   

3.
A key aspect of the developing field of materials informatics is optimally guiding experiments or calculations towards parts of the relatively vast feature space where a material with desired property may be discovered. We discuss our approach to adaptive experimental design and the methods developed in decision theory and global optimization which can be used in materials science. We show that the use of uncertainties to trade-off exploration versus exploitation to guide new experiments or calculations generally leads to enhanced performance, highlighting the need to evaluate and incorporate errors in predictive materials design. We illustrate our ideas on a computed data set of M2AX phases generated using ab initio calculations to find the sample with the optimal elastic properties, and discuss how our approach leads to the discovery of new NiTi-based alloys with the smallest thermal dissipation.  相似文献   

4.
A recently developed machine learning technique, multivariate adaptive regression splines (MARS), is introduced in this study to predict vehicles’ angle crashes. MARS has a promising prediction power, and does not suffer from interpretation complexity. Negative Binomial (NB) and MARS models were fitted and compared using extensive data collected on unsignalized intersections in Florida. Two models were estimated for angle crash frequency at 3- and 4-legged unsignalized intersections. Treating crash frequency as a continuous response variable for fitting a MARS model was also examined by considering the natural logarithm of the crash frequency. Finally, combining MARS with another machine learning technique (random forest) was explored and discussed. The fitted NB angle crash models showed several significant factors that contribute to angle crash occurrence at unsignalized intersections such as, traffic volume on the major road, the upstream distance to the nearest signalized intersection, the distance between successive unsignalized intersections, median type on the major approach, percentage of trucks on the major approach, size of the intersection and the geographic location within the state. Based on the mean square prediction error (MSPE) assessment criterion, MARS outperformed the corresponding NB models. Also, using MARS for predicting continuous response variables yielded more favorable results than predicting discrete response variables. The generated MARS models showed the most promising results after screening the covariates using random forest. Based on the results of this study, MARS is recommended as an efficient technique for predicting crashes at unsignalized intersections (angle crashes in this study).  相似文献   

5.
In recent years, there has been a large effort in the materials science community to employ materials informatics to accelerate materials discovery or to develop new understanding of materials behavior. Materials informatics methods utilize machine learning techniques to extract new knowledge or predictive models out of existing materials data. In this review, we discuss major advances in the intersection between data science and atom-scale calculations with a particular focus on studies of solid-state, inorganic materials. The examples discussed in this review cover methods for accelerating the calculation of computationally-expensive properties, identifying promising regions for materials discovery based on existing data, and extracting chemical intuition automatically from datasets. We also identify key issues in this field, such as limited distribution of software necessary to utilize these techniques, and opportunities for areas of research that would help lead to the wider adoption of materials informatics in the atomistic calculations community.  相似文献   

6.
Fibrillar dry adhesives have shown great potential in many applications thanks to their tunable adhesion, notably for pick-and-place handling of fragile objects. However, controlling and monitoring alignment with the target objects is mandatory to enable reliable handling. In this paper, we present an in-line monitoring system that allows optical analysis of an array of individual fibrils (with a contact radius of 350 µm) in contact with a smooth glass substrate, followed by the prediction of their adhesion performance. Images recorded at maximum compressive preload represent characteristic contact signatures that were used to extract visual features. These features, in turn, were used to create a linear model and to train different linear and non-linear regression models for predicting adhesion force depending on the misalignment angle. Support vector regression and boosted tree models exhibited highest accuracies and outperformed an analytical model reported in literature. Overall, this new approach enables predictions in gripping objects by contact observations in near real-time, which likely improves the reliability of handling operations.  相似文献   

7.
8.
The multi-principal-component concept of high-entropy alloys(HEAs) generates numerous new alloys.Among them,nanoscale precipitated HEAs have achieved superior mechanical properties and shown the potentials for structural applications.However,it is still a great challe nge to find the optimal alloy within the numerous candidates.Up to now,the reported nanoprecipitated HEAs are mainly designed by a trialand-error approach with the aid of phase diagram calculations,limiting the development of structural HEAs.In the current work,a novel method is proposed to accelerate the development of ultra-strong nanoprecipitated HEAs.With the guidance of physical metallurgy,the volume fraction of the required nanoprecipitates is designed from a machine learning of big data with thermodynamic foundation while the morphology of precipitates is kinetically tailored by prestrain aging.As a proof-of-principle study,an HEA with superior strength and ductility has been designed and systematically investigated.The newly developed γ'-strengthened HEA exhibits 1.31 GPa yield strength,1.65 GPa ultimate tensile strength,and 15% tensile elongation.Atom probe tomography and transmission electron microscope characterizations reveal the well-controlled high γ' volume fraction(52%) and refined precipitate size(19 nm).The refinement of nanoprecipitates originates from the accelerated nucleation of the γ' phase by prestrain aging.A deeper understanding of the excellent mechanical properties is illustrated from the aspect of strengthening mecha nisms.Finally,the versatility of the current design strategy to other precipitation-hardened alloys is discussed.  相似文献   

9.
《Advanced Powder Technology》2020,31(7):2689-2698
Belt conveyor systems are widely utilized in transportation applications. This research aims to achieve fault detection on belt conveyor idlers with an acoustic signal based method. The presented novel method uses Mel Frequency Cepstrum Coefficients and Gradient Boost Decision Tree for feature extraction and classification. Thirteen Mel Frequency Cepstrum Coefficients are extracted from acquired sound signal as features. A Gradient Boost Decision Tree model is developed and trained. After training, the model is applied to a testing dataset. Results show that the trained model can achieve diagnosis accuracy of 94.53%, as well as recall rate up to 99.7%. This study verifies the proposed method for acoustic signal based fault detection of belt conveyor idlers.  相似文献   

10.
Materials design is the most important and fundamental work on the background of materials genome initiative for global competitiveness proposed by the
National Science and Technology Council of America. As far as the methodologies of materials design, besides the thermodynamic and kinetic methods combing databases,both deductive approaches so-called the first principle methods and inductive approaches based on data mining methods are gaining great progress because of their successful applications in materials design. In this paper, support vector machine (SVM), including support vector classification (SVC) and support vector regression (SVR) based on the statistical learning theory (SLT) proposed by Vapnik, is introduced as a relatively new data mining method to meet the different tasks of materials design in our lab. The advantage of using SVM for materials design is discussed based on the applications in the formability of perovskite or BaNiO3 structure, the prediction of energy gaps of binary compounds, the prediction of sintered cold
modulus of sialon-corundum castable, the optimization of electric resistances of VPTC semiconductors and the thickness control of In2O3 semiconductor film preparation. The results presented indicate that SVM is an effective modeling tool for the small sizes of sample sets with great potential applications in materials design.  相似文献   

11.
Classification of structural brain magnetic resonance (MR) images is a crucial task for many neurological phenotypes that machine learning tools are increasingly developed and applied to solve this problem in recent years. In this study binary classification of T1‐weighted structural brain MR images are performed using state‐of‐the‐art machine learning algorithms when there is no information about the clinical context or specifics of neuroimaging. Image derived features and clinical labels that are provided by the International Conference on Medical Image Computing and Computer‐Assisted Intervention 2014 machine learning challenge are used. These morphological summary features are obtained from four different datasets (each N > 70) with clinically relevant phenotypes and automatically extracted from the MR imaging scans using FreeSurfer, a freely distributed brain MR image processing software package. Widely used machine learning tools, namely; back‐propagation neural network, self‐organizing maps, support vector machines and k‐nearest neighbors are used as classifiers. Clinical prediction accuracy is obtained via cross‐validation on the training data (N = 150) and predictions are made on the test data (N = 100). Classification accuracy, the fraction of cases where prediction is accurate and area under the ROC curve are used as the performance metrics. Accuracy and area under curve metrics are used for tuning the training hyperparameters and the evaluation of the performance of the classifiers. Performed experiments revealed that support vector machines show a better success compared to the other methods on clinical predictions using summary morphological features in the absence of any information about the phenotype. Prediction accuracy would increase greatly if contextual information is integrated into the system. © 2017 Wiley Periodicals, Inc. Int J Imaging Syst Technol, 27, 89–97, 2017  相似文献   

12.
With the advent of faster computer processors and especially graphics processing units (GPUs) over the last few decades, the use of data-intensive machine learning (ML) and artificial intelligence (AI) has increased greatly, and the study of crystal nucleation has been one of the beneficiaries. In this review, we outline how ML and AI have been applied to address four outstanding difficulties of crystal nucleation: how to discover better reaction coordinates (RCs) for describing accurately non-classical nucleation situations; the development of more accurate force fields for describing the nucleation of multiple polymorphs or phases for a single system; more robust identification methods for determining crystal phases and structures; and as a method to yield improved course-grained models for studying nucleation.  相似文献   

13.
In this review, we discuss current and potential future applications for materials informatics in industry. We include in this discussion not only the traditional materials and chemical industries, but also other manufacturing-intensive sectors, which broadens the relevance of materials informatics to a large proportion of the economy. We describe several high-level use cases, drawing upon our experience at Citrine Informatics working in materials and manufacturing, although we omit any details that could be considered customer-proprietary. We note that a converging set of factors, including executive-level corporate demand for Big Data technologies, increasing availability of large-scale materials data, drive for greater competitiveness in manufacturing, and advances in machine learning, will lead to a rapid increase in industrial application of materials informatics over the next several years.  相似文献   

14.
Microchip has long been studied and facilitated recent investigations in multiple biomedical and material fields. The advances in functional materials triggered several leaps in the development of microchip technology. Microarray chip, benefiting from micropatterning and nucleic acid nanotechnology, was firstly introduced around 1980 and rapidly facilitated genomics, proteomics, and biodetections. In the following generation, the microfluidic chips, raised from microelectromechanical systems (MEMS) and soft lithography, are revolutionizing several areas like biology, material fabrication, energy, and environmental science. More recently, the advances in materials fabrication keep expanding the frontiers of microchip platforms, like nanoscale fabrications and flexible device manufacturing. One of the most promising platforms is the wettability-patterned materials inspired by ubiquitous natural wetting creatures such as lotus leaf, spider silk, and Stenocara beetles. The unique property of handling liquids with no sophisticated equipment potentially facilitated the current microchip platform by combining the merits of microarray and microfluidics, and in turn, benefits material communities and beyond. In this featured article, we briefly introduce the state of art technologies to fabricate wettability-patterned chips and highlight some proof-of-concept demonstration of its emerging applications in material and biomedical science. We also give an outlook on its further developments including machine-learning micropattern manufacturing technology and reveal its potentiality to revolute several scientific areas.  相似文献   

15.
Glasses have played a critical role in the development of modern civilization and will continue to bring new solutions to global challenges from energy and the environment to healthcare and information/communication technology. To meet the accelerated pace of modern technology delivery, a more sophisticated approach to the design of advanced glass chemistries must be developed to enable faster, cheaper, and better research and development of new glass compositions for future applications. In the spirit of the U.S. Materials Genome Initiative, here we describe an approach for designing new glasses based on a mathematical optimization of composition-dependent glass property models. The models combine known physical insights regarding glass composition-property relationships together with data-driven approaches including machine learning techniques. Using such a combination of physical and empirical modeling approaches, we seek to decode the “glass genome,” enabling the improved and accelerated design of new glassy materials.  相似文献   

16.
Industrial robots are widely used in various areas owing to their greater degrees of freedom (DOFs) and larger operation space compared with traditional frame movement systems involving sliding and rotational stages. However, the geometrical transfer of joint kinematic errors and the relatively weak rigidity of industrial robots compared with frame movement systems decrease their absolute kinematic accuracy, thereby limiting their further application in ultraprecision manufacturing. This imposes a stringent requirement for improving the absolute kinematic accuracy of industrial robots in terms of the position and orientation of the robot arm end. Current measurement and compensation methods for industrial robots either require expensive measuring systems, producing positioning or orientation errors, or offer low measurement accuracy. Herein, a kinematic calibration method for an industrial robot using an artifact with a hybrid spherical and ellipsoid surface is proposed. A system with submicrometric precision for measuring the position and orientation of the robot arm end is developed using laser displacement sensors. Subsequently, a novel kinematic error compensating method involving both a residual learning algorithm and a neural network is proposed to compensate for nonlinear errors. A six-layer recurrent neural network (RNN) is designed to compensate for the kinematic nonlinear errors of a six-DOF industrial robot. The results validate the feasibility of the proposed method for measuring the kinematic errors of industrial robots, and the compensation method based on the RNN improves the accuracy via parameter fitting. Experimental studies show that the measuring system and compensation method can reduce motion errors by more than 30%. The present study provides a feasible and economic approach for measuring and improving the motion accuracy of an industrial robot at the submicrometric measurement level.The full text can be downloaded at https://link.springer.com/article/10.1007/s40436-022-00400-6  相似文献   

17.
The grinding and classification processes are systematic engineering that must comprehensively consider the influence of several factors to ensure good grinding fineness. Based on the machine learning method, this study analyzed the full process parameters (i.e., ball mill power, fresh ore feed rate, hydrocyclone feed pump power, hydrocyclone pressure, mill feed water flow rate, dilution water flow rate, and sump level) for industrial grinding circuit. The collected real data (42,101 records) were employed to train and test the extreme gradient-boosting (XGBoost) regression model. The XGBoost model’s prediction ability and accuracy were evaluated and analyzed. The validated model was employed to evaluate the relative importance and influence mechanisms of process parameters. It was found that hydrocyclone feed pump power, dilution water flow rate, hydrocyclone pressure, and mill feed water flow rate significantly affected the grinding fineness, which were consistent with the actual operation of grinding circuit.  相似文献   

18.
This paper aims to understand model the effect of vibration on particle percolation. The percolation of small particles in a vibrated bed of big particles is studied by DEM. It is found the percolation velocity (Vp) decreases with increasing vibration amplitude (A) and frequency (f) when the size ratio of small to large particles (d/D) is smaller than the spontaneous percolation threshold of 0.154. Vibration can enable percolation when the size ratio is larger than 0.154, while Vp increases with increasing A and f first and then decreases. Vp can be correlated to the vibration velocity amplitude under a given size ratio. Previous radial dispersion model can still be applied while the dispersion coefficient is affected by vibration conditions and size ratio. Furthermore, a machine learning model is trained to predict Vp as a function of A, f and d/D, and is then used to obtain the percolation threshold size ratio as a function of vibration conditions.  相似文献   

19.
Physical metallurgical (PM) and data-driven approaches can be independently applied to alloy design.Steel technology is a field of physical metallurgy around which some of the most comprehensive under-standing has been developed,with vast models on the relationship between composition,processing,microstructure and properties.They have been applied to the design of new steel alloys in the pursuit of grades of improved properties.With the advent of rapid computing and low-cost data storage,a wealth of data has become available to a suite of modelling techniques referred to as machine learning (ML).ML is being emergingly applied in materials discovery while it requires data mining with its adoption being limited by insufficient high-quality datasets,often leading to unrealistic materials design predictions outside the boundaries of the intended properties.It is therefore required to appraise the strength and weaknesses of PM and ML approach,to assess the real design power of each towards designing novel steel grades.This work incorporates models and datasets from well-established literature on marageing steels.Combining genetic algorithm (GA) with PM models to optimise the parameters adopted for each dataset to maximise the prediction accuracy of PM models,and the results were compared with ML models.The results indicate that PM approaches provide a clearer picture of the overall composition-microstructure-properties relationship but are highly sensitive to the alloy system and hence lack on exploration ability of new domains.ML conversely provides little explicit physical insight whilst yielding a stronger pre-diction accuracy for large-scale data.Hybrid PM/ML approaches provide solutions maximising accuracy,while leading to a clearer physical picture and the desired properties.  相似文献   

20.
This paper provides an overview of modern alloy development, from discovery and optimization towards alloy design, based on combinatorial thin film materials science. The combinatorial approach, combining combinatorial materials synthesis of thin film composition-spreads with high-throughput property characterization has proven to be a powerful tool to delineate composition-structure-property relationships, and hence to efficiently identify composition windows with enhanced properties. Furthermore, and most importantly for alloy design, theoretical models and hypotheses can be critically appraised. Examples for alloy discovery, optimization, and alloy design of functional as well as structural materials are presented.Using Fe-Mn based alloys as an example, we show that the combination of modern electronic-structure calculations with the highly efficient combinatorial thin film composition-spread method constitutes an effective tool for knowledge-based alloy design.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号