Abstract: It is estimated that 4.6 billion tons of non‐hazardous solid waste materials are produced annually in the USA. The potential reuse for a portion of the materials in the construction of highways and roads suggests that valuable benefits in terms of economic and environmental gains are possible. This paper describes the development of a prototype computer‐assisted tool or expert system to help manufacturers assess and analyze their industrial residuals as potential road construction material. This represents an expansion in the application of intelligent systems to domains where a few, hard‐to‐find technical reports have represented the main source of expertise available to practitioners for years. The system, developed through the use of an object‐oriented software shell, Level5 Object, was designed in a user‐friendly Windows environment which allows users with little or no computer training to effectively evaluate material residuals. 相似文献
Currently, the design of aesthetic products is a process that requires a set of activities where digital models and physical
mockups play a key role. Typically, these are modified (and built) several times before reaching the desired design, increasing
the development time and, consequently, the final product cost. In this paper, we present an innovative design environment
for computer-aided design (CAD) surface analysis. Our system relies on a direct visuo-haptic display system, which enables
users to visualize models using a stereoscopic view, and allows the evaluation of sectional curves using touch. Profile curves
are rendered using an haptic device that deforms a plastic strip, thanks to a set of actuators, to reproduce the curvature
of the shape co-located with the virtual model. By touching the strip, users are able to evaluate shape characteristics, such
as curvature or discontinuities (rendered using sound), and to assess the surface quality. We believe that future computer-aided
systems (CAS)/CAD systems based on our approach will contribute in improving the design process at industrial level. Moreover,
these will allow companies to reduce the product development time by reducing the number of physical mockups necessary for
the product design evaluation and by increasing the quality of the final product, allowing a wider exploration and comparative
evaluation of alternatives in the given time. 相似文献
This research aims to illustrate the potential use of concepts, techniques, and mining process tools to improve the systematic review process. Thus, a review was performed on two online databases (Scopus and ISI Web of Science) from 2012 to 2019. A total of 9649 studies were identified, which were analyzed using probabilistic topic modeling procedures within a machine learning approach. The Latent Dirichlet Allocation method, chosen for modeling, required the following stages: 1) data cleansing, and 2) data modeling into topics for coherence and perplexity analysis. All research was conducted according to the standards of the Preferred Reporting Items for Systematic Reviews and Meta-Analyses in a fully computerized way. The computational literature review is an integral part of a broader literature review process. The results presented met three criteria: (1) literature review for a research area, (2) analysis and classification of journals, and (3) analysis and classification of academic and individual research teams. The contribution of the article is to demonstrate how the publication network is formed in this particular field of research, and how the content of abstracts can be automatically analyzed to provide a set of research topics for quick understanding and application in future projects.
The simulation of laser wakefield accelerators with particle-in-cell codes in relativistic reference frames is described, with emphasis on the computational speed-ups, which may potentially exceed three orders of magnitude in comparison with laboratory frame configurations. The initialization of laboratory quantities in a relativistically moving frame is depicted, and the method for result comparison with the plasma rest frame is described. Benchmarks with laboratory frame simulations and experimental data where gains of ∼20 times were obtained are discussed, and potential numerical issues are analyzed. This method enables numerical simulations with shorter turnaround times required for parameter scanning, and for one-to-one three-dimensional experimental modeling of current and next generation laser wakefield experiments. 相似文献
A homogeneous set is a non-trivial module of a graph, i.e. a non-empty,
non-unitary, proper subset of a graph's vertices such that all its elements
present exactly the same outer neighborhood. Given two graphs
the Homogeneous Set Sandwich Problem (HSSP) asks whether there
exists a sandwich graph
which
has a homogeneous set. In 2001 Tang et al. published
an all-fast
algorithm which was recently proven wrong, so that the HSSP's known upper bound would have been reset
thereafter at the former
determined by Cerioli et al. in 1998. We present, notwithstanding, new deterministic
algorithms which have it established at
We give as
well two even faster
randomized algorithms, whose simplicity might
lend them didactic usefulness. We believe that, besides providing efficient
easy-to-implement procedures to solve it, the study of these new approaches
allows a fairly thorough understanding of the problem. 相似文献
We discuss how to increase and simplify the understanding of the equivalence relations between machine models and/or language representations of formal languages by means of the animation tool SAGEMoLiC. Our new educational tool permits the simulation of the execution of models of computation, as many other animation systems do, but its philosophy goes further than these of the usual systems since it allows for a true visualization of the key notions involved in the formal proofs of these equivalences. In contrast with the proposal of previous systems, our approach to visualize equivalence theorems is not a simple “step by step animation” of specific conversion algorithms between computational models and/or grammatical representations of formal languages, because we make emphasis on the key theoretical notions involved in the formal proofs of these equivalences. 相似文献
The multiple determination tasks of chemical properties are a classical problem in analytical chemistry. The major problem is concerned in to find the best subset of variables that better represents the compounds. These variables are obtained by a spectrophotometer device. This device measures hundreds of correlated variables related with physicocbemical properties and that can be used to estimate the component of interest. The problem is the selection of a subset of informative and uncorrelated variables that help the minimization of prediction error. Classical algorithms select a subset of variables for each compound considered. In this work we propose the use of the SPEA-II (strength Pareto evolutionary algorithm II). We would like to show that the variable selection algorithm can selected just one subset used for multiple determinations using multiple linear regressions. For the case study is used wheat data obtained by NIR (near-infrared spectroscopy) spectrometry where the objective is the determination of a variable subgroup with information about E protein content (%), test weight (Kg/HI), WKT (wheat kernel texture) (%) and farinograph water absorption (%). The results of traditional techniques of multivariate calibration as the SPA (successive projections algorithm), PLS (partial least square) and mono-objective genetic algorithm are presents for comparisons. For NIR spectral analysis of protein concentration on wheat, the number of variables selected from 775 spectral variables was reduced for just 10 in the SPEA-II algorithm. The prediction error decreased from 0.2 in the classical methods to 0.09 in proposed approach, a reduction of 37%. The model using variables selected by SPEA-II had better prediction performance than classical algorithms and full-spectrum partial least-squares. 相似文献
One of the most common effects among aphasia patients is the difficulty to recall names or words. Typically, word retrieval problems can be treated through word naming therapeutic exercises. In fact, the frequency and the intensity of speech therapy are key factors in the recovery of lost communication functionalities. In this sense, speech and language technology can have a relevant contribution in the development of automatic therapy methods. In this work, we present an on-line system designed to behave as a virtual therapist incorporating automatic speech recognition technology that permits aphasia patients to perform word naming training exercises. We focus on the study of the automatic word naming detector module and on its utility for both global evaluation and treatment. For that purpose, a database consisting of word naming therapy sessions of aphasic Portuguese native speakers has been collected. In spite of the different patient characteristics and speech quality conditions of the collected data, encouraging results have been obtained thanks to a calibration method that makes use of the patients’ word naming ability to automatically adapt to the patients’ speech particularities. 相似文献
This paper presents a new methodology to evaluate loss of load indices, with particular emphasis on LOLC (loss of load cost) assessment, for composite generation and transmission systems considering time varying loads for different areas or buses. The proposed approach, named pseudo-chronological simulation, retains the computational efficiency of nonsequential Monte Carlo simulation and the ability to model chronological load curves in sequential simulation. It considers the actual blocks of unsupplied energy per consumer class, per bus, and the respective duration, to accurately characterize the interruption process. Case studies on the IEEE-MRTS (Modified Reliability Test System) and the BSS (Brazilian South-Southeastern System) are presented and discussed 相似文献