共查询到20条相似文献,搜索用时 15 毫秒
1.
For a non-idealized machine tool, each point in the workspace is associated with a tool point positioning error vector. If this error map can be determined, then it is possible to substantially improve the positioning performance of the machine by introducing suitable compensation into the control loop. This paper explores the possibility of using an artifical neural network (ANN) to compute this mapping. The training set for the ANN is obtained by mounting a physical artifact whose dimensions are precisely known in the machine's workspace. The machine, equipped with a touch trigger probe, measures the positions of features on the artifact. The difference between the machine reading and the known dimension is the machine error at that point in the workspace. Using standard modeling techniques, the kinematic error model for a CNC turning center was developed. This model was parameterized by measurement of the parametric error functions using a laser interferometer, electronic levels and a precision square. The kinematic model was then used to simulate the artifact-measuring process and develop the ANN training set. The effect of changing artifact geometry was explored and a machining operation was simulated using the ANN output to provide compensation. The results show that the ANN is capable of learning the error map of a real machine, and that ANN-based compensation can significantly reduce part-dimensional errors. 相似文献
2.
This paper describes a physics-guided logistic classification method for tool life modeling and process parameter optimization in machining. Tool life is modeled using a classification method since the exact tool life cannot be measured in a typical production environment where tool wear can only be directly measured when the tool is replaced. In this study, laboratory tool wear experiments are used to simulate tool wear data normally collected during part production. Two states are defined: tool not worn (class 0) and tool worn (class 1). The non-linear reduction in tool life with cutting speed is modeled by applying a logarithmic transformation to the inputs for the logistic classification model. A method for interpretability of the logistic model coefficients is provided by comparison with the empirical Taylor tool life model. The method is validated using tool wear experiments for milling. Results show that the physics-guided logistic classification method can predict tool life using limited datasets. A method for pre-process optimization of machining parameters using a probabilistic machining cost model is presented. The proposed method offers a robust and practical approach to tool life modeling and process parameter optimization in a production environment. 相似文献
3.
Carolyn Rosé Yi-Chia Wang Yue Cui Jaime Arguello Karsten Stegmann Armin Weinberger Frank Fischer 《International Journal of Computer-Supported Collaborative Learning》2008,3(3):237-271
In this article we describe the emerging area of text classification research focused on the problem of collaborative learning
process analysis both from a broad perspective and more specifically in terms of a publicly available tool set called TagHelper
tools. Analyzing the variety of pedagogically valuable facets of learners’ interactions is a time consuming and effortful
process. Improving automated analyses of such highly valued processes of collaborative learning by adapting and applying recent
text classification technologies would make it a less arduous task to obtain insights from corpus data. This endeavor also
holds the potential for enabling substantially improved on-line instruction both by providing teachers and facilitators with
reports about the groups they are moderating and by triggering context sensitive collaborative learning support on an as-needed
basis. In this article, we report on an interdisciplinary research project, which has been investigating the effectiveness
of applying text classification technology to a large CSCL corpus that has been analyzed by human coders using a theory-based
multi-dimensional coding scheme. We report promising results and include an in-depth discussion of important issues such as
reliability, validity, and efficiency that should be considered when deciding on the appropriateness of adopting a new technology
such as TagHelper tools. One major technical contribution of this work is a demonstration that an important piece of the work
towards making text classification technology effective for this purpose is designing and building linguistic pattern detectors,
otherwise known as features, that can be extracted reliably from texts and that have high predictive power for the categories
of discourse actions that the CSCL community is interested in.
相似文献
Carolyn RoséEmail: |
4.
对测量结果的不确定度评定,是检测工作重要的组成部分。不确定度是与测量结果相联系的表征测量结果品质的合理地赋予被测量之值的分散性的一个参数或一组参数,需要我们准确地评定,本文就最常用的二线制压力变送器示值误差的测量结果的不确定度如何评定进行了较为实用的分析探讨。 相似文献
5.
6.
Iso-planar piecewise linear NC tool path generation from discrete measured data points 总被引:7,自引:0,他引:7
This article presents a method of generating iso-planar piecewise linear NC tool paths for three-axis surface machining using ball-end milling directly from discrete measured data points. Unlike the existing tool path generation methods for discrete points, both the machining error and the machined surface finish are explicitly considered and evaluated in the present work. The primary direction of the generated iso-planar tool paths is derived from the projected boundary of the discrete points. A projected cutter location net (CL-net) is then created, which groups the data points according to the intended machining error and surface finish requirements. The machining error of an individual data point is evaluated within its bounding CL-net cell from the adjacent tool swept surfaces of the ball-end mill. The positions of the CL-net nodes can thus be optimized and established sequentially by minimizing the machining error of each CL-net cell. Since the linear edges of adjacent CL-net cells are in general not perfectly aligned, weighted averages of the associated CL-net nodes are employed as the CL points for machining. As a final step, the redundant segments on the CL paths are trimmed to reduce machining time. The validity of the tool path generation method has been examined by using both simulated and experimentally measured data points. 相似文献
7.
Each actor evaluating potential management strategies brings her/his own distinct set of objectives to a complex decision space of system uncertainties. The diversity of these objectives and uncertainties requires detailed and rigorous analyses that respond to multifaceted challenges. The utility of this information depends on the accessibility of scientific information to decision makers. This paper demonstrates data visualization tools for presenting scientific results to decision makers in two case studies, La Paz/El Alto, Bolivia, and Yuba County, California. Visualization output from the case studies combines spatiotemporal, multivariate and multirun/multiscenario information to produce information corresponding to the objectives and uncertainties described by key actors. These tools can manage complex data and distill scientific information into accessible formats. Using the visualizations, scientists and decision makers can navigate the decision space and potential objective trade-offs to facilitate discussion and consensus building. These efforts can help identify stable negotiated agreements between different stakeholders. 相似文献
8.
P. Baraldi M. Librizzi E. Zio L. Podofillini V.N. Dang 《Expert systems with applications》2009,36(10):12461-12471
Problems characterized by qualitative uncertainty described by expert judgments can be addressed by the fuzzy logic modeling paradigm, structured within a so-called fuzzy expert system (FES) to handle and propagate the qualitative, linguistic assessments by the experts. Once constructed, the FES model should be verified to make sure that it represents correctly the experts’ knowledge. For FES verification, typically there is not enough data to support and compare directly the expert- and FES-inferred solutions. Thus, there is the necessity to develop indirect methods for determining whether the expert system model provides a proper representation of the expert knowledge. A possible way to proceed is to examine the importance of the different input factors in determining the output of the FES model and to verify whether it is in agreement with the expert conceptualization of the model. In this view, two sensitivity and uncertainty analysis techniques applicable to generic FES models are proposed in this paper with the objective of providing appropriate tools of verification in support of the experts in the FES design phase. To analyze the insights gained by using the proposed techniques, a case study concerning a FES developed in the field of human reliability analysis has been considered. 相似文献
9.
基于支持向量机的质量控制软测量建模 总被引:1,自引:0,他引:1
在具体研究支持向量机理论的基础上,提出了一种基于支持向量机的软测量控制方法。针对工业过程变量无法在线测量和大滞后的问题,建立了相应的支持向量机回归模型,将此方法用于合成反应器的质量控制中,实现了输出值的在线预估,并分析了参数调整和核函数的选择对建模的影响,得到了泛化良好的模型仿真结果。 相似文献
10.
Recent advances in state-of-the-art meta-heuristics feature the incorporation of probabilistic operators aiming to diversify search directions or to escape from being trapped in local optima. This feature would result in non-deterministic output in solutions that vary from one run to another of a meta-heuristic. Consequently, both the average and variation of outputs over multiple runs have to be considered in evaluating performances of different configurations of a meta-heuristic or distinct meta-heuristics. To this end, this work considers each algorithm as a decision-making unit (DMU) and develops robust data envelopment analysis (DEA) models taking into account not only average but also standard deviation of an algorithm’s output for evaluating relative efficiencies of a set of algorithms. The robust DEA models describe uncertain output using an uncertainty set, and aim to maximize a DMU’s worst-case relative efficiency with respect to that uncertainty set. The proposed models are employed to evaluate a set of distinct configurations of a genetic algorithm and a set of parameter settings of a simulated annealing heuristic. Evaluation results demonstrate that the robust DEA models are able to identify efficient algorithmic configurations. The proposed models contribute not only to the evaluation of meta-heuristics but also to the DEA methodology. 相似文献
11.
We present a heuristic-based multi-objective optimization approach for minimizing staff and maximizing throughput at Points-of-Dispensing (PODs). PODs are sites quickly set up by local health departments to rapidly dispense life-saving medical countermeasures during large-scale public health emergencies. Current modeling tools require decision makers to modify their models and re-run them for each “what if” scenario they are charged with preparing for, e.g. what happens if more/less staff are available. The exploration of these “what if” scenarios becomes tedious if there are many variables to change and the decision space quickly becomes too large to analyze effectively. Currently, to understand the trade-offs between throughput and staffing levels, public health emergency managers must maximize throughput subject to a specified staffing level. Then, they must repeatedly change the constraint (altering the maximum staff allowed) and re-run the model. In contrast, by approaching the problem from a multi-objective perspective and integrating discrete event and optimization tools, we automate of the exploration of the decision space. This approach allows public health emergency planners to examine far more potential solutions and to focus tangible planning resources on areas that show theoretical promise. Such an approach can also expose previously unidentified constraints in existing plans. 相似文献
12.
Consensus decision making is complex and challenging in multicriteria group decision making due to the involvement of several decision makers, the presence of multiple, and often conflicting criteria, and the existence of subjectiveness and imprecision in the decision making process. To ensure effective decisions being made, the interest of all the decision makers usually represented by the degree of consensus in the decision making process has to be adequately considered. This paper presents a consensus-based approach for effectively solving the multicriteria group decision making problem. The subjectiveness and imprecision of the decision making process is adequately handled by using intuitionistic fuzzy numbers. An interactive algorithm is developed for consensus building in the group decision making process. A decision support system framework is presented for improving the effectiveness of the consensus building process. An example is presented for demonstrating the applicability of the proposed approach for solving the multicriteria group decision making problem in real world situations. 相似文献
13.
Hamidreza Zoraghein Ali A. Alesheikh Abbas Alimohammadi Mohammad H. Vahidnia 《Computers, Environment and Urban Systems》2012
Indicator Kriging (IK) is a geostatistical method that uses observation points to quantify the probabilities at which a set of thresholds are exceeded at unmeasured points. To improve IK accuracy, the interpolation process should consider its uncertainty sources. By doing this, we also maintain its ability to provide the conditional cumulative distribution function (ccdf), which is a reliable measure of local uncertainty. This study modeled two IK uncertainty sources, i.e., measurement errors attached to observation points and subjective threshold choices. Soft Indicator Kriging (SIK), which uses a soft transformation for observation points, considers the measurement errors of these two sources. To select the thresholds objectively, a genetic algorithm (GA) was performed to obtain the optimum set of thresholds related to an objective function, which minimized the mean absolute error (MAE). 相似文献
14.
15.
16.
The geometry of cutting flutes and the surfaces of end mills is one of the crucial parameters affecting the quality of the machining in the case of end milling. These are usually represented by two-dimensional models. This paper describes in detail the methodology to model the geometry of a flat end mill in terms of three-dimensional parameters. The geometric definition of the end mill is developed in terms of surface patches; flutes as helicoidal surfaces, the shank as a surface of revolution and the blending surfaces as bicubic Bezier and biparametric sweep surfaces. The proposed model defines the end mill in terms of three-dimensional rotational angles rather than the conventional two dimensional angles. To validate the methodology, the flat end milling cutter is directly rendered in OpenGL environment in terms of three-dimensional parameters. Further, an interface is developed that directly pulls the proposed three-dimensional model defined with the help of parametric equations into a commercial CAD modeling environment. This facilitates a wide range of downstream technological applications. The modeled tool is used for finite element simulations to study the cutting flutes under static and transient dynamic load conditions. The results of stress distribution (von mises stress), translational displacement and deformation are presented for static and transient dynamic analysis for the end mill cutter flute and its body. The method described in this paper offers a simple and intuitive way of generating high-quality end mill models for use in machining process simulations. It can be easily extended to generate other tools without relying on analytical or numerical formulations. 相似文献
17.
Sustainable management of groundwater-dependent vegetation (GDV) requires the accurate identification of GDVs, characterisation of their water use dynamics and an understanding of associated errors. This paper presents sensitivity and uncertainty analyses of one GDV mapping method which uses temperature differences between time-series of modelled and observed land surface temperature (LST) to detect groundwater use by vegetation in a subtropical woodland. Uncertainty in modelled LST was quantified using the Jacobian method with error variances obtained from literature. Groundwater use was inferred where modelled and observed LST were significantly different using a Student's t-test. Modelled LST was most sensitive to low-range wind speeds (<1.5 m s−1), low-range vegetation height (<=0.5 m), and low-range leaf area index (<=0.5 m2 m−2), limiting the detectability of groundwater use by vegetation under such conditions. The model-data approach was well-suited to detection of GDV because model-data errors were lowest for climatic conditions conducive to groundwater use. 相似文献
18.
河流水质模型的逼近方法及误差分析 总被引:2,自引:0,他引:2
提出一种平稳河流水质模型简化方法.通过分析存在弥散和忽略弥散作用两种情况下的Streeter-Phelps模型的解,给出了忽略弥散作用的条件和最大误差分析,进而给出多河段模型的一般表达式.数值计算和实际验证结果表明,该方法可给出工程满意的结果. 相似文献
19.
MODIS日尺度的地表温度受到天气影响,有效像元信息严重缺失,这对数据稀缺区域尤为重要。以古尔班通古特沙漠为研究区,探索了采用AMSR-2的垂直极化亮度温度与植被指数对地表温度空间降尺度的方法,并用此方法填补了2018年MODIS的缺失像元。(1)通过十折交叉验证,对4种机器学习算法(Cubist、DBN、SVM、RF)、10个波段组合、2个空间尺度(5 km、10 km)下的训练模型进行了分析,表明RF算法精度明显高于其他3种算法,C09波段组合的验证精度高于其他波段组合。(2)构建了2个鲁棒性的随机森林算法地表温度降尺度模型(5 km|RF|09、10 km|RF|09),将AMSR-2亮度温度降尺度到1km分辨率,表明5 km|RF|09模型反演结果更为合理,MODIS与站点验证的R2分别为0.971、0.930,RMSE分别为3.38 K、4.71 K,MAE分别为2.51 K、3.84 K。(3)降尺度结果填补MODIS地表温度缺失像元,将其应用到古尔班通古特沙漠长时间序列的陆表温度分析,可为数据稀缺区域数据获取提供科学参考。 相似文献
20.
基于误差通道并行建模的主动控制系统 总被引:1,自引:0,他引:1
提出了基于误差通道并行在线建模算法的主动控制系统,该系统同时采用3个自适应滤波器,并且通过引入一个延迟单元,以保证滤波器解的唯一性。数学分析和仿真实验结果表明,该控制算法能获取误差通道的无偏估计,并可降低主动降噪系统总体代价。 相似文献