首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 93 毫秒
1.
2.
In this paper, a generic model for the synthesis of tolerances for manufactured parts is presented. The model uses a method of transforming traditional tolerance specifications (as defined in ASME Y14.5M) to a generalized coordinate system (hereinafter referred to as deviation space). Small displacement torsors (SDTs) have been used for representing the deviations. The tolerance synthesis method is formulated as a constrained nonlinear optimization process. Three different types of constraints have been considered for the optimization process: 1) representation of assemblability; 2) mapping of tolerance specification to deviations; and 3) functional requirements. A new deviation-based cost of manufacturing model has been proposed. A working module of the scheme has been implemented and the process has been elaborated with two examples. The possibility of extension of the model and scope for further generalization have been discussed. Note to Practitioners-This paper presents an optimization method for finding tolerance values for different features of an assembly of manufactured parts. Determination of different types of tolerances and their exact values for critical features of any part has been an ad-hoc process; it has been mostly experience-based till recent time. In this paper, efforts have been made to establish a strong mathematically oriented method for synthesis of tolerances. The procedure attempts to minimize cost of manufacturing while the functionality and assemblability are satisfied. It is a generalized method and could be applied for designing rigid manufacture parts. For practical usage, this method should be integrated with three-dimensional computer-aided design packages as a tolerance synthesis module for integrated tolerance design.  相似文献   

3.
For several years, Digital Mock-Up (DMU) has been improved by the integration of many tools as Finite Element (FE) Analysis, Computer Aided Manufacturing (CAM), and Computer Aided Tolerancing (CAT) in the Computer Aided Design (CAD) model. In the geometrical model, the tolerances, which specify the requirements for the proper functioning of mechanical systems, are formally represented. The nominal modeling of the parts and assemblies does not allow the prediction of the tolerance impacts on the simulation results as the optimization of mechanical system assemblability. So, improving the CAD model to be closer to the realistic model is a necessity to verify and validate the mechanical system assemblability. This paper proposes a new approach to integrate the tolerances in CAD model by the determination of the configurations with defects of a CAD part, used in a mechanical system. The realistic parts are computed according to the dimensional and geometrical tolerances. This approach provides an assembly result closer to the real assembly of the mechanical system. The Replacement of the nominal parts by the realistic ones requires the redefinition of the initially defined assembly mating constraints. The update of the mating constraints is performed by respecting an Objective Function of the Assembly (OFA). Integrating tolerances in CAD allows the visualization and simulation of the mechanical assemblies’ behavior in their real configuration and the detection of possible interference and collision effects between parts which are undetectable in the nominal state.  相似文献   

4.
Process mining can be seen as the “missing link” between data mining and business process management. The lion's share of process mining research has been devoted to the discovery of procedural process models from event logs. However, often there are predefined constraints that (partially) describe the normative or expected process, e.g., “activity A should be followed by B” or “activities A and B should never be both executed”. A collection of such constraints is called a declarative process model. Although it is possible to discover such models based on event data, this paper focuses on aligning event logs and predefined declarative process models. Discrepancies between log and model are mediated such that observed log traces are related to paths in the model. The resulting alignments provide sophisticated diagnostics that pinpoint where deviations occur and how severe they are. Moreover, selected parts of the declarative process model can be used to clean and repair the event log before applying other process mining techniques. Our alignment-based approach for preprocessing and conformance checking using declarative process models has been implemented in ProM and has been evaluated using both synthetic logs and real-life logs from a Dutch hospital.  相似文献   

5.
A “softening” of a basic formulation of multicriterion optimization and control (multistage decision making) is presented. For optimization, instead of seeking an optimal solution that best satisfies all the fuzzy objectives as has been done so far, we seek an optimal solution that best satisfies most, much more than 50%, etc. (a linguistic quantifier, in general) of the fuzzy objectives. For control, we seek in turn an optimal sequence of controls that best satisfies the fuzzy constraints and fuzzy goals at most, much more than 50%, etc. of the control stages. A calculus of linguistically quantified statements based upon fuzzy sets and possibility theory is used. Some applications to softer evidence aggregation in expert systems are also indicated.  相似文献   

6.
All manufactured products have geometrical variations which may impact their functional behavior. Tolerance analysis aims at analyzing the influence of these variations on product behavior, the goal being to evaluate the quality level of the product during its design stage. Analysis methods must verify whether specified tolerances enable the assembly and functional requirements. This paper first focuses on a literature overview of tolerance analysis methods which need to deal with a linearized model of the mechanical behavior. Secondly, the paper shows that the linearization impacts the computed quality level and thus may mislead the conclusion about the analysis. Different linearization strategies are considered, it is shown on an over-constrained mechanism in 3D that the strategy must be carefully chosen in order to not over-estimate the quality level. Finally, combining several strategies allows to define a confidence interval containing the true quality level.  相似文献   

7.
《Parallel Computing》2013,39(10):567-585
We examine the problem of reliable workflow scheduling with less resource redundancy. As scheduling workflow applications in heterogeneous systems, either for optimizing the reliability or for minimizing the makespan, are NP-Complete problems, we alternatively find schedules for meeting specific reliability and deadline requirements. First, we analyze the reliability of a given schedule using two important definitions: Accumulated Processor Reliability (APR) and Accumulated Communication Reliability (ACR). Second, inspired by the reliability analysis, we present three scheduling algorithms: RR algorithm schedules least Resources to meet the Reliability requirement; DRR algorithm extends RR by further considering the Deadline requirement; and dynamic algorithm schedules tasks dynamically: It avoids the “Chain effect” caused by uncertainties on the task execution time estimates, and relieves the impact from the inaccuracy on failure estimation. Finally, the empirical evaluation shows that our algorithms can save a significant amount of computation and communication resources when performing a similar reliability compared to Fault-Tolerant-Scheduling-Algorithm (FTSA) algorithm.  相似文献   

8.
The paper describes a method for the generation of tolerance specifications from product data. The problem is nontrivial due to the increasing adoption of geometric dimensioning criteria, which call for the use of many types of geometric tolerances to completely and unambiguously represent the design intent and the many constraints deriving from manufacturing, assembly and inspection processes. All these issues have to be modeled and explicitly provided to a generative specification procedure, which may thus need a large amount of input data. The proposed approach tries to avoid this difficulty by considering that most precision requirements to be defined relate to the assembly process, and can be automatically derived by analyzing the contact relations between parts and the assembly operations planned for the product. Along with possible user-defined additional requirements relating to function, assembly requirements are used in a rule-based geometric reasoning procedure to select datum reference frames for each part and to assign tolerance types to part features. A demonstrative software tool based on the developed procedure has allowed to verify its correctness and application scope on some product examples.  相似文献   

9.
10.
Tolerance analysis aims on checking whether specified tolerances enable functional and assembly requirements. The tolerance analysis approaches discussed in literature are generally assumed without the consideration of parts’ form defects. This paper presents a new model to consider the form defects in an assembly simulation. A Metric Modal Decomposition (MMD) method is henceforth, developed to model the form defects of various parts in a mechanism. The assemblies including form defects are further assessed using mathematical optimization. The optimization involves two models of surfaces: real model and difference surface-base method, and introduces the concept of signed distance. The optimization algorithms are then compared in terms of time consumption and accuracy. To illustrate the methods and their respective applications, a simplified over-constrained industrial mechanism in three dimensions is also used as a case study.  相似文献   

11.
Automated quality control is a key aspect of industrial maintenance. In manufacturing processes, this is often done by monitoring relevant system parameters to detect deviations from normal behavior. Previous approaches define “normalcy” as statistical distributions for a given system parameter, and detect deviations from normal by hypothesis testing. This paper develops an approach to manufacturing quality control using a newly introduced method: Bayesian Posteriors Updated Sequentially and Hierarchically (BPUSH). This approach outperforms previous methods, achieving reliable detection of faulty parts with low computational cost and low false alarm rates (∼0.1%). Finally, this paper shows that sample size requirements for BPUSH fall well below typical sizes for comparable quality control methods, achieving True Positive Rates (TPR) >99% using as few as n = 25 samples.  相似文献   

12.
The problem of managing and querying inconsistent databases has been deeply investigated in the last few years. As the problem of consistent query answering is hard in the general case, most of the techniques proposed so far have an exponential complexity. Polynomial techniques have been proposed only for restricted forms of constraints (such as functional dependencies) and queries. In this paper, a technique for computing “approximate” consistent answers in polynomial time is proposed, which works in the presence of a wide class of constraints (namely, full constraints) and Datalog queries. The proposed approach is based on a repairing strategy where update operations assigning an undefined truth value to the “reliability” of tuples are allowed, along with updates inserting or deleting tuples. The result of a repair can be viewed as a three-valued database which satisfies the specified constraints. In this regard, a new semantics (namely, partial semantics) is introduced for constraint satisfaction in the context of three-valued databases, which aims at capturing the intuitive meaning of constraints under three-valued logic. It is shown that, in order to compute “approximate” consistent query answers, it suffices to evaluate queries by taking into account a unique repair (called deterministic repair), which in some sense “summarizes” all the possible repairs. The so obtained answers are “approximate” in the sense that are safe (true and false atoms in the answers are, respectively, true and false under the classical two-valued semantics), but not complete.  相似文献   

13.
Inaccuracies, or deviations, in the measurements of monitored variables in a control system are facts of life that control software must accommodate. Deviation analysis can be used to determine how a software specification will behave in the face of such deviations. Deviation analysis is intended to answer questions such as “What is the effect on output O if input I is off by 0 to 100?”. This property is best checked with some form of symbolic execution approach. In this report we wish to propose a new approach to deviation analysis using model checking techniques. The key observation that allows us to use model checkers is that the property can be restated as “Will there be an effect on output O if input I is off by 0 to 100?”—this restatement of the property changes the analysis from an exploratory analysis to a verification task suitable for model checking.  相似文献   

14.
15.
Background: Conclusion Instability in software effort estimation (SEE) refers to the inconsistent results produced by a diversity of predictors using different datasets. This is largely due to the “ranking instability” problem, which is highly related to the evaluation criteria and the subset of the data being used. Aim: To determine stable rankings of different predictors. Method: 90 predictors are used with 20 datasets and evaluated using 7 performance measures, whose results are subject to Wilcoxon rank test (95 %). These results are called the “aggregate results”. The aggregate results are challenged by a sanity check, which focuses on a single error measure (MRE) and uses a newly developed evaluation algorithm called CLUSTER. These results are called the “specific results.” Results: Aggregate results show that: (1) It is now possible to draw stable conclusions about the relative performance of SEE predictors; (2) Regression trees or analogy-based methods are the best performers. The aggregate results are also confirmed by the specific results of the sanity check. Conclusion: This study offers means to address the conclusion instability issue in SEE, which is an important finding for empirical software engineering.  相似文献   

16.
Efficient algorithms are presented for factoring polynomials in the skew-polynomial ringF[x;σ], a non-commutative generalization of the usual ring of polynomialsF[x], whereFis a finite field and σ:F  Fis an automorphism (iterated Frobenius map). Applications include fast functional decomposition algorithms for a class of polynomials inF[x] whose decompositions are “wild” and previously thought to be difficult to compute.  相似文献   

17.
The role of information resource dictionary systems (data dictionary systems) is important in two important phases of information resource management:First, information requirements analysis and specification, which is a complex activity requiring data dictionary support: the end result is the specification of an “Enterprise Model,” which embodies the major activities, processes, information flows, organizational constraints, and concepts. This role is examined in detail after analyzing the existing approaches to requirements analysis and specification.Second, information modeling which uses the information in the Enterprise Model to construct a formal implementation independent database specification: several information models and support tools that may aid in transforming the initial requirements into the final logical database design are examined.The metadata — knowledge about both data and processes — contained in the data dictionary can be used to provide views of data for the specialized tools that make up the database design workbench. The role of data dictionary systems in the integration of tools is discussed.  相似文献   

18.
Allocation of appropriate tolerances is critical to ensure that components fit right and function satisfactorily in an assembly involving stacked components. There are numerous techniques available today to model assemblies on a computer. What is lacking is a common platform to make use of these computer models in order to perform tolerance analysis and allocation. This paper describes a technique to automate tolerance analysis and allocation of an assembly involving components stacked one on another represented in the boundary form. An algorithm is developed to track dimension loops in the stacked assembly. Statistical tolerance analysis and allocation is then performed on these interrelating dimensions and tolerances encompassed by a dimension loop. Advantages and limitations of this technique are compared against the manual method to conduct tolerance analysis and allocation.  相似文献   

19.
“Shear constraints” are used to derive a displacement-based bending element for the analysis of thin and moderately thick plates of general plan form. As a starting point, the eight serendipity modes are adopted for the normal rotations and the nine Lagrangian modes for the transverse displacement, w. Subsequently, the shear constraints are used to eliminate the mid-side and central w variables so that the final element has three degrees-of-freedom at the corners and two at each mid-side. The bending energy is integrated using the standard formulation for the serendipity Mindlin element (with two-point Gaussian integration) so that the only modifications to that element involve the shear strain-displacement matrix. The constraints, which are used to implement these modifications, involve explicit algebraic expressions rather than numerical integration or matrix manipulation. A Fortran subroutine is provided for implementing these changes in a general quadrilateral. Using hierarchical displacement functions, the mid-side displacement variables Δw, that are missing from the standard serendipity element, may be simply constrained to zero as “boundary conditions”. Numerical experiments are presented which show that the element does not “lock” and that it gives excellent results for both thin and moderately thick plates. It also passes the patch test for a general quadrilateral.  相似文献   

20.
Quick GPS: A new CAT system for single-part tolerancing   总被引:1,自引:0,他引:1  
This paper depicts a new CAT (Computer Aided Tolerancing) system called Quick GPS (Geometrical Product Specification), for assisting the designer when specifying the functional tolerances of a single part included in a mechanism, without any required complex function analysis.The mechanism assembly is first described through a positioning table formalism. In order to create datum reference frames and to respect assembly requirements, an ISO based 3D tolerancing scheme is then proposed, thanks to a set of rules based on geometric patterns and TTRS (Technologically and Topologically Related Surfaces). Since it remains impossible to determine tolerance chains automatically, the designer must impose links between the frames. The CAT system that we developed here proposes ISO based tolerance specifications to help ensure compliance with the designer’s intentions, saving on time and eliminating errors.This paper will detail both the set of tolerancing rules and the designer’s approach. The Quick GPS system has been developed in a CATIA V5 environment using CATIA VBA and CATIA CAA procedures.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号