首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 390 毫秒
1.
The generalized differential quadrature rule (GDQR) proposed here is aimed at solving high‐order differential equations. The improved approach is completely exempted from the use of the existing δ‐point technique by applying multiple conditions in a rigorous manner. The GDQR is used here to static and dynamic analyses of Bernoulli–Euler beams and classical rectangular plates. Numerical error analysis caused by the method itself is carried out in the beam analysis. Independent variables for the plate are first defined. The explicit weighting coefficients are derived for a fourth‐order differential equation with two conditions at two different points. It is quite evident that the GDQR expressions and weighting coefficients for two‐dimensional problems are not a direct application of those for one‐dimensional problems. The GDQR are implemented through a number of examples. Good results are obtained in this work. Copyright © 2001 John Wiley & Sons, Ltd.  相似文献   

2.
Batch operations are encountered in many industries and measurements are often recorded from automated sensors. It is important to determine whether an unknown batch is normal or unusual given a set of reference batches from normal operations. The measurements from a single batch can contain temporal readings that comprise a large time series. A discrete wavelet transformation (DWT) is applied to use the time and frequency localization of wavelets to extract features. A large number of coefficients can result and several methods to create summary features from the denoised coefficients obtained from DWT are compared. Also, a new summary feature incorporates information from denoised wavelet coefficients. The proposed study considers discrete wavelet‐decompositions combined with principal component analyses to summarize batch characteristics. Results were validated on an industry data set. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

3.
Unlike the traditional topology optimization approach that uses the same discretization for finite element analysis and design optimization, this paper proposes a framework for improving multiresolution topology optimization (iMTOP) via multiple distinct discretizations for: (1) finite elements; (2) design variables; and (3) density. This approach leads to high fidelity resolution with a relatively low computational cost. In addition, an adaptive multiresolution topology optimization (AMTOP) procedure is introduced, which consists of selective adjustment and refinement of design variable and density fields. Various two‐dimensional and three‐dimensional numerical examples demonstrate that the proposed schemes can significantly reduce computational cost in comparison to the existing element‐based approach. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

4.
We suggest a new approach for classification based on nonparametricly estimated likelihoods. Due to the scarcity of data in high dimensions, full nonparametric estimation of the likelihood functions for each population is impractical. Instead, we propose to build a class of estimated nonparametric candidate likelihood models based on a Markov property and to provide multiple likelihood estimates that are useful for guiding a classification algorithm. Our density estimates require only estimates of one and two-dimensional marginal distributions, which can effectively get around the curse of dimensionality problem. A classification algorithm based on those estimated likelihoods is presented. A modification to it utilizing variable selection of differences in log of estimated marginal densities is also suggested to specifically handle high dimensional data.  相似文献   

5.
This paper introduces a method for the synthesis of the multiple tasks of a planar six-bar mechanism by wavelet series. Based on the theory of wavelet series, a mathematical model for assessing the output of multiple tasks of planar linkage mechanisms was generated. The multiple tasks that combine a function generation and a rigid body guidance will be presented throughout this paper. For function generation, a numerical atlas database, including 31,775 groups of basic dimensional types of a planar slider-crank mechanism was created based on the relationships of the wavelet feature parameters. For rigid body guidance synthesis, an output feature parameters database of a planar four-bar motion generation, including 178,810 groups of the basic dimensional types was created by analysing the relationships of the wavelet coefficients between the rigid body rotation angle and the coupler rotation angle of its corresponding basic dimensional types. The actual sizes and installation position parameters of the planar six-bar mechanism were calculated using the fuzzy identification method and theoretical formulas. Finally, an application is presented to illustrate the validity and practicality of the proposed theory.  相似文献   

6.
With the development of the sensor network and manufacturing technology, multivariate processes face a new challenge of high‐dimensional data. However, traditional statistical methods based on small‐ or medium‐sized samples such as T2 monitoring statistics may not be suitable because of the “curse of dimensionality” problem. To overcome this shortcoming, some control charts based on the variable‐selection (VS) algorithms using penalized likelihood have been suggested for process monitoring and fault diagnosis. Although there has been much effort to improve VS‐based control charts, there is usually a common distributional assumption that in‐control observations should follow a single multivariate Gaussian distribution. However, in current manufacturing processes, processes can have multimodal properties. To handle the high‐dimensionality and multimodality, in this study, a VS‐based control chart with a Gaussian mixture model (GMM) is proposed. We extend the VS‐based control chart framework to the process with multimodal distributions, so that the high‐dimensionality and multimodal information in the process can be better considered.  相似文献   

7.
《成像科学杂志》2013,61(2):120-133
Abstract

Image watermarking refers to the process of embedding an authentication message, called watermark, into the host image to uniquely identify the ownership. In this paper, an adaptive, scalable, blind and robust wavelet-based watermarking approach is proposed. The proposed approach enables scalable watermark detection and provides robustness against progressive wavelet image compression. A multiresolution decomposition of the binary watermark is inserted into the selected coefficients of the wavelet-decomposed image that represent the high activity regions of the image. The watermark insertion is started from the lowest frequency sub-band of the decomposed image and each decomposed watermark sub-band is inserted into its counterpart sub-band of the decomposed image. In the lowest frequency sub-band, coefficients with maximum local variance and in the higher frequency sub-bands, coefficients with maximum magnitude are selected. The watermarked test images are transparent according to the human vision system characteristics and do not show any perceptual degradation. The experimental results very efficiently prove the robustness of the approach against progressive wavelet image coding even at very low bit rates. The watermark extraction process is completely blind and multiple spatial resolutions of the watermark are progressively detectable from the compressed watermarked image. This approach is a suitable candidate for providing efficient authentication for progressive image transmission applications especially over heterogeneous networks, such as the Internet.  相似文献   

8.
Multivariate time series and process identification methods are used to develop a dynamicstochastic model for a packed bed tubular reactor carrying out highly exothermic hydrogenolysis reactions. A canonical analysis procedure is used on the data collected from the reactor to first reduce the dimensionality of the identification and control problems. The identified transfer function-ARIMA model is transformed into a state space model form and used to develop a multivariable optimal stochastic controller for the reactor. The controlled variables are inferred production rates reconstructed from temperature and flow measurements. The parameters of the inferential equation are updated recursively using measurements of actual concentrations available periodically. The controller is implemented using a process minicomputer, and is shown to perform very well.  相似文献   

9.
A new concept has been introduced that the combination of rotational mode shape with two‐dimensional wavelet packet transform to detect the added mass (damage) in a glass fibre reinforced polymer composite plate structure. Wavelet packet transform is an advanced signal processing tool that can magnify the abnormality features in the signal. Rotational mode shapes are sensitive to damage in beam and plate structures. The proposed method employs an added mass, which slides to different locations to alter the local and global dynamic characteristics of the structure. Finite element analysis is carried out to obtain the first three rotational bending mode shapes, from the damaged plate structure, then used as input to two‐dimensional wavelet packet transform. The numerical results of normalised diagonal detail wavelet packet coefficients show a peak at single or multiple added mass (damage) locations of a plate structure for two different boundary conditions. This method seems to be sensitive to relatively small amount of damage to the plate structure. A simple parametric study is carried out for the damage extent quantification. In addition, investigation with noise‐contaminated signals shows its feasibility in the real applications.  相似文献   

10.
With the shrinking feature size of integrated circuits driven by continuous technology migrations for wafer fabrication, the control of tightening critical dimensions is critical for yield enhancement, while physical failure analysis is increasingly difficult. In particular, the yield ramp up stage for implementing new technology node involves new production processes, unstable machine configurations, big data with multiple co-linearity and high dimensionality that can hardly rely on previous experience for detecting root causes. This research aims to propose a novel data-driven approach for Analysing semiconductor manufacturing big data for low yield (namely, excursions) diagnosis to detect process root causes for yield enhancement. The proposed approach has shown practical viability to efficiently detect possible root causes of excursion to reduce the trouble shooting time and improve the production yield effectively.  相似文献   

11.
A two‐dimensional transient heat conduction problem of multiple interacting circular inhomogeneities, cavities and point sources is considered. In general, a non‐perfect contact at the matrix/inhomogeneity interfaces is assumed, with the heat flux through the interface proportional to the temperature jump. The approach is based on the use of the general solutions to the problems of a single cavity and an inhomogeneity and superposition. Application of the Laplace transform and the so‐called addition theorem results in an analytical transformed solution. The solution in the time domain is obtained by performing a numerical inversion of the Laplace transform. Several numerical examples are given to demonstrate the accuracy and the efficiency of the method. The approximation error decreases exponentially with the number of the degrees of freedom in the problem. A comparison of the companion two‐ and three‐dimensional problems demonstrates the effect of the dimensionality. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

12.
考虑混凝土主裂缝亚临界扩展长度及虚拟裂缝区粘聚力,利用,积分与应力强度因子的解析关系,建立基于小波基无单元方法求解混凝土等效断裂韧度的数值方法,研究不同基底小波分辨率对混凝土等效断裂韧度计算值的影响。对8个带裂缝试件的混凝土等效断裂韧度的计算结果表明:在相同小波基分辨率下,4根体积系列试件的计算值的均方差和变异系数分别小于0.0244和0.0231,4根高度系列试件的计算值的均方差和变异系数分别小于0.0384和0.0362,计算结果的一致性好;与解析解相比,8个试件的计算值的最大相对误差均在5%左右,计算结果的精度高;且不同小波基分辨率对计算结果影响不大。由此得出结论,小波基无单元方法是求解混凝土等效断裂韧度的一种可靠的高精度数值方法,可以作为混凝土断裂力学的一种有力的计算工具。  相似文献   

13.
提出一种基于小波变换的像素级CT,MR医学图像融合方法,利用离散小波变换分别将两幅源图像进行多尺度分解,再用不同的小波系数邻域特征指导高频分量和低频分量的小波系数的融合,低频分量采用邻域方差指导,高频分量采用邻域能量指导,最后根据融合图像的各小波系数重构融合图像.实验表明:不论从主观感受,还是采用信息熵和平均梯度两项指标作为客观定量评价标准,该方法都优于传统的融合方法,获得的融合图像有效地综合了CT与MR图像信息,能够同时清晰地显示脑部骨组织和软组织.  相似文献   

14.
The authors recently developed a method for time‐frequency signal analysis of earthquake records using Mexican hat wavelets. Ground motions in earthquakes are postulated as a sequence of simple penny‐shaped ruptures at different locations along a fault line and occurring at different times. In this article, a wavelet energy spectrum is proposed for time‐frequency localization of the earthquake input energy. The ground acceleration generated by a simple penny‐shaped rupture is used as the basis to form the mother wavelet. The symmetric Mexican hat wavelet is chosen as the mother wavelet. The spectrum is presented pictorially in a two‐dimensional, time‐frequency domain. The proposed wavelet energy spectrum can be used to observe the evolution of the frequency contents of earthquake energy over time and distance of the site from the epicenter in a more accurate manner than the traditional time series (accelerogram) or frequency domain (Fourier amplitude spectrum) representation. It can be viewed as a microscope for looking into the time‐frequency characteristics of earthquake acceleration records. The wavelet energy spectrum provides frequency evolution information to be used in the structural design process. © 2003 Wiley Periodicals, Inc. Int J Imaging Syst Technol 13, 133–140, 2003; Published online in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/ima.10038  相似文献   

15.
The analysis and optimization of complex multiphysics systems presents a series of challenges that limit the practical use of computational tools. Specifically, the optimization of such systems involves multiple interconnected components with competing quantities of interest and high‐dimensional spaces and necessitates the use of costly high‐fidelity solvers to accurately simulate the coupled multiphysics. In this paper, we put forth a data‐driven framework to address these challenges leveraging recent advances in machine learning. We combine multifidelity Gaussian process regression and Bayesian optimization to construct probabilistic surrogate models for given quantities of interest and explore high‐dimensional design spaces in a cost‐effective manner. The synergistic use of these computational tools gives rise to a tractable and general framework for tackling realistic multidisciplinary optimization problems. To demonstrate the specific merits of our approach, we have chosen a challenging large‐scale application involving the hydrostructural optimization of three‐dimensional supercavitating hydrofoils. To this end, we have developed an automated workflow for performing multiresolution simulations of turbulent multiphase flows and multifidelity structural mechanics (combining three‐dimensional and one‐dimensional finite element results), the results of which drive our machine learning analysis in pursuit of the optimal hydrofoil shape.  相似文献   

16.
一种优化的基于DWT的抗打印扫描的数字水印算法   总被引:8,自引:7,他引:1  
刘真  丁盈盈 《包装工程》2011,32(11):93-99
提出了一种优化的基于离散小波变换(DWT)的抗打印扫描的数字水印算法。算法将小波系数分为低频、中频、高频3类,并根据人眼视觉系统对这3类系数的不同分辨率加载了不同强度的水印信息,在保证不可见性的前提下增加了鲁棒性;同时在提取时增加了对扫描图像预处理过程,提高了水印信息提取的正确率。实验证明,算法对剪切、噪声、压缩攻击都具有较好的鲁棒性。  相似文献   

17.
A complete semi‐analytical treatment of the four‐dimensional (4‐D) weakly singular integrals over coincident, edge adjacent and vertex adjacent triangles, arising in the Galerkin discretization of mixed potential integral equation formulations, is presented. The overall analysis is based on the direct evaluation method, utilizing a series of coordinate transformations, together with a re‐ordering of the integrations, in order to reduce the dimensionality of the original 4‐D weakly singular integrals into, respectively, 1‐D, 2‐D and 3‐D numerical integrations of smooth functions. The analytically obtained final formulas can be computed by using typical library routines for Gauss quadrature readily available in the literature. A comparison of the proposed method with singularity subtraction, singularity cancellation and fully numerical methods, often used to tackle the multi‐dimensional singular integrals evaluation problem, is provided through several numerical examples, which clearly highlights the superior accuracy and efficiency of the direct evaluation scheme. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

18.
Successful identification of the important metabolite features in high-resolution nuclear magnetic resonance (NMR) spectra is a crucial task for the discovery of biomarkers that have the potential for early diagnosis of disease and subsequent monitoring of its progression. Although a number of traditional features extraction/selection methods are available, most of them have been conducted in the original frequency domain and disregarded the fact that an NMR spectrum comprises a number of local bumps and peaks with different scales. In the present study a complex wavelet transform that can handle multiscale information efficiently and has an energy shift-insensitive property is proposed as a method to improve feature extraction and classification in NMR spectra. Furthermore, a multiple testing procedure based on a false discovery rate (FDR) was used to identify important metabolite features in the complex wavelet domain. Experimental results with real NMR spectra showed that classification models constructed with the complex wavelet coefficients selected by the FDR-based procedure yield lower rates of misclassification than models constructed with original features and conventional wavelet coefficients.  相似文献   

19.
A Monte Carlo procedure to estimate efficiently the gradient of a generic function in high dimensions is presented. It is shown that, adopting an orthogonal linear transformation, it is possible to identify a new coordinate system where a relatively small subset of the variables causes most of the variation of the gradient. This property is exploited further in gradient‐based algorithms to reduce the computational effort for the gradient evaluation in higher dimensions. Working in this transformed space, only few function evaluations, i.e. considerably less than the dimensionality of the problem, are required. The procedure is simple and can be applied by any gradient‐based method. A number of different examples are presented to show the accuracy and the efficiency of the proposed approach and the applicability of this procedure for the optimization problem using well‐known gradient‐based optimization algorithms such as the descent gradient, quasi‐Newton methods and Sequential Quadratic Programming. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

20.
Two methods are presented for connecting dissimilar three‐dimensional finite element meshes. The first method combines the concept of master and slave surfaces with the uniform strain approach for finite elements. By modifying the boundaries of elements on a slave surface, corrections are made to element formulations such that first‐order patch tests are passed. The second method is based entirely on constraint equations, but only passes a weaker form of the patch test for non‐planar surfaces. Both methods can be used to connect meshes with different element types. In addition, master and slave surfaces can be designated independently of relative mesh resolutions. Example problems in three‐dimensional linear elasticity are presented. Published in 2000 by John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号