首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
We study the classification problem that arises when two variables—one continuous (x), one discrete (s)—evolve jointly in time. We suppose that the vector x traces out a smooth multidimensional curve, to each point of which the variable s attaches a discrete label. The trace of s thus partitions the curve into different segments whose boundaries occur where s changes value. We consider how to learn the mapping between the trace of x and the trace of s from examples of segmented curves. Our approach is to model the conditional random process that generates segments of constant s along the curve of x. We suppose that the variable s evolves stochastically as a function of the arc length traversed by x. Since arc length does not depend on the rate at which a curve is traversed, this gives rise to a family of Markov processes whose predictions are invariant to nonlinear warpings (or reparameterizations) of time. We show how to estimate the parameters of these models—known as Markov processes on curves (MPCs)—from labeled and unlabeled data. We then apply these models to two problems in automatic speech recognition, where x are acoustic feature trajectories and s are phonetic alignments.  相似文献   

2.
A generalization of a theorem by Pegna and Wolter—called Linkage Curve Theorem—is presented. The new theorem provides a condition for joining two surfaces with high order geometric continuity of arbitrary degree n. It will be shown that the Linkage Curve Theorem can be generalized even for the case when the common boundary curve is only G1.  相似文献   

3.
Sufficient and necessary conditions for the arc length of a polynomial parametric curve to be an algebraic function of the parameter are formulated. It is shown that if the arc length is algebraic, it is no more complicated than the square root of a polynomial. Polynomial curves that have this property encompass the Pythagorean-hodograph curves—for which the arc length is just a polynomial in the parameter—as a proper subset. The algebraically rectifiable cubics, other than Pythagorean-hodograph curves, constitute a single-parameter family of cuspidal curves. The implications of the general algebraic rectifiability criterion are also completely enumerated in the case of quartics, in terms of their cusps and intrinsic shape freedoms. Finally, the characterization and construction of algebraically rectifiable quintics is briefly sketched. These forms offer a rich repertoire of curvilinear profiles, whose lengths are readily determined without numerical quadrature, for practical design problems.  相似文献   

4.
For generation of hull forms, a method using rational cubic Bézier curves is chosen because of their superior segmentwise local-weighted behavior. A hull form is defined by two sets of grid lines—transverse grid lines arranged in length direction and longitudinal grid lines arranged in depth direction. Transverse lines are first defined, the points on the transverse lines with the same curve parameter values are then fitted to define longitudinal lines. Thereby, each curve is described by a rational cubic Bézier curve in space. The bilge, flat side and flat bottom can be defined precisely and more flexibilities are provided for defining bow and stern regions. By the way, a hull surface can be generated which is useful to produce desired data for hydrostatic or panel generations.  相似文献   

5.
Interpreting line drawings of curved objects   总被引:4,自引:2,他引:4  
In this paper, we study the problem of interpreting line drawings of scenes composed of opaque regular solid objects bounded by piecewise smooth surfaces with no markings or texture on them. It is assumed that the line drawing has been formed by orthographic projection of such a scene under general viewpoint, that the line drawing is error free, and that there are no lines due to shadows or specularities. Our definition implicitly excludes laminae, wires, and the apices of cones.A major component of the interpretation of line drawings is line labelling. By line labelling we mean (a) classification of each image curve as corresponding to either a depth or orientation discontinuity in the scene, and (b) further subclassification of each kind of discontinuity. For a depth discontinuity we determine whether it is a limb—a locus of points on the surface where the line of sight is tangent to the surface—or an occluding edge—a tangent plane discontinuity of the surface. For an orientation discontinuity, we determine whether it corresponds to a convex or concave edge. This paper presents the first mathematically rigorous scheme for labelling line drawings of the class of scenes described. Previous schemes for labelling line drawings of scenes containing curved objects were heuristic, incomplete, and lacked proper mathematical justification.By analyzing the projection of the neighborhoods of different kinds of points on a piecewise smooth surface, we are able to catalog all local labelling possibilities for the different types of junctions in a line drawing. An algorithm is developed which utilizes this catalog to determine all legal labellings of the line drawing. A local minimum complexity rule—at each vertex select those labellings which correspond to the minimum number of faces meeting at the vertex—is used in order to prune highly counter-intuitive interpretations. The labelling scheme was implemented and tested on a number of line drawings. The labellings obtained are few and by and large in accordance with human interpretations.  相似文献   

6.
In this paper, we propose significant extensions to the snake pedal model, a powerful geometric shape modeling scheme introduced in (Vemuri and Guo, 1998). The extension allows the model to automatically cope with topological changes and for the first time, introduces the concept of a compact global shape into geometric active models. The ability to characterize global shape of an object using very few parameters facilitates shape learning and recognition. In this new modeling scheme, object shapes are represented using a parameterized function—called the generator—which accounts for the global shape of an object and the pedal curve (surface) of this global shape with respect to a geometric snake to represent any local detail. Traditionally, pedal curves (surfaces) are defined as the loci of the feet of perpendiculars to the tangents of the generator from a fixed point called the pedal point. Local shape control is achieved by introducing a set of pedal points—lying on a snake—for each point on the generator. The model dubbed as a snake pedal allows for interactive manipulation via forces applied to the snake. In this work, we replace the snake by a geometric snake and derive all the necessary mathematics for evolving the geometric snake when the snake pedal is assumed to evolve as a function of its curvature. Automatic topological changes of the model may be achieved by implementing the geometric snake in a level-set framework. We demonstrate the applicability of this modeling scheme via examples of shape recovery from a variety of 2D and 3D image data.  相似文献   

7.
A linear singular blending (LSB) technique can enhance the shape—control capability of the B-spline. This capability is derived from the blending parameters defined at the B-spline control vertices and blends LSB line segments or bilinear surface patches with the B-spline curve or surface. Varying the blending parameters between zero and unity applies tension for reshaping. The reshaped curve or surface retains the same smoothness properties as the original B-spline; it possesses the same strict parametric continuities. This is different from the -spline, which introduces additional control to the B-spline by imposing geometrical continuities to the joints of curve segments or surface patches. For applications in which strict parametric continuities cannot be compromised, LSB provides an intuitive way to introduce tension to the B-spline.  相似文献   

8.
This paper is an extension of a previous one presented at the conference Cyberworlds 2014. In that work we addressed the problem of obtaining the rational Bézier curve that fits a given set of data points better in the least-squares sense. Our approach was based on the clonal selection theory principles to compute all parameters of the problem, namely, the control points of the approximating curve, their corresponding weights, and a suitable parameterization of data points. Although we were able to obtain results with good accuracy, this scheme can still be significantly improved by hybridizing it with an efficient local search procedure. This is the approach proposed in this paper. In particular, we consider the mesh adaptive search algorithm, a direct search method aimed at improving the local search step to refine the quality of the solution. This hybrid strategy has been applied to six illustrative free-form shapes exhibiting challenging features, including the three examples in previous paper. A comparative analysis of our results with respect to the previous methodology is also reported. Our experimental results show that this hybrid scheme performs extremely well. It also outperforms the previous approach for all instances in our benchmark.  相似文献   

9.
The subject of this paper is the complementary use of both a data base and an expert system in the analysis of urban areas in territorial planning.After a discussion of urban reasoning, this paper then states the fundamental principles of the URBYS system: —storage of the urban planner's knowledge; —the use of this knowledge by the expert system to assist in decision-making; —output of information from the urban data base to the expert system.A special effort was made to facilitate the use of this system and its closeness to the expert method by an easy modification of URBYS' knowledge without the overall coherence being affected.  相似文献   

10.
The Nanostream (Pasadena, CA) Veloce system, together with 24-column Brio cartridges, offers a novel approach to micro parallel liquid chromatography (μPLC). This system allows users to achieve unprecedented throughput for standard assays while matching the performance of conventional LC instrumentation, thus enabling routine compound purity assessment and physiochemical property profiling early in the drug discovery and development process.The Veloce system—which includes instrumentation, software, and replaceable microfluidic cartridges—incorporates pressure-driven flow to achieve chromatograms comparable to conventional high performance liquid chromatography (HPLC) instrumentation for a broad class of analytical applications while offering a dramatic increase in sample analysis capacity. The system enables parallel chromatographic separations and simultaneous, real-time UV detection. Each Nanostream Brio cartridge, made of polymeric materials, incorporates 24 columns packed with standard (C-18) stationary phase material to achieve reverse phase separations. Mixing and distribution of the mobile phase to each of the 24 columns is precisely controlled in each cartridge. The system provides an ideal platform to accelerate assessment of compound purity and physicochemical properties (i.e., log P, CHI, etc.) for a large number of compounds. In addition, the 24-fold increase in sample analysis capacity allows standard curve generation and simultaneous analysis of multiple replicates of samples in a single run.  相似文献   

11.
Integer subdivision algorithm for rendering NURBS curves   总被引:1,自引:1,他引:0  
A integer version of the well-known subdivision algorithm of NURBS curves is presented here. The algorithm is used to render NURBS curves of any degree on a raster device using a piece-wise linear approximation. The approximation is independent of the parametrization, that is, it is independent of the weights used. The maximum deviation between a precisely sampled curve and that of the subdivisionbased rendering is one pixel — an inherent feature of the subdivision technique. The algorithm works entirely in the screen coordinate system and produces smooth rendering of curves without oversampling. The integer arithmetic allows the rendering of relatively complex curves of 2 to 8 degrees within a fraction of a second on an i80286/386 processor, and is a good candidate for hardware implementation.  相似文献   

12.
In this paper there is shown (for a model problem) that the consistence error of a finite element method—which is based on noncompatible trial functions—disappears, if the underlying variational principle is extended to the noncompatible trial function set.  相似文献   

13.
14.
The tension leg platform (TLP) is a new concept in the design of platforms. It combines the facilities of both floating, drilling, and production platforms. The deck is secured by vertical tension legs attached to pile foundation templates on the sea bed. An attempt has been made in this paper to calculate the short and long term statistics of the stress response to the natural short and long crested sea and the resulting fatigue damage for certain critical members of a proposed TLP. A probabilistic approach is adopted utilising a spectral method for calculating the number of wave induced cycles at different stress levels. The nonlinear finite square stress element method is used to determine root mean square deflections and root mean fracture in randomly excited TLPs. Program RANDOM is developed to solve various theoretical equations and to evaluate the fatigue life. The basic input to the program consists of material properties, SN—curve and environmental data.  相似文献   

15.
Outdoor Visual Position Estimation for Planetary Rovers   总被引:2,自引:0,他引:2  
This paper describes (1) a novel, effective algorithm for outdoor visual position estimation; (2) the implementation of this algorithm in the Viper system; and (3) the extensive tests that have demonstrated the superior accuracy and speed of the algorithm. The Viper system (Visual Position Estimator for Rovers) is geared towards robotic space missions, and the central purpose of the system is to increase the situational awareness of a rover operator by presenting accurate position estimates. The system has been extensively tested with terrestrial and lunar imagery, in terrains ranging from moderate—the rounded hills of Pittsburgh and the high deserts of Chile—to rugged—the dramatic relief of the Apollo 17 landing site—to extreme—the jagged peaks of the Rockies. Results have consistently demonstrated that the visual estimation algorithm estimates position with an accuracy and reliability that greatly surpass previous work.  相似文献   

16.
Predictive models for assessing the risk of developing lung cancers can help identify high-risk individuals with the aim of recommending further screening and early intervention.To facilitate pre-hospital self-assessments,some studies have exploited predictive models trained on non-clinical data(e.g.,smoking status and family history).The performance of these models is limited due to not considering clinical data(e.g.,blood test and medical imaging results).Deep learning has shown the potential in processing complex data that combine both clinical and non-clinical information.However,predicting lung cancers remains difficult due to the severe lack of positive samples among follow-ups.To tackle this problem,this paper presents a generative-discriminative framework for improving the ability of deep learning models to generalize.According to the proposed framework,two nonlinear generative models,one based on the generative adversarial network and another on the variational autoencoder,are used to synthesize auxiliary positive samples for the training set.Then,several discriminative models,including a deep neural network(DNN),are used to assess the lung cancer risk based on a comprehensive list of risk factors.The framework was evaluated on over 55000 subjects questioned between January 2014 and December 2017,with 699 subjects being clinically diagnosed with lung cancer between January 2014 and August 2019.According to the results,the best performing predictive model built using the proposed framework was based on DNN.It achieved an average sensitivity of 76.54%and an area under the curve of 69.24%in distinguishing between the cases of lung cancer and normal cases on test sets.  相似文献   

17.
This paper explores the macro data flow approach for solving numerical applications on distributed memory systems. We discuss the problems of this approach with a sophisticated ‘real life’ algorithm—the adaptive full multigrid method.

It is shown that the nonnumeric parts of the algorithm—the initialization, the termination and the mapping of processes to processors—are very important for the overall performance.

To avoid unnecessary global synchronization points we propose to use the distributed supervisors. We compare this solution with more centralized algorithms. The performance evaluation is done for nearest neighbour and bus connected multiprocessors using a simulation systems.  相似文献   


18.
Financial restatements have been a major concern for the regulators, investors and market participants. Most of the previous studies focus only on fraudulent (or intentional) restatements and the literature has largely ignored unintentional restatements. Earlier studies have shown that large scale unintentional restatements can be equally detrimental and may erode investors’ confidence. Therefore it is important for us to pay a close to the significant unintentional restatements as well. A lack of focus on unintentional restatements could lead to a more relaxed internal control environment and lessen the efforts for curbing managerial oversights and instances of misreporting. In order to address this research gap, we focus on developing predictive models based on both intentional (fraudulent) and unintentional (erroneous) financial restatements using a comprehensive real dataset that includes 3,513 restatement cases over a period of 2001 to 2014. To the best of our knowledge it is the most comprehensive dataset used in the financial restatement predictive models. Our study also makes contributions to the datamining literature by (i) focussing on various datamining techniques and presenting a comparative analysis, (ii) ensuring the robustness of various predictive models over different time periods. We have employed all widely used data mining techniques in this area, namely, Decision Tree (DT), Artificial Neural Network (ANN), Naïve Bayes (NB), Support Vector Machine (SVM), and Bayesian Belief Network (BBN) Classifier while developing the predictive models. We find that ANN outperforms other data mining algorithms in our empirical setup in terms of accuracy and area under the ROC curve. It is worth noting that our models remain consistent over the full sample period (2001-2014), pre-financial-crisis period (2001-2008), and post-financial-crisis period (2009-2014). We believe this study will benefit academics, regulators, policymakers and investors. In particular, regulators and policymakers can pay a close attention to the suspected firms and investors can take actions in advance to reduce their investment risks. The results can also help improving expert and intelligent systems by providing more insights on both intentional and unintentional financial restatements.  相似文献   

19.
In this paper there is shown (for a model problem) that the consistence error of a finite element method—which is based on noncompatible trial functions—disappears, if the underlying variational principle is extended to the noncompatible trial function set.  相似文献   

20.
This paper presents a simulation aided approach for designing organizational structures in manufacturing systems. The approach is based on a detailed modeling and characterization of the forecasted order program, especially of elementary processes, activity networks and manufacturing orders. Under the use of the organization modeling system FORM, that has been developed at the ifab-Institute of Human and Industrial Engineering of the University of Karlsruhe, structuring strategies—e.g., a process-oriented strategy—can be applied in order to design organizational structures in manufacturing systems in a flexible and efficient way. Following that, a dynamical analysis of the created manufacturing structures can be carried out with the simulation tool FEMOS, that has also been developed at the ifab-Institute. The evaluation module of FEMOS enables to measure the designed solutions with the help of logistical—e.g., lead time degree—and organizational—e.g., degree of autonomy—key data. This evaluation is the basis for the identification of effective manufacturing systems and also of improvement potentialities. Finally, a case study is presented in this paper designing and analyzing different organizational structures of a manufacturing system where gear boxes and robot grip arms were manufactured.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号