首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
This paper presents a scheme for decomposing polyhedra called multi-LREP. The scheme is based on the L-REP decomposition, which classifies the triangular faces of a polyhedron into a set of layered tetrahedra. In the multi-LREP these layered tetrahedra are grouped into regions of a space subdivision. The paper also describes an efficient method for constructing the L-REP decomposition and how the multi-LREP can be applied to speed up two L-REP applications: the point-in-polyhedron inclusion test and the ray-scene intersection. An experimental comparison with other point-in-polyhedron tests is presented as well.  相似文献   

2.
Color vision supports two distinct visual functions: discrimination and constancy. Discrimination requires that the visual response to distinct objects within a scene be different. Constancy requires that the visual response to any object be the same across scenes. Across changes in scene, adaptation can improve discrimination by optimizing the use of the available response range. Similarly, adaptation can improve constancy by stabilizing the visual response to any fixed object across changes in illumination. Can common mechanisms of adaptation achieve these two goals simultaneously? We develop a theoretical framework for answering this question and present several example calculations. In the examples studied, the answer is largely yes when the change of scene consists of a change in illumination and considerably less so when the change of scene consists of a change in the statistical ensemble of surface reflectances in the environment.  相似文献   

3.
Level set methods are a popular and powerful class of numerical algorithms for dynamic implicit surfaces and solution of Hamilton-Jacobi PDEs. While the advanced level set schemes combine both efficiency and accuracy, their implementation complexity makes it difficult for the community to reproduce new results and make quantitative comparisons between methods. This paper describes the Toolbox of Level Set Methods, a collection of Matlab routines implementing the basic level set algorithms on fixed Cartesian grids for rectangular domains in arbitrary dimension. The Toolbox’s code and interface are designed to permit flexible combinations of different schemes and PDE forms, allow easy extension through the addition of new algorithms, and achieve efficient execution despite the fact that the code is entirely written as m-files. The current contents of the Toolbox and some coding patterns important to achieving its flexibility, extensibility and efficiency are briefly explained, as is the process of adding two new algorithms. Code for both the Toolbox and the new algorithms is available from the Web.  相似文献   

4.
Despite the growing importance of ergonomics and ergonomists worldwide, the position of ergonomics in companies is often not clear. Today, in many countries ergonomics is mainly (or even only) associated with the reduction of risks of work-related musculoskeletal disorders (WMSDs). Therefore, many companies consider ergonomics a part of occupational safety and health (OSH) that focuses mainly on the reduction of risks. This paper aims to analyse the links between occupational ergonomics and OSH. The position of occupational ergonomics in legislation, the presence of ergonomics in OSH networks, and the position of ergonomics in OSH company services are discussed. In addition, the added value of ergonomics to companies is examined. From these discussions, it becomes clear that ergonomics should be part of the OSH policy of companies, and should be integrated into today's company strategies to improve labour conditions. If ergonomics is considered as a discipline in its own right, a clear legislative context should be developed that goes beyond voluntary guidelines and the goodwill of employers, and necessitates the presence of ergonomics professionals in companies.  相似文献   

5.
In this paper we study the optimal stochastic control problem for stochastic differential equations on Riemannian manifolds. The cost functional is specified by controlled backward stochastic differential equations in Euclidean space. Under some suitable assumptions, we conclude that the value function is the unique viscosity solution to the associated Hamilton–Jacobi–Bellman equation which is a fully nonlinear parabolic partial differential equation on Riemannian manifolds.  相似文献   

6.
The method of harmonic linearization is used to obtain an approximate optimal control law for a second-order non-linear state-regulator problem.  相似文献   

7.
The array of Normalized Difference Vegetation Index (NDVI) products now being derived from satellite imagery open up new opportunities for the study of short and long-term variability in climate. Using a time series analysis procedure based on the Principal Components transform, and a sequence of monthly Advanced Very High Resolution Radiometer (AVHRR)-derived NDVI imagery from 1986 through 1990, we examine trends in variability of vegetation greenness for Africa for evidence of climatic trends. In addition to the anticipated seasonal trends, we identify signals of interannual variability. The most readily identified is one that periodically affects Southern Africa. It is shown that the temporal loadings for this component exhibit a very strong relationship with the El Nino/Southern Oscillation (ENSO) Index derived from atmospheric pressure patterns in the Pacific, Pacific sea surface temperature (SST) anomalies, and with anomalous Outgoing Longwave Radiation (OLR). However, we have also detected a second interannual variation, affecting most particularly East Africa and the Sahel, that does not exhibit a consistent ENSO relationship. The results show the teleconnection patterns between climatic conditions in the Pacific Ocean basin and vegetation conditions at specific regional locations over Africa. The comprehensive spatial character and high temporal resolution of these data offer exciting prospects for deriving a land surface index of ENSO and mapping the impacts of ENSO activity at continental scale. This study illustrates that vegetation reflectance data derived from polar orbiting satellites can serve as good proxy for the study of interannual climate variability.  相似文献   

8.
In industrial applications, optimal control problems frequently appear in the context of decision-making under incomplete information. In such framework, decisions must be adapted dynamically to account for possible regime changes of the underlying dynamics. Using stochastic filtering theory, Markovian evolution can be modelled in terms of latent variables, which naturally leads to high-dimensional state space, making practical solutions to these control problems notoriously challenging. In our approach, we utilise a specific structure of this problem class to present a solution in terms of simple, reliable, and fast algorithms. The algorithms presented in this paper have already been implemented in an R package.  相似文献   

9.
Phase synchronization analysis has been demonstrated to be a useful method to infer brain function and neural activity based on electroencephalography (EEG) signals. The phase locking value (PLV) is one of the most important tools for phase synchronization analysis. Although the traditional method (TM) to calculate PLV, which is based on the Hilbert transform, has been applied extensively, some of methodological problems of TM have not been solved. To address these problems, this paper proposes an improved method (IM) to calculate the PLV based on the Hilbert–Huang transform. For the IM, the Hilbert–Huang transform, instead of the Hilbert transform, is used to process non-stationary EEG signals and the empirical mode decomposition, not band-pass filter, is used to get target frequency band. The performance of the IM is evaluated by comparing normal and hypoxia EEG signals. The PLVs are used as features for a least squares support vector machine to recognize normal and hypoxia EEG. Experimental results show that the PLVs calculated by the IM can distinguish the EEG signals better than those calculated by TM.  相似文献   

10.
Although investors in financial markets have access to information from both mass media and social media, trading platforms that curate and provide this information have little to go by in terms of understanding the difference between these two types of media. This paper compares social media with mass media in the stock market, focusing on information coverage diversity and predictive value with respect to future stock absolute returns. Based on a study of nearly a million stock-related news articles from the Sina Finance news platform and 12.7 million stock-related social media messages from the popular Weibo platform in China, we find that social media covers less stocks than mass media, and this effect is amplified as the volume of media information increases. We find that there is some short-term predictive value from these sources, but they are different. Although mass media information coverage is more predictive than social media information coverage in a one-day horizon, it is the other way around in a two-to five-day horizon. These empirical results suggest that social media and mass media serve stock market investors differently. We draw connections to theories related to how crowds and experts differ and offer practical implications for the design of media-related IS systems.  相似文献   

11.
Consider a resource allocation problem on the following system. A system consists of m identical parallel machines and is alive only when all the machines are alive. To keep a machine alive, it requires resources (material, fuel, etc.). Resources with various sizes arrive one by one and the goal is to keep the system alive as long as possible. The problem has applications in many areas such as sequencing of maintenance actions for modular gas turbine aircraft engines[1]. Using scheduling term…  相似文献   

12.
In many decision problems a set of actions is evaluated with respect to a set of points of view, called criteria. This paper follows two aims - first to compare the so-called level-dependent Choquet integral introduced recently by Greco et al. with another transformation of Choquet integral, proposed by Havranová and Kalina. The other aim of this paper is to look for an appropriate utility function in a given setting. We illustrate our approach on a practical example, utilizing the level-dependent Choquet integral.  相似文献   

13.
This paper reports the results of novel quantitative research on multiple people’s personal note-taking in meetings with the long-term aim of aiding the creation of innovative meeting understanding applications. We present three experiments using a large number of group meetings taken from the Augmented Multi-party Interaction meeting corpus. Statistical techniques were employed for this work. Our findings suggest that temporal note-taking overlap information and the semantic content of the written private notes taken by many meeting participants both point to the majority of the most informative meeting events. Thus, the characteristics of note-taking can be seen as a contributing feature for new automatic meeting summarisation approaches and for the development of future meeting browser environments that better support the needs of individuals and organisations.  相似文献   

14.
We consider the problem of learning in repeated general-sum matrix games when a learning algorithm can observe the actions but not the payoffs of its associates. Due to the non-stationarity of the environment caused by learning associates in these games, most state-of-the-art algorithms perform poorly in some important repeated games due to an inability to make profitable compromises. To make these compromises, an agent must effectively balance competing objectives, including bounding losses, playing optimally with respect to current beliefs, and taking calculated, but profitable, risks. In this paper, we present, discuss, and analyze M-Qubed, a reinforcement learning algorithm designed to overcome these deficiencies by encoding and balancing best-response, cautious, and optimistic learning biases. We show that M-Qubed learns to make profitable compromises across a wide-range of repeated matrix games played with many kinds of learners. Specifically, we prove that M-Qubed’s average payoffs meet or exceed its maximin value in the limit. Additionally, we show that, in two-player games, M-Qubed’s average payoffs approach the value of the Nash bargaining solution in self play. Furthermore, it performs very well when associating with other learners, as evidenced by its robust behavior in round-robin and evolutionary tournaments of two-player games. These results demonstrate that an agent can learn to make good compromises, and hence receive high payoffs, in repeated games by effectively encoding and balancing best-response, cautious, and optimistic learning biases.  相似文献   

15.
In the light of multi-continued fraction theories, we make a classification and counting for multi-strict continued fractions, which are corresponding to multi-sequences of multiplicity m and length n. Based on the above counting, we develop an iterative formula for computing fast the linear complexity distribution of multi-sequences. As an application, we obtain the linear complexity distributions and expectations of multi-sequences of any given length n and multiplicity m less than 12 by a personal computer. But only results of m=3 and 4 are given in this paper.  相似文献   

16.
We present a hybrid approach to simulate global illumination and soft shadows at interactive frame rates. The strengths of hardware-accelerated GPU techniques are combined with CPU methods to achieve physically consistent results while maintaining reasonable performance. The process of image synthesis is subdivided into multiple passes accounting for the different illumination effects. While direct lighting is rendered efficiently by rasterization, soft shadows are simulated using a novel approach combining the speed of shadow mapping and the accuracy of visibility ray tracing. A shadow refinement mask is derived from the result of the direct lighting pass and from a small number of shadow maps to identify the penumbral region of an area light source. This region is accurately rendered by ray tracing. For diffuse indirect illumination, we introduce radiosity photons to profit from the flexibility of a point-based sampling while maintaining the benefits of interpolation over scattered data approximation or density estimation. A sparse sampling of the scene is generated by particle tracing. An area is approximated for each point sample to compute the radiosity solution using a relaxation approach. The indirect illumination is interpolated between neighboring radiosity photons, stored in a multidimensional search tree. We compare different neighborhood search algorithms in terms of image quality and performance. Our method yields interactive frame rates and results consistent with path tracing reference solutions.  相似文献   

17.
Pointfree formulation means suppressing domain variables to focus on higher-level objects (functions, relations). Advantages are algebraic-style calculation and abstraction as formal logics pursue by axiomatization. Various specific uses are considered, starting with quantification in the wider sense (?, ?, ∑, etc.). Pointfree style is achieved by suitable functionals that prove superior to pointwise conventions such as the Eindhoven notation. Pointwise calculations from the literature are reworked in pointfree form. The second use considered is in describing systems, with generic functionals capturing signal flow patterns. An illustration is the mathematics behind a neat magician’s trick, whose implementation allows comparing the pointfree style in Funmath, LabVIEW, TLA+, Haskell and Maple. The third use is making temporal logic calculational, with a simple generic Functional Temporal Calculus (FTC) for unification. Specific temporal logics are then captured via endosemantic functions. The example is TLA+. Calculation is illustrated by deriving various theorems, most related to liveness issues, and discovering results by calculation rather than proving them afterwards. To conclude, various ramifications, style and abstraction issues are discussed, in relation to engineering mathematics in general and to categorical formulations.  相似文献   

18.
Constitutional design and redesign is constant. Over the last 200 years, countries have replaced their constitutions an average of every 19 years and some have amended them almost yearly. A basic problem in the drafting of these documents is the search and analysis of model text deployed in other jurisdictions. Traditionally, this process has been ad hoc and the results suboptimal. As a result, drafters generally lack systematic information about the institutional options and choices available to them. In order to address this informational need, the investigators developed a web application, Constitute [online at http://www.constituteproject.org], with the use of semantic technologies. Constitute provides searchable access to the world’s constitutions using the conceptualization, texts, and data developed by the Comparative Constitutions Project. An OWL ontology represents 330 “topics”–e.g. right to health–with which the investigators have tagged relevant provisions of nearly all constitutions in force as of September of 2013. The tagged texts were then converted to an RDF representation using R2RML mappings and Capsenta’s Ultrawrap. The portal implements semantic search features to allow constitutional drafters to read, search, and compare the world’s constitutions. The goal of the project is to improve the efficiency and systemization of constitutional design and, thus, to support the independence and self-reliance of constitutional drafters.  相似文献   

19.
We consider triply-nested loops of the type that occur in the standard Gaussian elimination algorithm, which we denote by GEP (or the Gaussian Elimination Paradigm). We present two related cache-oblivious methods I-GEP and C-GEP, both of which reduce the number of cache misses incurred (or I/Os performed) by the computation over that performed by standard GEP by a factor of $\sqrt{M}We consider triply-nested loops of the type that occur in the standard Gaussian elimination algorithm, which we denote by GEP (or the Gaussian Elimination Paradigm). We present two related cache-oblivious methods I-GEP and C-GEP, both of which reduce the number of cache misses incurred (or I/Os performed) by the computation over that performed by standard GEP by a factor of ?M\sqrt{M}, where M is the size of the cache. Cache-oblivious I-GEP computes in-place and solves most of the known applications of GEP including Gaussian elimination and LU-decomposition without pivoting and Floyd-Warshall all-pairs shortest paths. Cache-oblivious C-GEP uses a modest amount of additional space, but is completely general and applies to any code in GEP form. Both I-GEP and C-GEP produce system-independent cache-efficient code, and are potentially applicable to being used by optimizing compilers for loop transformation.  相似文献   

20.
Similarity is one of the most important abstract concepts in human perception of the world. In computer vision, numerous applications deal with comparing objects observed in a scene with some a priori known patterns. Often, it happens that while two objects are not similar, they have large similar parts, that is, they are partially similar. Here, we present a novel approach to quantify partial similarity using the notion of Pareto optimality. We exemplify our approach on the problems of recognizing non-rigid geometric objects, images, and analyzing text sequences.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号