首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
A distributed implementation of the Spatially-Explicit Individual-Based Simulation Model of Florida Panther and White-Tailed Deer in the Everglades and Big Cypress Landscapes (SIMPDEL) model is presented. SIMPDEL models the impact of different water management strategies in the South Florida region on the white-tailed deer and the Florida panther populations. SIMPDEL models the interaction of the four interrelated components – vegetation, hydrology, white-tailed deer and Florida panther, over a time span up to several decades. Very similar outputs of bioenergetic and survival statistics were obtained from the serial and distributed models. A performance evaluation of the two models revealed moderate speed improvements for the distributed model (referred to as DSIMPDEL). The 4-processor configuration attained a speed improvement of 3.83 with small deer populations on an ATM-based network of SUN Ultra 2 workstations over the serial model executing on a single SUN Ultra 2 workstation.  相似文献   

2.
Interest matching is an important data-filtering mechanism for a large-scale distributed virtual environment. Many of the existing algorithms perform interest matching at discrete timesteps. Thus, they may suffer the missing-event problem: failing to report the events between two consecutive timesteps. Some algorithms solve this problem, by setting short timesteps, but they have a low computing efficiency. Additionally, these algorithms cannot capture all events, and some spurious events may also be reported. In this paper, we present an accurate interest matching algorithm called the predictive interest matching algorithm, which is able to capture the missing events between discrete timesteps. The PIM algorithm exploits the polynomial functions to model the movements of virtual entities, and predict the time intervals of region overlaps associated with the entities accurately. Based on the prediction of the space–time intersection of regions, our algorithm can capture all missing events and does not report the spurious events at the same time. To improve the runtime performance, a technique called region pruning is proposed and used in our algorithm. In experiments, we compare the new algorithm with the frequent interest matching algorithm and the space–time interest matching algorithm on the HLA/RTI distributed infrastructure. The results prove that although an additional matching effort is required in the new algorithm, it outperforms the baselines in terms of event-capturing ability, redundant matching avoidance, runtime efficiency and scalability.  相似文献   

3.
The increasing demand for higher resolution images and higher frame rate videos will always pose a challenge to computational power when real-time performance is required to solve the stereo-matching problem in 3D reconstruction applications. Therefore, the use of asymptotic analysis is necessary to measure the time and space performance of stereo-matching algorithms regardless of the size of the input and of the computational power available. In this paper, we survey several classic stereo-matching algorithms with regard to time–space complexity. We also report running time experiments for several algorithms that are consistent with our complexity analysis. We present a new dense stereo-matching algorithm based on a greedy heuristic path computation in disparity space. A procedure which improves disparity maps in depth discontinuity regions is introduced. This procedure works as a post-processing step for any technique that solves the dense stereo-matching problem. We prove that our algorithm and post-processing procedure have optimal O(n) time–space complexity, where n is the size of a stereo image. Our algorithm performs only a constant number of computations per pixel since it avoids a brute force search over the disparity range. Hence, our algorithm is faster than “real-time” techniques while producing comparable results when evaluated with ground-truth benchmarks. The correctness of our algorithm is demonstrated with experiments in real and synthetic data.  相似文献   

4.
This paper considers the problem of suppressing complex-jamming, which contains sidelobe blanket jammings (SLJs), multiple near-mainlobe blanket jammings (multiple-NMLJs) and self-defensive false target jamming (SDJ). We propose a blind source separation (BSS)-based space–time multi-channel algorithm for complex-jamming suppression. The space–time multi-channel consists of spatial multiple beams and temporal multiple adjacent pulse repetition intervals (PRIs). The source signals can be separated by the BSS, owing to their statistical independence. The real target and SDJ can then be obtained by the pulse compression approach, distinguished by echo identification simultaneously. A remarkable feature of the proposed approach is that it does not require prior knowledge about real target or jammings, and it is easy to implement for engineering applications.  相似文献   

5.
As more and more real time spatio-temporal datasets become available at increasing spatial and temporal resolutions, the provision of high quality, predictive information about spatio-temporal processes becomes an increasingly feasible goal. However, many sensor networks that collect spatio-temporal information are prone to failure, resulting in missing data. To complicate matters, the missing data is often not missing at random, and is characterised by long periods where no data is observed. The performance of traditional univariate forecasting methods such as ARIMA models decreases with the length of the missing data period because they do not have access to local temporal information. However, if spatio-temporal autocorrelation is present in a space–time series then spatio-temporal approaches have the potential to offer better forecasts. In this paper, a non-parametric spatio-temporal kernel regression model is developed to forecast the future unit journey time values of road links in central London, UK, under the assumption of sensor malfunction. Only the current traffic patterns of the upstream and downstream neighbouring links are used to inform the forecasts. The model performance is compared with another form of non-parametric regression, K-nearest neighbours, which is also effective in forecasting under missing data. The methods show promising forecasting performance, particularly in periods of high congestion.  相似文献   

6.
7.
In this paper, we propose a fast algorithm for efficient and accurate solution of the space–time fractional diffusion equations defined in a rectangular domain. The spatial discretization is done by using the central finite difference scheme and matrix transfer technique. Due to its nonlocality, numerical discretization of the spectral fractional Laplacian (?Δ)sα/2 results in a large dense matrix. This causes considerable challenges not only for storing the matrix but also for computing matrix–vector products in practice. By utilizing the compact structure of the discrete system and the discrete sine transform, our algorithm avoids to store the large matrix from discretizing the nonlocal operator and also significantly reduces the computational costs. We then use the Laplace transform method for time integration of the semi-discretized system and a weighted trapezoidal method to numerically compute the convolutions needed in the resulting scheme. Various experiments are presented to demonstrate the efficiency and accuracy of our method.  相似文献   

8.
The critical dimensions in describing space–time activities are “what“, “where”, “when”, and “who”, which are frequently applied to collect data about basic functions people perform in space in the course of a day. Collecting data about these dimensions using activity-based surveys has presented researchers with a number of technical and social limitations, ranging from the restricted period of time participants have to record their activities to the level of accuracy with which participants complete a survey. This paper proposes a new streaming data processing workflow for querying space–time activities (STA) as a by-product of microblogging communication. It allows exploring a large volume of geotagged tweets to discover STA patterns of daily life in a systematic manner. A sequence of tasks have been implemented using different cloud-based computing resources for handling over one million of daily geotagged tweets from Canada for a period of six months. The STA patterns have revealed activity choices that might be attributable to personal motivations for communicating an activity in social networks.  相似文献   

9.
10.
In general, it is a difficult problem to solve the inverse of any function. With the inverse implication operation, we present a quantum algorithm for solving the inversion of function via using time–space trade-off in this paper. The details are as follows. Let function \(f(x)=y\) have k solutions, where \(x\in \{0, 1\}^{n}, y\in \{0, 1\}^{m}\) for any integers nm. We show that an iterative algorithm can be used to solve the inverse of function f(x) with successful probability \(1-\left( 1-\frac{k}{2^{n}}\right) ^{L}\) for \(L\in Z^{+}\). The space complexity of proposed quantum iterative algorithm is O(Ln), where L is the number of iterations. The paper concludes that, via using time–space trade-off strategy, we improve the successful probability of algorithm.  相似文献   

11.
Many industrial processes belong to distributed parameter systems (DPS) that have strong spatial–temporal dynamics. Modeling of DPS is difficult but essential to simulation, control and optimization. The first-principle modeling for known DPS often leads to the partial differential equation (PDE). Because it is an infinite-dimensional system, the model reduction (MR) is very necessary for real implementation. The model reduction often works with selection of basis functions (BF). Combination of different BF and MR results in different approaches. For unknown DPS, system identification is usually used to figure out unknown structure and parameters. Using various methods, different approaches are developed. Finally, a novel kernel-based approach is proposed for the complex DPS. This paper provides a brief review of different DPS modeling methods and categorizes them from the view of time–space separation.  相似文献   

12.
The paper aims to demonstrate the importance of behavioural issues in environmental modelling. These issues can relate both to the modeler and to the modelling process including the social interaction in the modelling team. The origins of behavioural effects can be in the cognitive and motivational biases or in the social systems created as well as in the visual and verbal communication strategies used. The possible occurrence of these phenomena in the context of environmental modelling is discussed and suggestions for research topics are provided.  相似文献   

13.
14.
《Applied Soft Computing》2008,8(1):202-215
This paper presents a new approach for time series data mining and knowledge discovery. The relevant features of non-stationary time series data from power network disturbances are extracted using a multiresolution S-transform which can be treated either as a phase corrected wavelet transform or a variable window short-time Fourier transform. After extracting the relevant features from the time series data, an integrated LVQ neural network and various feed-forward neural network architectures are used for pattern recognition of disturbance waveform data. The fuzzy MLP outperforms all the other different connectionist models and is used in the final stage for encoding knowledge in the connection weights that are used to generate rules for fuzzy inferencing of the disturbance patterns. Overall pattern classification accuracy of 99% is achieved for power signal time series data. The knowledge discovery from the data has then been presented for selected patterns using the new quantification procedures. The approach presented in this paper is a general one and can be applied to any time series data sequence for mining for similarities in the data.  相似文献   

15.
Matrix-Matrix Multiplication (MMM) is a highly important kernel in linear algebra algorithms and the performance of its implementations depends on the memory utilization and data locality. There are MMM algorithms, such as standard, Strassen–Winograd variant, and many recursive array layouts, such as Z-Morton or U-Morton. However, their data locality is lower than that of the proposed methodology. Moreover, several SOA (state of the art) self-tuning libraries exist, such as ATLAS for MMM algorithm, which tests many MMM implementations. During the installation of ATLAS, on the one hand an extremely complex empirical tuning step is required, and on the other hand a large number of compiler options are used, both of which are not included in the scope of this paper. In this paper, a new methodology using the standard MMM algorithm is presented, achieving improved performance by focusing on data locality (both temporal and spatial). This methodology finds the scheduling which conforms with the optimum memory management. Compared with (Chatterjee et al. in IEEE Trans. Parallel Distrib. Syst. 13:1105, 2002; Li and Garzaran in Proc. of Lang. Compil. Parallel Comput., 2005; Bilmes et al. in Proc. of the 11th ACM Int. Conf. Super-comput., 1997; Aberdeen and Baxter in Concurr. Comput. Pract. Exp. 13:103, 2001), the proposed methodology has two major advantages. Firstly, the scheduling used for the tile level is different from the element level’s one, having better data locality, suited to the sizes of memory hierarchy. Secondly, its exploration time is short, because it searches only for the number of the level of tiling used, and between (1, 2) (Sect. 4) for finding the best tile size for each cache level. A software tool (C-code) implementing the above methodology was developed, having the hardware model and the matrix sizes as input. This methodology has better performance against others at a wide range of architectures. Compared with the best existing related work, which we implemented, better performance up to 55% than the Standard MMM algorithm and up to 35% than Strassen’s is observed, both under recursive data array layouts.  相似文献   

16.
This paper presents a real-time, computationally inexpensive environment for accurate simulations of sheet materials on a personal computer. The approach described differs from other techniques through its novel use of multilayer sheet structures. The ultimate aim is to incorporate into the environment the capacity of simulating a range of temperatures. A pseudo-immersive Window on World (WoW) environment is used to handle the implementation of the real-time, aesthetically accurate deformation algorithm (MaSSE-Mass-Spring Simulation Engine). The motion of the sheet is controlled by simulated gravity and through its interaction with objects that have been inserted into a virtual room. In addition, the WoW interface is used to adjust environmental parameters dynamically and adjust the scene viewing perspective. An obvious use of the environment is centred on mechanical engineering-based real-time simulations of heat-sensitive sheet materials. This would allow for a wide range of applications in virtual manufacturing including the clothing industry and hostile environments.  相似文献   

17.
The current paper takes an introspective look at the human–computer interaction (HCI) issues for mobile computing in a variable work context. We catalogue the current research in four major categories. The major findings of our study are following. (1) A majority of HCI issues, about 58%, fall under the category of computer systems and interface architecture implications. (2) 23% of the articles focus on development and implementation issues. (3) 13% of the articles focus on use and context of computer issues. (4) 6% of the articles focus on human characteristics issues. Further, the literature indicates that the field services is a main application of mobile computing (46%) followed by sales force (21%), health care (17%), fieldwork (8%), insurance claims (4%) and journalism (4%).  相似文献   

18.
Gazetteers, i.e., lists of place-names, enable having a global vision of places of interest through the assignment of a point, or a region, to a place name. However, such identification of the location corresponding to a place name is often a difficult task. There is no one-to-one correspondence between the two sets, places and names, because of name variants, different names for the same place and homonymy; the location corresponding to a place name may vary in time, changing its extension or even the position; and, in general, there is the imprecision deriving from the association of a concept belonging to language (the place name) to a precise concept (the spatial location). Also for named time periods, e.g., early Bronze Age, which are of current use in archaeology, the situation is similar: they depend on the location to which they refer as the same period may have different time-spans in different locations. The present paper avails of a recent extension of the CIDOC CRM called CRMgeo, which embeds events in a spatio-temporal 4-dimensional framework. The paper uses concepts from CRMgeo and introduces extensions to model gazetteers and period thesauri. This approach enables dealing with time-varying location appellations as well as with space-varying period appellations on a robust basis. For this purpose a refinement/extension of CRMgeo is proposed and a discretization of space and time is used to approximate real space–time extents occupied by events. Such an approach solves the problem and suggests further investigations in various directions.  相似文献   

19.
Common to all tests of space–time interaction is the assumption that the population underlying the events of interest exhibits a trajectory of growth that is consistent through time and across space. In practice, however, this assumption is often untenable and, when violated, can introduce population shift bias into the results of these tests. While this problem is widely recognized, more work remains to compare its effect across tests and to determine the extent to which it is a problem for study short periods. This paper quantifies and compares the population shift bias present in the results of the Knox, Mantel, and Jacquez tests of space–time interaction. A simulation study is carried out which quantifies the bias present in each test across a variety of population movement scenarios. Results show a positive relationship between population shift bias and the heterogeneity in population growth across all the tests. They also demonstrate variability in the size of the bias across the three tests for space–time interaction considered. Finally, the results illustrate that population shift bias can be a serious problem for short study periods. Collectively, these findings suggest that an unbiased approach to assessing the significance of space–time interaction test results is needed whenever spatially heterogeneous population change is identified within a study area.  相似文献   

20.
In this paper we develop a unified difference-spectral method for stably solving time–space fractional sub-diffusion and super-diffusion equations. Based on the equivalence between Volterra integral equations and fractional ordinary differential equations with initial conditions, this proposed method is constructed by combining the spectral Galerkin method in space and the fractional trapezoid formula in time. Numerical experiments are carried out to verify the effectiveness of the method, and demonstrate that the unified method can achieve spectral accuracy in space and second-order accuracy in time for solving two kinds of time–space fractional diffusion equations.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号