共查询到20条相似文献,搜索用时 17 毫秒
1.
Compared with conventional cameras, spectral imagers provide many more features in the spectral domain. They have been used in various fields such as material identification, remote sensing, precision agriculture, and surveillance. Traditional imaging spectrometers use generally scanning systems. They cannot meet the demands of dynamic scenarios. This limits the practical applications for spectral imaging. Recently, with the rapid development in computational photography theory and semiconductor techniques, spectral video acquisition has become feasible. This paper aims to offer a review of the state-of-the-art spectral imaging technologies, especially those capable of capturing spectral videos. Finally, we evaluate the performances of the existing spectral acquisition systems and discuss the trends for future work. 相似文献
2.
We introduce a new representation for time series, the Multiresolution Vector Quantized (MVQ) approximation, along with a distance function. Similar to Discrete Wavelet Transform, MVQ keeps both local and global information about the data. However, instead of keeping low-level time series values, it maintains high-level feature information (key subsequences), facilitating the introduction of more meaningful similarity measures. The method is fast and scales linearly with the database size and dimensionality. Contrary to previous methods, the vast majority of which use the Euclidean distance, MVQ uses a multiresolution/hierarchical distance function. In our experiments, the proposed technique consistently outperforms the other major methods. 相似文献
3.
J. Gao 《International journal of remote sensing》2013,34(14):2823-2833
Mangrove forests in the western Waitemata Harbour, Auckland, New Zealand were mapped into lush and stunted categories from SPOT HRV and Landsat TM images at 10, 20 and 30m using the maximum likelihood method. It was found that the TM-generated results were the most accurate at 95% for lush mangroves and 87.5% for stunted mangroves. Their corresponding accuracy levels were lowered to 77.5% and 67.5% in the 20m SPOT XS-derived results. Both percentages were improved to 80 after the PAN band was incorporated in the classification at 10m. These results suggest that a high spectral resolution is more important in accurately mapping mangroves in a temperate zone than a fine spatial resolution because it enhances the interpretability of non-mangrove vegetation and thus increases its confusion with mangroves. 相似文献
4.
The current computational power and some recently developed algorithms allow a new automatic spectral analysis method for randomly missing data. Accurate spectra and autocorrelation functions are computed from the estimated parameters of time series models, without user interaction. If only a few data are missing, the accuracy is almost the same as when all observations were available. For larger missing fractions, low-order time series models can still be estimated with a good accuracy if the total observation time is long enough. Autoregressive models are best estimated with the maximum likelihood method if data are missing. Maximum likelihood estimates of moving average and of autoregressive moving average models are not very useful with missing data. Those models are found most accurately if they are derived from the estimated parameters of an intermediate autoregressive model. With statistical criteria for the selection of model order and model type, a completely automatic and numerically reliable algorithm is developed that estimates the spectrum and the autocorrelation function in randomly missing data problems. The accuracy was better than what can be obtained with other methods, including the famous expectation–maximization (EM) algorithm. 相似文献
5.
Linear discriminant analysis (LDA) is a popular technique that works for both dimensionality reduction and classification. However, LDA faces the problem of small sample size in dealing with high dimensional data. Several approaches have been proposed to overcome this issue, but the resulting transformation matrix fails to extract shared structures among data samples. In this paper, we propose trace norm regularized LDA that not only tackles the problem of small sample size but also uncover the underlying structures between target classes. Specifically, our formulation characterizes the intrinsic dimensionality of a transformation matrix owing to the appealing property of trace norm. Evaluations over nine real data sets deliver the effectiveness of our algorithm. 相似文献
6.
A classification-based assessment of the optimal spectral and spatial resolutions for Great Lakes coastal wetland imagery 总被引:2,自引:0,他引:2
We analyzed hyperspectral airborne imagery (CASI 2 with 46 contiguous VIS/NIR bands) that was acquired over a Lake Huron coastal wetland. To support detailed Great Lakes coastal wetland mapping, the optimal spatial resolution of imagery was determined to be less than 2 m. There was a 23% change in classification resiliency using the SAM classifier upon resampling the original 1-meter, 18-band imagery to 2-meter pixels, and further classifications with larger pixels (4 and 8 m) increased overall classification change to 35% and 50%, respectively.We performed a series of image classification experiments incorporating three independent band selection methodologies (derivative magnitude, fixed interval and derivative histogram), in order to explore the effects of spectral resampling on classification resiliency. This research verified that a minimum of seven, strategically located bands in the VIS-NIR wavelength region (425.4 nm, 514.9 nm, 560.1 nm, 685.5 nm, 731.5 nm, 812.3 nm and 916.7 nm) are necessary to maintain a classification resiliency above the 85% threshold. Significantly, these seven bands produced the highest classification resiliency using the fewest number of bands of any of the 63 band-reduction strategies that were tested.Analyzing only derivative magnitudes proved to be an unreliable tool to identify optimal bands. The fixed interval method was adversely influenced by the starting band location, making its implementation problematic. The combined use of derivative magnitude and frequency of occurrence appears to be the best method to determine the “optimal” bands for a wetland mapping hyperspectral application. 相似文献
7.
Real-time 3D imaging is becoming increasingly important in areas such as medical science, entertainment, homeland security,
and manufacturing. Numerous 3D imaging techniques have been developed, but only a few of them have the potential to achieve
realtime. Of these few, fringe analysis based techniques stand out, having many advantages over the rest. This paper will
explain the principles behind fringe analysis based techniques, and will provide experimental results from systems using these
techniques. 相似文献
8.
9.
Performance analysis of R*-trees with arbitrary node extents 总被引:1,自引:0,他引:1
Yufei Tao Dimitris Papadias 《Knowledge and Data Engineering, IEEE Transactions on》2004,16(6):653-668
Existing analysis for R-trees is inadequate for several traditional and emerging applications including, for example, temporal, spatio-temporal, and multimedia databases because it is based on the assumption that the extents of a node are identical on all dimensions, which is not satisfied in these domains. We propose analytical models that can accurately predict R*-tree performance without this assumption. Our derivation is based on the novel concept of extent regression function, which computes the node extents as a function of the number of node splits. Detailed experimental evaluation reveals that the proposed models are accurate, even in cases where previous methods fail completely. 相似文献
10.
Hussain Ishfaq Awan Muhammad Ali Souto Pedro F. Bletsas Konstantinos Akesson Benny Tovar Eduardo 《Real-Time Systems》2021,57(1-2):141-189
Real-Time Systems - The well-known model of Vestal aims to avoid excessive pessimism in the quantification of the processing requirements of mixed-criticality systems, while still guaranteeing the... 相似文献
11.
The correctness of a real-time system depends on not only the system’s output but also on the time at which results are produced.
A hard real-time system is required to complete its operations before all its timing deadlines. For a given task set it is
useful to know what changes can be made to a task that will result in a system that is borderline schedulable. It is also
beneficial in an engineering context to know the minimum speed of a processor that will deliver a schedulable system. We address
the following sensitivity analysis (parameter computations) for EDF-scheduled systems on a uniprocessor: task execution times,
speed of the processor, task periods and task relative deadlines. We prove that an optimal (minimum or maximum) system parameter
can be determined by a single run of the Quick convergence Processor demand Analysis (QPA) algorithm. This algorithm provides
efficient and exact sensitivity analysis for arbitrary deadline real-time systems. We also improve the implementation of this
sensitivity analysis by using various starting values for the algorithms. The approaches developed for task parameter computations
are therefore as efficient as QPA, and are easily incorporated into a system design support tool. 相似文献
12.
Nikolić Borislav Tobuschat Sebastian Soares Indrusiak Leandro Ernst Rolf Burns Alan 《Real-Time Systems》2019,55(1):63-105
Real-Time Systems - Nowadays available multiprocessor platforms predominantly use a network-on-chip (NoC) architecture as an interconnect medium, due to its good scalability and performance. During... 相似文献
13.
Spectral super-resolution is a very important technique to obtain hyperspectral images from only multispectral images, which can effectively solve the high acquisition cost and low spatial resolution of hyperspectral images. However, in practice, multispectral channels or images captured by the same sensor are often with different spatial resolutions, which brings a severe challenge to spectral super-resolution. This paper proposed a universal spectral super-resolution network based on physical optimization unfolding for arbitrary multispectral images, including single-resolution and cross-scale multispectral images. Furthermore, two new strategies are proposed to make full use of the spectral information, namely, cross-dimensional channel attention and cross-depth feature fusion. Experimental results on five data sets show superiority and stability of PoNet addressing any spectral super-resolution situations. 相似文献
14.
Hyun-Jung Kim Yu-Deok Seo Sung-Kie Youn 《Computer Methods in Applied Mechanics and Engineering》2010,199(45-48):2796-2812
Trimming technique is a powerful and efficacious way of endowing an arbitrary complex topology to CAD files created by using NURBS. In the present work, it is shown that any complex multiply-connected NURBS domain can be described by using trimming curves only. Isogeometric analysis for linear elasticity problems of complex topology described in this way is presented. For fully communicative interaction between CAD and CAE, a specific searching algorithm and an integration scheme of trimmed elements are introduced to utilize the IGES files exported from CAD system for Isogeometric analysis. Schemes for imposing essential and traction boundary conditions on trimming curves are presented. It has been demonstrated that with the presented schemes trimmed cases in any complicated situations can be successfully treated. With the examples of complex topology that could be described by employing trimming curves only, effectiveness and robustness of present method are demonstrated. 相似文献
15.
Sergey V. SevastyanovBertrand M.T. Lin Hsiao-Lan Huang 《Theoretical computer science》2011,412(35):4536-4544
The paper considers makespan minimization on a single machine subject to release dates in the relocation problem, originated from a resource-constrained redevelopment project in Boston. Any job consumes a certain amount of resource from a common pool at the start of its processing and returns to the pool another amount of resource at its completion. In this sense, the type of our resource constraints extends the well-known constraints on resumable resources, where the above two amounts of resource are equal for each job. In this paper, we undertake the first complexity analysis of this problem in the case of arbitrary release dates. We develop an algorithm, based on a multi-parametric dynamic programming technique (when the number of parameters that undergo enumeration of their values in the DP-procedure can be arbitrarily large). It is shown that the algorithm runs in pseudo-polynomial time when the number m of distinct release dates is bounded by a constant. This result is shown to be tight: (1) it cannot be extended to the case when m is part of the input, since in this case the problem becomes strongly NP-hard, and (2) it cannot be strengthened up to designing a polynomial time algorithm for any constant m>1, since the problem remains NP-hard for m=2. A polynomial-time algorithm is designed for the special case where the overall contribution of each job to the resource pool is nonnegative. As a counterpart of this result, the case where the contributions of all jobs are negative is shown to be strongly NP-hard. 相似文献
16.
This paper considers a multiserver queueing model with abandonment, retrial and after-call work for call centers. Upon a phone call, customers that find a free call line occupy the line immediately while those who see all the call lines busy are blocked and join an orbit. Customers holding a call line are served according to the first-come first-served discipline. After completing a call, the customer leaves the system while the server must start an after-call work and the call line is released for a newly arrived customer. Waiting customers may abandon after some waiting time and then either join the orbit or leave forever. Customers in the orbit retry to hold a free call line after some time. We formulate the queueing system using a continuous-time level-dependent quasi-birth-and-death process for which a sufficient condition for the ergodicity is derived. We obtain a numerical solution for the stationary distribution based on which performance measures such as the waiting time distribution and the blocking probability are derived. Using Little’s law, we obtain explicit formulae which verify the accuracy of the numerical solution. We compare our model with some simpler models which do not fully take into account some human behaviors. The comparison shows significant differences implying the importance of our model. Numerical results show various insights into the performance of call centers. 相似文献
17.
18.
Enver Ever 《The Journal of supercomputing》2017,73(5):2130-2156
The ability to deliver acceptable levels of quality of service is crucial for cloud systems, and this requires performance as well as availability analysis. Existing modeling attempts mainly focus on pure performance analysis; however, the software and hardware components of cloud infrastructures may have limited reliability. In this study, analytical models are presented for performability evaluation of cloud centers. A novel approximate solution approach is introduced which allows consideration of large numbers of servers. The challenges for analytical modeling of cloud systems mentioned in the literature are considered. The analytical models and solutions, therefore, are capable of considering large numbers of facility nodes typically up to orders of hundreds or thousands, and able to incorporate various traffic loads while evaluating quality of service for cloud centers together with server availabilities. The results obtained from the analytical models are presented comparatively with the results obtained from discrete event simulations for validation. 相似文献
19.
Cormac Smyth Evgeny Kudryashov Breda O'Driscoll Vitaly Buckin 《Journal of The Association for Laboratory Automation》2004,9(2):87-90
This article describes the application of high-resolution ultrasonic spectroscopy (HR-US) for the analysis of industrial emulsions and suspensions. The benefits of HR-US are discussed, including the ability to perform a direct analysis of emulsions and suspensions, which would otherwise be impossible, require significantly more effort in the laboratory, or produce erroneous results. The HR-US 102 spectrometer is also introduced. Manufacture of this laboratory-scale instrument is now possible due to recent technological advances in HR-US. The article outlines the principles of the HR-US technique and illustrates the application of the HR-US 102 spectrometer for analysis of absorption of ligands on the surface of particles, thermal stability, and effects of thermal history on microstructure of emulsions, crystallization, and particle sizing in diluted and concentrated emulsions. 相似文献
20.
Moshe Dubiner 《Journal of scientific computing》1987,2(1):3-31
A detailed asymptotic analysis of spectral methods for prototype problems is presented. Asymptotic error behavior throughout the solution regime is given. A number of surprising results are presented, including theO(N) boundedness of the eigenvalues of collocation on Legendre points. 相似文献