首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 156 毫秒
1.
对高维流式数据的在线组变量选择问题进行了研究,提出了带Group Lasso惩罚的逻辑斯蒂回归在线估计方法,并给出了GFTPRL (Group Follow the Proximally Regularized Leader)算法。通过给出GFTPRL算法的缺憾界,证明了算法在理论上是有效的。实验结果表明,对于稀疏模型GFTPRL算法的预测分类准确率明显优于其他主流稀疏在线算法。  相似文献   

2.
随着信息技术的发展,空间数据呈几何级数增长,特别是数字地图中常见的矢量数据,如何在海量的矢量数据中快速检测他们之间的拓扑关系,就成为相关的空间分析迫切需要解决的问题之一。根据目前空间数据质量检查的需要,提出一种新的快速几何拓扑关系检查算法——行扫描检测算法。该算法在现有常用的算法基础上,通过提高算法运算效率,有效地减少运算时间。通过实验,将行扫描算法与不同的常用算法进行测试比较,最后证明行扫描算法在大量的矢量数据拓扑检查中效率上的优点。  相似文献   

3.
在滚动轴承性能退化评估中,不同工况会影响振动信号特征对故障程度的敏感性,在早期有限样本中选择适用于状态评估的有效特征是实现在线评估轴承性能退化程度的关键。首先提出一种基于均方根的早期有限样本判定方法 Limited Feature Select Sample(LFSS),其次提出一种针对性能退化评估特征选择的改进Binary Bat Algorithm(BBA)算法——Feedback Seeking Binary Bat Algorithm(FSBBA),将其应用于滚动轴承早期有限样本中进行故障特征选择,克服了原始BBA容易陷入局部寻优的缺点。基于LFSS与FSBBA算法,构建了滚动轴承在线状态评估模型,并将其运用于两例滚动轴承全寿命数据特征选择,性能退化评估指标分析结果表明了所提出方法的有效性。  相似文献   

4.
为了解决并行矢量空间分析在数据划分阶段的负载均衡问题,研究了矢量空间数据的划分,提出了一种基于空间聚类思想的矢量空间数据划分方法。该方法充分考虑矢量空间数据规模以及空间邻近性特征对并行空间分析算法效率的影响,首先采用空间填充曲线对二维空间数据进行编码,保证空间要素邻近性特征;然后用空间要素集合对空间要素流进行填充,从而确保各个子任务集中的要素数据规模相对均衡。以并行叠加分析中点面、线面、面面叠加操作为例,设计了对比实验。实验结果表明,该方法能够有效提高以线、面要素为操作对象的并行算法负载均衡度和提高并行算法整体运行效率。  相似文献   

5.
在复杂场景下,传统的粒子滤波跟踪算法较难定位目标.针对此问题,提出了一种基于在线特征选择的粒子滤波跟踪算法.该算法首先在线、自适应地通过Fisher判别准则,从16个不同的颜色特征空间中选择最能区分目标及其邻近背景的1个最佳特征空间,然后在这个最佳特征空间中用基于统计直方图的粒子滤波算法跟踪目标.试验结果表明,该算法鲁棒性和准确性较好,在光照变化.目标自身发生形变和遮挡情况下能够准确地对目标进行跟踪.  相似文献   

6.
张岩 《中国科技博览》2010,(12):295-296
本文基于OLAP技术和数据挖掘技术,构建了校园卡消费系统数据挖掘模型,提出了一种WEB下基于OLAP和数据挖掘技术的校园卡消费系统挖掘算法,并分析了数据挖掘在校园卡消费系统中实现的过程。  相似文献   

7.
讨论了CIMS环境下数据仓库的组织结构,阐明了基于数据仓库的联机分析处理(OLAP)是实现智能决策支持系统的基础。提出了企业数据仓库体系化环境的设计和实施方案,以及基于多Agent技术的维护方案。同时,引入了基于Web的分布式OLAP三层体系结构方案以提供便捷的决策支持功能。文章的最后给出了一个应用实例。  相似文献   

8.
为了能在在线方式下有效认知产品外形设计中的客户感性偏好,提出了一种能以结构化的方式识别客户非结构化感性表达的模型。首先在在线方式下,以语义差异法和多维尺度法获得了客户感性空间向设计感性空间的映射;接着以正交设计缩减实验次数,再以移动法形成由多个正交设计组成的选择集;最后利用离散选择模型对选择结果进行分析,得到了感性意象和外形设计特征之间的函数关系,也就是获得了设计感性空间到设计参数空间的映射。整个识别模型符合在线方式下客户的表达特点,从非结构化的评价变为结构化的选择。以乘用车应用实例验证了所提出模型的可行性与有效性。  相似文献   

9.
黄舟  陈斌  方裕  彭霞  张珂  解学通 《高技术通讯》2007,17(10):1013-1018
基于新一代GIS技术体系的空间数据的分布特点,提出了一种新的应用于分布式空间查询处理的混合启发式优化算法(HHOA).该算法参考了传统的基于关系代数变换的启发式优化算法,同时为了克服单调采用直接连接或者半连接的弊端,引入了直接连接和半连接相混合的策略,实现更高的查询执行效率.实例研究表明,尤其是在涉及空间连接的查询前提下,该算法能够有效支持分布式空间查询的处理.  相似文献   

10.
针对并行矢量空间叠加分析中存在的I/O性能差及并行算法调度效率低的缺陷,提出了"去"归并通用并行计算架构(NJ-GPCA)。该架构首先基于内存数据库Redis设计内存矢量空间数据模型;其次通过数据预处理以及任务分发技术,减少进程等待,提高I/O性能;最后重新进行任务分配以及规划进程调度,避免结果数据归并收集,使得并行叠加分析算法归并收集阶段的时间复杂度由O(nlogn)降低到O(n)。实验结果表明,该方法对真实地理数据下的并行叠加分析操作,I/O时间至少减少75%,对于提高算法整体性能有明显效果。  相似文献   

11.
Hudomalj  Emil  Vidmar  Gaj 《Scientometrics》2003,58(3):609-622
The application of online analytical processing (OLAP) technology to bibliographic databases is addressed. We show that OLAP tools can be used by librarians for periodic and ad hoc reporting, quality assurance, and data integrity checking, as well as by research policy makers for monitoring the development of science and evaluating or comparing disciplines, fields or research groups. It is argued that traditional relational database management systems, used mainly for day-to-day data storage and transactional processing, are not appropriate for performing such tasks on a regular basis. For the purpose, a fully functional OLAP solution has been implemented on Biomedicina Slovenica, a Slovenian national bibliographic database. We demonstrate the system's usefulness by extracting data for studying a selection of scientometric issues: changes in the number of published papers, citations and pure citations over time, their dependence on the number of co-operating authors and on the number of organisations the authors are affiliated to, and time-patterns of citations. Hardware, software and feasibility considerations are discussed and the phases of the process of developing bibliographic OLAP applications are outlined. This revised version was published online in June 2006 with corrections to the Cover Date.  相似文献   

12.
A data warehouse (DW) is designed primarily to meet the informational needs of an organization’s decision support system. Most queries posed on such systems are analytical in nature. These queries are long and complex, and are posed in an exploratory and ad-hoc manner. The response time of these queries is high when processed directly against a continuously growing DW. In order to reduce this time, materialized views are used as an alternative. It is infeasible to materialize all views due to storage space constraints. Further, optimal view selection is an NP-Complete problem. Alternately, a subset of views, from amongst all possible views, needs to be selected that improves the response time for analytical queries. In this paper, a quantum-inspired evolutionary view selection algorithm (QIEVSA) that selects Top-K views from a multidimensional lattice has been proposed. Experimental comparison of QIEVSA with other evolutionary view selection algorithms shows that QIEVSA is able to select Top-K views that are comparatively better in reducing the response times for analytical queries. This in turn aids in efficient decision making.  相似文献   

13.
There are a variety of analytical models for supplier selection ranging from simple weighted techniques to complex mathematical programming approaches. However, these models are specifically aimed at supporting a decision maker in a single phase, especially in the final selection phase and they have failed to consider the supplier selection process from a holistic point of view. Although the methodology presented in this paper primarily focused on the prequalification of potential suppliers, the outputs of the previous phases, namely problem definition and formulation of criteria, are used as inputs in this methodology. The methodology utilises a fuzzy analytic hierarchy process (AHP) method to determine the weights of the pre-selected decision criteria, a max-min approach to maximise and minimise the supplier performances against these weighted criteria, and a non-parametric statistical test to identify an effective supplier set. This information supports decision makers in making the final selection with effective alternative choices. Potential application of the proposed methodology is demonstrated in Audio Electronics in Turkey's electronics industry.  相似文献   

14.
This paper describes a new method for estimating the extreme values of a Gaussian random field in both space and time. The method relies on the use of data provided by measurement or Monte Carlo simulation combined with a technique for estimating the extreme value distribution of a recorded time series. The time series in question represents the spatial extremes of the random field at each point in time. The time series is constructed by sampling the available realization of the random field over a suitable grid defining the domain in question and extracting the extreme value. This is done for each time point of a suitable time grid. Thus, the time series of spatial extremes is produced. This time series provides the basis for estimating the extreme value distribution using available techniques for time series, which results in an accurate practical procedure for solving a very difficult problem. This is demonstrated by comparison with the results obtained from analytically derived expressions for the extreme values of a Gaussian random field. Properties of these analytical formulas are also discussed.  相似文献   

15.
One of the major obstacles contributing to the cost, time and efficiency of improving the quality output of manufacturing systems is the propagation of defectives or errors through the system. Design of Experiments (DoE), the response surface plot and a Neural Network Metamodel (NNM) can be used automatically to detect the interrelationship of the system without the need for complex analytical tools and costly intervention. A case study is conducted here to demonstrate the capability of DoE, the response surface plot and NNM in building a decisionsupport model for achieving six-sigma quality for a manufacturing system with a significant shift in the mean number of defectives produced. The case study is based on a discrete event simulation model of an actual manufacturing system. A response surface plot is used as an off-line decision support tool. Alternatively, a grid search method implemented on the NNM can be used as an on-line decision support tool in the manufacturing system with a real-time database system.  相似文献   

16.
Understanding the underlying structure of single vehicle crashes (SVCs) is essential for improving safety on the roads. Past research has found that SVCs tend to cluster both spatially and temporally. However, limited research has been conducted to investigate the interaction between the location of SVCs and the time they occur, especially at different levels of scales or spatial extents.This paper applied spatial, temporal and spatio-temporal techniques to investigate patterns of SVCs in Western Australia between 1999 and 2008, at different levels of scale. Spider graphs were adapted to identify temporal patterns of vehicle crashes at two different levels of scales: daily and weekly with respect to their causes. The spatial structures of vehicle crashes were analysed using Kernel Density Estimation analysis at three different scales: West Australia, Metropolitan area, and Perth Local Government Area (LGA). These are illustrated using spatial zooming theory. Comap was then used to demonstrate the spatio-temporal interaction effect on vehicle crashes. The results show significant differences in spatio-temporal patterns of SVCs for various crash causes. The techniques used here have the potential to help decision makers in developing effective road safety strategies.  相似文献   

17.
Quantile functions are important in characterizing the entire probability distribution of a random variable, especially when the tail of a skewed distribution is of interest. This article introduces new quantile function estimators for spatial and temporal data with a fused adaptive Lasso penalty to accommodate the dependence in space and time. This method penalizes the difference among neighboring quantiles, hence it is desirable for applications with features ordered in time or space without replicated observations. The theoretical properties are investigated and the performances of the proposed methods are evaluated by simulations. The proposed method is applied to particulate matter (PM) data from the Community Multiscale Air Quality (CMAQ) model to characterize the upper quantiles, which are crucial for studying spatial association between PM concentrations and adverse human health effects.  相似文献   

18.
A new method to modify coordinates of detectors in any positron emission tomography (PET) system using a radioactive point source is developed. This method is based on selecting a centered detector in each detector block of PET and measuring coincidence counts between the two centered detectors in opposite detector blocks to find the coordinates of their LOR (Line of Response). Due to slight misalignment of detector positions, measured LORs may not intersect at a single point. Based on the proposed method, the coordinates of detectors can be measured with very high accuracy and the coordinate of the center of the gantry (which is normally the same as the center of field of view) can be defined correctly. The results of the application of our method to a small animal PET system (FinePET), which was recently developed at Tohoku University, Japan, are shown here. The method is expected to contribute to the design and development of PET systems which can realize a very high spatial resolution of less than 1 mm FWHM.  相似文献   

19.
基于SOFM实例推理系统的结构选型   总被引:1,自引:0,他引:1  
本文以大跨空间结构和高层建筑结构两类大型复杂结构型式优选为研究内容,建立了基于自组织竞争神经网络(SOFM)检索机制的实例推理系统,并应用于结构选型决策;采用面向对象的设计思想建立方案集,采用自组织竞争神经网络进行实例检索和匹配,形成了一套有效的计算推理体系。最后给出一个高层建筑结构方案生成的算例,说明系统的工作过程。  相似文献   

20.
The objectives of this study are (1) to develop an incident duration model which can account for the spatial dependence of duration observations, and (2) to investigate the impacts of a hurricane on incident duration. Highway incident data from New York City and its surrounding regions before and after Hurricane Sandy was used for the study. Moran’s I statistics confirmed that durations of the neighboring incidents were spatially correlated. Moreover, Lagrange Multiplier tests suggested that the spatial dependence should be captured in a spatial lag specification. A spatial error model, a spatial lag model and a standard model without consideration of spatial effects were developed. The spatial lag model is found to outperform the others by capturing the spatial dependence of incident durations via a spatially lagged dependent variable. It was further used to assess the effects of hurricane-related variables on incident duration. The results show that the incidents during and post the hurricane are expected to have 116.3% and 79.8% longer durations than those that occurred in the regular time. However, no significant increase in incident duration is observed in the evacuation period before Sandy’s landfall. Results of temporal stability tests further confirm the existence of the significant changes in incident duration patterns during and post the hurricane. Those findings can provide insights to aid in the development of hurricane evacuation plans and emergency management strategies.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号