首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
In a multinode data sharing environment, different buffer coherency control schemes based on various lock retention mechanisms can be designed to exploit the concept of deferring the propagation or writing of dirty pages to disk to improve normal performance. Two types of deferred write policies are considered. One policy only propagates dirty pages to disk at the times when dirty pages are flushed out of the buffer under LRU buffer replacement. The other policy also performs writes at the times when dirty pages are transferred across nodes. The dirty page propagation policy can have significant implications on the database recovery time. In this paper, we provide an analytical modeling framework for the analysis of the recovery times under the two deferred write policies. We demonstrate how these policies can be mapped onto a unified analytic modeling framework. The main challenge in the analysis is to obtain the pending update count distribution which can be used to determine the average numbers of log records and data I/Os needed to be applied during recovery. The analysis goes beyond previous work on modeling buffer hit probability in a data sharing system where only the average buffer composition, not the distribution, needs to be estimated, and recovery analysis in a single node environment where the complexities on tracking the propagation of dirty pages across nodes and the buffer invalidation effect do not appear  相似文献   

2.
Bayesian approaches have been proposed by several functional magnetic resonance imaging (fMRI) researchers in order to overcome the fundamental limitations of the popular statistical parametric mapping method. However, the difficulties associated with subjective prior elicitation have prevented the widespread adoption of the Bayesian methodology by the neuroimaging community. In this paper, we present a Bayesian multilevel model for the analysis of brain fMRI data. The main idea is to consider that all the estimated group effects (fMRI activation patterns) are exchangeable. This means that all the collected voxel time series are considered manifestations of a few common underlying phenomena. In contradistinction to other Bayesian approaches, we think of the estimated activations as multivariate random draws from the same distribution without imposing specific prior spatial and/or temporal information for the interaction between voxels. Instead, a two-stage empirical Bayes prior approach is used to relate voxel regression equations through correlations between the regression coefficient vectors. The adaptive shrinkage properties of the Bayesian multilevel methodology are exploited to deal with spatial variations, and noise outliers. The characteristics of the proposed model are evaluated by considering its application to two real data sets.  相似文献   

3.
In this paper, a multilevel fuzzy control (MLFC) system is developed and implemented to deal with the real-world nonlinear plants with intrinsic uncertainties and time-varying parameters. The proposed fuzzy control strategy has a hierarchical structure with an adaptation mechanism embedded in the lower level to tune the output membership functions (MFs) of the first layer fuzzy controller and can be used to control a system with an input-output monotonic relationship or a piecewise monotonic relationship. The stability of the closed-loop system under the proposed MLFC is theoretically proven. Simulations are carried out by applying the proposed multilevel fuzzy control (MLFC) to a uncertain nonlinear plants, and it is shown that much better system performances are achieved compared with conventional fuzzy logic controllers (FLC), even in presence of disturbance and noise.  相似文献   

4.
Presents a simulation-based performance analysis of a concurrent file reorganization algorithm. We examine the effect on throughput of (a) buffer size, (b) degree of reorganization, (c) write probability of transactions, (d) multiprogramming level, and (e) degree of clustered transactions. The problem of file reorganization that we consider involves altering the placement of records on pages of a secondary storage device. In addition, we want this reorganization to be done in place, i.e. using the file's original storage space for the newly reorganized file. Our approach is appropriate for a non-in-place reorganization as well. The motivation for such a physical change, i.e. record clustering, is to improve the database system's performance, i.e. minimizing the number of page accesses made in answering a set of queries. There are numerous record clustering algorithms, but they usually do not solve the entire problem, i.e., they do not specify how to efficiently reorganize the file to reflect the clustering assignment that they determine. In previous work, we have presented an algorithm that is a companion to general record clustering algorithms, i.e. it actually transforms the file. In this work we show through simulation that our algorithm, when run concurrently with user transactions, provides an acceptable level of overall database system performance  相似文献   

5.
《微型机与应用》2017,(6):58-61
为了分析基于多值量化的混沌扩频序列的性能,以Chebyshev混沌映射为例,结合其概率密度分布函数的特点给出对其进行多值量化的方法。仿真分析多值量化混沌序列的平衡性和相关性,在高斯白噪声信道中对传统二值量化混沌序列和多值量化混沌序列扩频通信系统进行误码率仿真,并从低检测概率和低利用概率两个角度对多值量化混沌序列的抗截获性能进行分析。仿真结果表明多值量化混沌序列的误码率性能更好,其抗截获性能也有所提高。  相似文献   

6.
Multimedia content understanding research requires rigorous approach to deal with the complexity of the data. At the crux of this problem is the method to deal with multilevel data whose structure exists at multiple scales and across data sources. A common example is modeling tags jointly with images to improve retrieval, classification and tag recommendation. Associated contextual observation, such as metadata, is rich that can be exploited for content analysis. A major challenge is the need for a principal approach to systematically incorporate associated media with the primary data source of interest. Taking a factor modeling approach, we propose a framework that can discover low-dimensional structures for a primary data source together with other associated information. We cast this task as a subspace learning problem under the framework of Bayesian nonparametrics and thus the subspace dimensionality and the number of clusters are automatically learnt from data instead of setting these parameters a priori. Using Beta processes as the building block, we construct random measures in a hierarchical structure to generate multiple data sources and capture their shared statistical at the same time. The model parameters are inferred efficiently using a novel combination of Gibbs and slice sampling. We demonstrate the applicability of the proposed model in three applications: image retrieval, automatic tag recommendation and image classification. Experiments using two real-world datasets show that our approach outperforms various state-of-the-art related methods.  相似文献   

7.
针对网络结构的多样性和网络数据的复杂性,提出一种基于多层次数据融合的网络安全态势分析方法。该方法将网络结构抽象成层次化结构,采用专家系统的数据融合方法进行数据融合。配合层次化的网络结构提出合理的层次化评价体系,并进行量化计算。最后通过实验数据验证了该方法的合理性和有效性。  相似文献   

8.
This paper describes a new way to design and fabricate compliant micromechanisms and material structures with negative Poisson's ratio (NPR). The design of compliant mechanisms and material structures is accomplished in an automated way using a numerical topology optimization method, The procedure allows the user to specify the elastic properties of materials or the mechanical advantages (MA's) or geometrical advantages (GA's) of compliant mechanisms and returns the optimal structures. The topologies obtained by the numerical procedure require practically no interaction by the engineer before they can be transferred to the fabrication unit. Fabrication is carried out by patterning a sputtered silicon on a plasma-enhanced chemical vapor deposition (PECVD) glass with a laser micromachining setup. Subsequently, the structures are etched into the underlying PECVD glass, and the glass is underetched, all in one two-step reactive ion etching (RIE) process. The components are tested using a probe placed on an x-y stage. This fast prototyping allows newly developed topologies to be fabricated and tested within the same day  相似文献   

9.
10.
Static analysis of declarative languages deals with the detection, at compile time, of program properties that can be used to better understand the program semantics and to improve the efficiency of program evaluation. In logical update languages, an interesting problem is the detection of conflicting updates, inserting and deleting the same fact, for transactions based on set-oriented updates and active rules. In this paper, we investigate this topic in the context of the U-Datalog language, a set-oriented update language for deductive databases, based on a deferred semantics. We first formally define relevant properties of U-Datalog programs, mainly related to update conflicts. Then, we prove that the defined properties are decidable and we propose an algorithm to detect such conditions. Finally, we show how the proposed techniques can be applied to other logical update languages. Our results are based on the concept of labeling and query-tree.  相似文献   

11.
针对大范围区域内坡面稳定性分析的应用需求和目前边坡稳定性计算方法的缺陷,提出了一种基于数字高程模型(DEM)数据的三维分析方法。该方法采用球面代替椭球面进行滑面的搜索,并通过对二维分析结果的积分运算来计算三维条件下的边坡安全系数,最后根据安全系数确定可能的滑坡位置和形状。实际的应用结果表明,该方法简化了滑体搜索算法,能够保证边坡稳定性分析的精度,并提高了分析计算的效率。  相似文献   

12.
13.
《Information Systems》1999,24(5):377-400
Multilevel relations, based on the current multilevel secure (MLS) relational data models, can present a user with information that is difficult to interpret and may display an inconsistent outlook about the views of other users. Such ambiguity is due to the lack of a comprehensive method for asserting and interpreting beliefs about information at lower security levels. In this paper we present a belief-consistent MLS relational database model which provides an unambiguous interpretation of all visible information and gives the user access to the beliefs of users at lower security levels, neither of which was possible in any of the existing models. We identify different beliefs that can be held by users at higher security levels about information at lower security levels, and introduce a mechanism for asserting beliefs about all accessible tuples. This mechanism provides every user with an unambiguous interpretation of all viewable information and presents a consistent account of the views at all levels visible to the user. In order to implement this assertion mechanism, new database operations, such as verify true and verify false, are presented. We specify the constraints for the write operations, such as update and delete, that maintain belief consistency and redefine the relational algebra operations, such as select, project, union, difference and join.  相似文献   

14.
为了对某通信企业内海量历史数据进行充分利用,对管理者的科学决策提供有力支持,通过应用数据仓库技术获取潜在有用信息,设计并实现了基于数据仓库的性能综合分析系统(PSAS),详细阐述了数据仓库的设计与实现流程,解决了数据仓库构建、数据分析等技术问题.实践表明其具有良好效果.  相似文献   

15.
基于GridFTP的并行数据传输性能分析与研究   总被引:1,自引:0,他引:1       下载免费PDF全文
分析论述了广域网格网络中数据传输的需求以及网格数据传输协议—GridFTP的性能特点,对GridFTP中最重要的并行传输机制进行了探讨。通过大量实验测试对不同并行度下数据传输的时间、带宽、吞吐量、数据传输总量等性能参数进行了比较,讨论了并行度对传输性能的影响,提出了并行传输机制在提高传输性能方面的局限和值得注意的问题。  相似文献   

16.
基于多实例的思想提出了一种新的模糊级别的多级安全模型,将多个仅密级不同的元组合并表示,并用安全模式来表示元组所适用的密级,只要主体的许可级别匹配此安全模式,便可存取此元组.这种模型解决了现有多级安全数据模型中存在的数据冗余度大及隐通道等问题.  相似文献   

17.
As we delve deeper into the ‘Digital Age’, we witness an explosive growth in the volume, velocity, and variety of the data available on the Internet. For example, in 2012 about 2.5 quintillion bytes of data was created on a daily basis that originated from myriad of sources and applications including mobile devices, sensors, individual archives, social networks, Internet of Things, enterprises, cameras, software logs, etc. Such ‘Data Explosions’ has led to one of the most challenging research issues of the current Information and Communication Technology era: how to optimally manage (e.g., store, replicated, filter, and the like) such large amount of data and identify new ways to analyze large amounts of data for unlocking information. It is clear that such large data streams cannot be managed by setting up on-premises enterprise database systems as it leads to a large up-front cost in buying and administering the hardware and software systems. Therefore, next generation data management systems must be deployed on cloud. The cloud computing paradigm provides scalable and elastic resources, such as data and services accessible over the Internet Every Cloud Service Provider must assure that data is efficiently processed and distributed in a way that does not compromise end-users’ Quality of Service (QoS) in terms of data availability, data search delay, data analysis delay, and the like. In the aforementioned perspective, data replication is used in the cloud for improving the performance (e.g., read and write delay) of applications that access data. Through replication a data intensive application or system can achieve high availability, better fault tolerance, and data recovery. In this paper, we survey data management and replication approaches (from 2007 to 2011) that are developed by both industrial and research communities. The focus of the survey is to discuss and characterize the existing approaches of data replication and management that tackle the resource usage and QoS provisioning with different levels of efficiencies. Moreover, the breakdown of both influential expressions (data replication and management) to provide different QoS attributes is deliberated. Furthermore, the performance advantages and disadvantages of data replication and management approaches in the cloud computing environments are analyzed. Open issues and future challenges related to data consistency, scalability, load balancing, processing and placement are also reported.  相似文献   

18.
SAR复图像数据的CCSDS-IDC编码性能分析与四叉树编码   总被引:1,自引:0,他引:1       下载免费PDF全文
目的:CCSDS-IDC (国际空间数据系统咨询委员会-图像数据压缩) 是NASA制定的基于离散小波变换(DWT)尺度间衰减性的空间图像数据压缩标准,适用于合成孔径雷达(SAR)幅度图像及各类遥感图像的压缩。然而,与光学图像不同,常见的SAR图像都是复图像数据,其在干涉测高等许多场合具有广泛应用,分析研究CCSDS-IDC对SAR复图像数据的编码性能具有重要的应用价值。方法:由于SAR复图像数据不具有尺度间的衰减性,因此将其用于SAR复图像数据编码时性能较低。考虑到SAR复图像数据DWT系数呈现出聚类特性,提出将四叉树(QC)用于DWT域的SAR复图像数据编码,发现QC对SAR复图像数据具有高效的压缩性能。结果:实验结果表明,在同等码率下,对基于DWT的SAR复图像数据压缩, QC比CCSDS-IDC最多可提高幅度峰值信噪比4.4dB,平均相位误差最多可降低0.368;与基于方向提升小波变换(DLWT)的CCSDS-IDC相比,QC可提高峰值信噪比3.08dB,降低平均相位误差0.25;对其它类型的图像压缩,基于聚类的QC仍能获得很好的编码性能。结论:CCSDS-IDC对SAR复图像数据编码性能低下,而QC能获得很好的编码性能。对应于图像平滑分布的尺度间衰减性,其在某些特殊图像中可能不存在,而对应于图像结构分布的聚类特性总是存在的,故在基于DWT的图像编码算法设计中,应优先考虑利用小波系数的聚类特性,从而实现对更多种类图像的高效编码。  相似文献   

19.
我国第三代短波通信技术的研究仍处于起步阶段,并且国内的大多短波通信系统都是基于第二代的短波通信标准。相比于第二代短波通信的数据链路协议,第三代短波通信的自动链路建立系统以其简洁而高效的设计解决了第二代短波系统所面临的各种问题,在建立数据链路的高效性、稳定性、可靠性等方面都有了很大的改进。根据美军标MIL-STD-188-141B中给出的第三代短波通信技术标准,对第三代短波数据链路协议的高速数据链路传输协议(HDL)进行了研究,通过理论分析与Matlab仿真相结合的方法,对数据包的误码率、丢包率、平均传输次数以及采用不同数据分组时的吞吐量等方面进行了性能分析。  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号