共查询到20条相似文献,搜索用时 15 毫秒
1.
Grid architecture integrates geographically distributed nodes to manage and provide resources to execute scientific applications.
For data locality, applications with different computational phases require data redistribution for realignment. The tradeoff
between high efficiency computation and communication cost of data redistribution accompanies. This paper introduces a research
model and two methods to derive new lists of processor logical id according to the characteristics of heterogeneous network.
Both methods provide choices of more low-cost communication schedules in grid. The simulations show both proposed methods
yield outstanding performance in grid. 相似文献
2.
《Decision Support Systems》1986,2(2):145-158
To ease the access to data bases for the casual user, a system called SMARTY has been developed as an intelligent front end to the relational data base management system SQL/DS. A user only needs to specify the data attributes and possibly their value ranges which he wants to be displayed. SMARTY will test the meaningfulness of this request for information and subsequently generate and execute the SQL statements for deriving the information from SQL relations. SMARTY is capable of extracting or aggregating detailed information in a company database. It can be regarded as an expert system on top of a relational DBMS to improve the user interface, or as a query system to a universal relation. SMARTY was written in Prolog. Both Prolog and SQL/DS are running under the IBM operating system VM/SP 3. 相似文献
3.
M.DE LA SEN 《International journal of control》2013,86(3):737-765
Two possible optimization techniques for on-line adjustment of the design parameters involved in the adaptation algorithms of adaptive control schemes for minimum phase plants arc discussed. Sensitivity corrections adapled to this particular problem are introduced for correcting inaccuracies in an auxiliary model derived in order to be able to apply classical optimization techniques to the whole scheme. The main objective of such techniques is to improve the adaptation transient performances. The resulting strategies are discussed from the point of view of performance and possible implementation. Simulations illustrate the feasibility of the proposed optimizing procedures which are an extension, using a more general optimization theory and/or a sensitivity approach, of previous results and an alternative to the adaptive sampling approach of De la Sen (1984 c). 相似文献
4.
5.
Sholom M. Weiss Robert J. Baseman Fateh Tipu Christopher N. Collins William A. Davies Raminderpal Singh John W. Hopkins 《Applied Intelligence》2010,33(3):318-329
We describe an automated system for improving yield, power consumption and speed characteristics in the manufacture of semiconductors. Data are continually collected in the form of a history of tool usage, electrical and other real-valued measurements—a dimension of tens of thousands of features. Unique to this approach is the inference of patterns in the form of binary regression rules that demonstrate a significantly higher or lower performance value for tools relative to the overall mean for that manufacturing step. Results are filtered by knowledge-based constraints, increasing the likelihood that empirically validated rules will prove interesting and worth further investigation. This system is currently installed in the IBM 300 mm fab, manufacturing game chips and microprocessors. It has detected numerous opportunities for yield and performance improvement, saving many millions of dollars. 相似文献
6.
In this paper, we present a particle swarm optimizer (PSO) to solve the variable weighting problem in projected clustering
of high-dimensional data. Many subspace clustering algorithms fail to yield good cluster quality because they do not employ
an efficient search strategy. In this paper, we are interested in soft projected clustering. We design a suitable k-means objective weighting function, in which a change of variable weights is exponentially reflected. We also transform the
original constrained variable weighting problem into a problem with bound constraints, using a normalized representation of
variable weights, and we utilize a particle swarm optimizer to minimize the objective function in order to search for global
optima to the variable weighting problem in clustering. Our experimental results on both synthetic and real data show that
the proposed algorithm greatly improves cluster quality. In addition, the results of the new algorithm are much less dependent
on the initial cluster centroids. In an application to text clustering, we show that the algorithm can be easily adapted to
other similarity measures, such as the extended Jaccard coefficient for text data, and can be very effective. 相似文献
7.
Dionysios Efstathiou Andreas Koutsopoulos Sotiris Nikoletseas 《Simulation Modelling Practice and Theory》2011,19(10):2226-2243
We study the problem of greedy, single path data propagation in wireless sensor networks, aiming mainly to minimize the energy dissipation. In particular, we first mathematically analyze and experimentally evaluate the energy efficiency and latency of three characteristic protocols, each one selecting the next hop node with respect to a different criterion (minimum projection, minimum angle and minimum distance to the destination). Our analytic and simulation findings suggest that any single criterion does not simultaneously satisfy both energy efficiency and low latency. Towards parameterized energy–latency trade-offs we provide as well hybrid combinations of the two criteria (direction and proximity to the sink). Our hybrid protocols achieve significant performance gains and allow fine-tuning of desired performance. Also, they have nice energy balance properties, and can prolong the network lifetime. 相似文献
8.
《Computers & Education》2013,60(4):1246-1256
In this paper, an online game was developed in the form of a competitive board game for conducting web-based problem-solving activities. The participants of the game determined their move by throwing a dice. Each location of the game board corresponds to a gaming task, which could be a web-based information-searching question or a mini-game; the former was used to guide the participants to search for information to answer a series of questions related to the target learning issue, while the latter was used to provide supplementary materials during the gaming process. To evaluate the performance of the proposed approach, an experiment was conducted on an elementary school natural science course. The experimental results showed that the proposed approach not only significantly promoted the flow experience, learning attitudes, learning interest and technology acceptance degree of the students, but also improved their learning achievements in the web-based problem-solving activity. 相似文献
9.
10.
Fuu-Cheng Jiang Chu-Hsing Lin Der-Chen Huang Chao-Tung Yang 《The Journal of supercomputing》2012,59(1):268-296
The operational patterns of multifarious backup strategies on AODV-based (Ad-hoc On-Demand Vector) routing protocols are elaborated
in this article. To have a broader picture on relevant routing protocols together, variants of AODV-based backup routing protocols
are formulated by corresponding algorithms, and also each of them are simulated to obtain the necessary performance metrics
for comparisons in terms of packet delivery ratio, average latency delay, and the normalized routing load. Then to make the
process of data salvation more efficiently in case of link failure, we explore the possibility of combining the AODV backup
routing strategy and on-demand node-disjoint multipath routing protocols. This article proposes an improved approach named
DPNR (Dual Paths Node-disjoint Routing) for data salvation, a routing protocol that maintains the only two shortest backup
paths in the source and destination nodes. The DPNR scheme can alleviate the redundancy-frames overhead during the process
of data salvation by the neighboring intermediate nodes. Our simulation results have demonstrated that DPNR scheme delivers
good data delivery performance while restricting the impacts of transmission collision and channel contention. The mathematical
rationale for our proposed approach is stated as well. 相似文献
11.
Naïve–Bayes Classifier (NBC) is widely used for classification in machine learning. It is considered as the first choice for many classification problems because of its simplicity and classification accuracy as compared to other supervised learning methods. However, for high dimensional data like gene expression data, it does not perform well due to two major limitations i.e. underflow and overfitting. In order to address the problem of underflow, the existing approach adopted is to add the logarithms of probabilities rather than multiplying probabilities and the estimate approach is used for providing solution to overfitting problem. However, in practice for gene expression data, these approaches do not perform well. In this paper, a novel approach has been proposed to overcome the limitations using a robust function for estimating probabilities in Naïve–Bayes Classifier. The proposed method not only resolves the limitation of NBC but also improves the classification accuracy for gene expression data. The method has been tested over several benchmark gene expression datasets of high dimension. Comparative results of proposed Robust Naïve–Bayes Classifier (R-NBC) and existing NBC for gene expression data have also been illustrated to highlight the effectiveness of the R-NBC. Simulation study has also been performed to depict the robustness of the R-NBC over the existing approaches. 相似文献
12.
《Computer Standards & Interfaces》2014,36(1):110-121
Data-Centric Publish–Subscribe (DCPS) is an architectural and communication paradigm where applications exchange data-content through a common data-space using publish-subscribe interactions. Due to its focus on data-content, DCPS is especially suitable for deploying IoT systems. However, some problems must be solved to support large deployments. In this paper we define a novel extension to the IETF REsource LOcation And Discovery (RELOAD) protocol specification for providing content discovery and transfer in big scale IoT deployments. We have conducted a set of experiments over multiple simulated networks of 500 to 10,000 nodes that demonstrate the viability, scalability, and robustness of our proposal. 相似文献
13.
A New Method of Solving Kernels in Algebraic Decomposition for the Synthesis of Logic Cell Array 下载免费PDF全文
In this paper,an improvement has been made on the algorithm of solving the kernels of new functions which are generated after a common divisor appearing in the original functions is replaced by a new intermediate variable.And an efficient method based on kernel heritage is presented.This method has been successfully used in synthesis of LCA (Logic Cell Array). 相似文献
14.
Data are considered to be important organizational assets because of their assumed value, including their potential to improve
the organizational decision-making processes. Such potential value, however, comes with various costs, including those of
acquiring, storing, securing and maintaining the given assets at appropriate quality levels. Clearly, if these costs outweigh
the value that results from using the data, it would be counterproductive to acquire, store, secure and maintain the data.
Thus cost–benefit assessment is particularly important in data warehouse (DW) development; yet very few techniques are available
for determining the value that the organization will derive from storing a particular data table and hence determining which
data set should be loaded in the DW. This research seeks to address the issue of identifying the set of data with the potential
for producing the greatest net value for the organization by offering a model that can be used to perform a cost–benefit analysis
on the decision support views that the warehouse can support and by providing techniques for estimating the parameters necessary
for this model. 相似文献
15.
In some retrospective observational studies, the subject is asked to recall the age at a particular landmark event. The resulting data may be partially incomplete because of the inability of the subject to recall. This type of incompleteness may be regarded as interval censoring, where the censoring is likely to be informative. The problem of fitting Cox’s relative risk regression model to such data is considered. While a partial likelihood is not available, a method of semi-parametric inference of the regression parameters as well as the baseline distribution is proposed. Monte Carlo simulations show reasonable performance of the regression parameters, compared to Cox estimators of the same parameters computed from the complete version of the data. The proposed method is illustrated through the analysis of data on age at menarche from an anthropometric study of adolescent and young adult females in Kolkata, India. 相似文献
16.
The main focus of data distribution management (DDM) in HLA is to reduce the amount of data received by federates in large-scale distributed simulations. The use of limited multicast resources plays a key role in the performance of DDM. In order to improve the performance of DDM by using communication protocol effectively, a hybrid multicast–unicast data transmission problem and its formal definition are presented, and then a hybrid multicast–unicast assignment approach is proposed. The approach uses a new adaptive communication protocol selection (ACPS) strategy to utilize the advantages of multicast and unicast, avoid their disadvantages, and consider the inter-relationship between connections. It includes the ACPS static assignment algorithm and the ACPS dynamic assignment algorithm, according to the difference between the static connections and the dynamic connections. In our approach, a concept of distance is presented to measure the inter-relationship between connections for multicast and the message redundancy for unicast, which is the core of the two algorithms in order to gather the connections to a multicast group or to balance the use of unicast and multicast for best performance. As a result, our algorithms can more effectively decide whether a new connection should use unicast or multicast communication, and whether adjusting previous assignment result can further improve the performance. In addition, a control mechanism is introduced to deal with connection changes during the dynamic assignment. The experiment results indicate that our algorithms can utilize the multicast and unicast communication resources effectively, as well as can achieve better performance than existing methods in the real running environment. 相似文献
17.
It has been experimentally demonstrated by Faugère that his F5 algorithm is the fastest algorithm for calculating Gröbner bases. Computational efficiency of F5 is due to not only applying linear algebra but also using the new F5 criterion for revealing useless zero reductions. At the ISSAC 2010 conference, Gao, Guan, and Volny presented G2V, a new version of the F5 algorithm, which is simpler than the original version of the algorithm. However, the incremental structure of G2V used in the algorithm for applying the F5 criterion is a serious obstacle from the point of view of application of Buchberger’s second criterion. In this paper, a modification of the G2V algorithm is presented, which makes it possible to use both Buchberger criteria. To experimentally study computational effect of the proposed modification, we implemented the modified algorithm in Maple. Results of comparison of G2V and its modified version on a number of test examples are presented. 相似文献
18.
19.
Jörg Becker Patrick Delfmann Hanns-Alexander Dietrich Matthias Steinhorst Mathias Eggert 《Information Systems Frontiers》2016,18(2):359-405
Given the strong increase in regulatory requirements for business processes the management of business process compliance becomes a more and more regarded field in IS research. Several methods have been developed to support compliance checking of conceptual models. However, their focus on distinct modeling languages and mostly linear (i.e., predecessor-successor related) compliance rules may hinder widespread adoption and application in practice. Furthermore, hardly any of them has been evaluated in a real-world setting. We address this issue by applying a generic pattern matching approach for conceptual models to business process compliance checking in the financial sector. It consists of a model query language, a search algorithm and a corresponding modelling tool prototype. It is (1) applicable for all graph-based conceptual modeling languages and (2) for different kinds of compliance rules. Furthermore, based on an applicability check, we (3) evaluate the approach in a financial industry project setting against its relevance for decision support of audit and compliance management tasks. 相似文献
20.
R. K. Teruiya A. R. Dos Santos R. Dall'Agnol P. Veneziani 《International journal of remote sensing》2013,34(13):3957-3974
This study examines the value of integrating airborne C‐band Synthetic Aperture Radar (SAR), Landsat Thematic Mapper (TM) and airborne gamma spectroradiometric data with field, petrographic and geochemical data for geological mapping of the Cigano batholith, an important representative of the Paleoproterozoic granitic magmatism in the Carajás Mineral Province (CMP), Amazon region. Distinct schemes for the integration of radar/optical and radar/optical/gamma data were evaluated and the geological information derived from the integrated products was verified in the field. The investigation allowed the re‐evaluation of previous geological information and the definition of distinct domains within the Cigano pluton and country rocks. The importance of brittle structures related to the tectonic evolution of the area and the location of intensely altered zones in the pluton was emphasized, favouring new insights into current geological and exploratory models of the area. The application of a similar approach as operational routine in exploration programmes in the Amazon region is justified considering the limited geological information, the availability of aerogeophysical data and airborne/spaceborne remote sensing data (radar, optical) and the high costs of field mapping in this kind of terrain. 相似文献