首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   36篇
  免费   6篇
化学工业   21篇
轻工业   6篇
无线电   2篇
一般工业技术   4篇
自动化技术   9篇
  2022年   1篇
  2021年   3篇
  2020年   3篇
  2019年   3篇
  2018年   4篇
  2017年   2篇
  2016年   6篇
  2015年   4篇
  2014年   2篇
  2013年   4篇
  2012年   3篇
  2011年   5篇
  2010年   2篇
排序方式: 共有42条查询结果,搜索用时 15 毫秒
1.
Data Grid is a geographically distributed environment that deals with large-scale data-intensive applications. Effective scheduling in Grid can reduce the amount of data transferred among nodes by submitting a job to a node, where most of the requested data files are available. Data replication is another key optimization technique for reducing access latency and managing large data by storing data in a wisely manner. In this paper two algorithms are proposed, first a novel job scheduling algorithm called Combined Scheduling Strategy (CSS) that uses hierarchical scheduling to reduce the search time for an appropriate computing node. It considers the number of jobs waiting in queue, the location of required data for the job and the computing capacity of sites. Second a dynamic data replication strategy, called the Modified Dynamic Hierarchical Replication Algorithm (MDHRA) that improves file access time. This strategy is an enhanced version of Dynamic Hierarchical Replication (DHR) strategy. Data replication should be used wisely because the storage capacity of each Grid site is limited. Thus, it is important to design an effective strategy for the replication replacement. MDHRA replaces replicas based on the last time the replica was requested, number of access, and size of replica. It selects the best replica location from among the many replicas based on response time that can be determined by considering the data transfer time, the storage access latency, the replica requests that waiting in the storage queue and the distance between nodes. The simulation results demonstrate the proposed replication and scheduling strategies give better performance compared to the other algorithms.  相似文献   
2.
The Data Grid provides massive aggregated computing resources and distributed storage space to deal with data-intensive applications. Due to the limitation of available resources in the grid as well as production of large volumes of data, efficient use of the Grid resources becomes an important challenge. Data replication is a key optimization technique for reducing access latency and managing large data by storing data in a wise manner. Effective scheduling in the Grid can reduce the amount of data transferred among nodes by submitting a job to a node where most of the requested data files are available. In this paper two strategies are proposed, first a novel job scheduling strategy called Weighted Scheduling Strategy (WSS) that uses hierarchical scheduling to reduce the search time for an appropriate computing node. It considers the number of jobs waiting in a queue, the location of the required data for the job and the computing capacity of the sites Second, a dynamic data replication strategy, called Enhanced Dynamic Hierarchical Replication (EDHR) that improves file access time. This strategy is an enhanced version of the Dynamic Hierarchical Replication strategy. It uses an economic model for file deletion when there is not enough space for the replica. The economic model is based on the future value of a data file. Best replica placement plays an important role for obtaining maximum benefit from replication as well as reducing storage cost and mean job execution time. So, it is considered in this paper. The proposed strategies are implemented by OptorSim, the European Data Grid simulator. Experiment results show that the proposed strategies achieve better performance by minimizing the data access time and avoiding unnecessary replication.  相似文献   
3.
In this article, the ballistic behavior of the glass/epoxy/nanoclay hybrid nanocomposites is studied. The fiber glass used is a plain weave 200 g/m2, while the nanoclay is an organically modified montmorillonite nanoclay (Closite 30B). The epoxy resin system is made of Epon 828 as the epoxy prepolymer and Jeffamine D‐400 as the curing agent. 0, 3, 5, 7, and 10 wt% of nanoclay particles are dispersed in the epoxy resin. Ballistic tests are performed using flat‐ended projectiles in impact velocities 134 m/s and 169 m/s. The results show that the energy absorption capability and mechanical properties of the composite can be significantly enhanced by adding nanoparticles. When the impact velocity is 134 m/s, near than the ballistic limit, the most increase in the energy absorption capability is observed in 3 wt% nanoclay while with the impact velocity 169 m/s, beyond the ballistic limit, the highest increase is observed in 10 wt% nanoclay. POLYM. COMPOS., 37:1173–1179, 2016. © 2014 Society of Plastics Engineers  相似文献   
4.
Catalysts have a major role in the polymerization of olefins and exert their influence in three ways: (1) polymerization behaviour, including polymerization activity and kinetics; (2) polymer particle morphology, including bulk density, particle size, particle size distribution and particle shape; and (3) polymer microstructure, including molecular weight regulation, chemical composition distribution and short‐ and long‐chain branching. By tailoring the catalyst structure, such as the creation of a bridge or introducing a substituent on the ligand, metallocene catalysts can play a major role in the achievement of desirable properties. Kinetic profiles of the metallocene catalyst used in this study showed decay‐type behaviour for copolymerization of ethylene/α‐olefins. It was observed that increasing the comonomer ratio in the feedstock affected physical properties such as reducing the melting temperature, crystallinity, density and molecular weight of the copolymers. It was also observed that the heterogeneity of the chemical composition distribution and the physical properties were enhanced as the comonomer molecular weight was increased. In particular, 2‐phenyl substitution on the indenyl ring reduced somewhat the melting point of the copolymers. In addition, the copolymer produced using bis(2‐phenylindenyl)zirconium dichloride (bis(2‐PhInd)ZrCl2) catalyst exhibited a narrower distribution of lamellae (0.3–0.9 nm) than the polymer produced using bisindenylzirconium dichloride catalyst (0.5–3.6 nm). The results obtained indicate that the bis(2‐PhInd)ZrCl2 catalyst showed a good comonomer incorporation ability. The heterogeneity of the chemical composition distribution and the physical properties were influenced by the type of comonomer and type of substituent in the catalyst. Copyright © 2010 Society of Chemical Industry  相似文献   
5.
In this paper we have designed an acceptance single sampling plan with inspection errors when the fraction of defective items is a fuzzy number. We have shown that the operating characteristics curve of this plan is like a band having high and low bounds, its width depends on the ambiguity of proportion parameter in the lot when the samples size and acceptance numbers are fixed. A comparison of the single sampling plans with and without inspection errors was done to study the effects upon the characteristics. The results of this comparison show that in the sampling plan with inspection errors, there is a lower operating characteristics band in comparison to a sampling plan without inspection errors for good processing quality. We have also shown that the incorrect classification of a good item reduces the fuzzy probability of acceptance and incorrect classification of a defective item results in a higher fuzzy probability of acceptance.  相似文献   
6.
SAPO-34 nanocrystals (inorganic filler) were incorporated in polyurethane membranes and the permeation properties of CO2, CH4, and N2 gases were explored. In this regard, the synthesized PU-SAPO-34 mixed matrix membranes (MMMs) were characterized via SEM, AFM, TGA, XRD and FTIR analyses. Gas permeation properties of PU-SAPO-34 MMMs with SAPO-34 contents of 5 wt%, 10 wt% and 20 wt% were investigated. The permeation results revealed that the presence of 20 wt% SAPO-34 resulted in 4.45%, 18.24% and 40.2% reductions in permeability of CO2, CH4, and N2, respectively, as compared to the permeability of neat polyurethane membrane. Also, the findings showed that at the pressure of 1.2 MPa, the incorporation of 20 wt% SAPO-34 into the polyurethane membranes enhanced the selectivity of CO2/CH4 and CO2/N2, 14.43 and 37.46%, respectively. In this research, PU containing 20 wt% SAPO-34 showed the best separation performance. For the first time, polynomial regression (PR) as a simple yet accurate tool yielded a mathematical equation for the prediction of permeabilities with high accuracy (R2 > 99%).  相似文献   
7.
Network anomaly detection is one of the most challenging fields in cyber security. Most of the proposed techniques have high computation complexity or based on heuristic approaches. This paper proposes a novel two-tier classification models based on machine learning approaches Naïve Bayes, certainty factor voting version of KNN classifiers and also Linear Discriminant Analysis for dimension reduction. Experimental results show a desirable and promising gain in detection rate and false alarm compared with other existing models. The model also trained by two generated balance training sets using SMOTE method to evaluate the chosen similarity measure for dealing with imbalanced network anomaly data sets. The two-tier model provides low computation time due to optimal dimension reduction and feature selection, as well as good detection rate against rare and complex attack types which are so dangerous because of their close similarity to normal behaviors like User to Root and Remote to Local. All evaluation processes experimented by NSL-KDD data set.  相似文献   
8.
The rapid growth in demand for computational power has led to a shift to the cloud computing model established by large-scale virtualized data centers. Such data centers consume enormous amounts of electrical energy. Cloud providers must ensure that their service delivery is flexible to meet various consumer requirements. However, to support green computing, cloud providers also need to minimize the cloud infrastructure energy consumption while conducting the service delivery. In this paper, for cloud environments, a novel QoS-aware VMs consolidation approach is proposed that adopts a method based on resource utilization history of virtual machines. Proposed algorithms have been implemented and evaluated using CloudSim simulator. Simulation results show improvement in QoS metrics and energy consumption as well as demonstrate that there is a trade-off between energy consumption and quality of service in the cloud environment.  相似文献   
9.
Using a continuous flow apparatus, the ternary solubility of mono- and di-tert-butyl ethers of glycerol (MTBG and DTBG, respectively) in supercritical carbon dioxide was measured at the temperatures of 313.15, 333.15, and 348.15 K; a pressure range of 80-200 bar; and an expanded gas flow rate of 180 ± 10 mL min−1 at average laboratory temperature of 300.15 K and pressure of 0.89 bar. The ternary solubility of the ethers at the constant temperatures of 333.15 and 348.15 K increased with increasing pressure up to the crossover point (i.e., 152 bar for MTBG and 170 bar for DTBG). MTBG exhibited a higher solubility than DTBG in scCO2. The experimental data for the ternary solubility of MTBG and DTBG were correlated using the Bartle equation.  相似文献   
10.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号