首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
《Computers & chemistry》1993,17(3):257-263
The use of a combination of computer modelling and tritium radiolabelling is described as a technique for the investigation of the kinetics and mechanism of the autocatalytic reactions occurring in epoxy resin cure. Rate constants for substituted anilines reacting with phenylglycidylether have been derived and the effect of steric hindrance investigated. The program is described and its use in the explanation and understanding of the experimental data is illustrated.  相似文献   

2.
Fully atomistic molecular dynamics (MD) simulations were used to predict the properties of diglycidyl ether of bisphenol F (DGEBF) crosslinked with curing agent diethyltoluenediamine (DETDA). This polymer is a commercially important epoxy resin and a candidate for applications in nanocomposites. The calculated properties were density and bulk modulus (at near-ambient pressure and temperature) and glass transition temperature (at near-ambient pressure). The molecular topology, degree of curing, and MD force-field were investigated as variables. The models were created by densely packing pre-constructed oligomers of different composition and connectivity into a periodic simulation box. For high degrees of curing (greater than 90%), the density was found to be insensitive to the molecular topology and precise value of degree of curing. Of the two force-fields that were investigated, cff91 and COMPASS, the latter clearly gave more accurate values for the density as compared to experiment. In fact, the density predicted by COMPASS was within 6% of reported experimental values for the highly crosslinked polymer. The predictions of both force-fields for glass transition temperature were within the range of reported experimental values, with the predictions of cff91 being more consistent with a highly cured resin.  相似文献   

3.
Fully atomistic molecular dynamics (MD) simulations were used to predict the properties of diglycidyl ether of bisphenol F (DGEBF) crosslinked with curing agent diethyltoluenediamine (DETDA). This polymer is a commercially important epoxy resin and a candidate for applications in nanocomposites. The calculated properties were density and bulk modulus (at near-ambient pressure and temperature) and glass transition temperature (at near-ambient pressure). The molecular topology, degree of curing, and MD force-field were investigated as variables. The models were created by densely packing pre-constructed oligomers of different composition and connectivity into a periodic simulation box. For high degrees of curing (greater than 90%), the density was found to be insensitive to the molecular topology and precise value of degree of curing. Of the two force-fields that were investigated, cff91 and COMPASS, the latter clearly gave more accurate values for the density as compared to experiment. In fact, the density predicted by COMPASS was within 6% of reported experimental values for the highly crosslinked polymer. The predictions of both force-fields for glass transition temperature were within the range of reported experimental values, with the predictions of cff91 being more consistent with a highly cured resin.  相似文献   

4.
把环氧树脂与固化增韧剂按照一定质量比例混合,对光纤Bragg光栅(FBG)进行封装处理,封装后FBG应变与温度传感线性度非常好,相关系数到达0.99以上,应变与温度灵敏度系数分别达到了1.8pm/10-6和144.9pm/℃,与裸FBG测试结果相比,应变灵敏度系数提高了1.64倍,温度灵敏度系数提高了14.3倍,抗压强...  相似文献   

5.
Molding technologies associated with fabricating macro scale polymer components such as injection molding and hot embossing have been adapted with considerable success for fabrication of polymer microparts. While the basic principles of the process remain the same, the precision with which the processing parameters need to be controlled especially in the case of molding high aspect ratio (HAR) polymer microparts into polymer sheets is much greater than in the case of macro scale parts. It is seen that the bulk effects of the mold insert fixture and molding machine have a dominant influence on the molding parameters and that differences in material parameters such as the glass transition temperature (T g) of polymer sheets are critical for the success and typically differ from sheet to sheet. This makes it very challenging to establish standard processing parameters for hot embossing of sheet polymers. In the course of this paper, a methodology for developing a hot embossing process for HAR microstructures based on known material properties and considering the cumulative behavior of mold, material, and machine will be presented. Using this method force–temperature–deflection curves were measured with the intent of fine tuning the hot embossing process. Tests were carried out for different materials using a dummy mold insert yielding information that could be directly transferred to the actual mold insert with minimum development time and no risk of damage to the actual microstructures.  相似文献   

6.
针对光纤光栅(FBG)与被测金属构件可靠连接问题,提出环氧树脂掺金属粉末嵌入式封装技术,阐明了该封装工艺,采用纯弯曲梁对裸光纤光栅和封装后的光纤光栅分别进行应变实验,结果表明,经环氧树脂掺金属粉末封装后的光纤光栅传感器应变灵敏度是裸光纤光栅的1.3倍,达到1.53pm/με,具有很好的重复性,该方法提高了嵌入光纤光栅后被测金属构件的机械强度.  相似文献   

7.
8.
由某石化公司树脂所提出的开发合成树脂产品数据库系统,包括后台管理程序和远程查询录入网页两大部分。该系统用于对合成树脂产品和原料的各种性能数据进行维护,同时为局域网内的用户提供网络查询功能。本文从需求分析,流程分析,开发及运行环境选择,设计与实现4个方面对该系统的开发过程进行讲述,最后介绍了该系统的运行情况。目前该系统已在实际应用中,运行良好。实践证明使用该系统管理数据可以显著的提高生产率。  相似文献   

9.
When evaluating new protocols and algorithms it is often desirable to be able to rapidly build a specialized simulator from scratch. The advantage of this approach is that software designers can avoid the often‐steep learning curve associated with using existing simulators, and that specialized simulators can have significant computational advantages over general‐purpose programs. The drawback is that simulators can be complex to develop and debug. This paper describes our experiences developing SESAME, a Java‐based simulator designed to analyze multicast protocol performance. By using an appropriate set of design patterns we were able to rapidly develop a reusable and easy‐to‐implement framework for multicast algorithm and protocol analysis. We developed several useful approaches for rapid simulation development, including a multicast visualization tool and techniques for simulating concurrent processing within a computer network that substantially simplified code development and maintenance. Owing to performance concerns, we also ran an extensive series of performance tests of different priority queue implementations. Our design approach and the lessons that we learned in developing SESAME can be applied to rapid simulation development in general. Further, our performance results suggest that by using commercial off‐the‐shelf Java development environments it is possible to obtain sufficient performance for many time‐sensitive applications such as discrete event simulation. Copyright © 2001 John Wiley & Sons, Ltd.  相似文献   

10.
多传感数据的判别分析方法   总被引:3,自引:2,他引:3  
在多传感数据的处理过程中 ,有时要利用多传感器对某一对象进行不同方面观测 ,针对所获得数据研究观察对象属性。此类问题的本质是判别对象的归属类型 ,就此问题利用多元统计理论 ,提出一种简明的判别方法。  相似文献   

11.
This paper reports a new idea-screening method for new product development (NPD) with a group of decision makers having imprecise, inconsistent and uncertain preferences. The traditional NPD analysis method determines the solution using the membership function of fuzzy sets which cannot treat negative evidence. The advantage of vague sets, with the capability of representing negative evidence, is that they support the decision makers with the ability of modeling uncertain opinions. In this paper, we present a new method for new-product screening in the NPD process by relaxing a number of assumptions so that imprecise, inconsistent and uncertain ratings can be considered. In addition, a new similarity measure for vague sets is introduced to produce a ratings aggregation for a group of decision makers. Numerical illustrations show that the proposed model can outperform conventional fuzzy methods. It is able to provide decision makers (DMs) with consistent information and to model situations where vague and ill-defined information exist in the decision process.  相似文献   

12.
Logistics industry is an integral sector encompassing transportation, warehousing, handling, circulation and processing, delivery and information technology. With the progress of economic globalization and integration, logistics industry has become a new momentum driving the fast development of national and regional economy. The close relationship between economic development and logistics advancement receives wide attention from the academia. However, current research on the coordination between economy and logistics mostly focuses on concept interpretation, and qualitative discussions. Very rarely do scholars conduct quantitative analysis on the coordination of metropolitan economy and logistics. To fill this gap, we first examine whether there exist interactions between metropolitan logistics and economy by building evaluation index systems for metropolitan logistics and economy. Then we introduce the entropy method and Granger causality test to evaluate and test the level of logistics and economic development in five cities: Beijing, Shanghai, Guangzhou, Chongqing, and Tianjin from 2009 to 2013. From the dimensions of regional economic investment, regional economic capacity and strength, we finally test the relationship between three economic subsystems and three logistics subsystems to further validate the relationship between metropolitan economy and logistics.  相似文献   

13.
A fuzzy group-preferences analysis method for new-product development   总被引:1,自引:0,他引:1  
This paper reports a new idea-screening method for new product development (NPD) with a group of decision makers having imprecise, inconsistent and uncertain preferences. The traditional NPD analysis method determines the solution using the membership function of fuzzy sets which cannot treat negative evidence. The advantage of vague sets, with the capability of representing negative evidence, is that they support the decision makers with the ability of modeling uncertain opinions. In this paper, we present a new method for new-product screening in the NPD process by relaxing a number of assumptions so that imprecise, inconsistent and uncertain ratings can be considered. In addition, a new similarity measure for vague sets is introduced to produce a ratings aggregation for a group of decision makers. Numerical illustrations show that the proposed model can outperform conventional fuzzy methods. It is able to provide decision makers (DMs) with consistent information and to model situations where vague and ill-defined information exist in the decision process.  相似文献   

14.
NetMagic平台硬件开发方法   总被引:2,自引:0,他引:2  
NetMagic作为支持网络研究的实验平台,已经获得广泛应用。NetMagic平台采用决策与执行相分离的原则和扩充Rule接口的方式,进一步降低了NetMagic平台硬件开发的复杂性。基于用户自定义逻辑UM的NetMagic硬件开发方法和UM开发接口规范的提出不但有效简化了用户逻辑设计工作,而且支持NetMagic资源的优化利用。基于NetMagic平台设计的N Probe为NetMagic的硬件开发提供了一个典型的参考模型。  相似文献   

15.
A tool that bridges the gap between the theory and practice of program analysis specifications is described. The tool supports a high-level specification language that enables clear and concise expression of analysis algorithms. The denotational nature of the specifications eases the derivation of formal proofs of correctness for the analysis algorithm. SPARE (structured program analysis refinement environment) is based on a hybrid approach that combines the positive aspects of both the operational and the semantics-driven approach. An extended denotational framework is used to provide specifications in a modular fashion. Several extensions to the traditional denotational specification language have been designed to allow analysis algorithms to be expressed in a clear and concise fashion. This extended framework eases the design of analysis algorithms as well as the derivation of correctness proofs. The tool provides automatic implementation for testing purposes  相似文献   

16.
We present a scheme for building a decision model for analyzing the quality of short-term liquidity from domain experts. This scheme combines the model building features of both process tracing approach and output analysis approach. The process tracing component applies the Concurrent Verbal Protocol Analysis to build a decision model by tracing through verbalized decision procedures from domain experts. Its performance consistency is then verified by the application of Probability Neural Network as an output analysis method. The results of verification provide the basis for further refinement of the decision model. This scheme retains the explanation capability of the Protocol Analysis, and, at the same time, provides an opportunity for researchers to rectify some of the inherent problems associated with it.  相似文献   

17.
18.
The paper considers the use of a method for the identification and verification of significant patterns to find clear and measurable differences between groups of countries by the nature of the relationship of the dynamics of their macroeconomic indicators. The analysis is conducted using panel data that include annual values of a number of economic indicators in a specified time intervals. An approach based on permutation tests is used to take into account the effect of multiple testing. A technology that combines correlation analysis with the detection of significant patterns has made it possible to reveal statistically significant differences between groups of countries with two types of institutional matrices as identified by sociologists.  相似文献   

19.
信息安全风险评估的数值分析法初探   总被引:2,自引:0,他引:2  
介绍了一种对信息安全风险的数值分析方法,包括对资产价值的评估、对威胁和薄弱点评估以及最终的风险计算和函数拟合方法。通过这种方法,可以在传统方法的基础上,定量地计算出不同信息资产的风险程度,以供安全方案的设计和投资指导之用。  相似文献   

20.
A patent quality analysis for innovative technology and product development   总被引:1,自引:0,他引:1  
Enterprises evaluate intellectual property rights and the quality of patent documents in order to develop innovative products and discover state-of-the-art technology trends. The product technologies covered by patent claims are protected by law, and the quality of the patent insures against infringement by competitors while increasing the worth of the invention. Thus, patent quality analysis provides a means by which companies determine whether or not to customize and manufacture innovative products. Since patents provide significant financial protection for businesses, the number of patents filed is increasing at a fast pace. Companies which cannot process patent information or fail to protect their innovations by filing patents lose market competitiveness. Current patent research is needed to estimate the quality of patent documents. The purpose of this research is to improve the analysis and ranking of patent quality. The first step of the proposed methodology is to collect technology specific patents and to extract relevant patent quality performance indicators. The second step is to identify the key impact factors using principal component analysis. These factors are then used as the input parameters for a back-propagation neural network model. Patent transactions help judge patent quality and patents which are licensed or sold with intellectual property usage rights are considered high quality patents. This research collected 283 patents sold or licensed from the news of patent transactions and 116 patents which were unsold but belong to the technology specific domains of interest. After training the patent quality model, 36 historical patents are used to verify the performance of the trained model. The match between the analytical results and the actual trading status reached an 85% level of accuracy. Thus, the proposed patent quality methodology evaluates the quality of patents automatically and effectively as a preliminary screening solution. The approach saves domain experts valuable time targeting high value patents for R&D commercialization and mass customization of products.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号