首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到12条相似文献,搜索用时 187 毫秒
1.
针对常规宽带能量检测方法对低信噪比线谱目标检测性能较差的不足,文章在分析目标线谱波达方向(Direction of Arrival,DOA)估计分布信息熵的基础上,提出一种基于DOA分布信息熵加权的线谱目标检测方法。通过仿真对比分析了该方法的检测性能,并利用海上实验数据验证了其有效性。结果表明,当目标方位较为稳定时,该方法可有效检测低信噪比线谱目标,并可适用存在多个同频线谱目标的情况。  相似文献   

2.
In this study copper indium gallium diselenide photovoltaic (PV) modules were subjected to a thorough indoor assessment procedure. The assessment is to be used as a baseline for future evaluation of the modules deployed outdoors as part of an ongoing evaluation of device performance and degradation. The main focus of the study is the long term monitoring of the devices to determine service lifetime. In this paper we will present initial results of the baseline evaluation, namely I-V characteristics, thorough visual inspection and an analysis of performance parameters. The results obtained revealed that the performance of one of the modules was inferior to the others evaluated. In order to further investigate this, laser beam induced current (LBIC) measurements were conducted on regions that had a non-uniform appearance as observed visually.  相似文献   

3.
In order to evaluate accurately a station blackout (SBO) event frequency of a multi-unit nuclear power plant that has a shared alternate AC (AAC) power source, an approach has been developed which accommodates the complex inter-unit behavior of the shared AAC power source under multi-unit loss of offsite power conditions. The SBO frequency at a target unit of probabilistic safety assessment could be underestimated if the inter-unit dependency of the shared AAC power source is not properly modeled.The approach is illustrated for two cases, 2 units and 4 units at a single site, and generalized for a multi-unit site. Furthermore, the SBO frequency of the first unit of the 2-unit site is quantified. The methodology suggested in the present paper is believed to be very useful in evaluating the SBO frequency and the core damage frequency resulting from the SBO event. This approach is also applicable to the probabilistic evaluation of the other shared systems in a multi-unit nuclear power plant.  相似文献   

4.
The sustained transmission and spread of environmentally mediated infectious diseases is governed in part by the dispersal of parasites, disease vectors and intermediate hosts between sites of transmission. Functional geospatial models can be used to quantify and predict the degree to which environmental features facilitate or limit connectivity between target populations, yet typical models are limited in their geographical and analytical approach, providing simplistic, global measures of connectivity and lacking methods to assess the epidemiological implications of fine-scale heterogeneous landscapes. Here, functional spatial models are applied to problems of surveillance and control of the parasitic blood fluke Schistosoma japonicum and its intermediate snail host Oncomelania haupensis in western China. We advance functional connectivity methods by providing an analytical framework to (i) identify nodes of transmission where the degree of connectedness to other villages, and thus the potential for disease spread, is higher than is estimated using Euclidean distance alone and (ii) (re)organize transmission sites into disease surveillance units based on second-order relationships among nodes using non-Euclidean distance measures, termed effective geographical distance (EGD). Functional environmental models are parametrized using ecological information on the target organisms, and pair-wise distributions of inter-node EGD are estimated. A Monte Carlo rank product analysis is presented to identify nearby nodes under alternative distance models. Nodes are then iteratively embedded into EGD space and clustered using a k-means algorithm to group villages into ecologically meaningful surveillance groups. A consensus clustering approach is taken to derive the most stable cluster structure. The results indicate that novel relationships between nodes are revealed when non-Euclidean, ecologically determined distance measures are used to quantify connectivity in heterogeneous landscapes. These connections are not evident when analysing nodes in Euclidean space, and thus surveillance and control activities planned using Euclidean distance measures may be suboptimal. The methods developed here provide a quantitative framework for assessing the effectiveness of ecologically grounded surveillance systems and of control and prevention strategies for environmentally mediated diseases.  相似文献   

5.
Human dental tissues consist of inorganic constituents (mainly crystallites of hydroxyapatite, HAp) and organic matrix. In addition, synthetic HAp powders are frequently used in medical and chemical applications. Insights into the ultrastructural alterations of skeletal hard tissues exposed to thermal treatment are crucial for the estimation of temperature of exposure in forensic and archaeological studies. However, at present, only limited data exist on the heat-induced structural alterations of human dental tissues. In this paper, advanced non-destructive small- and wide angle X-ray scattering (SAXS/WAXS) synchrotron techniques were used to investigate the in situ ultrastructural alterations in thermally treated human dental tissues and synthetic HAp powders. The crystallographic properties were probed by WAXS, whereas HAp grain size distribution changes were evaluated by SAXS. The results demonstrate the important role of the organic matrix that binds together the HAp crystallites in responding to heat exposure. This is highlighted by the difference in the thermal behaviour between human dental tissues and synthetic HAp powders. The X-ray analysis results are supported by thermogravimetric analysis. The results concerning the HAp crystalline architecture in natural and synthetic HAp powders provide a reliable basis for deducing the heating history for dental tissues in the forensic and archaeological context, and the foundation for further development and optimization of biomimetic material design.  相似文献   

6.
Sensitivity analysis plays an important role in reliability evaluation, structural optimization and structural design, etc. The local sensitivity, i.e., the partial derivative of the quantity of interest in terms of parameters or basic variables, is inadequate when the basic variables are random in nature. Therefore, global sensitivity such as the Sobol’ indices based on the decomposition of variance and the moment-independent importance measure, among others, have been extensively studied. However, these indices are usually computationally expensive, and the information provided by them has some limitations for decision making. Specifically, all these indices are positive, and therefore they cannot reveal whether the effects of a basic variable on the quantity of interest are positive or adverse. In the present paper, a novel global sensitivity index is proposed when randomness is involved in structural parameters. Specifically, a functional perspective is firstly advocated, where the probability density function (PDF) of the output quantity of interest is regarded as the output of an operator on the PDF of the source basic random variables. The Fréchet derivative is then naturally taken as a measure for the global sensitivity. In some sense such functional perspective provides a unified perspective on the concepts of global sensitivity and local sensitivity. In the case the change of the PDF of a basic random variable is due to the change of parameters of the PDF of the basic random variable, the computation of the Fréchet-derivative-based global sensitivity index can be implemented with high efficiency by incorporating the probability density evolution method (PDEM) and change of probability measure (COM). The numerical algorithms are elaborated. Several examples are illustrated, demonstrating the effectiveness of the proposed method.  相似文献   

7.
This experimental study is carried out to investigate reliability and effectiveness of a new method of using photo-coupler for detecting frost formation in an air source heat pump, and further to determine the most efficient initiation point of the defrost cycle. This new method of using photo-coupler as a frost sensing device is evaluated by comparing its performance with conventional time control defrost system in which defrost cycle is set to start at predetermined interval, e.g. about at every 1–1.5 h. Results indicate that overall heating capacity of photo-coupler detection method (case IV) is 5.5% higher than that of time control method. It is also shown that for maximum efficiency the defrost cycle must be initiated before the frost build-up area exceeds 45% of total front surface of the outdoor coil.  相似文献   

8.
The aim of this work is to prepare a new type of phosphogypsum-sulfur polymer cements (PG-SPC) to be utilised in the manufacture of building materials. Physico-chemical and radiological characterization was performed in phosphogypsum and phosphogypsum-sulfur polymer concretes and modeling of exhalation rates has been also carried out. An optimized mixture of the materials was obtained, the solidified material with optimal mixture (sulfur/phosphogypsum = 1:0.9, phosphogypsum dosage = 10-40 wt.%) results in highest strength (54-62 MPa) and low total porosity (2.8-6.8%). The activity concentration index (I) in the PG-SPC is lower than the reference value in the most international regulations and; therefore, these cements can be used without radiological restrictions in the manufacture of building materials. Under normal conditions of ventilation, the contribution to the expected radon indoor concentration in a standard room is below the international recommendations, so the building materials studied in this work can be applied to houses built up under normal ventilation conditions.Additionally, and taking into account that the PG is enriched in several natural radionuclides as 226Ra, the leaching experiments have demonstrated that environmental impact of the using of SPCs cements with PG is negligible.  相似文献   

9.
Over the past few decades there have been considerable efforts to use adsorption (solid/vapor) for cooling and heat pump applications, but intensified efforts were initiated only since the imposition of international restrictions on the production and utilization of CFCs and HCFCs. In this paper, a dual-mode silica gel–water adsorption chiller design is outlined along with the performance evaluation of the innovative chiller. This adsorption chiller utilizes effectively low-temperature solar or waste heat sources of temperature between 40 and 95 °C. Two operation modes are possible for the advanced chiller. The first operation mode will be to work as a highly efficient conventional chiller where the driving source temperature is between 60 and 95 °C. The second operation mode will be to work as an advanced three-stage adsorption chiller where the available driving source temperature is very low (between 40 and 60 °C). With this very low driving source temperature in combination with a coolant at 30 °C, no other cycle except an advanced adsorption cycle with staged regeneration will be operational. The drawback of this operational mode is its poor efficiency in terms of cooling capacity and COP. Simulation results show that the optimum COP values are obtained at driving source temperatures between 50 and 55 °C in three-stage mode, and between 80 and 85 °C in single-stage, multi-bed mode.  相似文献   

10.
This paper addresses the modeling of probability of dangerous failure on demand and spurious trip rate of safety instrumented systems that include MooN voting redundancies in their architecture. MooN systems are a special case of k-out-of-n systems. The first part of the article is devoted to the development of a time-dependent probability of dangerous failure on demand model with capability of handling MooN systems. The model is able to model explicitly common cause failure and diagnostic coverage, as well as different test frequencies and strategies. It includes quantification of both detected and undetected failures, and puts emphasis on the quantification of common cause failure to the system probability of dangerous failure on demand as an additional component. In order to be able to accommodate changes in testing strategies, special treatment is devoted to the analysis of system reconfiguration (including common cause failure) during test of one of its components, what is then included in the model. Another model for spurious trip rate is also analyzed and extended under the same methodology in order to empower it with similar capabilities. These two models are powerful enough, but at the same time simple, to be suitable for handling of dependability measures in multi-objective optimization of both system design and test strategies for safety instrumented systems. The level of modeling detail considered permits compliance with the requirements of the standard IEC 61508. The two models are applied to brief case studies to demonstrate their effectiveness. The results obtained demonstrated that the first model is adequate to quantify time-dependent PFD of MooN systems during different system states (i.e. full operation, test and repair) and different MooN configurations, which values are averaged to obtain the PFDavg. Also, it was demonstrated that the second model is adequate to quantify STR including spurious trips induced by internal component failure and by test itself. Both models were tested for different architectures with 1≤N≤5 and 2≤M≤5 subject to uniform staggered test. The results obtained also showed the effects that modifying M and N has on both PFDavg and STR, and also demonstrated the conflicting nature of these two measures with respect to one another.  相似文献   

11.
This paper reviews the historical development of the probabilistic risk assessment (PRA) methods and applications in the nuclear industry. A review of nuclear safety and regulatory developments in the early days of nuclear power in the United States has been presented. It is argued that due to technical difficulties for measuring and characterizing uncertainties and concerns over legal challenges, safety design and regulation of nuclear power plants has primarily relied upon conservative safety assessment methods derived based on a set of design and safety principles. Further, it is noted that the conservatism adopted in safety and design assessments has allowed the use of deterministic performance assessment methods. This approach worked successfully in the early years of nuclear power epoch as the reactor design proved to be safe enough. However, it has been observed that as the conservative approach to design and safety criteria proved arbitrary, and yielded inconsistencies in the degree to which different safety measures in nuclear power plants protect safety and public heath, the urge for a more consistent assessment of safety became apparent in the late 1960s. In the early 1970s, as a result of public and political pressures, then the US Atomic Energy Commission initiated a new look at the safety of the nuclear power plants through a comprehensive study called ‘Reactor Safety Study’ (WASH-1400, or ‘Rasmussen Study’—after its charismatic study leader Professor Norman Rasmussen of MIT) to demonstrate safety of the nuclear power plants. Completed in October 1975, this landmark study introduced a novel probabilistic, systematic and holistic approach to the assessment of safety, which ultimately resulted in a sweeping paradigm shift in safety design and regulation of nuclear power in the United States in the turn of the Century. Technical issues of historic significance and concerns raised by the subsequent reviews of the Rasmussen Study have been discussed. Effect of major events and developments such as the Three Mile Island accident and the Nuclear Regulatory Commission and the Nuclear Industry sponsored studies on the tools, techniques and applications of the PRA that culminated in the present day risk-informed initiatives has been discussed.  相似文献   

12.
In this paper, we propose a novel methodology to define and estimate a surrogate measure. By imposing a hypothetical disturbance to the leading vehicle, the following vehicle’s action is represented as a probabilistic causal model. After that, a tree is built to describe the eight possible conflict types under the model. The surrogate measure, named Aggregated Crash Index (ACI), is thus proposed to measure the crash risk. This index reflects the accommodability of freeway traffic state to a traffic disturbance. We further apply this measure to evaluate the crash risks in a freeway section of Pacific Motorway, Australia. The results show that the proposed indicator outperforms the three traditional crash surrogate measures (i.e., Time to Collision, Proportion of Stopping Distance, and Crash Potential Index) in representing rear-end crash risks. The applications of this measure are also discussed.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号