首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Recent Dirac-Hartree-Slater and Dirac-Hartree-Fock calculations of ionization cross-sections, fluorescence and Coster-Kronig yields and X-ray emission rates offer a “self-consistent” theoretical data base for relative intensities of proton-induced L X-rays. Interpolation schemes for use with tables of these quantities are described. Theoretical intensity ratios for major line groups agree with compiled experimental data to within a few percent, justifying use of the data base in PIXE.  相似文献   

2.
Constitutive equations for representing the inelastic behavior (elastic-plastic and creep) of fast-reactor structural alloys are discussed with an emphasis on 2 Cr-1 Mo steel. Equations that are recommended for use in current design activities are outlined, and background information to their selection is discussed. Some results from ongoing efforts to establish improved methods are also described. Sample experimental data illustrate specific features of material behavior, and a constitutive equation model that is qualitatively capable of representing these observations is described. The model is a flow potential formulation, and its relationship to the works of other investigators is identified.  相似文献   

3.
Albedo data were calculated by ANISN for iron-covered concrete slab as well as for iron and concrete single layer slabs. Neutron is allowed to be incident on a slab with each of 14 energy groups ranging from 10 MeV to 1 keV and reflected with energy between and including the incident group and the lowest group. Neutron direction is described with 8 discrete angles for incidence and reflection respectively. Discussion was made for the dependency on slab thickness and on angle and energy of incidence and reflection. As a result, albedo data calculated by ANISN showed good agreement with the other similar data and consequently they were concluded to be sufficiently valid for use in shield design calculation.  相似文献   

4.
Calculation of the rotational centers in computed tomographysinograms   总被引:1,自引:0,他引:1  
An efficient method for accurately calculating the center-of-rotation, or projection center, for parallel computed tomography projection data, or sinograms, is described. This method uses all the data in the sinogram to estimate the center by a least-squares technique and requires no previous calibration scan. The method also finds the object's center-of-mass without reconstructing its image. Since the method uses the measured data, it is sensitive to noise in the measurements, but that sensitivity is relatively small compared to other techniques. Examples of its use on simulated and actual data are included. For fan-beam data over 360°, two related methods are described to find the center in the presence or absence of a midline offset  相似文献   

5.
本文叙述了一个以微型计算机为基础的双参数核谱数据获取与处理系统。主要特点是造价低,使用方便、谱形显示立体感较强。文中对数据获取通道和立体谱形显示作了较详细的描述,给出了应用实例。  相似文献   

6.
This paper describes upgrades to the Data Acquisition System for the Experimental Projects Department at PPPL, especially in support of the PBX-M upgrade to be completed this year. Hardware and software maintenance problems with the old configuration, consisting of a DEC KL-10 and eight PDP-11's, are described. The real-time software and hardware performance requirements and projections for CAMAC I/O and data analysis and display are presented. Described are three applications that have realtime requirements and are located on separate processors, connected to PPPL's VAX Cluster by an Ethernet link. Building upon a previous large software base, general-purpose subroutine libraries and utilities are being emphasized. The most useful of these are described. The use of software packages from DEC, third-party vendors, and the fusion community, is also described. The new approaches to software development that are being incorporated into the DAS efforts are discussed. Specific future challenges are also described.  相似文献   

7.
Different roles for internal standards in PIXE analysis of fluid residues are discussed. The efficacy of internal standards is predicated on having homogeneous targets of uniform thickness; the use of lecithin additives to achieve this is described and data are presented to illustrate the dependence of analytical precision on the mode of specimen preparation. Determination of actual thickness via measurement of the energy loss of transmitted protons is demonstrated; this provides a means of correcting for departures from the much-quoted “thin target” criterion.  相似文献   

8.
The collection of high count rate data on a scanning proton microprobe from a number of detectors simultaneously requires much attention to both hardware and software detail. With fast data acquisition and rapid data handling coupled with high resolution graphics, a microprobe provides a powerful analytical instrument.Several considerations favour the use of event by event mode for data collection. Rapid data handling requires the provision of individual, fast ADCs for each detector. A controlling host computer must record each energy event together with the positional coordinates of the event as it was generated at the specimen. The task of the resident host software ranges from dumping the data on to magnetic tape or disk to a full online, realtime SORT to enable the mapping of elemental distributions on high resolution colour monitors as data is being collected.The evolution of total quantative scanning analysis will be described from its inception on a small mini-computer to the present time. The principles of high speed data capture and processing as associated with the more powerful, present day computers integrating high resolution graphics will be discussed, including the development of fast data acquisition hardware and multidimensional display programs.  相似文献   

9.
Lucifer, a fastbus module functioning as part of the Delphi track trigger hardware in the first- and second-level trigger sequences, is described. This module is designed for flexibility, ease of testing, and usage. These aims are achieved in limited area by extensive use of application-specific IC devices. Hardware implementation, control requirements, and data bus architecture are described  相似文献   

10.
反应堆热工分析程序中可视化建模技术应用   总被引:1,自引:0,他引:1  
徐珍  杨燕华  林萌  杨晓 《核动力工程》2006,27(6):38-41,51
采用可视化建模技术及可扩展标记语言(XML),建立适用于热工水力安全分析系统程序RELAP5的组件模型库.通过调用库内组件,构造模型系统图,建立可输入热工水力参数的可视化人-机界面;将各组件和对应界面连接后,输入参数,经查错最终生成RELAP5输入卡.通过对简单管道内液体流动分析,证明本技术简化了RELAP5程序的使用.  相似文献   

11.
The peculiar characteristics of an autonomous CAMAC system based on the use of an Intel 8080 microprocessor are described. The system memory for program store, scratch work and data acquisition is contained in the autonomous CAMAC controller. Memory can be accessed in DMA mode with 24-bit data utilizing three bytes with a unique address. Suitable 3-byte floating point mathematical subroutines have been developed for general use and in particular for the requirements of the system. Details on hardware and software of the system are illustrated together with a brief mention of the generalizations.  相似文献   

12.
The use of the Chernobyl experience in emergency data management is presented. Information technologies for the generalization of practical experience in the protection of the population after the Chernobyl accident are described. The two main components of this work are the development of the administrative information system (AIS) and the creation of the central data bank. The current state of the AIS, the data bank and the bank of models is described. Data accumulated and models are used to estimate the consequences of radiation accidents and to provide different types of prognosis. Experience of accumulated analysis data allows special software to be developed for large-scale simulation of radiation consequences of major radiation accidents and to organize practical exercises. Some examples of such activity are presented.  相似文献   

13.
A number of methods for compressing binary data on board spacecraft are discussed. Five systems, ranging in complexity from the simple use of prescalers to a floating point system, are described. The systems are compared, assuming that an accumulated count must be read into an eight bit telemetry word. Maximum count, reading error and hardware complexity are examined for each system. General formulae are presented which allow the calculation of maximum count for each system assuming that N bit telemetry words are available. The effect of a larger telemetry word on the performance of some of the systems is examined.  相似文献   

14.
Small perturbations of the environmental radiation field by artificial radionuclides have been successfully quantified using high pressure ionization chambers and in situ semiconductor detector gamma-ray spectra. The calibration and use of these instruments for the detection of ground-deposited and airborne sources of activity is described and general methods for data interpretation are discussed. Specific examples are given in which the exposure rate from fallout radionuclides deposited on the soil surface and from noble gases released by nuclear facilities are determined and unambiguously separated from variations in the underlying background.  相似文献   

15.
In order to handle the vast amount of information collected by JET diagnostics, which can exceed 10 Gbytes of data per shot, a series of new soft computing methods are being developed. They cover various aspects of the data analysis process, ranging from information retrieval to statistical confidence and machine learning. In this paper some recent developments are described. History effects in the plasma evolution leading to disruptions have been investigated with the use of Artificial Neural Networks. New image processing algorithms, based on optical flow techniques, are being used to derive quantitative information about the movement of objects like filaments at the edge of JET plasmas. Adaptive filters, mainly of the Kalman type, have been successfully implemented for the online filtering of MSE data for real time purposes.  相似文献   

16.
Details of organizing the work to liquidate a hard-to-reach repository for high-level waste at a special site at the Institute are described. The bulk of the waste in the pit was encased in a high-strength concrete slab and, together with low- and medium-level waste, contained a large number of metal cans of high-level waste. Special arrangements for radiation protection set up around the pit and the techniques used to break up the concrete casing and extract the waste are described. Video cameras and a gamma visualizer were used to find high-level waste and fragments in the demolished concrete casing and to guide remotely controlled robotic equipment to them. Changes in the radiation environment in the work area were monitored operationally with a gamma locator; data from this detector were fed in real time to and analyzed and processed by a personal computer for use in carrying out the work. Translated from Atomnaya énergiya, Vol. 105, No. 3, pp. 164–169, September, 2008.  相似文献   

17.
18.
The check whether it is possible to use the 2005-look up table primary designed for heated pipes also for heated rod bundles gives the surprising result that the bundle critical power for five data sets of three different bundles and different power distributions are predicted by a simple method described above using the 2005-look up table within the accuracy reported by the authors of this table.  相似文献   

19.
The prediction by a mathematical model of the separation of uranium isotopes using a gas centrifuge process is a hard task. The gas motion can be described by analytical or numerical solutions of the system of equations defined by the equation of continuity, the Navier-Stokes equation and the equation of energy. However, these calculations cannot be performed for actual centrifuges.

Neural networks are an alternative for modelling complex problems that show too many difficulties to be solved by phenomenological models.

The authors propose the use of neural networks for the simulation and prevision of the separative and operational parameters of a gas centrifuge separating uranium isotopes. The results from the uranium separation experiments (Zippe data) are compiled and presented to the neural network in the learning and testing processes. The prediction using the neural network model shows good agreement with the experimental data.  相似文献   

20.
An online system of the underground neutrino detector, Super-Kamiokande is scheduled to be upgraded in 2008 together with front-end electronics. This detector consists of 50 000 tons of pure water equipped with about 13 000 photo-multipliers (PMTs) to detect Cherenkov light. The new online system is required to accept the dataflow of up to 800 MB/s from the front-end electronics and process them for the offline analysis. We will utilize a Gigabit Ethernet network and parallel data processing to handle this large amount of flow. In the new data acquisition scheme, we will not use a hardware event-trigger but read out every hit data from the front-end electronics and process them by the online farm. Therefore, there is no threshold of the number of PMT hits and the detector will become more sensitive to important cosmic-ray events such as relic neutrino from supernova and low energy solar neutrino. In addition to that, a dead-timeless system is desirable for the continuous measurement 365 days a year. In this paper, the detailed design of the upgraded online system and testing activities using prototypes will be described.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号