首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   140篇
  免费   5篇
  国内免费   1篇
电工技术   1篇
化学工业   27篇
金属工艺   4篇
机械仪表   13篇
建筑科学   4篇
能源动力   3篇
轻工业   7篇
水利工程   1篇
无线电   19篇
一般工业技术   23篇
冶金工业   2篇
原子能技术   2篇
自动化技术   40篇
  2024年   2篇
  2023年   1篇
  2022年   7篇
  2021年   14篇
  2020年   4篇
  2019年   8篇
  2018年   12篇
  2017年   4篇
  2016年   5篇
  2015年   5篇
  2014年   4篇
  2013年   16篇
  2012年   6篇
  2011年   15篇
  2010年   7篇
  2009年   4篇
  2008年   4篇
  2007年   3篇
  2006年   3篇
  2005年   6篇
  2004年   6篇
  2003年   3篇
  2000年   1篇
  1998年   2篇
  1996年   2篇
  1993年   1篇
  1990年   1篇
排序方式: 共有146条查询结果,搜索用时 31 毫秒
81.
Early clinical results with time-of-flight (TOF) positron emission tomography (PET) systems have demonstrated the advantages of TOF information in PET reconstruction. Reconstruction approaches in TOF-PET systems include list-mode and binned iterative algorithms as well as confidence-weighted analytic methods. List-mode iterative TOF reconstruction retains the resolutions of the data in the spatial and temporal domains without any binning approximations but is computationally intensive. We have developed an approach [DIRECT (direct image reconstruction for TOF)] to speed up TOF-PET reconstruction that takes advantage of the reduced angular sampling requirement of TOF data by grouping list-mode data into a small number of azimuthal views and co-polar tilts and depositing the grouped events into histo-images, arrays with the sampling and geometry of the final image. All physical effects are included in the system model and deposited in the same histo-image structure. Using histo-images allows efficient computation during reconstruction without ray-tracing or interpolation operations. The DIRECT approach was compared with 3-D list-mode TOF ordered subsets expectation maximization (OSEM) reconstruction for phantom and patient data taken on the University of Pennsylvania research LaBr (3) TOF-PET scanner. The total processing and reconstruction time for these studies with DIRECT without attention to code optimization is approximately 25%-30% that of list-mode TOF-OSEM to achieve comparable image quality. Furthermore, the reconstruction time for DIRECT is independent of the number of events and/or sizes of the spatial and TOF kernels, while the time for list-mode TOF-OSEM increases with more events or larger kernels. The DIRECT approach is able to reproduce the image quality of list-mode iterative TOF reconstruction both qualitatively and quantitatively in measured data with a reduced time.  相似文献   
82.
We report on the combined magnetization and electron paramagnetic resonance characterization of a novel Ti‐O organic–inorganic gel hybrid and the related electron–hole generation process upon UV illumination. We find that electrons are injected into the conduction band of the Ti‐O framework, photoreducing TiIV to TiIII. TiIII sites are mainly located on the surface, owing to the nanometric dimensions of the inorganic component. Surprisingly, the symmetry of the TiO6 octahedra depends on the level of illumination: in the lightly UV‐exposed samples TiIII is sited in the weakly distorted TiO6 octahedra to which methanoate groups are bonded, as suggested by electron spin echo envelope modulation (ESEEM) experiments. Extensive illumination causes structural rearrangements, leading to enhanced tetragonal TiO6 distortion and shifting the TiIII interaction towards the hydroxide groups or water. The results provide clear evidence for an interfacial charge transfer between the quantum‐size TiO lattice and coordinated species upon in situ and ex situ UV illumination at temperatures from room temperature to 5 K.  相似文献   
83.
In order to assess the present situation of schistosomiasis in the Zona da Mata Sul, Pernambuco State, Brazil, a study was conducted in the following phases: origin, historical and temporal evolution, and basic determinants of this health/disease process; critical assessment of comprehensive intervention programs implemented by the State in the region since 1970; and a case study in 17 counties, representing 1,424 communities and 485,200 inhabitants, and Brazil's second most endemic region based on prevalence rates for schistosomiasis. Temporal series over a 14-year period were used to analyze results of intervention programs. Conclusions were: a) current positivity rates are higher than those observed in the early 1980s; b) the programs' strategy focused almost exclusively on mass treatment, thus allowing for reinfestation and occurrence of new cases; c) proposals such as the PCDEN (Program for Control of Endemic Diseases in the Northeast) aimed at decentralization to the municipal level in the 1990s were not effectively implemented, helping to leave this persistent endemic out of control.  相似文献   
84.
Metabolomics encompasses the study of small molecules in a biological sample. Liquid Chromatography coupled with Mass Spectrometry (LC-MS) profiling is an important approach for the identification and quantification of metabolites from complex biological samples. The amount and complexity of data produced in an LC-MS profiling experiment demand automatic tools for the preprocessing, analysis, and extraction of useful biological information. Data preprocessing—a topic that covers noise filtering, peak detection, deisotoping, alignment, identification, and normalization—is thus an active area of metabolomics research. Recent years have witnessed development of many software for data preprocessing, and still there is a need for further improvement of the data preprocessing pipeline. This review presents an overview of selected software tools for preprocessing LC-MS based metabolomics data and tries to provide future directions.  相似文献   
85.
Data processing and identification of unknown compounds in comprehensive two-dimensional gas chromatography combined with time-of-flight mass spectrometry (GC×GC/TOFMS) analysis is a major challenge, particularly when large sample sets are analyzed. Herein, we present a method for efficient treatment of large data sets produced by GC×GC/TOFMS implemented as a freely available open source software package, Guineu. To handle large data sets and to efficiently utilize all the features available in the vendor software (baseline correction, mass spectral deconvolution, peak picking, integration, library search, and signal-to-noise filtering), data preprocessed by instrument software are used as a starting point for further processing. Our software affords alignment of the data, normalization, data filtering, and utilization of retention indexes in the verification of identification as well as a novel tool for automated group-type identification of the compounds. Herein, different features of the software are studied in detail and the performance of the system is verified by the analysis of a large set of standard samples as well as of a large set of authentic biological samples, including the control samples. The quantitative features of our GC×GC/TOFMS methodology are also studied to further demonstrate the method performance and the experimental results confirm the reliability of the developed procedure. The methodology has already been successfully used for the analysis of several thousand samples in the field of metabolomics.  相似文献   
86.
This paper deals with the influence of phase modulated synthetic jet on the aerodynamics of the hump in a closed test section Of the Eiffel-type wind tunnel. Three experimental methods of measurement techniques of this phenomenon were used: the pressure profile using the Kiel total pressure probe, the velocity profile using the CTA (constant temperature anemometry) probe and the visualization of the flow field using the hot film and the thermo camera, The experimental results with and without the influence of the synthetic jet were compared, as well the impact of the phase shift of the neighbouring synthetic jets. As a reference case, the flow around the hump without the influence of the synthetic jet was selected. The results of the measurement are presented in figures and compared.  相似文献   
87.
Electroporation-based applications require the use of specific pulse parameters for a successful outcome. When recommended values of pulse parameters cannot be set, similar outcomes can be obtained by using equivalent pulse parameters. We determined the relations between the amplitude and duration/number of pulses resulting in the same fraction of electroporated cells. Pulse duration was varied from 150 ns to 100 ms, and the number of pulses from 1 to 128. Fura 2-AM was used to determine electroporation of cells to Ca(2+). With longer pulses or higher number of pulses, lower amplitudes are needed for the same fraction of electroporated cells. The expression derived from the model of electroporation could describe the measured data on the whole interval of pulse durations. In a narrower range (0.1-100 ms), less complex, logarithmic or power functions could be used instead. The relation between amplitude and number of pulses could best be described with a power function or an exponential function. We show that relatively simple two-parameter power or logarithmic functions are useful when equivalent pulse parameters for electroporation are sought. Such mathematical relations between pulse parameters can be important in planning of electroporation-based treatments, such as electrochemotherapy and nonthermal irreversible electroporation.  相似文献   
88.
The point-in-polygon problem is often encountered in geographical information systems. The algorithms usually work on polygons defined by straight edges. In some situations, however, polygons containing circular arcs are applied. In geographical information systems these polygons are usually considered as geometric buffers, geodesic offsets, or geodesic parallels. This paper presents three algorithms suitable for providing information about the containment of a point in geometric buffers: the Ray-crossing method, the Cell-Based Algorithm and the Approximate approach. An extensive experimental section allows the reader to select the most efficient algorithm for practical problems.  相似文献   
89.
Iterative image reconstruction algorithms play an increasingly important role in modern tomographic systems, especially in emission tomography. With the fast increase of the sizes of the tomographic data, reduction of the computation demands of the reconstruction algorithms is of great importance. Fourier-based forward and back-projection methods have the potential to considerably reduce the computation time in iterative reconstruction. Additional substantial speed-up of those approaches can be obtained utilizing powerful and cheap off-the-shelf fast Fourier transform (FFT) processing hardware. The Fourier reconstruction approaches are based on the relationship between the Fourier transform of the image and Fourier transformation of the parallel-ray projections. The critical two steps are the estimations of the samples of the projection transform, on the central section through the origin of Fourier space, from the samples of the transform of the image, and vice versa for back-projection. Interpolation errors are a limitation of Fourier-based reconstruction methods. We have applied min-max optimized Kaiser-Bessel interpolation within the nonuniform FFT (NUFFT) framework and devised ways of incorporation of resolution models into the Fourier-based iterative approaches. Numerical and computer simulation results show that the min-max NUFFT approach provides substantially lower approximation errors in tomographic forward and back-projection than conventional interpolation methods. Our studies have further confirmed that Fourier-based projectors using the NUFFT approach provide accurate approximations to their space-based counterparts but with about ten times faster computation, and that they are viable candidates for fast iterative image reconstruction.  相似文献   
90.
Abstract:   Spatial databases contain geocoded data. Geocoded data play a major role in numerous engineering applications such as transportation and environmental studies where geospatial information systems (GIS) are used for spatial modeling and analysis as they contain spatial information (e.g., latitude and longitude) about objects. The information that a GIS produces is impacted by the quality of the geocoded data (e.g., coordinates) stored in its database. To make appropriate and reasonable decisions using geocoded data, it is important to understand the sources of uncertainty in geocoding. There are two major sources of uncertainty in geocoding, one related to the database that is used as a reference data set to geocode objects and one related to the interpolation technique used. Factors such as completeness, correctness, consistency, currency, and accuracy of the data in the reference database contribute to the uncertainty of the former whereas the specific logic and assumptions used in an interpolation technique contribute to the latter. The primary purpose of this article is to understand uncertainties associated with interpolation techniques used for geocoding. In doing so, three geocoding algorithms were used and tested and the results were compared with the data collected by the Global Positioning System (GPS). The result of the overall comparison indicated no significant differences between the three algorithms .  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号