首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   68篇
  免费   6篇
化学工业   21篇
机械仪表   2篇
轻工业   5篇
水利工程   1篇
无线电   11篇
一般工业技术   6篇
冶金工业   9篇
原子能技术   1篇
自动化技术   18篇
  2021年   4篇
  2020年   1篇
  2019年   2篇
  2018年   3篇
  2017年   1篇
  2016年   4篇
  2015年   2篇
  2014年   2篇
  2013年   4篇
  2012年   5篇
  2011年   6篇
  2010年   3篇
  2009年   8篇
  2008年   1篇
  2007年   1篇
  2006年   2篇
  2004年   1篇
  2003年   3篇
  2002年   3篇
  2001年   2篇
  2000年   2篇
  1998年   2篇
  1997年   2篇
  1996年   2篇
  1995年   2篇
  1994年   2篇
  1993年   1篇
  1989年   1篇
  1988年   1篇
  1975年   1篇
排序方式: 共有74条查询结果,搜索用时 15 毫秒
1.
Minimum variance beamformers are usually complemented with diagonal loading techniques in order to provide robustness against several impairments such as imprecise knowledge of the steering vector or finite sample size effects. This paper concentrates on this last application of diagonal loading techniques, i.e., it is assumed that the steering vector is perfectly known and that diagonal loading is used to alleviate the finite sample size impairments. The analysis herein is asymptotic in the sense that it is assumed that both the number of antennas and the number of samples are high but have the same order of magnitude. Borrowing some results of random matrix theory, the authors first derive a deterministic expression that describes the asymptotic signal-to-noise-plus-interference ratio (SINR) at the output of the diagonally loaded beamformer. Then, making use of the statistical theory of large observations (also known as general statistical analysis or G-analysis), the authors derive an estimator of the optimum loading factor that is consistent when both the number of antennas and the sample size increase without bound at the same rate. Because of that, the estimator has an excellent performance even in situations where the quotient between the number of observations is low relative to the number of elements of the array.  相似文献   
2.
A simple, rapid and sensitive procedure for the simultaneous determination of total cholesterol, tocopherols and β-carotene in meat is described. The method involves a direct saponification of the meat, a single n-hexane extraction and the analysis of the extracted compounds by normal-phase HPLC, using fluorescence (tocopherols) and UV–Vis photodiode array (cholesterol and β-carotene) detections in tandem. Rates of recovery of spiked meat samples were 93% for cholesterol, 83–86% for (α-, β- and -γ) tocopherols and 89% for β-carotene. Repeatabilities were high (CV < 6%) for all determined compounds, except for δ-tocopherol. This tocopherol, which is not usually present in meat, showed a much lower recovery percentage (73%) and repeatability (12.8%). This methodology was applied for the quantification of total cholesterol, tocopherols and β-carotene in three muscles (longissimus thoracis, longissimus lumborum and semitendinosus) of the Portuguese traditional Barrosã-PDO veal, obtained from autochthonous calves fed extensively during summer (with the least abundant green pastures) and slaughtered in early autumn (October). Barrosã-PDO veal showed median contents of total cholesterol (0.50–0.56 mg/g) and, depending on the analysed muscle, moderate to high contents of α-tocopherol (3.3–3.9 μg/g) and β-carotene (0.07–0.09 μg/g), suggesting an high sensorial and hygienic quality.  相似文献   
3.
To acquire maximum information on the geometrical errors of industrially made surfaces at a minimum cost, a method for estimating conditional probabilities of a random signal (Bayesian prediction) is applied to three-dimensional metrology. First, a surface is interpolated between data acquired on a coordinate measuring machine (CMM). Then, for a given probability, limit surfaces are computed that bind a region of space containing the known data and the most probable interpolation of the missing data of the surface. These bounds can be treated as the surface; their points can be considered as if they were actual CMM data when fitting a tolerance zone or a datum feature to the data. For Bayesian prediction, the basic hypotheses on the signal are stationarity, ergodicity, and gaussian density. Deviations from these hypotheses and their consequences on the prediction are taken into account and corrections are proposed.  相似文献   
4.
Deduplication is the task of identifying the entities in a data set which refer to the same real world object. Over the last decades, this problem has been largely investigated and many techniques have been proposed to improve the efficiency and effectiveness of the deduplication algorithms. As data sets become larger, such algorithms may generate critical bottlenecks regarding memory usage and execution time. In this context, cloud computing environments have been used for scaling out data quality algorithms. In this paper, we investigate the efficacy of different machine learning techniques for scaling out virtual clusters for the execution of deduplication algorithms under predefined time restrictions. We also propose specific heuristics (Best Performing Allocation, Probabilistic Best Performing Allocation, Tunable Allocation, Adaptive Allocation and Sliced Training Data) which, together with the machine learning techniques, are able to tune the virtual cluster estimations as demands fluctuate over time. The experiments we have carried out using multiple scale data sets have provided many insights regarding the adequacy of the considered machine learning algorithms and proposed heuristics for tackling cloud computing provisioning.  相似文献   
5.
We present a 4-approximation algorithm for the problem of placing the fewest guards on a 1.5D terrain so that every point of the terrain is seen by at least one guard. This improves on the previous best approximation factor of 5 (see King in Proceedings of the 13th Latin American Symposium on Theoretical Informatics, pp. 629–640, 2006). Unlike most of the previous techniques, our method is based on rounding the linear programming relaxation of the corresponding covering problem. Besides the simplicity of the analysis, which mainly relies on decomposing the constraint matrix of the LP into totally balanced matrices, our algorithm, unlike previous work, generalizes to the weighted and partial versions of the basic problem.  相似文献   
6.
Multimedia Tools and Applications - Microsoft has recently released a mixed reality headset called HoloLens. This semi-transparent visor headset allows the user who wears it to view the projection...  相似文献   
7.
8.
The effects of composition and furnace temperature on Ni1‐ΨCoΨCr2?2ΨAlO4 (0≤Ψ≤1) pigments prepared by Solution Combustion Synthesis were studied. As‐synthesized samples showed spinel‐like spongy structure, very easy to grind. However, important differences on crystallinity, crystal size, and microstructure were observed depending on composition and furnace temperature. All pigments developed intense tones, covering a wide color palette because of composition influence, although little effect was observed with furnace temperature. Stable crystalline structures, suitable grain size, and high resistance against synthesizing variables and ceramic glazes make SCS pigments perfect candidates to be used in the ceramic ink‐jet decoration.  相似文献   
9.
The oxidation of diacetone-L-sorbose (DAS) into diacetone-2-keto-L-gulonic acid (DAG) is a step in the synthesis of vitamin C. This oxidation was carried out electrochemically in a stirred cell with the redox couple Ni (OH)2/β-NiOOH as an electrochemical mediator. The anode was made of a current collector (nickel foam or platinized-titanium) and of a suspension of Ni(OH)2 in KOH aqueous media. The mediated electrooxidation of DAS was achieved by the Ni(OH)2β-NiOOH couple in solution which was regenerated at the current collector. This process required a two-compartment reactor but afforded high yields of chemical conversion: we obtained 96% with a 48% faradaic yield. Previous works dealing with DAS electrooxidation to DAG on nickel anode are reviewed hereafter.  相似文献   
10.
BACKGROUND: The concept of health transition is intended to define, from a plural point of view, the changes in health conditions that have contributed to a decrease in mortality associated with the demographic transition. The purpose of the study is to analyse the health transition in Spain during this century (1900-1990). METHOD: The study of the different components of the health transition (epidemiological transition, risk transition and health care transition) has been based on historical series relating to Natural Population Changes. Annual Statistics and Housing Census Reports. RESULTS: Overall Mortality and Child Mortality rates have tended to decrease over the entire period: overall mortality has decreased by 70%, while child mortality has dropped by 96%. Life expectancy has increased by 42 years from 1900 (35) to 1990 (77), which in relative terms represents an increase of 120%. There has been a 95% decrease in infectious disease-related deaths and a 134% increase in non-infectious disease-related deaths. It can therefore be said that the epidemiological transition in Spain concluded in the fifties with the end of the previous pattern, mainly characterised by a high mortality rate (especially with respect to children), when the main cause of death was due to infectious diseases, then giving way to a new situation in which mortality rates dropped considerably and non-infectious diseases became the main cause of death (the turning point was in 1945). CONCLUSIONS: The new epidemiological trend that took place over the period studied appears to be the result of improved sanitary infrastructure and increased spending as well as better medical services, however also includes new health problems related to working conditions, massive urban development (particularly as of the sixties) and changes in lifestyle.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号