首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   189篇
  免费   2篇
化学工业   22篇
建筑科学   2篇
能源动力   18篇
轻工业   17篇
水利工程   1篇
无线电   28篇
一般工业技术   21篇
冶金工业   33篇
自动化技术   49篇
  2023年   1篇
  2021年   6篇
  2020年   2篇
  2019年   4篇
  2018年   6篇
  2017年   4篇
  2016年   4篇
  2015年   4篇
  2014年   3篇
  2013年   12篇
  2012年   3篇
  2011年   1篇
  2010年   3篇
  2009年   8篇
  2008年   7篇
  2006年   6篇
  2005年   1篇
  2004年   3篇
  2003年   2篇
  2002年   8篇
  2001年   12篇
  2000年   3篇
  1999年   2篇
  1998年   22篇
  1997年   9篇
  1996年   6篇
  1995年   4篇
  1994年   3篇
  1992年   6篇
  1991年   4篇
  1990年   4篇
  1989年   3篇
  1988年   1篇
  1986年   3篇
  1985年   2篇
  1984年   1篇
  1982年   4篇
  1978年   2篇
  1977年   2篇
  1976年   1篇
  1974年   1篇
  1973年   2篇
  1971年   1篇
  1970年   1篇
  1967年   1篇
  1961年   1篇
  1956年   1篇
  1955年   1篇
排序方式: 共有191条查询结果,搜索用时 31 毫秒
1.
Approximate maximum likelihood (ML) hidden Markov modeling using the most likely state sequence (MLSS) is examined and compared with the exact ML approach that considers all possible state sequences. It is shown that for any hidden Markov model (HMM), the difference between the approximate and the exact normalized likelihood functions cannot exceed the logarithm of the number of states divided by the dimension of the output vectors (frame length), which is negligible for typically used values of vector dimension (128–256) and number of states (2–30). Furthermore, for Gaussian HMMs and a given observation sequence, the MLSS is typically the sequence of nearest neighbor states in the Itakura-Saito sense, and the posterior probability of any state sequence which departs from the MLSS in a single time instant decays exponentially with the frame length. Hence, for a sufficiently large frame length the exact and approximate ML approaches provide similar model estimates and likelihood values. The results and their implications on speech recognition are demonstrated in a set of experiments.  相似文献   
2.
The extended Ziv-Zakai bound for vector parameters is used to develop a lower bound on the mean square error in estimating the 2-D bearing of a narrowband planewave signal using planar arrays of arbitrary geometry. The bound has a simple closed-form expression that is a function of the signal wavelength, the signal-to-noise ratio (SNR), the number of data snapshots, the number of sensors in the array, and the array configuration. Analysis of the bound suggests that there are several regions of operation, and expressions for the thresholds separating the regions are provided. In the asymptotic region where the number of snapshots and/or SNR are large, estimation errors are small, and the bound approaches the inverse Fisher information. This is the same as the asymptotic performance predicted by the local Cramer-Rao bound for each value of bearing. In the a priori performance region where the number of snapshots or SNR is small, estimation errors are distributed throughout the a priori parameter space and the bound approaches the a priori covariance. In the transition region, both small and large errors occur, and the bound varies smoothly between the two extremes. Simulations of the maximum likelihood estimator (MLE) demonstrate that the bound closely predicts the performance of the MLE in all regions  相似文献   
3.
Although direct evidence of carcinogenic risk from mammography is lacking, there is a hypothetical risk from screening because excess breast cancers have been demonstrated in women receiving doses of 0.25-20 Gy. These high-level exposures to the breast occurred from the 1930s to the 1950s due to atomic bomb radiation, multiple chest fluoroscopies, and radiation therapy treatments for benign disease. Using a risk estimate provided by the Biological Effects of Ionizing Radiation (BEIR) V Report of the National Academy of Sciences and a mean breast glandular dose of 4 mGy from a two-view per breast bilateral mammogram, one can estimate that annual mammography of 100,000 women for 10 consecutive years beginning at age 40 will result in at most eight breast cancer deaths during their lifetime. On the other hand, researchers have shown a 24% mortality reduction from biennial screening of women in this age group; this will result in a benefit-to-risk ratio of 48.5 lives saved per life lost and 121.3 years of life saved per year of life lost. An assumed mortality reduction of 36% from annual screening would result in 36.5 lives saved per life lost and 91.3 years of life saved per year of life lost. Thus, the theoretical radiation risk from screening mammography is extremely small compared with the established benefit from this life-saving procedure and should not unduly distract women under age 50 who are considering screening.  相似文献   
4.
An in-depth experimental study of heat transfer in ovens has provided basic data that is directly applicable to design. Heat transfer coefficients were measured for thermal loads having either black or highly reflective surface finishes. Approximately 100 different data runs were carried out. These heat transfer coefficients enabled the separation of the heat transfer into convective and radiative components, with radiation being the dominant transfer mechanism for blackened loads. The thermal response of the load to the presence of blockages situated either below or above the load was quantified. This response was only slightly affected by the blockages when they were empty of water, but major effects were observed when the blockages were water filled. Major effects were also encountered when the load was supported from below by cookie sheets. On the other hand, extensive investigation of various positions throughout the oven indicated a very weak effect of load position on the thermal response.  相似文献   
5.
Since DeLone and McLean (D&M) developed their model of IS success, there has been much research on the topic of success as well as extensions and tests of their model. Using the technique of a qualitative literature review, this research reviews 180 papers found in the academic literature for the period 1992–2007 dealing with some aspect of IS success. Using the six dimensions of the D&M model – system quality, information quality, service quality, use, user satisfaction, and net benefits – 90 empirical studies were examined and the results summarized. Measures for the six success constructs are described and 15 pairwise associations between the success constructs are analyzed. This work builds on the prior research related to IS success by summarizing the measures applied to the evaluation of IS success and by examining the relationships that comprise the D&M IS success model in both individual and organizational contexts.  相似文献   
6.
Computational models of emotions have been thriving and increasingly popular since the 1990s. Such models used to be concerned with the emotions of individual agents when they interact with other agents. Out of the array of models for the emotions, we are going to devote special attention to the approach in Adamatzky’s Dynamics of Crowd-Minds. The reason it stands out, is that it considers the crowd, rather than the individual agent. It fits in computational intelligence. It works by mathematical simulation on a crowd of simple artificial agents: by letting the computer program run, the agents evolve, and crowd behaviour emerges. Adamatzky’s purpose is to give an account of the emergence of allegedly “irrational” behaviour. This is not without problem, as the irrational to one person may seem entirely rational to another, and this in turn is an insight that, in the history of crowd psychology, has affected indeed the competition among theories of crowd dynamics. Quite importantly, Adamatzky’s book argues for the transition from individual agencies to a crowd’s or a mob’s coalesced mind as so, and at any rate for coalesced crowd’s agency.  相似文献   
7.
Lennart Åqvist (1992) proposed a logical theory of legal evidence, based on the Bolding-Ekelöf of degrees of evidential strength. This paper reformulates Åqvist's model in terms of the probabilistic version of the kappa calculus. Proving its acceptability in the legal context is beyond the present scope, but the epistemological debate about Bayesian Law isclearly relevant. While the present model is a possible link to that lineof inquiry, we offer some considerations about the broader picture of thepotential of AI & Law in the evidentiary context. Whereas probabilisticreasoning is well-researched in AI, calculations about the threshold ofpersuasion in litigation, whatever their value, are just the tip of theiceberg. The bulk of the modeling desiderata is arguably elsewhere, if one isto ideally make the most of AI's distinctive contribution as envisaged forlegal evidence research.  相似文献   
8.
9.
10.
Ontologies, and legal ontologies are a particular class of application of these, have become fairly popular. Are they fit for purpose? Like with all kinds of tools from legal computing, one must be cautious, and consider very attentively what the likely receptions are going to be, among users. Maurice v. Judd (New York, 1818), when a jury was called to decide whether whale oil is fish oil, and decided that indeed whales are fish, is a trial that was analysed in Graham Burnett's Trying Leviathan: The nineteenth-century New York court case that put the whale on trial and challenged the order of nature (Princeton, NJ: Princeton University). It is a highly readable book, and it has something important to teach developers of ontologies in the legal domain. The intended public of users of any software, or of ontologies in particular, is paramount. Those intended users you are catering to with your new tool are going to make or break it, just as it happened, e.g. to sentencing information systems in Canadian provinces.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号