首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   228篇
  免费   19篇
  国内免费   15篇
电工技术   7篇
综合类   20篇
化学工业   14篇
金属工艺   2篇
机械仪表   9篇
建筑科学   14篇
矿业工程   1篇
能源动力   3篇
水利工程   9篇
石油天然气   4篇
武器工业   3篇
无线电   24篇
一般工业技术   47篇
冶金工业   1篇
原子能技术   1篇
自动化技术   103篇
  2023年   4篇
  2022年   3篇
  2021年   7篇
  2020年   7篇
  2019年   4篇
  2018年   4篇
  2017年   5篇
  2016年   12篇
  2015年   8篇
  2014年   13篇
  2013年   16篇
  2012年   24篇
  2011年   35篇
  2010年   20篇
  2009年   24篇
  2008年   16篇
  2007年   17篇
  2006年   18篇
  2005年   9篇
  2004年   6篇
  2003年   5篇
  2002年   4篇
  2001年   1篇
排序方式: 共有262条查询结果,搜索用时 62 毫秒
11.
Parameter estimation is a cornerstone of most fundamental problems of statistical research and practice. In particular, finite mixture models have long been heavily relied on deterministic approaches such as expectation maximization (EM). Despite their successful utilization in wide spectrum of areas, they have inclined to converge to local solutions. An alternative approach is the adoption of Bayesian inference that naturally addresses data uncertainty while ensuring good generalization. To this end, in this paper we propose a fully Bayesian approach for Langevin mixture model estimation and selection via MCMC algorithm based on Gibbs sampler, Metropolis–Hastings and Bayes factors. We demonstrate the effectiveness and the merits of the proposed learning framework through synthetic data and challenging applications involving topic detection and tracking and image categorization.  相似文献   
12.
The use of graphical processing unit (GPU) parallel processing is becoming a part of mainstream statistical practice. The reliance of Bayesian statistics on Markov Chain Monte Carlo (MCMC) methods makes the applicability of parallel processing not immediately obvious. It is illustrated that there are substantial gains in improved computational time for MCMC and other methods of evaluation by computing the likelihood using GPU parallel processing. Examples use data from the Global Terrorism Database to model terrorist activity in Colombia from 2000 through 2010 and a likelihood based on the explicit convolution of two negative-binomial processes. Results show decreases in computational time by a factor of over 200. Factors influencing these improvements and guidelines for programming parallel implementations of the likelihood are discussed.  相似文献   
13.
Subset Simulation is an adaptive simulation method that efficiently solves structural reliability problems with many random variables. The method requires sampling from conditional distributions, which is achieved through Markov Chain Monte Carlo (MCMC) algorithms. This paper discusses different MCMC algorithms proposed for Subset Simulation and introduces a novel approach for MCMC sampling in the standard normal space. Two variants of the algorithm are proposed: a basic variant, which is simpler than existing algorithms with equal accuracy and efficiency, and a more efficient variant with adaptive scaling. It is demonstrated that the proposed algorithm improves the accuracy of Subset Simulation, without the need for additional model evaluations.  相似文献   
14.
In agricultural and environmental sciences dispersal models are often used for risk assessment to predict the risk associated with a given configuration and also to test scenarios that are likely to minimise those risks. Like any biological process, dispersal is subject to biological, climatic and environmental variability and its prediction relies on models and parameter values which can only approximate the real processes. In this paper, we present a Bayesian method to model dispersal using spatial configuration and climatic data (distances between emitters and receptors; main wind direction) while accounting for uncertainty, with an application to the prediction of adventitious presence rate of genetically modified maize (GM) in a non-GM field. This method includes the design of candidate models, their calibration, selection and evaluation on an independent dataset. A group of models was identified that is sufficiently robust to be used for prediction purpose. The group of models allows to include local information and it reflects reliably enough the observed variability in the data so that probabilistic model predictions can be performed and used to quantify risk under different scenarios or derive optimal sampling schemes.  相似文献   
15.
Doubly-censored data refers to time to event data for which both the originating and failure times are censored. In studies involving AIDS incubation time or survival after dementia onset, for example, data are frequently doubly-censored because the date of the originating event is interval-censored and the date of the failure event usually is right-censored. The primary interest is in the distribution of elapsed times between the originating and failure events and its relationship to exposures and risk factors. The estimating equation approach [Sun et al. (1999). Regression analysis of doubly censored failure time data with applications to AIDS studies. Biometrics 55, 909-914] and its extensions assume the same distribution of originating event times for all subjects. This paper demonstrates the importance of utilizing additional covariates to impute originating event times, i.e., more accurate estimation of originating event times may lead to less biased parameter estimates for elapsed time. The Bayesian MCMC method is shown to be a suitable approach for analyzing doubly-censored data and allows a rich class of survival models. The performance of the proposed estimation method is compared to that of other conventional methods through simulations. Two examples, an AIDS cohort study and a population-based dementia study, are used for illustration. Sample code is shown in Appendix A and Appendix B.  相似文献   
16.
In this paper, we present an improved procedure for collecting no or little atmosphere- and snow-contaminated observations from the Moderate Resolution Imaging Spectroradiometer (MODIS) sensor. The resultant time series of daily MODIS data of a temperate deciduous broadleaf forest (the Bartlett Experimental Forest) in 2004 show strong seasonal dynamics of surface reflectance of green, near infrared and shortwave infrared bands, and clearly delineate leaf phenology and length of plant growing season. We also estimate the fractions of photosynthetically active radiation (PAR) absorbed by vegetation canopy (FAPARcanopy), leaf (FAPARleaf), and chlorophyll (FAPARchl), respectively, using a coupled leaf-canopy radiative transfer model (PROSAIL-2) and daily MODIS data. The Markov Chain Monte Carlo (MCMC) method (the Metropolis algorithm) is used for model inversion, which provides probability distributions of the retrieved variables. A two-step procedure is used to estimate the fractions of absorbed PAR: (1) to retrieve biophysical and biochemical variables from MODIS images using the PROSAIL-2 model; and (2) to calculate the fractions with the estimated model variables from the first step. Inversion and forward simulations of the PROSAIL-2 model are carried out for the temperate deciduous broadleaf forest during day of year (DOY) 184 to 201 in 2005. The reproduced reflectance values from the PROSAIL-2 model agree well with the observed MODIS reflectance for the five spectral bands (green, red, NIR1, NIR2, and SWIR1). The estimated leaf area index, leaf dry matter, leaf chlorophyll content and FAPARcanopy values are close to field measurements at the site. The results also showed significant differences between FAPARcanopy and FAPARchl at the site. Our results show that MODIS imagery provides important information on biophysical and biochemical variables at both leaf and canopy levels.  相似文献   
17.
Neural networks provide a tool for describing non-linearity in volatility processes of financial data and help to answer the question “how much” non-linearity is present in the data. Non-linearity is studied under three different specifications of the conditional distribution: Gaussian, Student-t and mixture of Gaussians. To rank the volatility models, a Bayesian framework is adopted to perform a Bayesian model selection within the different classes of models. In the empirical analysis, the return series of the Dow Jones Industrial Average index, FTSE 100 and NIKKEI 225 indices over a period of 16 years are studied. The results show different behavior across the three markets. In general, if a statistical model accounts for non-normality and explains most of the fat tails in the conditional distribution, then there is less need for complex non-linear specifications.  相似文献   
18.
We investigate explicit segment duration models in addressing the problem of fragmentation in musical audio segmentation. The resulting probabilistic models are optimised using Markov Chain Monte Carlo methods; in particular, we introduce a modification to Wolff’s algorithm to make it applicable to a segment classification model with an arbitrary duration prior. We apply this to a collection of pop songs, and show experimentally that the generated segmentations suffer much less from fragmentation than those produced by segmentation algorithms based on clustering, and are closer to an expert listener’s annotations, as evaluated by two different performance measures. Editor: Gerhard Widmer  相似文献   
19.
In two-way contingency tables analysis, a popular class of models for describing the structure of the association between the two categorical variables are the so-called “association” models. Such models assign scores to the classification variables which can be either fixed and prespecified or unknown parameters to be estimated. Under the row-column (RC) association model, both row and column scores are unknown parameters without any restriction concerning their ordinality. It is natural to impose order restrictions on the scores when the classification variables are ordinal. The Bayesian approach for the RC (unrestricted and restricted) model is adopted. MCMC methods are facilitated in order the parameters to be estimated. Furthermore, an alternative parametrization of the association models is proposed. This new parametrization simplifies computation in the MCMC procedure and leads to a natural parameter space for the order constrained model. The proposed methodology is illustrated via a popular dataset.  相似文献   
20.
For the gradual maturity of Bayesian survival analysis theory, as well as the defects of the traditional methods for storage reliability evaluation, the Bayesian survival analysis method is proposed to build regression models for reliability in the random truncated test. These models can reflect the influences of different environments on the ammunition storage lifetime. As an example, the common exponential distribution is used here, and Markov chain Monte Carlo(MCMC) method based on Gibbs sampling dynamically simulates the Markov chain of the parameters' posterior distribution. Also, the parameters' Bayesian estimations are calculated in the random truncated condition. The simulation results show that the proposed method is effective and directly perceived.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号