全文获取类型
收费全文 | 191篇 |
免费 | 8篇 |
国内免费 | 2篇 |
专业分类
电工技术 | 1篇 |
化学工业 | 31篇 |
金属工艺 | 4篇 |
机械仪表 | 10篇 |
建筑科学 | 2篇 |
能源动力 | 1篇 |
轻工业 | 23篇 |
水利工程 | 1篇 |
无线电 | 25篇 |
一般工业技术 | 25篇 |
冶金工业 | 8篇 |
原子能技术 | 1篇 |
自动化技术 | 69篇 |
出版年
2024年 | 1篇 |
2023年 | 5篇 |
2022年 | 5篇 |
2021年 | 15篇 |
2020年 | 9篇 |
2019年 | 5篇 |
2018年 | 10篇 |
2017年 | 4篇 |
2016年 | 12篇 |
2015年 | 15篇 |
2014年 | 11篇 |
2013年 | 19篇 |
2012年 | 18篇 |
2011年 | 11篇 |
2010年 | 9篇 |
2009年 | 13篇 |
2008年 | 4篇 |
2007年 | 7篇 |
2006年 | 9篇 |
2005年 | 2篇 |
2004年 | 2篇 |
2003年 | 1篇 |
1999年 | 1篇 |
1998年 | 2篇 |
1997年 | 3篇 |
1996年 | 2篇 |
1995年 | 2篇 |
1994年 | 1篇 |
1989年 | 1篇 |
1981年 | 1篇 |
1979年 | 1篇 |
排序方式: 共有201条查询结果,搜索用时 15 毫秒
81.
Much of the work on statistical machine translation (SMT) from morphologically rich languages has shown that morphological tokenization and orthographic normalization help improve SMT quality because of the sparsity reduction they contribute. In this article, we study the effect of these processes on SMT when translating into a morphologically rich language, namely Arabic. We explore a space of tokenization schemes and normalization options. We also examine a set of six detokenization techniques and evaluate on detokenized and orthographically correct (enriched) output. Our results show that the best performing tokenization scheme is that of the Penn Arabic Treebank. Additionally, training on orthographically normalized (reduced) text then jointly enriching and detokenizing the output outperforms training on enriched text. 相似文献
82.
Nizar Bouguila Author Vitae 《Pattern recognition》2011,44(6):1183-1200
Recently hybrid generative discriminative approaches have emerged as an efficient knowledge representation and data classification engine. However, little attention has been devoted to the modeling and classification of non-Gaussian and especially proportional vectors. Our main goal, in this paper, is to discover the true structure of this kind of data by building probabilistic kernels from generative mixture models based on Liouville family, from which we develop the Beta-Liouville distribution, and which includes the well-known Dirichlet as a special case. The Beta-Liouville has a more general covariance structure than the Dirichlet which makes it more practical and useful. Our learning technique is based on a principled purely Bayesian approach which resulted models are used to generate support vector machine (SVM) probabilistic kernels based on information divergence. In particular, we show the existence of closed-form expressions of the Kullback-Leibler and Rényi divergences between two Beta-Liouville distributions and then between two Dirichlet distributions as a special case. Through extensive simulations and a number of experiments involving synthetic data, visual scenes and texture images classification, we demonstrate the effectiveness of the proposed approaches. 相似文献
83.
84.
Four‐neighborhood clique kernel: A general framework for Bayesian and variational techniques of noise reduction in magnetic resonance images of the brain 下载免费PDF全文
Michael Osadebey Nizar Bouguila Douglas Arnold for the Alzheimer's Disease Neuroimaging Initiative 《International journal of imaging systems and technology》2014,24(3):224-238
Several algorithms have been proposed in the literature for image denoising but none exhibit optimal performance for all range and types of noise and for all image acquisition modes. We describe a new general framework, built from four‐neighborhood clique system, for denoising medical images. The kernel quantifies smoothness energy of spatially continuous anatomical structures. Scalar and vector valued quantification of smoothness energy configures images for Bayesian and variational denoising modes, respectively. Within variational mode, the choice of norm adapts images for either total variation or Tikhonov technique. Our proposal has three significant contributions. First, it demonstrates that the four‐neighborhood clique kernel is a basic filter, in same class as Gaussian and wavelet filters, from which state‐of‐the‐art denoising algorithms are derived. Second, we formulate theoretical analysis, which connects and integrates Bayesian and variational techniques into a two‐layer structured denoising system. Third, our proposal reveals that the first layer of the new denoising system is a hitherto unknown form of Markov random field model referred to as single‐layer Markov random field (SLMRF). The new model denoises a specific type of medical image by minimizing energy subject to knowledge of mathematical model that describes relationship between the image smoothness energy and noise level but without reference to a classical prior model. SLMRF was applied to and evaluated on two real brain magnetic resonance imaging datasets acquired with different protocols. Comparative performance evaluation shows that our proposal is comparable to state‐of‐the‐art algorithms. SLMRF is simple and computationally efficient because it does not incorporate a regularization parameter. Furthermore, it preserves edges and its output is devoid of blurring and ringing artifacts associated with Gaussian‐based and wavelet‐based algorithms. The denoising system is potentially applicable to speckle reduction in ultrasound images and extendable to three‐layer structure that account for texture features in medical images. © 2014 Wiley Periodicals, Inc. Int J Imaging Syst Technol, 24, 224–238, 2014 相似文献
85.
Abdulwahhab Omar Waleed Abbas Nizar Hadi 《International Journal of Control, Automation and Systems》2018,16(6):2790-2800
International Journal of Control, Automation and Systems - In this paper, a new structure of a Fractional Order State Feedback Controller (FOSFC) is designed to solve the problem of trajectory... 相似文献
86.
Nuha Zamzami Rua Alsuroji Oboh Eromonsele Nizar Bouguila 《Computational Intelligence》2020,36(2):459-485
This paper proposes an unsupervised algorithm for learning a finite mixture of scaled Dirichlet distributions. Parameters estimation is based on the maximum likelihood approach, and the minimum message length (MML) criterion is proposed for selecting the optimal number of components. This research work is motivated by the flexibility issues of the Dirichlet distribution, the widely used model for multivariate proportional data, which has prompted a number of scholars to search for generalizations of the Dirichlet. By introducing the extra parameters of the scaled Dirichlet, several useful statistical models could be obtained. Experimental results are presented using both synthetic and real datasets. Moreover, challenging real-world applications are empirically investigated to evaluate the efficiency of our proposed statistical framework. 相似文献
87.
We developed a variational Bayesian learning framework for the infinite generalized Dirichlet mixture model (i.e. a weighted mixture of Dirichlet process priors based on the generalized inverted Dirichlet distribution) that has proven its capability to model complex multidimensional data. We also integrate a “feature selection” approach to highlight the features that are most informative in order to construct an appropriate model in terms of clustering accuracy. Experiments on synthetic data as well as real data generated from visual scenes and handwritten digits datasets illustrate and validate the proposed approach. 相似文献
88.
On Bayesian analysis of a finite generalized Dirichlet mixture via a Metropolis-within-Gibbs sampling 总被引:1,自引:0,他引:1
In this paper, we present a fully Bayesian approach for generalized Dirichlet mixtures estimation and selection. The estimation
of the parameters is based on the Monte Carlo simulation technique of Gibbs sampling mixed with a Metropolis-Hastings step.
Also, we obtain a posterior distribution which is conjugate to a generalized Dirichlet likelihood. For the selection of the
number of clusters, we used the integrated likelihood. The performance of our Bayesian algorithm is tested and compared with
the maximum likelihood approach by the classification of several synthetic and real data sets. The generalized Dirichlet mixture
is also applied to the problems of IR eye modeling and introduced as a probabilistic kernel for Support Vector Machines.
相似文献
Riad I. HammoudEmail: |
89.
90.
High-dimensional unsupervised selection and estimation of a finite generalized Dirichlet mixture model based on minimum message length 总被引:1,自引:0,他引:1
Bouguila N Ziou D 《IEEE transactions on pattern analysis and machine intelligence》2007,29(10):1716-1731
We consider the problem of determining the structure of high-dimensional data, without prior knowledge of the number of clusters. Data are represented by a finite mixture model based on the generalized Dirichlet distribution. The generalized Dirichlet distribution has a more general covariance structure than the Dirichlet distribution and offers high flexibility and ease of use for the approximation of both symmetric and asymmetric distributions. This makes the generalized Dirichlet distribution more practical and useful. An important problem in mixture modeling is the determination of the number of clusters. Indeed, a mixture with too many or too few components may not be appropriate to approximate the true model. Here, we consider the application of the minimum message length (MML) principle to determine the number of clusters. The MML is derived so as to choose the number of clusters in the mixture model which best describes the data. A comparison with other selection criteria is performed. The validation involves synthetic data, real data clustering, and two interesting real applications: classification of web pages, and texture database summarization for efficient retrieval. 相似文献