首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   20篇
  免费   0篇
  国内免费   1篇
综合类   1篇
一般工业技术   1篇
冶金工业   1篇
自动化技术   18篇
  2022年   1篇
  2016年   2篇
  2014年   2篇
  2013年   1篇
  2011年   2篇
  2010年   1篇
  2009年   3篇
  2008年   2篇
  2007年   1篇
  2006年   1篇
  2003年   1篇
  2002年   1篇
  2000年   2篇
  1988年   1篇
排序方式: 共有21条查询结果,搜索用时 15 毫秒
1.
We construct equivalent localized versions of a formula, adding assumptions simultaneously to various locations, where the particular location determines what is added. Inference rules that take advantage of localized formulas are presented for sequent calculi in which the left hand side of sequents can be used to accumulate the background assumptions (or contexts) of assertions. The intended application is to the automatic generation of tractable justifying lemmas for substitution operations for interactive proof development systems, especially those concerned with mathematical topics where manipulation of deeply embedded terms is desirable.  相似文献   
2.
大多数研究者对微博倾向性分析过多关注的是情感词、形容词和否定词,忽略了 关联词对其情感倾向的影响。为了提高微博情感倾向性分析的准确率,提出了融合关联词的微博倾向性分析方法,考虑微博文本中形容词、程度副词以及关联词之间的组合关系。 本文充分考虑了关联词的结构特点并在已有词典的基础上构建专门用于微博倾向性分析的微博词典、否定词词典和关联词词典,同时考虑到网络新词对微博倾向性的影响,还构建 了一个全新的网络新词词典。借助支持向量机(Support vector machine,SVM)将微博文本分为负向、正向和中性3 类,通过结合情感词典和SVM的方法提高微博文本倾向性分析的准确率。通过对COASE 2014 数据实验可以表明,本文方法对微博倾向性分析取得了较好的效果。  相似文献   
3.
目前,大多数模糊推理都是利用t-范数和t-余范数或其改进形式对连接词进行建模,这些模型不能将模糊规则中前件集与后件集之间的相关性信息引入到模糊推理过程,这会丢失蕴含在规则中的一些信息甚至导致推理结果与实际经验严重不符.为解决此问题,本文首先引入模糊集合面向对象变换的概念,并将其推广,建立了合成type-2模糊集合模型.基于此模型,针对区间型type-2模糊逻辑系统,提出一种面向后件集的模糊推理机制,该机制能将前件集与后件集的相关性信息(包括清晰数和模糊数两种情形)引入到模糊推理过程.仿真结果表明,该方法能捕获到模糊规则中更多的不确定性信息,并为模糊逻辑系统的设计提供更大的自由度.  相似文献   
4.
The paper contains the first complete proof of strong normalization (SN) for full second order linear logic (LL): Girard’s original proof uses a standardization theorem which is not proven. We introduce sliced pure structures (sps), a very general version of Girard’s proof-nets, and we apply to sps Gandy’s method to infer SN from weak normalization (WN). We prove a standardization theorem for sps: if WN without erasing steps holds for an sps, then it enjoys SN. A key step in our proof of standardization is a confluence theorem for sps obtained by using only a very weak form of correctness, namely acyclicity slice by slice. We conclude by showing how standardization for sps allows to prove SN of LL, using as usual Girard’s reducibility candidates.  相似文献   
5.
The aim of this paper is to introduce the concepts of interval additive generators of interval t-norms and interval t-conorms, as interval representations of additive generators of t-norms and t-conorms, respectively, considering both the correctness and the optimality criteria. The formalization of interval fuzzy connectives in terms of their interval additive generators provides a more systematic methodology for the selection of interval t-norms and interval t-conorms in the various applications of fuzzy systems. We also prove that interval additive generators satisfy the main properties of additive generators discussed in the literature.  相似文献   
6.
This paper uses a partially ordered set of syntactic categories to accommodate optionality and licensing in natural language syntax. A complex but well-studied data set pertaining to the syntax of quantifier scope and negative polarity licensing in Hungarian is used to illustrate the proposal. The presentation is geared towards both linguists and logicians. The paper highlights that the main ideas can be implemented in different grammar formalisms, and discusses in detail an implementation where the partial ordering on categories is given by the derivability relation of a calculus with residuated and Galois-connected unary operators.  相似文献   
7.
Connectives are cohesive devices that signal the relations between clauses and are critical to the construction of a coherent representation of a text's meaning. The authors investigated young readers' knowledge, processing, and comprehension of temporal, causal, and adversative connectives using offline and online tasks. In a cloze task, 10-year-olds were more accurate than 8-year-olds on temporal and adversative connectives, but both age groups differed from adult levels of performance (Experiment 1). When required to rate the “sense” of 2-clause sentences linked by connectives, 10-year-olds and adults were better at discriminating between clauses linked by appropriate and inappropriate connectives than were 8-year-olds. The 10-year-olds differed from adults only on the temporal connectives (Experiment 2). In contrast, online reading time measures indicated that 8-year-olds' processing of text is influenced by connectives as they read, in much the same way as 10-year-olds'. Both age groups read text more quickly when target 2-clause sentences were linked by an appropriate connective compared with texts in which a connective was neutral (and), inappropriate to the meaning conveyed by the 2 clauses, or not present (Experiments 3 and 4). These findings indicate that although knowledge and comprehension of connectives is still developing in young readers, connectives aid text processing in typically developing readers. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   
8.
In fuzzy logic in wider sense, i.e. in the field of fuzzy sets applications, t-norms got a prominent rôle in recent times. In many-valued logic, the ?UKASIEWICZ systems, the GÖDEL sytems, and also the product logic all are t-norm based systems. The present paper discusses the more general problem of the adequate axiomatizability for such t-norm based logical systems in general, surveying results of the last years. The main emphasis in the present paper is on propositional logic.  相似文献   
9.
Fang-Fang Wang 《工程优选》2014,46(11):1501-1519
The fuzzy-connective-based aggregation network is similar to the human decision-making process. It is capable of aggregating and propagating degrees of satisfaction of a set of criteria in a hierarchical manner. Its interpreting ability and transparency make it especially desirable. To enhance its effectiveness and further applicability, a learning approach is successfully developed based on particle swarm optimization to determine the weights and parameters of the connectives in the network. By experimenting on eight datasets with different characteristics and conducting further statistical tests, it has been found to outperform the gradient- and genetic algorithm-based learning approaches proposed in the literature; furthermore, it is capable of generating more accurate estimates. The present approach retains the original benefits of fuzzy-connective-based aggregation networks and is widely applicable. The characteristics of the learning approaches are also discussed and summarized, providing better understanding of the similarities and differences among these three approaches.  相似文献   
10.
This paper sets forth a new theory of quantifiers and term connectives, called shadow theory, which should help simplify various semantic theories of natural language by greatly reducing the need of Montagovian proper names, type-shifting, and λ-conversion. According to shadow theory, conjunctive, disjunctive, and negative noun phrases such as John and Mary, John or Mary, and not both John and Mary, as well as determiner phrases such as every man, some woman, and the boys, are all of semantic type e and denote individual-like objects, called shadowsconjunctive, disjunctive, or negative shadows, such as John-and-Mary, John-or-Mary, and not-(John-and-Mary). There is no essential difference between quantification and denotation: quantification is nothing but denotation of shadows. Individuals and shadows constitute a Boolean structure. Formal language LSD (Language for Shadows with Distributivity), which takes compound terms to denote shadows, is investigated. Expansions and enrichments of LSD are also considered toward the end of the paper.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号