New generations of video compression algorithms, such as those included in the under development High Efficiency Video Coding (HEVC) standard, provide substantially higher compression compared to their ancestors. The gain is achieved by improved prediction of pixels, both within a frame and between frames. Novel coding tools that contribute to the gain provide highly uncorrelated prediction residuals for which classical frequency decomposition methods, such as the discrete cosine transform, may not be able to supply a compact representation with few significant coefficients. To further increase the compression gains, this paper proposes transform skip modes which allow skipping one or both 1-D constituent transforms (i.e., vertical and horizontal), which is more suitable for sparse residuals. The proposed transform skip mode is tested in the HEVC codec and is able to provide bitrate reductions of up to 10% at the same objective quality when compared with the application of 2-D block transforms only. Moreover, the proposed transform skip mode outperforms the full transform skip currently investigated for possible adoption in the HEVC standard. 相似文献
Much research has been devoted over the years to investigating and advancing the techniques and tools used by analysts when they model. As opposed to what academics, software providers and their resellers promote as should be happening, the aim of this research was to determine whether practitioners still embraced conceptual modeling seriously. In addition, what are the most popular techniques and tools used for conceptual modeling? What are the major purposes for which conceptual modeling is used? The study found that the top six most frequently used modeling techniques and methods were ER diagramming, data flow diagramming, systems flowcharting, workflow modeling, UML, and structured charts. Modeling technique use was found to decrease significantly from smaller to medium-sized organizations, but then to increase significantly in larger organizations (proxying for large, complex projects). Technique use was also found to significantly follow an inverted U-shaped curve, contrary to some prior explanations. Additionally, an important contribution of this study was the identification of the factors that uniquely influence the decision of analysts to continue to use modeling, viz., communication (using diagrams) to/from stakeholders, internal knowledge (lack of) of techniques, user expectations management, understanding models’ integration into the business, and tool/software deficiencies. The highest ranked purposes for which modeling was undertaken were database design and management, business process documentation, business process improvement, and software development. 相似文献
High quality domain ontologies are essential for successful employment of semantic Web services. However, their acquisition is difficult and costly, thus hampering the development of this field. In this paper we report on the first stage of research that aims to develop (semi-)automatic ontology learning tools in the context of Web services that can support domain experts in the ontology building task. The goal of this first stage was to get a better understanding of the problem at hand and to determine which techniques might be feasible to use. To this end, we developed a framework for (semi-)automatic ontology learning from textual sources attached to Web services. The framework exploits the fact that these sources are expressed in a specific sublanguage, making them amenable to automatic analysis. We implement two methods in this framework, which differ in the complexity of the employed linguistic analysis. We evaluate the methods in two different domains, verifying the quality of the extracted ontologies against high quality hand-built ontologies of these domains.
Our evaluation lead to a set of valuable conclusions on which further work can be based. First, it appears that our method, while tailored for the Web services context, might be applicable across different domains. Second, we concluded that deeper linguistic analysis is likely to lead to better results. Finally, the evaluation metrics indicate that good results can be achieved using only relatively simple, off the shelf techniques. Indeed, the novelty of our work is not in the used natural language processing methods but rather in the way they are put together in a generic framework specialized for the context of Web services. 相似文献
This paper provides new insights on the semantic characteristics of two and three noun compounds. An analysis is performed using two sets of semantic classification categories: a list of 8 prepositional paraphrases previously proposed by Lauer [Designing statistical language learners: experiments on noun compounds, Ph.D. Thesis, Macquarie University, Australia] and a new set of 35 semantic relations introduced by us. We show the distribution of these semantic categories on a corpus of noun compounds and present several models for the bracketing and the semantic classification of noun compounds. The results are compared against state-of-the-art models reported in the literature. 相似文献
A new method of the electrode modification and DNA immobilization for a biosensor is reported. Outer layer of a conventional carbon paste electrode (CPE) was modified with carboxyl groups by mixing stearic acid with the paste. Single-stranded deoxyribonucleic acid was attached to the modified electrode through a linker - ethylenediamine. Immobilization process was performed in the presence of activators - water soluble 1-ethyl-3(3′-dimethylaminopropyl)-carbodiimide (EDC) and N-hydroxysulfosuccinimide (NHS). Stearic acid concentration and other experimental parameters of the procedure were optimized. Covalent immobilization of DNA on the electrode surface exhibits some advantages as compared to simple adsorption mainly due to the fact that nucleic acid chains are bound to an electrode surface by one end only and it ensures structural flexibility and increases hybridization without DNA leakage. Modified electrode with immobilized (21-mer) oligonucleotide as a specific probe was successfully applied in preliminary investigations for the detection of bar gene commonly used in genetically modified food. 相似文献