首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到14条相似文献,搜索用时 62 毫秒
1.
建筑结构图版面分析方法研究   总被引:1,自引:0,他引:1  
工程图的计算机识别和理解技术是计算机应用于工程方面的研究热点之一,图形的版面分析是图形理解的基础。文章给出了基于图形和基于图像的两种版面分析和划分算法,并对两种方法在速度、效果方面进行了比较。实验证明,基于图像的版面分析和划分算法在线条数较多时优于基于图形方法的结论。  相似文献   

2.
基于计算机的视觉检测技术的原理及应用研究   总被引:1,自引:0,他引:1  
基于图像处理和计算机视觉检测技术具有十分显著的优越性,尤其是其具有智能化、快速性等特点,在科学技术迅速发展的今天,基于计算机的视觉检测技术更是得到了广泛的关注和重视。随着计算机的普及以及一系列相关技术的进步与发展,人们会更多的借助于计算机、传感器等设备继续拧视觉信息的捕捉和处理。以下,笔者立足于我国计算机视觉检测技术的发展情况,就该技术的原理以及其应用领域等方面加以简单的介绍。  相似文献   

3.
论文提出了基于数学形态学的版面分析方法。该方法是以自底向上为主,同时结合了数学形态学的思想。论文提出的方法,利用数学形态学的膨胀运算和搜索算法,实现对复杂版面进行快速准确的分析。论文以名片版面图像作为测试样本,进行版面分析,取得了预期的效果。  相似文献   

4.
5.
设计了一种基于双目视觉技术的触摸屏实现方法,采用两个普通摄像头作为传感器实时捕捉触摸物的存在及其动作,并利用计算机视觉技术实现对普通显示屏幕的虚拟触摸功能,使其具有单击、双击、拖动以及多点触摸等功能。  相似文献   

6.
基于汉字字形的西夏文字有6000字,西夏字的信息处理有利于西夏学的研究和西夏书籍的出版。在汉字、英文等文字的版面分析已有一些研究成果,古籍的版面分析也是一个研究热点。该文就西夏文的版面分析进行了系统的研究和实践。  相似文献   

7.
曲线和曲面的重构是逆向工程中的重要问题,特别是按照计算机图形学中点线面的发展规律,曲线重构更是其中很重要的一步,为后面的曲面重构奠定了研究基础。论文研究和实现了一种曲线重构算法,该算法将人类的视觉具有的接近性和连续性融入到了曲线重构算法中。实验结果表明了该算法的有效性。  相似文献   

8.
徐兆军  业宁  王厚立 《计算机应用》2004,24(Z2):274-275
分析了传统的版面分析算法,提出了一种新的基于神经网络的版面分析的算法.算法先对原图像进行边界识别,以突出文字区域的信息,消弱图像区域的信息,然后用8×8的矩形采样,取样本的期望和方差来作为训练的样本,然后识别,并用基于连通数来滤波.通过实验结果可以看出这种方法是很有效的.  相似文献   

9.
提出了一种基于图线和行程段特征分析,由行程段直接拾取整条图线的工程图识别算法。该算法不经过图段分别,根据整条图线的信息确定交点;通过分析线索、间隔等的规律确定线型。最后,基于工程图知识进行图线校正。该算法已付诸实践并取得良好效果。  相似文献   

10.
该文介绍了基于计算机视觉技术的智能停车场进出口安全方案及其实现的核心技术。主要讨论了车牌照识别技术,并简述了车辆检测和车辆图像匹配两个关键技术。  相似文献   

11.
站在全新的角度研究系统重构的方法,提出一种基于现场跟踪分析方法的系统重构策略,系统地阐述现场跟踪方法的理论依据、设计思路、实现策略等。从工程的角度介绍了现场分析方法在系统重构上的设计方法,给出了系统总体结构、组成框架以及详细的工作流程。该技术应用于系统重构后,可以为大规模的逻辑芯片安全检测、硬件漏洞与"后门"发现、故障诊断与修复提供一种全新的解决方案。  相似文献   

12.
基于TMS的信息资源分类与检索方法研究   总被引:1,自引:0,他引:1  
本文研究基于主题图的多种格式信息资源的分类管理与检索方法。主要研究运用主题图方法,表示信息资源的内容及相互间关系,并实现自动推理。最后设计了一个基于主题图技术的信息资源分类与检索应用系统实例,它体现了主题图技术特点,将信息资源分为资源域和主题域两层,很方便组织管理信息资源,能提高检索性能,并有一定程度的智能检索和内容管理能力。  相似文献   

13.
在逆向工程中复杂曲面零件表面的几何模型的构造是研究重点之一,根据零件表面 的数字化数据提取零件表面的边界是构造零件几何模型的关键步骤.本文提出基于三目视觉 方法来提取和构造复杂曲面边界的技术.从一幅图像中提取反映物体边界的特征点,再利用 图像匹配方法得到这些特征点的空间坐标,最后以这些特征点构造出物体边界B样条曲线.  相似文献   

14.
Subjective pattern recognition is a class of pattern recognition problems, where we not only merely know a few, if any, the strategies our brains employ in making decisions in daily life but also have only limited ideas on the standards our brains use in determining the equality/inequality among the objects. Face recognition is a typical example of such problems. For solving a subjective pattern recognition problem by machinery, application accuracy is the standard performance metric for evaluating algorithms. However, we indeed do not know the connection between algorithm design and application accuracy in subjective pattern recognition. Consequently, the research in this area follows a “trial and error” process in a general sense: try different parameters of an algorithm, try different algorithms, and try different algorithms with different parameters. This phenomenon can be observed clearly in the nearly 30 years research of the face recognition: although huge advances have been made, no algorithm has ever been shown a potential to be consistently better than most of the algorithms developed earlier; it was even shown that a naïve algorithm can work, in the sense of accuracy, at least no worse than many newly developed ones in a few benchmarks. We argue that, the primary objective of subjective pattern recognition research should be moved to theoretical robustness from application accuracy so that we can evaluate and compare algorithms without or with only few “trial and error” steps. We in this paper introduce an analytical model for studying the theoretical stabilities of multicandidate Electoral College and Direct Popular Vote schemes (aka regional voting scheme and national voting scheme, respectively), which can be expressed as the a posteriori probability that a winning candidate will continue to be chosen after the system is subjected to noise. This model shows that, in the context of multicandidate elections, generally, Electoral College is more stable than Direct Popular Vote, that the stability of Electoral College increases from that of Direct Popular Vote as the size of the subdivided regions decreases from the original nation size, up to a certain level, and then the stability starts to decrease approaching the stability of Direct Popular Vote as the region size approaches the original unit cell size; and that the stability of Electoral College approaches that of Direct Popular Vote in the two extremities as the region size increases to the original national size or decreases to the unit cell size. It also shows a special situation of white noise dominance with negligibly small concentrated noise, where Direct Popular Vote is surprisingly more stable than Electoral College, although the existence of such a special situation is questionable. We observe that “high stability” in theory indeed always reveals itself in “high accuracy” in applications. Extensive experiments on two human face benchmark databases applying an Electoral College framework embedded with standard baseline and newly developed holistic algorithms have been conducted. The impressive improvement by Electoral College over regular holistic algorithms verifies the stability theory on the voting systems. It also shows an evidential support for adopting theoretical stability instead of application accuracy as the primary objective for subjective pattern recognition research.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号