首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Stereo imaging of the optic-disc is a gold standard examination of glaucoma, and progression of glaucoma can be detected from temporal stereo images. A Java-based software system is reported here which automatically aligns the left and right stereo retinal images and presents the aligned images side by side, along with the anaglyph computed from the aligned images. Moreover, the disparity between two aligned images is computed and used as the depth cue to render the optic-disc images, which can be interactively edited, panned, zoomed, rotated, and animated, allowing one to examine the surface of the optic-nerve head from different view angles. Measurement including length, area, and volume of regions of interest can also be performed interactively.  相似文献   

2.
Besides the dimensions of the selected image field width, the resolution of the individual objects is also of major importance for automatic reconstruction and other sophisticated histological work. The software solution presented here allows the user to create image mosaics by using a combination of several photographs. Optimum control is achieved by combining two procedures and several control mechanisms. In sample tests involving 50 image pairs, all images were mosaiced without giving rise to error. The program is ready for public download.  相似文献   

3.
4.
Program mutation is a fault-based technique for measuring the effectiveness of test cases that, although powerful, is computationally expensive. The principal expense of mutation is that many faulty versions of the program under test, called mutants, must be created and repeatedly executed. This paper describes a tool, called JavaMut, that implements 26 traditional and object-oriented mutation operators for supporting mutation analysis of Java programs. The current version of that tool is based on syntactic analysis and reflection for implementing mutation operators. JavaMut is interactive; it provides a graphical user interface to make mutation analysis faster and less painful. Thanks to such automated tools, mutation analysis should be achieved within reasonable costs.  相似文献   

5.
iw3d is a tool for 3-d interpolation of scattered data. The program is endowed with a user-friendly graphical interface. The interpolation-algorithm is a modified version of the inverse-distance weighting method. It has the capability for a quality-weighted interpolation. The interpolation can be carried out with a sector-search approach for reducing clustering effects. A semivariogram is calculated for the three cartesian main directions to give an estimate for a selectable search-distance. Tests of the program with a synthetic 3-d data set and a large set of measured subsurface temperatures with different qualities show good results.iw3d is programmed with Java (TM) and is a GPL licensed open source. It has been successfully tested with different Java versions on Microsoft Windows, Sun Solaris and on several Linux distributions.The current version is available at http://geomath.onlinehome.de/iw3d/.  相似文献   

6.
Immunochemical staining techniques are commonly used to assess neuronal, astrocytic and microglial alterations in experimental neuroscience research, and in particular, are applied to tissues from animals subjected to ischemic stroke. Immunoreactivity of brain sections can be measured from digitized immunohistology slides so that quantitative assessment can be carried out by computer-assisted analysis. Conventional methods of analyzing immunohistology are based on image classification techniques applied to a specific anatomic location at high magnification. Such micro-scale localized image analysis limits one for further correlative studies with other imaging modalities on whole brain sections, which are of particular interest in experimental stroke research. This report presents a semi-automated image analysis method that performs convolution-based image classification on micro-scale images, extracts numerical data representing positive immunoreactivity from the processed micro-scale images and creates a corresponding quantitative macro-scale image. The present method utilizes several image-processing techniques to cope with variances in intensity distribution, as well as artifacts caused by light scattering or heterogeneity of antigen expression, which are commonly encountered in immunohistology. Micro-scale images are composed by a tiling function in a mosaic manner. Image classification is accomplished by the K-means clustering method at the relatively low-magnification micro-scale level in order to increase computation efficiency. The quantitative macro-scale image is suitable for correlative analysis with other imaging modalities. This method was applied to different immunostaining antibodies, such as endothelial barrier antigen (EBA), lectin, and glial fibrillary acidic protein (GFAP), on histology slides from animals subjected to middle cerebral artery occlusion by the intraluminal suture method. Reliability tests show that the results obtained from immunostained images at high magnification and relatively low magnification are virtually the same.  相似文献   

7.
In this work, we investigate a new ranking method for principal component analysis (PCA). Instead of sorting the principal components in decreasing order of the corresponding eigenvalues, we propose the idea of using the discriminant weights given by separating hyperplanes to select among the principal components the most discriminant ones. The method is not restricted to any particular probability density function of the sample groups because it can be based on either a parametric or non-parametric separating hyperplane approach. In addition, the number of meaningful discriminant directions is not limited to the number of groups, providing additional information to understand group differences extracted from high-dimensional problems. To evaluate the discriminant principal components, separation tasks have been performed using face images and three different databases. Our experimental results have shown that the principal components selected by the separating hyperplanes allow robust reconstruction and interpretation of the data, as well as higher recognition rates using less linear features in situations where the differences between the sample groups are subtle and consequently most difficult for the standard and state-of-the-art PCA selection methods.  相似文献   

8.
Three examples are given of the state-of-the-art of pattern recognition as regards the analysis of images of human and animal tissue: (1) the counting of cell nuclei of normal and abnormal rabbit kidney; (2) the detection of boundaries of endothelial cells of the living human cornea; and (3) the differentiation of normal and abnormal cells in the human liver. The methodology employed is that of cellular logic matched filtering using the Carnegie-Mellon SUPRPIC image processing system.  相似文献   

9.
A new graphical tool (Multimedia University’s RSIMANA—Remote-Sensing Image Analyzer) developed for image analysis is described in this paper. MATLAB and ENVI are some of the commercially available tools in the market that aid in image processing and analysis. But their current versions are of limited assistance in image analysis; for example, MATLAB can extract the area of irregular objects and patterns in images, but not their length. ENVI is more focused on image processing than on image analysis functions. Other commercially available tools are also prohibitively expensive. This indicates the need to develop a userfriendly graphical tool that meets research objectives in the educational environment. The text was submitted by the author in English. Hema Nair. Born 1965. Educational qualifications: B.Tech. (Electrical Engineering) from Government Engineering College affiliated to University of Calicut, Kerala State, India, 1986; MSc (Electrical Engineering) from National University of Singapore, 1993; MSc (Computer Science) from Clark Atlanta University, USA, 1996. Previous employment: Researcher and Project Leader in AT & T, New Jersey, USA, for about 5 years. Also worked in Bangalore, India, before that in Apple Information Technology Ltd. as Teaching Faculty. Current employment: lecturer, Faculty of Engineering and Technology, Multimedia University, Malaysia. Current research: the final stages of her PhD in Computer Science at Multimedia University. Scientific interests are image analysis, pattern recognition, databases, AI, data mining. Member of IEEE (USA) since 1997, Professional Member of ACM (USA) since 1997, Member of Institution of Engineers (India) since 1986. Reviewer for IASTED International Conference 2004. Current PhD project entitled “Pattern Extraction and Concept Clustering in Linguistic Terms from Mined Images” is funded by an Intensive Research in Priority Area (IRPA) grant from Government of Malaysia. Research for MSc in Computer Science from USA was funded by a research grant from the US Army. Author of three International Conference papers accepted in Portugal, Belgium, and India.  相似文献   

10.
面向Java的信息流分析工作需要修改编译器或实时执行环境,对已有系统兼容性差,且缺乏形式化分析与安全性证明。首先,提出了基于有限状态自动机的Java信息流分析方法,将整个程序变量污点取值空间抽象为自动机状态空间,并将Java字节码指令看做自动机状态转换动作;然后,给出了自动机转换的信息流安全规则,并证明了在该规则下程序执行的无干扰安全性;最后,采用静态污点跟踪指令插入和动态污点跟踪与控制的方法实现了原型系统IF-JVM,既不需要获得Java应用程序源码,也不需要修改Java编译器和实时执行环境,更独立于客户操作系统。实验结果表明,原型系统能正确实现对Java的细粒度地信息流跟踪与控制,性能开销为53.1%。  相似文献   

11.
In this paper an experience in development of an engineering application using Java is presented. Higher-order asymptotic analysis of elastic–plastic crack-tip fields includes a computationally intensive part related to obtaining the three-term asymptotic expansion. It also requires the input of a large amount of data for fitting the asymptotic field to numerically determined stresses in the near crack-tip region. The developed Java applet implements a graphical user interface using Abstract Windowing Toolkit components. The applet can be downloaded through the Internet and can be run on different computer platforms inside a Java enabled Web browser. Using simple examples, we explain how to overcome some drawbacks of Java API classes, which can arise during the development of engineering applications.  相似文献   

12.
Designing a JEE (Java Enterprise Edition)-based enterprise application capable of achieving its performance objectives is rather hard. Predicting the performance of this type of systems at the design level is difficult and sometimes not viable, because this requires having precise knowledge of the expected load conditions and the underlying software infrastructure. Besides, the requirement for rapid time-to-market leads to postpone performance tuning until systems are developed, packaged and running. In this paper we present a novel approach for automatically detecting performance problems in JEE-based applications and, in turn, suggesting courses of actions to correct them. The idea is to allow developers to smoothly identify and eradicate performance anti-patterns by automatically analyzing execution traces. The approach has been implemented as a tool called JEETuningExpert, and validated using three well-known JEE reference applications. Specifically, we evaluated the effectiveness of JEETuningExpert for detecting performance problems, measured the overhead imposed by online monitoring each application and the improvements were achieved after following the suggested corrective actions. These results empirically showed that the refactored applications are 40.08%, 76.94% and 61.13% faster, on average.  相似文献   

13.
《Information Fusion》2001,2(2):135-149
In this paper, some image registration algorithms are investigated for the purpose of image fusion in a digital camera application. A hybrid scheme which uses both feature-based and intensity-based methods is proposed. In particular, an edge-based image registration approach is developed to guide the intensity-based registration which uses optical flow estimation. The idea of coarse-to-fine multi-scale iterative refinement is also utilized. The combination of these different methods tends to compensate for any deficiencies in the individual methods. Experiments show that our approach provides accurate registrations for the digital camera application. It is also demonstrated that the approach proves useful for registering some multi-spectral images.  相似文献   

14.
MATLAB与Java的联合应用研究   总被引:5,自引:0,他引:5  
科学与工程领域经常涉及网络环境下的工程计算问题。MATLAB具有很强的数值计算能力,Java是目前普遍使用的网络应用开发工具。本文研究了工程计算问题中的MATLAB与Java联合途径,提出二者联合应用的三种方法。研究与应用表明,这些方法能够将MATLAB的运算能力与Java的网络开发功能结合起来,实现二者的优势互补,拓宽应用领域,增强应用程序的处理能力,很好地解决网络环境下的工程计算问题。由于三种方法各有所长,在实际应用中要根据具体问题和要求的不同进行合适的选择。  相似文献   

15.
Java has begun to open up new possibilities for accessing applications on the Web. With Java, developers can write applications as applets and insert them into Web pages. The user can then retrieve and execute them with local computing resources. We show how developers can use this feature to create a network computing platform that lets Web users share applications not specifically devised for network use, including those that are computationally intensive. With our approach, the network is not involved as long as the user executes operations on the graphical interface, which is executed locally on the client. Only when users require some computational response from the server do they need to access it. Access is straightforward; authorized users can access the application from any node connected with the Internet as long as they have a Java-enabled Web browser. We have used used one such network computing platform to port an existing tool and develop a new application  相似文献   

16.
朱明凯  高振华  柴志雷 《计算机应用》2010,30(11):2873-2875
Java技术正越来越受到图像处理研究人员的关注,希望以此提升开发效率,增强可移植性。但软件方式的Java虚拟机运行速度慢、实时性差,无法满足图像处理复杂计算对性能的需求。为此,提出一种以硬件方式直接执行字节码的Java处理器结构,并实现了其模拟器及预处理器构成完整测试平台。从实验结果可看出:该平台的执行效率是虚拟机方式的860倍,表明将Java处理器用于嵌入式图像处理将是一种可行选择。  相似文献   

17.
Cosatto  E. Craf  H.P. 《Micro, IEEE》1995,15(3):32-38
At the heart of this image analysis system are two NET32K analog neural network chips, computing over 100 billion multiply-accumulates per second. The system simultaneously scans sixty-four 16×16-pixel templates over bi-level images, producing feature maps that mark matches between the image and the templates. When we code simple, generic shapes into the templates, the feature maps allow us to make quick, robust analyses of complex, noisy images  相似文献   

18.
A large portion of high-level computer programs consists of data declaration. Thus, an increased focus on testing the data flow aspects of programs should be considered. In this paper, we consider testing the data flow in Java programs dynamically. Data flow analysis has been applied for testing procedural and some object-oriented programs. We have extended the dynamic data flow analysis technique to test Java programs and show how it can be applied to detect data flow anomalies.  相似文献   

19.
Static analyses based on denotational semantics can naturally model functional behaviours of the code in a compositional and completely context and flow sensitive way. But they only model the functional i.e., input/output behaviour of a program P, not enough if one needs P’s internal behaviours i.e., from the input to some internal program points. This is, however, a frequent requirement for a useful static analysis. In this paper, we overcome this limitation, for the case of mono-threaded Java bytecode, with a technique used up to now for logic programs only. Namely, we define a program transformation that adds new magic blocks of code to the program P, whose functional behaviours are the internal behaviours of P. We prove the transformation correct w.r.t. an operational semantics and define an equivalent denotational semantics, devised for abstract interpretation, whose denotations for the magic blocks are hence the internal behaviours of P. We implement our transformation and instantiate it with abstract domains modelling sharing of two variables, non-cyclicity of variables, nullness of variables, class initialisation information and size of the values bound to program variables. We get a static analyser for full mono-threaded Java bytecode that is faster and scales better than another operational pair-sharing analyser. It has the same speed but is more precise than a constraint-based nullness analyser. It makes a polyhedral size analysis of Java bytecode scale up to 1300 methods in a couple of minutes and a zone-based size analysis scale to still larger applications.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号