首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   20篇
  免费   1篇
  国内免费   1篇
电工技术   3篇
建筑科学   2篇
无线电   6篇
一般工业技术   1篇
自动化技术   10篇
  2021年   1篇
  2020年   1篇
  2019年   1篇
  2018年   4篇
  2017年   1篇
  2016年   1篇
  2015年   1篇
  2014年   3篇
  2013年   2篇
  2011年   1篇
  2009年   1篇
  2008年   1篇
  2006年   2篇
  2004年   1篇
  1985年   1篇
排序方式: 共有22条查询结果,搜索用时 15 毫秒
1.
Real-time processing and compression of DNA microarray images.   总被引:1,自引:0,他引:1  
In this paper, we present a pipeline architecture specifically designed to process and compress DNA microarray images. Many of the pixilated image generation methods produce one row of the image at a time. This property is fully exploited by the proposed pipeline that takes in one row of the produced image at each clock pulse and performs the necessary image processing steps on it. This will remove the present need for sluggish software routines that are considered a major bottleneck in the microarray technology. Moreover, two different structures are proposed for compressing DNA microarray images. The proposed architecture is proved to be highly modular, scalable, and suited for a standard cell VLSI implementation.  相似文献   
2.
Multimedia Tools and Applications - Image retargeting is the task of making images capable of being displayed on screens with different sizes. This work should be done so that high-level visual...  相似文献   
3.
An adaptive LSB matching steganography based on octonary complexity measure   总被引:1,自引:0,他引:1  
Adaptive steganography methods tend to increase the security against attacks. Most of adaptive methods use LSB flipping (LSB-F) for embedding part of their algorithms. LSB-F is very much vulnerable against simple steganalysis methods but it allows the adaptive algorithms to be extractable at the receiver side. Use of LSB matching (LSB-M) could increase the security but extraction of data at the receiver is difficult or, in occasions, impossible. There are numerous attacks against LSB-M. In this paper we are proposing an adaptive algorithm which, unlike most adaptive methods, uses LSB-M as its embedding method. The proposed method uses a complexity measure based on a local neighborhood analysis for determination of secure locations of an image. Comparable adaptive methods that use LSB-M suffer from possible changes in the complexity of pixels when embedding is performed. The proposed algorithm is such that when a pixel is categorized as complex at the transmitter and is embedded the receiver will identify it as complex too, and data is correctly retrieved. Better performance of the algorithm is shown by obtaining higher PSNR values for the embedded images with respect to comparable adaptive algorithms. The security of the algorithm against numerous attacks is shown to be higher than LSB-M. Also, it is compared with a recent adaptive method and is proved to be advantageous for most embedding rates.  相似文献   
4.
An optimal algorithm based on branch-and-bound approach is presented in this paper to determine lot sizes for a single item in material requirement planning environments with deterministic time-phased demand and constant ordering cost with zero lead time, where all-units discounts are available from vendors and backlog is not permitted. On the basis of the proven properties of optimal order policy, a tree-search procedure is presented to construct the sequence of optimal orders. Some useful fathom rules have been proven, which make the algorithm very efficient. To compare the performance of this algorithm with the other existing optimal algorithms, an experimental design with various environments has been developed. Experimental results show that the performance of our optimal algorithm is much better than the performance of other existing optimal algorithms. Considering computational time as the performance measure, this algorithm is considered the best among the existing optimal algorithms for real problems with large dimensions (i.e. large number of periods and discount levels).  相似文献   
5.
The Earned Value technique is a crucial technique in analyzing and controlling the performance of a project which allows a more accurate measurement of both the performance and the progress of a project. This paper presents a new fuzzy-based earned value model with the advantage of developing and analyzing the earned value indices, and the time and the cost estimates at completion under uncertainty. As the uncertainty is inherent in real-life activities, the developed model is very useful in evaluating the progress of a project where uncertainty arises. A small example illustrates how the new model can be implemented in reality.  相似文献   
6.
Multimedia Tools and Applications - Nowadays, due to widespread usage of the Internet, digital contents are distributed quickly and inexpensively throughout the world. Watermarking techniques can...  相似文献   
7.
Motion estimation plays a vital role in reducing temporal correlation in video codecs but it requires high computational complexity. Different algorithms have tried to reduce this complexity. However these reduced-complexity routines are not as regular as the full search algorithm (FSA). Also, regularity of an algorithm is very important in order to have a hardware implementation of that algorithm even if it leads to more complexity burden. The goal of this paper is to develop an efficient and regular algorithm which mimics FSA by searching a small area exhaustively. Our proposed algorithm is designed based on two observations. The first observation is that the motion vector of a block falls within a specific rectangular area designated by the prediction vectors. The second observation is that in most cases, this rectangular area is smaller than one fourth of the FSA’s search area. Therefore, the search area of the proposed method is adaptively found for each block of a frame. To find the search area, the temporal and spatial correlations among motion vectors of blocks are exploited. Based on these correlations, a rectangular search area is determined and the best matching block in this area is selected. The proposed algorithm is similar to FSA in terms of regularity but requires less computational complexity due to its smaller search area. Also, the suggested algorithm is as simple as FSA in terms of implementation and is comparable with many of the existing fast search algorithms. Simulation results show the claimed performance and efficiency of the algorithm.  相似文献   
8.
In this paper we present a pipeline architecture specifically designed for processing of DNA microarray images. Many of the pixilated image generation methods produce one row of the image at a time. This property is fully exploited by a pipeline which takes in one row of the produced image at each clock pulse and performs the necessary image processing steps on it. This will remove the present need for sluggish software routines that are considered a major bottleneck in the microarray technology. The size of the proposed structure is a function of the width of the image and not its length. The proposed architecture is proved to be highly modular, scalable and suited for a Standard Cell VLSI implementation.  相似文献   
9.
Contourlet transform (CT) is a powerful image processing tool. Even though many promising applications have been proposed, no hardware implementation of CT has been reported. This paper analyzes CT to form a structure which is hardware implementable. CT consists of two main parts, Laplacian pyramid (LP) and directional filter bank (DFB). In both parts, novel algorithmic changes are proposed for realizing efficient hardware architecture. In the proposed LP structure, 50 % of the arithmetic operations have been reduced and it operates twice as fast as the existing implementations. To the best of our knowledge, DFB has not comprehensively been studied for hardware implementation so far. Thus, we first analyze DFB to figure out its hardware-oriented structure and then propose DFB architecture. Finally, analysis and simulation results demonstrate that the proposed CT architecture achieves the real-time performance (40 frame/s) operating at 76 MHz which is verified through FPGA implementation. Moreover, since all stages utilize fixed-point arithmetic operations, the comprehensive quantization analysis is performed to keep the MSE and PSNR values in an acceptable range.  相似文献   
10.
One of the main issues in hybrid wireless networks is vertical handoff. Dropping probability is one of the important parameters that must be considered in planning the wireless communication systems. However, there has not been much effort in dropping rate reduction in loosely coupled hybrid wireless networks. In loosely coupled WLAN/cellular systems the system administrator of the WLAN is different from the cellular one. Therefore, in these situations, reducing the dropping probability based on classical methods such as using reserved guard channels is difficult. A handoff from a WLAN to a cellular system occurs when a multi-mode device moves out of the WLANs coverage area. This is an upward vertical handoff in a hybrid network. In this paper, we propose to employ ad hoc relaying during the upward vertical handoff in a hybrid WLAN/cellular system. Two-hop and multi-hop relaying approaches, which we propose in this paper, improve the dropping probability regardless of the number of reserved channels. Simulation results support the effectiveness of the proposed methods. Also, practical routing protocols are proposed in order to implement the suggested relaying methods.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号