首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
2.

With the rapid escalation of information technology and the internet, the digital image has turned out to be a vital medium for communication. Hence, there is an increasing demand to protect these digital images as they are transmitted over the insecure medium such as the Internet. This paper proposes the Genetic Algorithm influenced image encryption scheme. Both crossover and mutation operations were performed to enhance the statistical measures of the grayscale cipher images. Intra-pixel bit manipulation improved the diffusion property during mutation. Crossover accomplished the generation of offspring with the confluence of image intensities and keys. The chaotic Logistic and Tent maps provided improvement in keyspace through their role in the generation of initial seeds. Noticeably, the initial seeds were generated from the features of input image such as minimum & maximum number of occurrences of intensity and minimum & maximum intensity levels. In this regard, every image will accompany a unique key sequence as a session key which needs to be shared with the intended receiver for the distortion-free recovery of original images. Besides, the multi-point genetic crossover was employed for diffusion which resulted in the production of a couple of offspring. The Fitness test which decides the number of generations was executed by performing correlation and entropy analyses at a threshold of 0.01 and 7.99 for the former and later respectively. The investigational consequences authenticate that the proposed scheme not only reveals simple and optimised encryption, it is also defending against various distinctive attacks. The proposed image security solutions can be incorporated in banking, medical insurances, e-healthcare, and e-governance sectors for document confidentiality and integrity checking mechanisms.

  相似文献   

3.
ContextIn large software organizations with a product line development approach, system test planning and scope selection is a complex task. Due to repeated testing: across different testing levels, over time (test for regression) as well as of different variants, the risk of redundant testing is large as well as the risk of overlooking important tests, hidden by the huge amount of possible tests.AimsThis study assesses the amount and type of overlaid manual testing across feature, integration and system test in such context, it explores the causes of potential redundancy and elaborates on how to provide decision support in terms of visualization for the purpose of avoiding redundancy.MethodAn in-depth case study was launched including both qualitative and quantitative observations.ResultsA high degree of test overlay is identified originating from distributed test responsibilities, poor documentation and structure of test cases, parallel work and insufficient delta analysis. The amount of test overlay depends on which level of abstraction is studied.ConclusionsAvoiding redundancy requires tool support, e.g. visualization of test design coverage, test execution progress, priorities of coverage items as well as visualized priorities of variants to support test case selection.  相似文献   

4.
The detection of ground fog from satellite data is of interest in operational nowcasting applications, as well as in studies of the climate system. A discrimination between fog at the ground and other low-stratus situations from satellite data requires information on cloud vertical geometry to establish whether the cloud touches the ground. This article introduces a technique that allows for the discrimination between low stratus and (ground) fog on the basis of geostationary satellite imagery. The cloud-base height is derived using a subadiabatic model of cloud microphysics. In this model, the cloud base is varied until model liquid–water path matches that retrieved from satellite data. The performance of this technique is shown to be good in a comparison with METeorological Aerodrome Report data comprising 1030 satellite scenes. With a hit rate of 81% and a threat score of 0.62, the skill is satisfactory.  相似文献   

5.
6.
Multimedia Tools and Applications - Recently, there is an increasing demand for efficient and secure transreception of medical images in telemedicine applications. Though a fixed spectrum is...  相似文献   

7.
8.
This paper presents a new approach to image enhancement based on a maximum likelihood identification method. It is assumed that the images are corrupted by a white gaussian noise field. A two-dimensional extension of the classical ARMA model is developed as a mathematical model for the image fields. Since the maximum likelihood identification leads to a parametric optimization problem, Davidon's algorithm is applied for numerical solutions. The advantage of the present method is that the enhanced images based on the predicted estimates are directly obtained from the noise-corruptod images, so that the autocovariance function of the original image is not required. To improve the quality of the enhanced images, a filtering algorithm is also derived. Digital simulation studies are carried out for various artificial images to show the feasibility of this approach.  相似文献   

9.
Landsat TM and ETM+ imagery was used to distinguish areas of high vs. low cover of Amur honeysuckle (Lonicera maackii), taking advantage of the late leaf retention of this invasive shrub. L. maackii cover was measured in eight stands and compared to 15 Landsat 5 TM and Landsat 7 ETM+ images from spring and autumn dates from 1999 to 2006. Jeffries–Matusita (JM) distance calculations showed potential separability between high vs. low/zero cover classes of L. maackii on some late fall images. The Soil Adjusted Atmospheric Resistant Vegetation Index (SARVI2) revealed higher levels of green biomass in high L. maackii cover plots than low/zero cover plots for November images only. These findings justify further investigation of the effectiveness of late fall images to map the historical spread of L. maackii and other forest understory invasives with similar phenology.  相似文献   

10.
A distributed implementation of the Spatially-Explicit Individual-Based Simulation Model of Florida Panther and White-Tailed Deer in the Everglades and Big Cypress Landscapes (SIMPDEL) model is presented. SIMPDEL models the impact of different water management strategies in the South Florida region on the white-tailed deer and the Florida panther populations. SIMPDEL models the interaction of the four interrelated components – vegetation, hydrology, white-tailed deer and Florida panther, over a time span up to several decades. Very similar outputs of bioenergetic and survival statistics were obtained from the serial and distributed models. A performance evaluation of the two models revealed moderate speed improvements for the distributed model (referred to as DSIMPDEL). The 4-processor configuration attained a speed improvement of 3.83 with small deer populations on an ATM-based network of SUN Ultra 2 workstations over the serial model executing on a single SUN Ultra 2 workstation.  相似文献   

11.
Multimedia Tools and Applications - GAN-based image colorization techniques are capable of producing highly realistic color in real-time. Subjective assessment of these approaches has demonstrated...  相似文献   

12.
This study proposes a new method designed to assess quantitative changes in mountain glaciers. The Glacier Mapper–Change Detector method has a four-step algorithm: (i) image ratioing on two time-comparable satellite images; (ii) digitization of glacier outlines from ratio images using a threshold value; (iii) change detection using the position of glacier outlines at two different moments in time; and (iv) derivation of the future evolution model by using relief-related parameters of glacier changes. The method was calibrated and verified for the Elbrus (Greater Caucasus) glacier over the period 1985–2007 using Landsat TM images. The Glacier Mapper–Change Detector results indicated altitude as the most important relief control for glacial retreat. The most probable future glacial area losses modelled reflect the 1985–2007 glacial retreat patterns.  相似文献   

13.
This paper considers the eigenvalue distribution of a linear time-invariant (LTI) system with time delays and its application to some controllers design for a delay plant via eigenvalue assignment. First, a new result on the root distribution for a class of quasi-polynomials is developed based on the extension of the Hermite–Biehler theorem. Then, such result is applied to proportional–integral (PI) controller parameter design for a first-order plant with time delay through pole placement. The complete region of PI gains can be obtained so that the rightmost eigenvalues in the infinite eigenspectrum of the closed-loop system with delay plant are assigned to desired positions in the complex plane. Furthermore, on the basis of the previous result, this paper also extended the PI control to the proportional–integral–derivative (PID) control. It is worth pointing out that this work aims to improve the performance of the closed-loop system on the premise of guaranteeing the stability.  相似文献   

14.
The investigation of innovative Human-Computer Interfaces (HCI) provides a challenge for future multimedia research and development. Brain-Computer Interfaces (BCI) exploit the ability of human communication and control bypassing the classical neuromuscular communication channels. In general, BCIs offer a possibility of communication for people with severe neuromuscular disorders, such as Amyotrophic Lateral Sclerosis (ALS) or spinal cord injury. Beyond medical applications, a BCI conjunction with exciting multimedia applications, e.g., a dexterity game, could define a new level of control possibilities also for healthy customers decoding information directly from the user’s brain, as reflected in electroencephalographic (EEG) signals which are recorded non-invasively from user’s scalp. This contribution introduces the Berlin Brain–Computer Interface (BBCI) and presents setups where the user is provided with intuitive control strategies in plausible gaming applications that use biofeedback. Yet at its beginning, BBCI thus adds a new dimension in multimedia research by offering the user an additional and independent communication channel based on brain activity only. First successful experiments already yielded inspiring proofs-of-concept. A diversity of multimedia application models, say computer games, and their specific intuitive control strategies, as well as various Virtual Reality (VR) scenarios are now open for BCI research aiming at a further speed up of user adaptation and increase of learning success and transfer bit rates.
Klaus-Robert MüllerEmail:
  相似文献   

15.
Varying illumination geometry affects spectral measurements of a target reflectance and the intensity of solar radiation is the most important factor for in‐field spectral measurements. This paper reports the effect of bidirectional electromagnetic radiation on an image‐based reflectance sensor designed for plant nitrogen assessment. The results show the nonlinearity of reflectance as a function of the solar zenith angle. Ambient illumination was analysed and compensated for using fixed nadir‐view positions of a solar radiometer and a 3‐charge‐coupled device (CCD) multispectral imaging sensor (MSIS). A compensation algorithm was developed to correct for the nonlinearity of both sensors. The compensated reflectance remained linearly consistent with varying the solar zenith angle throughout the daytime within a maximum standard deviation of 0.62% at all three (green, red and near‐infrared) spectral channels, when testing with a 20% reflectance panel. The consistent reflectance was recovered under both sunny and cloudy conditions.  相似文献   

16.
We are currently on a verge of a revolution in digital photography. Developments in computational imaging and adoption of artificial intelligence have spawned new editing techniques that give impressive results in astonishingly short time-frames. The advent of multi-sensor and multi-lens cameras will further challenge many existing integrity verification techniques. As a result, it will be necessary to re-evaluate our notion of image authenticity and look for new techniques that could work efficiently in this new reality. The goal of this paper is to thoroughly review existing techniques for protection and verification of digital image integrity. In contrast to other recent surveys, the discussion covers the most important developments both in active protection and in passive forensic analysis techniques. Existing approaches are analyzed with respect to their capabilities, fundamental limitations, and prospective attack vectors. Whenever possible, the discussion is supplemented with real operation examples and a list of available implementations. Finally, the paper reviews resources available in the research community, including public data-sets and commercial or open-source software. The paper concludes by discussing relevant developments in computational imaging and highlighting future challenges and open research problems.  相似文献   

17.
Continuous improvements in semiconductor fabrication density are supporting new classes of System-on-a-Chip (SoC) architectures that combine extensive processing logic/processor with high-density memory. Such architectures are generally called Processor-in-Memory (PIM) or Intelligent Memory (I-RAM) and can support high-performance computing by reducing the performance gap between the processor and the memory. The PIM architecture combines various processors in a single system. These processors are characterized by their computation and memory-access capabilities. Therefore, a novel strategy must be developed to identify their capabilities and dispatch the most appropriate jobs to them in order to exploit them fully. Accordingly, this study presents an automatic source-to-source parallelizing system, called statement-analysis-grouping-evaluation (SAGE), to exploit the advantages of PIM architectures. Unlike conventional iteration-based parallelizing systems, SAGE adopts statement-based analyzing approaches. This study addresses the configuration of a PIM architecture with one host processor (i.e., the main processor in state-of-the-art computer systems) and one memory processor (i.e., the computing logic integrated with the memory). The strategy of the SAGE system, in which the original program is decomposed into blocks and a feasible execution schedule is produced for the host and memory processors, is investigated as well. The experimental results for real benchmarks are also discussed.  相似文献   

18.
The paper presents a method for FPGA implementation of Self-Organizing Map (SOM) artificial neural networks with on-chip learning algorithm. The method aims to build up a specific neural network using generic blocks designed in the MathWorks Simulink environment. The main characteristics of this original solution are: on-chip learning algorithm implementation, high reconfiguration capability and operation under real time constraints. An extended analysis has been carried out on the hardware resources used to implement the whole SOM network, as well as each individual component block.  相似文献   

19.
The high chip-level integration enables the implementation of large-scale parallel processing architectures with 64 and more processing nodes on a single chip or on an FPGA device. These parallel systems require a cost-effective yet high-performance interconnection scheme to provide the needed communications between processors. The massively parallel Network on Chip (mpNoC) was proposed to address the demand for parallel irregular communications for massively parallel processing System on Chip (mppSoC). Targeting FPGA-based design, an efficient mpNoC low level RTL implementation is proposed taking into account design constraints. The proposed network is designed as an FPGA based Intellectual Property (IP) able to be configured in different communication modes. It can communicate between processors and also perform parallel I/O data transfer which is clearly a key issue in an SIMD system. The mpNoC RTL implementation presents good performances in terms of area, throughput and power consumption which are important metrics targeting an on chip implementation. mpNoC is a flexible architecture that is suitable for use in FPGA-based parallel systems. This paper introduces the basic mppSoC architecture. It mainly focuses on the mpNoC flexible IP based design and its implementation on FPGA. The integration of mpNoC in mppSoC is also described. Implementation results on a Stratix II FPGA device are given for three data-parallel applications ran on mppSoC. The obtained good performances justify the effectiveness of the proposed parallel network. It is shown that the mpNoC is a lightweight parallel network making it suitable for both small as well as large FPGA-based parallel systems.  相似文献   

20.
ABSTRACT

This paper analyzes the relationship between Information and Communications Technology (ICT) market drivers and the level of ICT usage in small states. Main arguments in the literature review refer to institutional, socioeconomic, and demographic factors to explain ICT usage. Through a multi-method approach, this paper takes a dataset of small states to explore the relationship between ICT market variables and Internet usage. Findings highlight the importance of underlying digital platforms. A typology of small states is provided for ICT market influencers and usage, showing a varying degree of heterogeneity. Given that the sample size available to study small states limits the use of traditional analytical techniques, this paper utilizes Qualitative Comparative Analysis and demonstrates that high usage levels of digital applications for businesses, consumers, and social-media are conditions that are sufficient, but not necessary, for ICT advancement to occur. Implications for theory development and policy making are discussed in terms of market incentives.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号