Multilayered wire‐grid polarizers (WGP) find application as low‐reflection polarizers in projection‐type liquid crystal display devices. A multilayered WGP is formed by adding thin layers on top of the metal ridges of an ordinary WGP. The ordinary WGP consists of a periodic array of parallel metal ridges, where the period of the array and the width of any individual metal ridge are typically less than the wavelength of the incident light. Such WGPs are often used as efficient polarizers. However, in certain applications, it is important to reduce the reflection from the WGP while preserving the polarization efficiency. One of the ways to achieve this goal is to add thin layers on top of the metal ridges of the ordinary WGP. The reduction in reflection from the multilayered WGP depends on the number and material of these additional layers. In this paper, we describe a design method for multilayered WGPs based on an effective medium theory, thin‐film computation method and a monochromatic recursive convolution finite‐difference time‐domain algorithm. The goal of design process is to identify suitable materials and thicknesses for the additional thin layers needed to lower the reflection appreciably. The design method is explained with the help of bilayered WGPs. 相似文献
Samples of three Indian coals, of widely differing origin and rank, were subjected to flash pyrolysis at a temperature of about 1150 °C for 30 s in vacuo, and under atmospheres of nitrogen, argon, ammonia, and perdeuterobenzene. The gaseous products of the pyrolyses were analysed by infra-red and mass spectroscopy and by gas chromatography. Observed variations in gas compositions are discussed relative to the possible mode of influence by the pyrolytic atmospheres. It would appear that the pyrolytic atmosphere is an important factor in determining the composition of the pyrolysis products; the influence of nitrogen, argon and perdeuterobenzene is a physical one, leading especially to higher yields of olefins. 相似文献
Over the last few years, there has been a rapid growth in digital data. Images with quotes are spreading virally through online social media platforms. Misquotes found online often spread like a forest fire through social media, which highlights the lack of responsibility of the web users when circulating poorly cited quotes. Thus, it is important to authenticate the content contained in the images being circulated online. So, there is a need to retrieve the information within such textual images to verify quotes before its usage in order to differentiate a fake or misquote from an authentic one. Optical Character Recognition (OCR) is used in this paper, for converting textual images into readable text format, but none of the OCR tools are perfect in extracting information from the images accurately. In this paper, a method of post-processing on the retrieved text to improve the accuracy of the detected text from images has been proposed. Google Cloud Vision has been used for recognizing text from images. It has also been observed that using post-processing on the extracted text improved the accuracy of text recognition by 3.5% approximately. A web-based text similarity approach (URLs and domain name) has been used to examine the authenticity of the content of the quoted images. Approximately, 96.26% accuracy has been achieved in classifying quoted images as verified or misquoted. Also, a ground truth dataset of authentic site names has been created. In this research, images with quotes by famous celebrities and global leaders have been used. A comparative analysis has been performed to show the effectiveness of our proposed algorithm.
For production of fine-grained and corrosion-resistant tungsten carbide (WC) based cemented carbides, addition of chromium
carbide (Cr33C2) in small amounts is standard practice. No systematic study, however, has been made of the effects of large additions (maximum
6 wt % ) of Cr3C2 as a substitute for tungsten carbide. This study focuses on the effect of hard-phase substitution by C3C2 in WC-1OCo cemented carbide. An attempt is also made to modify the binder metal cobalt by partial or complete substitution
of nickel. Specimens were prepared using the standard liquid-phase sintering process and were tested for sintered porosity,
mechanical properties, corrosion resistance, and microstructural parameters. Results confirm the findings of earlier workers
regarding grain refinement and improvement of mechanical properties upon the addition of small amounts (<2 wt%) of Cr3C2. Modification of the binder phase improves indentation fracture toughness and corrosion resistance. Addition of Cr3C2 independent of the binder type improves corrosion resistance. 相似文献
With an increasing acceptance of Wireless Sensor Networks (WSNs), the health of individual sensor is becoming critical in identifying important events in the region of interest. One of the key challenges in detecting event in a WSN is how to detect it accurately transmitting minimum information providing sufficient details about the event. At the same time, it is also important to devise a strategy to handle multiple events occurring simultaneously. In this paper, we propose a Polynomial-based scheme that addresses these problems of Event Region Detection (PERD) by having a aggregation tree of sensor nodes. We employ a data aggregation scheme, TREG (proposed in our earlier work) to perform function approximation of the event using a multivariate polynomial regression. Only coefficients of the polynomial (P) are passed instead of aggregated data. PERD includes two components: event recognition and event report with boundary detection. This can be performed for multiple simultaneously occurring events. We also identify faulty sensor(s) using the aggregation tree. Performing further mathematical operations on the calculated P can identify the maximum (max) and minimum (min) values of the sensed attribute and their locations. Therefore, if any sensor reports a data value outside the [min, max] range, it can be identified as a faulty sensor. Since PERD is implemented over a polynomial tree on a WSN in a distributed manner, it is easily scalable and computation overhead is marginal. Results reveal that event(s) can be detected by PERD with error in detection remaining almost constant achieving a percentage error within a threshold of 10% with increase in communication range. Results also show that a faulty sensor can be detected with an average accuracy of 94% and it increases with increase in node density. 相似文献
Large scale grid computing systems may provide multitudinous services, from different providers, whose quality of service
will vary. Moreover, services are deployed and undeployed in the grid with no central coordination. Thus, to find out the
most suitable service to fulfill their needs, or to find the most suitable set of resources on which to deploy their services,
grid users must resort to a Grid Information Service (GIS). This service allows users to submit rich queries that are normally
composed of multiple attributes and range operations. The ability to efficiently execute complex searches in a scalable and
reliable way is a key challenge for current GIS designs. Scalability issues are normally dealt with by using peer-to-peer
technologies. However, the more reliable peer-to-peer approaches do not cater for rich queries in a natural way. On the other
hand, approaches that can easily support these rich queries are less robust in the presence of failures. In this paper we
present the design of NodeWiz, a GIS that allows multi-attribute range queries to be performed efficiently in a distributed
manner, while maintaining load balance and resilience to failures. 相似文献
With the exponential growth of end users and web data, the internet is undergoing the change of paradigm from a user-centric model to a content-centric one, popularly known as information-centric networks (ICN). Current ICN research evolves around three key-issues namely (i) content request searching, (ii) content routing, and (iii) in-network caching scheme to deliver the requested content to the end user. This would improve the user experience to obtain requested content because it lowers the download delay and provides higher throughput. Existing researches have mainly focused on on-path congestion or expected delivery time of a content to determine the optimized path towards custodian. However, it ignores the cumulative effect of the link-state parameters and the state of the cache, and consequently it leads to degrade the delay performance. In order to overcome this shortfall, we consider both the congestion of a link and the state of on-path caches to determine the best possible routes. We introduce a generic term entropy to quantify the effects of link congestion and state of on-path caches. Thereafter, we develop a novel entropy dependent algorithm namely ENROUTE for searching of content request triggered by any user, routing of this content, and caching for the delivery this requested content to the user. The entropy value of an intra-domain node indicates how many popular contents are already cached in the node, which, in turn, signifies the degree of enrichment of that node with the popular contents. On the other hand, the entropy for a link indicates how much the link is congested with the traversal of contents. In order to have reduced delay, we enhance the entropy of caches in nodes, and also use path with low entropy for downloading contents. We evaluate the performance of our proposed ENROUTE algorithm against state-of-the-art schemes for various network parameters and observe an improvement of 29–52% in delay, 12–39% in hit rate, and 4–39% in throughput.
Microsystem Technologies - Micro-mechanical systems (MEMS) based piezoresistive pressure sensors have significant importance in several pressure sensor devices in real world, i.e., aviation, IoT... 相似文献
In this paper, the authors study the problem of testing the hypothesis of a block compound symmetry covariance matrix with two-level multivariate observations, taken for m variables over u sites or time points. Through the use of a suitable block-diagonalization of the hypothesis matrix, it is possible to obtain a decomposition of the main hypothesis into two sub-hypotheses. Using this decomposition, it is then possible to obtain the likelihood ratio test statistic as well as its exact moments in a much simpler way. The exact distribution of the likelihood ratio test statistic is then analyzed. Because this distribution is quite elaborate, yielding a non-manageable distribution function, a manageable but very precise near-exact distribution is developed. Numerical studies conducted to evaluate the closeness between this near-exact distribution and the exact distribution show the very good performance of this approximation even for very small sample sizes and the approach followed allows us to extend its validity to situations where the population distributions are elliptically contoured. A real-data example is presented and a simulation study is also conducted. 相似文献