Hydrogen fuel is a promising alternative to fossil fuels because of its energy content, clean nature, and fuel efficiency. However, it is not readily available. Most current producion processes are very energy intensive and emit carbon dioxide. Therefore, this article reviews technological options for hydrogen production that are eco-friendly and generate clean hydrogen fuel. Biological methods, such different fermentation processes and photolysis are discussed together with the required substrates and the process efficiency. 相似文献
The electrochemical reduction of carbon dioxide (CO2) to hydrocarbons is a challenging task because of the issues in controlling the efficiency and selectivity of the products. Among the various transition metals, copper has attracted attention as it yields more reduced and C2 products even while using mononuclear copper center as catalysts. In addition, it is found that reversible formation of copper nanoparticle acts as the real catalytically active site for the conversion of CO2 to reduced products. Here, it is demonstrated that the dinuclear molecular copper complex immobilized over graphitized mesoporous carbon can act as catalysts for the conversion of CO2 to hydrocarbons (methane and ethylene) up to 60%. Interestingly, high selectivity toward C2 product (40% faradaic efficiency) is achieved by a molecular complex based hybrid material from CO2 in 0.1 m KCl. In addition, the role of local pH, porous structure, and carbon support in limiting the mass transport to achieve the highly reduced products is demonstrated. Although the spectroscopic analysis of the catalysts exhibits molecular nature of the complex after 2 h bulk electrolysis, morphological study reveals that the newly generated copper cluster is the real active site during the catalytic reactions. 相似文献
Catalysis Letters - Industrial Cu/ZnO/Al2O3 or novel rate catalysts, prepared with a photochemical deposition method, were studied under functional CH3OH synthesis conditions at the set temperature... 相似文献
The procedure for the classical chemical refining of vegetable oils consists of degumming, alkali neutralization, bleaching, and deodorization. Conventional refining of rice bran oil using alkali gives oil of acceptable quality, but the refining losses are very high. A critical work has been carried out to study the application of membrane technology in the pretreatment of crude rice bran oil. Oils intended for physical refining should have a low phosphorus content, and this is not readily achievable by the conventional acid/water degumming process. The application of membrane technology for the pretreatment of rice bran oil has been investigated. The process has already been successfully applied to other vegetable oils. Ceramic membranes, which are important from the commercial point of view, were examined for this purpose. The results showed that the membrane‐filtered oils met the requirements of physical refining, with a substantial reduction in color. It was observed that most of the waxy material was also rejected. Experiments were carried out to establish the relationship between permeate flux and rejection with membrane pore size, trans‐membrane pressure and micellar solute concentration. 相似文献
The seeds and extracted oils ofCarissa spinarum (Apocynaceae), (I),Leucaena leucocephala (Leguminosae) (II) andPhysalis minima (Solanaceae) (III) were analyzed for characteristics and compositions. The seeds of I, II and III contained 22.4, 6.4 and
40.0% oil and 10.1, 27.6 and 17.9% protein, respectively. The oils of I, II and III had, respectively, iodine values 70.1,
113.5 and 122.5; saponification values 186, 188 and 189; unsaponifiable matter 5.2, 2.5 and 0.8%, and the following fatty
acid compositions (area %): palmitic 12.6, 14.2, 10.5; stearic 7.6, 6.1, 8.6; oleic 72.7, 20.1, 17.3; linoleic 5.2, 53.8,
61.4; linolenic 0.9, 1.8, 0.0, and arachidic 1.0, 2.3, 0.0. II contained 1.7% lignoceric acid. III contained small amounts
of hexadecenoic (0.1%), epoxy (0.6%) and hydroxy (1.5%) fatty acids. 相似文献
In this paper, a novel pyramid coding based rate control scheme is proposed for video streaming applications constrained by a constant channel bandwidth. To achieve the target bit rate with the best quality, the initial quantization parameter (QP) is determined by the average spatio-temporal complexity of the sequence, its resolution and the target bit rate. Simple linear estimation models are then used to predict the number of bits that would be necessary to encode a frame for a given complexity and QP. The experimental results demonstrate that the proposed rate control scheme significantly outperforms the existing rate control scheme in the Joint Model (JM) reference software in terms of Peak Signal to Noise Ratio (PSNR) and consistent perceptual visual quality while achieving the target bit rate. Finally, the proposed scheme is validated through experimental evaluation over a miniature test-bed.
Noise in textual data such as those introduced by multilinguality, misspellings, abbreviations, deletions, phonetic spellings,
non-standard transliteration, etc. pose considerable problems for text-mining. Such corruptions are very common in instant
messenger and short message service data and they adversely affect off-the-shelf text mining methods. Most techniques address
this problem by supervised methods by making use of hand labeled corrections. But they require human generated labels and
corrections that are very expensive and time consuming to obtain because of multilinguality and complexity of the corruptions.
While we do not champion unsupervised methods over supervised when quality of results is the singular concern, we demonstrate
that unsupervised methods can provide cost effective results without the need for expensive human intervention that is necessary
to generate a parallel labeled corpora. A generative model based unsupervised technique is presented that maps non-standard
words to their corresponding conventional frequent form. A hidden Markov model (HMM) over a “subsequencized” representation
of words is used, where a word is represented as a bag of weighted subsequences. The approximate maximum likelihood inference
algorithm used is such that the training phase involves clustering over vectors and not the customary and expensive dynamic
programming (Baum–Welch algorithm) over sequences that is necessary for HMMs. A principled transformation of maximum likelihood
based “central clustering” cost function of Baum–Welch into a “pairwise similarity” based clustering is proposed. This transformation
makes it possible to apply “subsequence kernel” based methods that model delete and insert corruptions well. The novelty of
this approach lies in that the expensive (Baum–Welch) iterations required for HMM, can be avoided through an approximation
of the loglikelihood function and by establishing a connection between the loglikelihood and a pairwise distance. Anecdotal
evidence of efficacy is provided on public and proprietary data. 相似文献