首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   32篇
  免费   1篇
电工技术   1篇
化学工业   6篇
轻工业   2篇
无线电   1篇
一般工业技术   2篇
自动化技术   21篇
  2024年   2篇
  2023年   1篇
  2022年   1篇
  2020年   1篇
  2016年   1篇
  2014年   2篇
  2013年   5篇
  2011年   4篇
  2010年   2篇
  2008年   4篇
  2007年   3篇
  2006年   2篇
  2005年   1篇
  2001年   3篇
  1972年   1篇
排序方式: 共有33条查询结果,搜索用时 0 毫秒
21.
The distribution of the heating potential generated by an incident electromagnetic plane wave on a conducting sphere simulating the human head was investigated. It was found that for a sphere of 10-cm radius having the same electrical characteristics as those of biological tissues, no hot spots are generated inside. While at lower frequencies the heating is relatively uniform with some polarization effects, for frequencies above 1000 MHz only skin heating takes place. For a sphere of the same size but of conductivity ?= 10 mmho/cm (which for f>1000 is lower than that of biological tissues) hot spots occur inside for f>1000 MHz. Intense hot spots also occur inside spheres of radius 5 cm having the same electrical characteristics as those of biological tissues in the frequency region of 250 MHz相似文献   
22.
    
This study presents the results of applying deep learning methodologies within the ecotoxicology field, with the objective of training predictive models that can support hazard assessment and eventually the design of safer engineered nanomaterials (ENMs). A workflow applying two different deep learning architectures on microscopic images of Daphnia magna is proposed that can automatically detect possible malformations, such as effects on the length of the tail, and the overall size, and uncommon lipid concentrations and lipid deposit shapes, which are due to direct or parental exposure to ENMs. Next, classification models assign specific objects (heart, abdomen/claw) to classes that depend on lipid densities and compare the results with controls. The models are statistically validated in terms of their prediction accuracy on external D. magna images and illustrate that deep learning technologies can be useful in the nanoinformatics field, because they can automate time‐consuming manual procedures, accelerate the investigation of adverse effects of ENMs, and facilitate the process of designing safer nanostructures. It may even be possible in the future to predict impacts on subsequent generations from images of parental exposure, reducing the time and cost involved in long‐term reproductive toxicity assays over multiple generations.  相似文献   
23.
    
The purpose of atmospheric correction is to produce more accurate surface reflectance and to potentially improve the extraction of surface parameters from satellite images. To achieve this goal the influences of the atmosphere, solar illumination, sensor viewing geometry and terrain information have to be taken into account. Although a lot of information from satellite imagery can be extracted without atmospheric correction, the physically based approach offers advantages, especially when dealing with multitemporal data and/or when a comparison of data provided by different sensors is required. The use of atmospheric correction models is limited by the need to supply data related to the condition of the atmosphere at the time of imaging. Such data are not always available and the cost of their collection is considerable, hence atmospheric correction is performed with the use of standard atmospheric profiles. The use of these profiles results in a loss of accuracy. Therefore, site-dependent databases of atmospheric parameters are needed to calibrate and to adjust atmospheric correction methods for local level applications. In this article, the methodology and results of the project Adjustment of Atmospheric Correction Methods for Local Studies: Application in ASTER (Advanced Spaceborne Thermal Emission and Reflection Radiometer) (ATMOSAT) for the area of Crete are presented. ATMOSAT aimed at comparing several atmospheric correction methods for the area of Crete, as well as investigating the effects of atmospheric correction on land cover classification and change detection. Databases of spatio-temporal distributions of all required input parameters (atmospheric humidity, aerosols, spectral signatures, land cover and elevation) were developed and four atmospheric correction methods were applied and compared. The baseline for this comparison is the spatial distribution of surface reflectance, emitted radiance and brightness temperature as derived by ASTER Higher Level Products (HLPs). The comparison showed that a simple image based method, which was adjusted for the study area, provided satisfactory results for visible, near infrared and short-wave infrared spectral areas; therefore it can be used for local level applications. Finally, the effects of atmospheric correction on land cover classification and change detection were assessed using a time series of ASTER multispectral images acquired in 2000, 2002, 2004 and 2006. Results are in agreement with past studies, indicating that for this type of application, where a common radiometric scale is assumed among the multitemporal images, atmospheric correction should be taken into consideration in pre-processing.  相似文献   
24.
Flavonoid fatty esters were prepared by acylation of flavonoids (rutin and naringin) by fatty acids (C8, C10, C12), catalyzed by immobilized lipase from Candida antarctica in various solvent systems. The reaction parameters affecting the conversion of the enzymatic process, such as the nature of the organic solvent and acyl donor used, the water activity (aw) of the system, as well as the acyl donor concentration have been investigated. At optimum reaction conditions, the conversion of flavonoids was 50—60% in tert‐butanol at aw less than 0.11. In all cases studied, only flavonoid monoester was identified, which indicates that this lipase‐catalyzed esterification is regioselective.  相似文献   
25.
In this paper the model predictive control (MPC) technology is used for tackling the optimal drug administration problem. The important advantage of MPC compared to other control technologies is that it explicitly takes into account the constraints of the system. In particular, for drug treatments of living organisms, MPC can guarantee satisfaction of the minimum toxic concentration (MTC) constraints. A whole-body physiologically-based pharmacokinetic (PBPK) model serves as the dynamic prediction model of the system after it is formulated as a discrete-time state-space model. Only plasma measurements are assumed to be measured on-line. The rest of the states (drug concentrations in other organs and tissues) are estimated in real time by designing an artificial observer. The complete system (observer and MPC controller) is able to drive the drug concentration to the desired levels at the organs of interest, while satisfying the imposed constraints, even in the presence of modelling errors, disturbances and noise.  相似文献   
26.
There are many processes in a pulp and paper mill where an on-line parameter analyzer cannot be used due to several reasons

•The analyzer is very expensive

•It cannot survive in the environment we want to use it

•It is not operational due to hardware problems, maintenance etc

•Such an analyzer does not exist in the market

In all these situations it would be great for the mill to have an alternative way of measuring those parameters in real-time. Neural network models can serve as virtual sensors that infer process parameters from other variables, which can be measured on-line. One excellent application of inferential sensors in the pulp and paper industry is the on-line prediction of paper properties, like tensile, stretch, brightness, opacity etc. In tissue machines, the most important quality parameter is softness, which is usually measured in a very subjective manner by the touch of a human finger. In this work we examine how neural networks can be deployed in order to build online virtual sensors for softness and other tissue quality properties. The results are promising and show that neural network technology can improve productivity and minimize out of specs production in a tissue machine, by providing accurate real time monitoring ofquality parameters.  相似文献   
27.
Hash functions are special cryptographic algorithms, which are applied wherever message integrity and authentication are critical. Implementations of these functions are cryptographic primitives widely used in common cryptographic schemes and security protocols such as Internet Protocol Security (IPSec) and Virtual Private Network (VPN). In this paper, a novel FPGA implementation of the Secure Hash Algorithm 1 (SHA-1) is proposed. The proposed architecture exploits the benefits of pipeline and re-timing of execution through pre-computation of intermediate temporal values. Pipeline allows division of the calculation of the hash value in four discreet stages, corresponding to the four required rounds of the algorithm. Re-timing is based on the decomposition of the SHA-1 expression to separate information dependencies and independencies. This allows pre-computation of intermediate temporal values in parallel to the calculation of other independent values. Exploiting the information dependencies, the fundamental operational block of SHA-1 is modified so that maximum operation frequency is increased by 30% approximately with negligible area penalty compared to other academic and commercial implementations. The proposed SHA-1 hash function was prototyped and verified using a XILINX FPGA device. The implementation’s characteristics are compared to alternative implementations proposed by the academia and the industry, which are available in the international IP market. The proposed implementation achieved a throughput that exceeded 2,5 Gbps, which is the highest among all similar IP cores for the targeted XILINX technology.  相似文献   
28.
Supply chains are complicated dynamical systems triggered by customer demands. Proper selection of equipment, machinery, buildings and transportation fleets is a key component for the success of such systems. However, efficiency of supply chains mostly depends on management decisions, which are often based on intuition and experience. Due to the increasing complexity of supply chain systems (which is the result of changes in customer preferences, the globalization of the economy and the stringy competition among companies), these decisions are often far from optimum. Another factor that causes difficulties in decision making is that different stages in supply chains are often supervised by different groups of people with different managing philosophies. From the early 1950s it became evident that a rigorous framework for analyzing the dynamics of supply chains and taking proper decisions could improve substantially the performance of the systems. Due to the resemblance of supply chains to engineering dynamical systems, control theory has provided a solid background for building such a framework. During the last half century many mathematical tools emerging from the control literature have been applied to the supply chain management problem. These tools vary from classical transfer function analysis to highly sophisticated control methodologies, such as model predictive control (MPC) and neuro-dynamic programming. The aim of this paper is to provide a review of this effort. The reader will find representative references of many alternative control philosophies and identify the advantages, weaknesses and complexities of each one. The bottom line of this review is that a joint co-operation between control experts and supply chain managers has the potential to introduce more realism to the dynamical models and develop improved supply chain management policies.  相似文献   
29.
    
In this paper, we construct and evaluate all nonisomorphic Latin hypercube designs with n≤16 runs, the use of which guarantee that the estimates of the first‐order effects are uncorrelated with each other and also uncorrelated with the estimates of the second‐order effects, in polynomial regression models. The produced designs are evaluated using well‐known and popular criteria, and optimal designs are presented in every case studied. An effort to construct nonisomorphic small Latin hypercubes in which only the estimates of the first‐order effects are required to be uncorrelated with each other has also been made, and new designs are presented. All the constructed designs, besides their stand‐alone properties, are useful for the construction of bigger orthogonal Latin hypercubes with desirable properties, using well‐known techniques proposed in the literature. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   
30.
This paper presents a new stochastic algorithm for solving hierarchical multiobjective optimization problems. The algorithm is based on the simulated annealing concept and returns a single solution that corresponds to the lexicographic ordering approach. The algorithm optimizes simultaneously the multiple objectives by assigning a different initial temperature to each one, according to its position in the hierarchy. A major advantage of the proposed method is its low computational cost. This is very critical, particularly, for online applications, where the time that is available for decision making is limited. The method is tested in a number of benchmark problems, which illustrate its ability to find near-optimal solutions even in nonconvex multiobjective optimization problems. The results are comparable with those that are produced by state-of-the-art multiobjective evolutionary algorithms, such as the Nondominated Sorting Genetic Algorithm II. The algorithm is further applied to the solution of a large-scale problem that is formulated online, when a multiobjective adaptive model predictive control (MPC) configuration is adopted. This particular control scheme involves an adaptive discrete-time model of the system, which is developed using the radial-basis-function neural-network architecture. A key issue in the success of the adaptation strategy is the introduction of a persistent excitation constraint, which is transformed to a top-priority objective. The overall methodology is applied to the control problem of a pH reactor and proves to be superior to conventional MPC configurations.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号