首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   389篇
  免费   62篇
  国内免费   1篇
电工技术   6篇
化学工业   182篇
金属工艺   7篇
机械仪表   17篇
建筑科学   14篇
矿业工程   2篇
能源动力   7篇
轻工业   18篇
水利工程   6篇
无线电   26篇
一般工业技术   91篇
冶金工业   13篇
自动化技术   63篇
  2023年   5篇
  2022年   9篇
  2021年   6篇
  2020年   11篇
  2019年   15篇
  2018年   26篇
  2017年   18篇
  2016年   30篇
  2015年   29篇
  2014年   64篇
  2013年   75篇
  2012年   36篇
  2011年   21篇
  2010年   35篇
  2009年   19篇
  2008年   22篇
  2007年   6篇
  2006年   2篇
  2005年   2篇
  2004年   1篇
  2003年   2篇
  2002年   1篇
  2001年   3篇
  1999年   1篇
  1998年   1篇
  1997年   2篇
  1996年   1篇
  1992年   2篇
  1988年   1篇
  1987年   1篇
  1985年   2篇
  1978年   1篇
  1976年   1篇
  1975年   1篇
排序方式: 共有452条查询结果,搜索用时 281 毫秒
391.
The slope movement at Sedrun (Switzerland) has been studied with ortho-rectified images. Displacement maps compiled for two periods (1973–1990 and 1990–2003), based on a correlation of the aerial images indicate maximum average slope movements of 60 cm/year, similar to those obtained by traditional photogrammetry over the period 1973–1990. The limits of the most active zones determined by image correlation correspond to those obtained by fieldwork. Comparison of the two displacement maps shows an acceleration of the instability of 150% since 1990. This paper demonstrates the value of using more than one technique to help understand the evolution of this long-term instability.  相似文献   
392.
Résumé. Les méthodes d'analyses de séries temporelles d'images satellitales sont nombreuses. Après une brève revue de ces méthodes, une nouvelle approche est proposée. Elle consiste à calculer les résidus de l'ACP (Analyse en Composantes Principales) du tableau des NDVI par la procédure suivante: calculer les variables NDVI à chacune des dates; construire le tableau de données espace × temps, par accolement des variables NDVI; réaliser l'ACP de ce tableau; déterminer le nombre, k, de facteurs de l'ACP explicatifs des structures stables au cours du temps du paysage; et calculer les résidus entre le tableau NDVI et celui reconstruit à partir des k facteurs. Cette méthode appliquée à l'analyse d'une série de trois scènes Landsat TM acquises en 1983, 1984 and 1993 sur la Camargue (France) a permis de séparer les structures permanentes d'occupation du sol de leurs variations annuelles. La spécificité de cette région, appelée ‘île Camargue’, qui est totalement sous le contrôle des systèmes d'irrigation mis en place par l'homme, apparaît ainsi nettement.

Abstract. Numerous methods exist for the analysis of changes applied to time series of satellite images. After a quick review of these methods, a new approach is proposed. This approach is based on residuals computed from PCA (Principal Component Analysis) on a NDVI table. It consists of: computing the NDVI variable for each date; building a space × time table which joins NDVI variables; carrying out a PCA on this table; choosing the number, k, of factors of PCA which explain time invariant landscape structures; and computing residuals between NDVI table and the table computed from the k factors. This method applied to the analysis of a series of three Landsat TM scenes acquired in 1983, 1984 and 1993 on the Camargue region (France) allows separation of the permanent land use structure from its annual variations. The specificity of the so called ‘island Camargue’ is clearly shown; control of water exchanges by man alters the land use every year.  相似文献   
393.
Intense rainfall on urban areas can generate severe flooding in the city, and if the conditions are right, the flow in the streets can be supercritical. The redistribution of the flow in street intersections determines the flow rates and water levels in the street network. We have investigated the flow that occurs when two supercritical flows collide in a 90° junction formed by streets of identical cross section. Several flow configurations within the intersection are possible, depending on the position of the hydraulic jumps that form in and upstream of the intersection. Previous work has identified three flow types, with Type II flows being further classified into three subregimes. Hydraulic models have been developed, based on the principles of the conservation of flow and momentum flux in the intersection, which predict the angles at which the jumps will form. These models can be used to determine the flow type that will occur. Moreover, additional models have been developed for computing the outflow discharge distribution. For Type I flows, it has not been possible to develop such a hydraulic model for the discharge distribution, but some data are provided for one configuration to indicate the influence of different parameters. For Type II and Type III flows, such models are developed, and their predictions agree with data obtained from the channel intersection facility at the Laboratory of Fluid Mechanics and Acoustics in Lyon.  相似文献   
394.
This paper is dedicated to the cure of an in-plane isotropic carbon-epoxy tooling material presenting a specific mesostructure. Eshelby-Kröner self-consistent model (EKSC) is used to achieve a two-steps scale transition procedure, allowing relating microscopic to macroscopic properties of the material, and estimating its multi-scale mechanical states. This procedure is used to predict the local residual stresses due to thermal and chemical shrinkage of the resin, depending on the manufacturing process conditions. An experimental investigation provides the BMI resin cure kinetics and mechanical properties as a function of the temperature and conversion degree. The consequences of these evolutions on the local mechanical states are investigated and discussed.  相似文献   
395.
Plasticized corn flour‐based materials were prepared by extrusion and injection molding. Extrusion of corn flour blends (75% wet basis (wb)—glycerol (5 or 10% wb)—water) was performed in a twin‐screw extruder with either one or three shearing zones. Native corn flour is mainly composed of corn starch granules surrounded by proteins layers. Therefore, the destructuration of corn flour by thermomechanical treatments was analyzed (i) by techniques essentially allowing to monitor corn starch amorphization (differential scanning calorimetry, X‐ray diffractometry, determination of water sorption isotherms, susceptibility to hydrolysis by amylolytic enzymes) (ii) and via proteins layers role and distribution observed by confocal scanning laser microscopy and comparing the susceptibility of corn starch to hydrolysis by amylolytic enzymes in the presence or not of a protease. Both corn starch granules amorphization and proteins dispersion and aggregation were more pronounced for materials extruded in a screw profile with three shearing zones. For materials extruded in a screw profile with one shearing zone, the amorphization of starch was higher in materials made with 5% wb glycerol, whereas the proteins dispersion and aggregation was more pronounced in materials made with 10% wb glycerol. A barrier role of proteins to hydrolysis of corn starch by amylolytic enzymes was demonstrated and discussed. © 2011 Wiley Periodicals, Inc. J Appl Polym Sci, 2012  相似文献   
396.
We provide bounds on the probability that accumulated errors were never above a given threshold on numerical algorithms. Such algorithms are used, for example, in aircraft and nuclear power plants. This report contains simple formulas based on Lévy’s, Markov’s and Hoeffding’s inequalities and it presents a formal theory of random variables with a special focus on producing concrete results. We select three very common applications that cover the common practices of systems that evolve for a long time. We compute the number of bits that remain continuously significant in the first two applications with a probability of failure around one out of a billion, where worst case analysis considers that no significant bit remains. We are using PVS as such formal tools force explicit statement of all hypotheses and prevent incorrect uses of theorems.  相似文献   
397.
398.
We propose an efficient hardware-oriented method for evaluating complex polynomials. The method is based on solving iteratively a system of linear equations. The solutions are obtained digit-by-digit on simple and highly regular hardware. The operations performed are defined over the reals. We describe a complex-to-real transform, a complex polynomial evaluation algorithm, the convergence conditions, and a corresponding design and implementation. The latency and the area are estimated for the radix-2 case. The main features of the method are: the latency of about m cycles for an m-bit precision; the cycle time independent of the precision; a design consisting of identical modules; and digit-serial connections between the modules. The number of modules, each roughly corresponding to serial-parallel multiplier without a carry-propagate adder, is 2(n?+?1) for evaluating an n-th degree complex polynomial. The method can also be used to compute all successive integer powers of the complex argument with the same latency and a similar implementation cost. The design allows straightforward tradeoffs between latency and cost: a factor k decrease in cost leads to a factor k increase in latency. A similar tradeoff between precision, latency and cost exists. The proposed method is attractive for programmable platforms because of its regular and repetitive structure of simple hardware operators.  相似文献   
399.
We consider the problem of the global minimization of a function observed with noise. This problem occurs for example when the objective function is estimated through stochastic simulations. We propose an original method for iteratively partitioning the search domain when this area is a finite union of simplexes. On each subdomain of the partition, we compute an indicator measuring if the subdomain is likely or not to contain a global minimizer. Next areas to be explored are chosen in accordance with this indicator. Confidence sets for minimizers are given. Numerical applications show empirical convergence results, and illustrate the compromise to be made between the global exploration of the search domain and the focalization around potential minimizers of the problem.  相似文献   
400.
Super-crosslinked epoxy nanocomposites containing N-octadecyl-N′-octadecyl imidazolium iodide (IM)-functionalized montmorillonite (MMT-IM) nanoplatelets were developed and examined for cure kinetics, viscoelastic behavior and thermal degradation kinetics. The structure and morphology of MMT-IM were characterized by FTIR, XRD, TEM, and TGA. Synthesized MMT-IM revealed synergistic effects on the network formation, the glass transition temperature (Tg) and thermal stability of epoxy. Cure and viscoelastic behaviors of epoxy nanocomposites containing 0.1 wt% MMT and MMT-IM were compared based on DSC and DMA, respectively. Activation energy profile as a function of the extent of cure was obtained. DMA results indicated a strong interface between imidazole groups of MMT-IM and epoxy, which caused a significant improvement in storage modulus and the Tg of epoxy. Network degradation kinetics of epoxy containing 0.5, 2.0, and 5.0 wt% MMT and MMT-IM were compared by using Friedman, Kissinger-Akahira-Sunose (KAS), Flynn-Wall-Ozawa (FWO) and the modified Coats-Redfern methods. Although addition of MMT to epoxy was detrimental to the Tg value, as featured by a fall from 94.1°C to 89.7°C detected by DMA method, and from 103.3°C to 97.9°C by DSC method, respectively. By contrast, meaningful increase in such values were observed in the same order from 94.1°C to 94.7°C and from 103.3°C to 104.7°C for super-crosslinked epoxy/MMT-IM systems.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号