首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1161篇
  免费   82篇
  国内免费   1篇
电工技术   7篇
综合类   2篇
化学工业   262篇
金属工艺   26篇
机械仪表   31篇
建筑科学   38篇
矿业工程   4篇
能源动力   58篇
轻工业   89篇
水利工程   10篇
石油天然气   3篇
无线电   113篇
一般工业技术   260篇
冶金工业   72篇
原子能技术   12篇
自动化技术   257篇
  2024年   4篇
  2023年   21篇
  2022年   33篇
  2021年   51篇
  2020年   45篇
  2019年   34篇
  2018年   47篇
  2017年   47篇
  2016年   56篇
  2015年   41篇
  2014年   71篇
  2013年   78篇
  2012年   91篇
  2011年   119篇
  2010年   68篇
  2009年   59篇
  2008年   62篇
  2007年   60篇
  2006年   44篇
  2005年   35篇
  2004年   25篇
  2003年   21篇
  2002年   14篇
  2001年   6篇
  2000年   13篇
  1999年   13篇
  1998年   19篇
  1997年   10篇
  1996年   11篇
  1995年   4篇
  1994年   4篇
  1993年   1篇
  1992年   4篇
  1991年   3篇
  1990年   1篇
  1989年   2篇
  1988年   4篇
  1987年   2篇
  1986年   5篇
  1983年   1篇
  1982年   1篇
  1981年   1篇
  1980年   1篇
  1978年   1篇
  1977年   6篇
  1976年   1篇
  1975年   1篇
  1974年   1篇
  1973年   1篇
  1972年   1篇
排序方式: 共有1244条查询结果,搜索用时 15 毫秒
81.
Polylactide (PLA) is the most used biodegradable and biobased food packaging polymer for rigid containers and films. However, its low ductility is a hurdle for increasing its applications in flexible food packaging. A solution is the use of additives. Palm oil deodorizer distillate (PODC) is revealed to be an excellent additive promoting PLA ductility. PODC is a by‐product of vegetable oil refining, which is available in stable quality and in sufficient amounts. Amorphous PLA/PODC blends had an elongation at break of around 130% and that of semi‐crystalline blends was still around 55% compared to the initial 5% of neat PLA. At the same time the PLA rigidity and high glass transition temperatures were kept. PODC was also a very efficient processing aid, allowing for film blow extrusion. The blends were stable in properties during six months without exudation. They complied with legal norms of Food Contact Materials (EU 10/2011) and induced no sensorial alteration of packed food. Therefore PODC is a very interesting alternative to common plasticizers for the production of flexible PLA packaging films. © 2016 Society of Chemical Industry  相似文献   
82.
This paper is to introduce an application of Computational Intelligence (CI) to Moving Picture Expert Group-4 (MPEG-4) video compression over IEEE.802.15.1 wireless communication, known as Bluetooth 1.2, in order to improve picture quality. The 2.4 GHz Industrial, Scientific and Medical frequency band is used for the IEEE.802.15.1 standard. IEEE.802.15.1 can be affected by noise and interference due to other neighboring wireless devices sharing the same frequency carrier. The noise and interference create difficulties in ascertaining an accurate real-time transmission rate at the receiving end. Furthermore, the MPEG-4 codec is an object-oriented compression system and demands a high bandwidth. It is therefore difficult to avoid excessive delay, image quality degradation and/or data loss during MPEG-4 video transmission over standard systems. A new buffer entitled ‘buffer added’ has been introduced at the input of the Bluetooth 1.2 device. This buffer is controlled by a Rule-Based Fuzzy (RBF) logic controller at the input and a neural-fuzzy controller (NFC) at the output. The two new fuzzy rules manipulate and supervise the flow of video over the Bluetooth 1.2 standard. The computer simulation results illustrate the comparison between a non-CI video transmission over Bluetooth 1.2 and the proposed design, confirming that the applications of RBF and NFC do improve the image quality, reduce data loss and reduce time delay.  相似文献   
83.
The present paper aims to demonstrate the interest of fuzzy inference systems in system modeling when human interaction is important. It discusses the originality of FIS and their capability to integrate expertise and rule learning from data into a single framework, analyzing their place relatively to concurrent approaches. An open source software implementation is presented, with a focus on the useful features for modeling. Two real world case studies are presented to illustrate the approach and the software utility.  相似文献   
84.
We address the problem of verifying planning domains as used in model-based planning, for example in space missions. We propose a methodology for testing flight rules of planning domains which is self-contained, in the sense that flight rules are verified using a planner and no external tools are required. We review and analyse coverage conditions for requirements-based testing, and we reason in detail on "Unique First Cause" (UFC) coverage for test suites. We characterise flight rules using patterns, encoded using LTL, and we provide UFC coverage for them. We then present a translation of LTL formulae into planning goals, and illustrate our approach on a case study.  相似文献   
85.
86.
We show that observed co-variations at sub-hourly time scales between the photochemical reflectance index (PRI) and canopy light use efficiency (LUE) over a Douglas-fir forest result directly from sub-hourly leaf reflectance changes in a 531 nm spectral window roughly 50 nm wide. We conclude then, that over a forest stand we are observing the direct effects of photosynthetic down-regulation on leaf-level reflectance at 531 nm. Key to our conclusion is our ability to simultaneously measure the LUE and reflectance of the Douglas-fir stand as a function of shadow fraction from the “hot spot” to the "dark spot"dark spot” and a new finding herein, based on radiative transfer theory, that the magnitude of a normalized reflectance difference index (NDRI) such as PRI can vary with shadow fraction only in case the reflectance of the shaded and sunlit leaves differ in at least one of the NDRI bands.Our spectrometer measurements over a nearly 6 month period show that at a forest stand scale, only two NDRIs (both containing a band near 570 nm) vary with shadow fraction and are correlated with LUE; an NDRI with a band centered at 531 nm roughly 50 nm wide, and another near 705 nm. Therefore, we are able to conclude that only these two bands' reflectance differ between the sunlit and the shaded elements of the canopy. Their reflectance changes on time scales of a few minutes or less. Our observations also show that the reflectance changes at 531 nm are more highly correlated with variations in canopy light use efficiency when only sunlit canopy elements are viewed (the hot spot), than when only shaded elements (the dark spot) are viewed. Taken together then, these results demonstrate that the observed sub-hourly changes in foliage reflectance at 531 nm and 705 nm can only result from corresponding variations in photosynthetic rates.The importance of our results are as follows: (1) We show that variations in PRI with LUE are a direct result of rapid changes in foliage reflectance at 531 nm resulting from photosynthetic down-regulation, and can be observed at forest scales. (2) Our findings also suggest a new sensor and methodology for the direct retrieval from space of changes in forest LUE by measuring PRI as a function of shadow fraction using a multi-angle spectrometer simultaneously retrieving both shadow fraction and PRI.  相似文献   
87.
The idea of decomposed software pipelining is to decouple the software pipelining problem into a cyclic scheduling problem without resource constraints and an acyclic scheduling problem with resource constraints. In terms of loop transformation and code motion, the technique can be formulated as a combination of loop shifting and loop compaction. Loop shifting amounts to moving statements between iterations thereby changing some loop independent dependences into loop carried dependences and vice versa. Then, loop compaction schedules the body of the loop considering only loop independent dependences, but taking into account the details of the target architecture. In this paper, we show how loop shifting can be optimized so as to minimize both the length of the critical path and the number of dependences for loop compaction. The first problem is well-known and can be solved by an algorithm due to Leiserson and Saxe. We show that the second optimization (and the combination with the first one) is also polynomially solvable with a fast graph algorithm, variant of minimum-cost flow algorithms. Finally, we analyze the improvements obtained on loop compaction by experiments on random graphs.  相似文献   
88.
89.
Numerical weather prediction (NWP) is in a period of transition. As resolutions increase, global models are moving towards fully nonhydrostatic dynamical cores, with the local and global models using the same governing equations; therefore we have reached a point where it will be necessary to use a single model for both applications. The new dynamical cores at the heart of these unified models are designed to scale efficiently on clusters with hundreds of thousands or even millions of CPU cores and GPUs. Operational and research NWP codes currently use a wide range of numerical methods: finite differences, spectral transform, finite volumes and, increasingly, finite/spectral elements and discontinuous Galerkin, which constitute element-based Galerkin (EBG) methods. Due to their important role in this transition, will EBGs be the dominant power behind NWP in the next 10 years, or will they just be one of many methods to choose from? One decade after the review of numerical methods for atmospheric modeling by Steppeler et al. (Meteorol Atmos Phys 82:287–301, 2003), this review discusses EBG methods as a viable numerical approach for the next-generation NWP models. One well-known weakness of EBG methods is the generation of unphysical oscillations in advection-dominated flows; special attention is hence devoted to dissipation-based stabilization methods. Since EBGs are geometrically flexible and allow both conforming and non-conforming meshes, as well as grid adaptivity, this review is concluded with a short overview of how mesh generation and dynamic mesh refinement are becoming as important for atmospheric modeling as they have been for engineering applications for many years.  相似文献   
90.
Digital Elevation Models (DEMs) are used to compute the hydro-geomorphological variables required by distributed hydrological models. However, the resolution of the most precise DEMs is too fine to run these models over regional watersheds. DEMs therefore need to be aggregated to coarser resolutions, affecting both the representation of the land surface and the hydrological simulations. In the present paper, six algorithms (mean, median, mode, nearest neighbour, maximum and minimum) are used to aggregate the Shuttle Radar Topography Mission (SRTM) DEM from 3″ (90 m) to 5′ (10 km) in order to simulate the water balance of the Lake Chad basin (2.5 Mkm2). Each of these methods is assessed with respect to selected hydro-geomorphological properties that influence Terrestrial Hydrology Model with Biogeochemistry (THMB) simulations, namely the drainage network, the Lake Chad bottom topography and the floodplain extent.The results show that mean and median methods produce a smoother representation of the topography. This smoothing involves the removing of the depressions governing the floodplain dynamics (floodplain area<5000 km2) but it eliminates the spikes and wells responsible for deviations regarding the drainage network. By contrast, using other aggregation methods, a rougher relief representation enables the simulation of a higher floodplain area (>14,000 km2 with the maximum or nearest neighbour) but results in anomalies concerning the drainage network. An aggregation procedure based on a variographic analysis of the SRTM data is therefore suggested. This consists of preliminary filtering of the 3″ DEM in order to smooth spikes and wells, then resampling to 5′ via the nearest neighbour method so as to preserve the representation of depressions. With the resulting DEM, the drainage network, the Lake Chad bathymetric curves and the simulated floodplain hydrology are consistent with the observations (3% underestimation for simulated evaporation volumes).  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号