首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   700篇
  免费   33篇
  国内免费   3篇
电工技术   28篇
综合类   2篇
化学工业   254篇
金属工艺   9篇
机械仪表   9篇
建筑科学   30篇
矿业工程   1篇
能源动力   2篇
轻工业   86篇
无线电   44篇
一般工业技术   99篇
冶金工业   93篇
原子能技术   6篇
自动化技术   73篇
  2022年   7篇
  2021年   6篇
  2020年   7篇
  2019年   8篇
  2018年   11篇
  2017年   5篇
  2016年   14篇
  2015年   18篇
  2014年   16篇
  2013年   31篇
  2012年   28篇
  2011年   26篇
  2010年   20篇
  2009年   27篇
  2008年   27篇
  2007年   25篇
  2006年   15篇
  2005年   17篇
  2004年   20篇
  2003年   17篇
  2002年   13篇
  2001年   15篇
  2000年   7篇
  1999年   15篇
  1998年   24篇
  1997年   20篇
  1996年   15篇
  1995年   10篇
  1994年   12篇
  1993年   12篇
  1992年   13篇
  1991年   8篇
  1990年   12篇
  1989年   16篇
  1988年   13篇
  1987年   19篇
  1986年   13篇
  1985年   11篇
  1984年   16篇
  1983年   12篇
  1982年   9篇
  1981年   11篇
  1980年   5篇
  1978年   8篇
  1977年   5篇
  1976年   11篇
  1975年   7篇
  1974年   12篇
  1973年   7篇
  1971年   4篇
排序方式: 共有736条查询结果,搜索用时 15 毫秒
11.
Sacrificial etching is one of the most important process steps in micro-electro-mechanical systems technology, since it enables the generation of free-standing structures. These structures are often the main part of micro-mechanical devices, intended to sense or induce a mechanical movement. The etching process transforms an initial multi-segmented geometry and depends on material properties and several process conditions. One of the crucial issues for etching is the etching selectivity on different materials. The major task for the simulation is to give an answer, how sacrificial layer surfaces regress in time under the influence of process parameters and to which magnitude surrounding material segments are affected by the etching process. For this purpose we have developed a fully three-dimensional topography simulation tool, Etcher-Topo3D, which is capable to deal with realistic process conditions. The main concept is demonstrated in this work. During simulation the topography of the initial multi-segment geometry is changed which is handled by a level-set algorithm. After a simulation is finished, the level-set representation has usually to be converted back to a mesh representation to enable further analysis. To illustrate the main features of our simulation tool several examples of MEMS structures with a sacrificial layer are presented.  相似文献   
12.
We introduce two new concepts for?-efficiency and investigate these and some known concepts under the following aspects: (1) What are the relationships between the efficient and the?-efficient point set? (2) If the sequence (v ?) of?-efficient points converges forε → 0, is the limit efficient, or weakly efficient? (3) Can the distance between the?-efficient and the efficient point set be estimated in terms ofε?  相似文献   
13.
14.
The field of data mining has become accustomed to specifying constraints on patterns of interest. A large number of systems and techniques has been developed for solving such constraint-based mining problems, especially for mining itemsets. The approach taken in the field of data mining contrasts with the constraint programming principles developed within the artificial intelligence community. While most data mining research focuses on algorithmic issues and aims at developing highly optimized and scalable implementations that are tailored towards specific tasks, constraint programming employs a more declarative approach. The emphasis lies on developing high-level modeling languages and general solvers that specify what the problem is, rather than outlining how a solution should be computed, yet are powerful enough to be used across a wide variety of applications and application domains.This paper contributes a declarative constraint programming approach to data mining. More specifically, we show that it is possible to employ off-the-shelf constraint programming techniques for modeling and solving a wide variety of constraint-based itemset mining tasks, such as frequent, closed, discriminative, and cost-based itemset mining. In particular, we develop a basic constraint programming model for specifying frequent itemsets and show that this model can easily be extended to realize the other settings. This contrasts with typical procedural data mining systems where the underlying procedures need to be modified in order to accommodate new types of constraint, or novel combinations thereof. Even though the performance of state-of-the-art data mining systems outperforms that of the constraint programming approach on some standard tasks, we also show that there exist problems where the constraint programming approach leads to significant performance improvements over state-of-the-art methods in data mining and as well as to new insights into the underlying data mining problems. Many such insights can be obtained by relating the underlying search algorithms of data mining and constraint programming systems to one another. We discuss a number of interesting new research questions and challenges raised by the declarative constraint programming approach to data mining.  相似文献   
15.
Epidemiological studies indicate a correlation of cruciferous vegetables consumption with reduced incidence of cancer. This study was designed to investigate molecular mechanisms, which may help to understand the beneficial effects of Brussels sprout consumption. In order to avoid the limitations of in vitro model systems, we performed a dietary intervention study with five participants. We investigated, whether sprout consumption affects the proteome profile of primary white blood cells. In order to achieve maximal sensitivity in detecting specific adaptive proteome alterations, we metabolically labelled freshly isolated cells in the presence of 35S‐methionine/cysteine and performed autoradiographic quantification of protein synthesis. Proteins were separated by 2‐DE and spots of interest were cut out, digested and identified by MS. After the intervention, we found a significant up‐regulation of the synthesis of manganese superoxide dismutase (1.56‐fold) and significant down‐regulation of the synthesis of heat shock 70 kDa protein (hsp70; 2.27‐fold). Both proteins play a role in malignant transformation of cells. Hsp‐70 is involved in the regulation of apoptosis, which leads to elimination of cancer cells, while SOD plays a key role in protection against reactive oxygen species mediated effects. Our findings indicate that the alteration of the synthesis of these proteins may be involved in the anticarcinogenic effects of cruciferous vegetables, which was observed in earlier laboratory studies with animals.  相似文献   
16.
Transport in single and double barrier devices is studied using a Monte Carlo solver for the Wigner transport equation. This approach allows the effects of tunneling and scattering to be included. Several numerical methods have been improved to render the Wigner Monte Carlo technique more robust, including a newly developed particle annihilation algorithm. A self-consistent iteration scheme with the Poisson equation was introduced. The role of scattering and space charge effects on the electrical characteristics of n-i-n nanostructures, ultra-scaled double gate MOSFETs, and GaAs resonant tunneling diodes is demonstrated. An erratum to this article can be found at  相似文献   
17.
Carbon nanotube field-effect transistors (CNTFETs) have been studied in recent years as a potential alternative to CMOS devices, because of the capability of ballistic transport. The ambipolar behavior of Schottky barrier CNTFETs limits the performance of these devices. A double gate design is proposed to suppress this behavior. In this structure the first gate located near the source contact controls carrier injection and the second gate located near the drain contact suppresses parasitic carrier injection. To avoid the ambipolar behavior it is necessary that the voltage of the second gate is higher or at least equal to the drain voltage. The behavior of these devices has been studied by solving the coupled Schrödinger-Poisson equation system. We investigated the effect of the second gate voltage on the performance of the device and finally the advantages and disadvantages of these options are discussed.  相似文献   
18.
Investigating the dynamical and physical properties of cosmic dust can reveal a great deal of information about both the dust and its many sources. Over recent years, several spacecraft (e.g., Cassini, Stardust, Galileo, and Ulysses) have successfully characterised interstellar, interplanetary, and circumplanetary dust using a variety of techniques, including in situ analyses and sample return. Charge, mass, and velocity measurements of the dust are performed either directly (induced charge signals) or indirectly (mass and velocity from impact ionisation signals or crater morphology) and constrain the dynamical parameters of the dust grains. Dust compositional information may be obtained via either time-of-flight mass spectrometry of the impact plasma or direct sample return. The accurate and reliable interpretation of collected spacecraft data requires a comprehensive programme of terrestrial instrument calibration. This process involves accelerating suitable solar system analogue dust particles to hypervelocity speeds in the laboratory, an activity performed at the Max Planck Institut fu?r Kernphysik in Heidelberg, Germany. Here, a 2 MV Van de Graaff accelerator electrostatically accelerates charged micron and submicron-sized dust particles to speeds up to 80 km s(-1). Recent advances in dust production and processing have allowed solar system analogue dust particles (silicates and other minerals) to be coated with a thin conductive shell, enabling them to be charged and accelerated. Refinements and upgrades to the beam line instrumentation and electronics now allow for the reliable selection of particles at velocities of 1-80 km s(-1) and with diameters of between 0.05 μm and 5 μm. This ability to select particles for subsequent impact studies based on their charges, masses, or velocities is provided by a particle selection unit (PSU). The PSU contains a field programmable gate array, capable of monitoring in real time the particles' speeds and charges, and is controlled remotely by a custom, platform independent, software package. The new control instrumentation and electronics, together with the wide range of accelerable particle types, allow the controlled investigation of hypervelocity impact phenomena across a hitherto unobtainable range of impact parameters.  相似文献   
19.
20.
Chocolate mass of low viscosity is preferred for most applications. Milk powder influences processing behaviour, flow properties and taste of milk chocolate. The project aimed to investigate influences of skim milk powders containing amorphous or crystalline lactose on flow properties after producing samples by roller milling and conching or alternatively by ball milling. For the first case, it was found that mass consistency before roller milling is strongly influenced by lactose type; producers must specify it and adapt initial mass fat content. Little impact on final products was found after processing milk powders at equilibrium moisture. If predried powders are used for reducing conching time, crystalline lactose leads to chocolate with slightly lower viscosity. At ball mill processing, crystalline lactose resulted in significantly lower viscosity, for example 15% at 40 s?1; thus, for this process, it can be recommended to use special milk powders high in crystalline lactose content.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号