首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   4665篇
  免费   232篇
  国内免费   10篇
电工技术   57篇
综合类   3篇
化学工业   959篇
金属工艺   92篇
机械仪表   68篇
建筑科学   274篇
矿业工程   4篇
能源动力   197篇
轻工业   481篇
水利工程   30篇
石油天然气   9篇
无线电   451篇
一般工业技术   845篇
冶金工业   480篇
原子能技术   36篇
自动化技术   921篇
  2023年   46篇
  2022年   67篇
  2021年   127篇
  2020年   83篇
  2019年   101篇
  2018年   111篇
  2017年   103篇
  2016年   144篇
  2015年   128篇
  2014年   162篇
  2013年   295篇
  2012年   288篇
  2011年   345篇
  2010年   305篇
  2009年   267篇
  2008年   284篇
  2007年   246篇
  2006年   214篇
  2005年   184篇
  2004年   159篇
  2003年   139篇
  2002年   129篇
  2001年   66篇
  2000年   80篇
  1999年   63篇
  1998年   89篇
  1997年   65篇
  1996年   64篇
  1995年   50篇
  1994年   54篇
  1993年   36篇
  1992年   29篇
  1991年   29篇
  1990年   24篇
  1989年   30篇
  1988年   20篇
  1987年   29篇
  1986年   23篇
  1985年   23篇
  1984年   25篇
  1983年   21篇
  1982年   24篇
  1981年   16篇
  1980年   14篇
  1979年   22篇
  1978年   9篇
  1977年   16篇
  1976年   13篇
  1974年   8篇
  1973年   7篇
排序方式: 共有4907条查询结果,搜索用时 15 毫秒
91.
Diamond‐dispersed copper matrix (Cu/D) composite materials with different interfacial configurations are fabricated through powder metallurgy and their thermal performances are evaluated. An innovative solution to chemically bond copper (Cu) to diamond (D) has been investigated and compared to the traditional Cu/D bonding process involving carbide‐forming additives such as boron (B) or chromium (Cr). The proposed solution consists of coating diamond reinforcements with Cu particles through a gas–solid nucleation and growth process. The Cu particle‐coating acts as a chemical bonding agent at the Cu–D interface during hot pressing, leading to cohesive and thermally conductive Cu/D composites with no carbide‐forming additives. Investigation of the microstructure of the Cu/D materials through scanning electron microscopy, transmission electron microscopy, and atomic force microscopy analyses is coupled with thermal performance evaluations through thermal diffusivity, dilatometry, and thermal cycling. Cu/D composites fabricated with 40 vol% of Cu‐coated diamonds exhibit a thermal conductivity of 475 W m?1 K?1 and a thermal expansion coefficient of 12 × 10?6 °C?1. These promising thermal performances are superior to that of B‐carbide‐bonded Cu/D composites and similar to that of Cr‐carbide‐bonded Cu/D composites fabricated in this study. Moreover, the Cu/D composites fabricated with Cu‐coated diamonds exhibit higher thermal cycling resistance than carbide‐bonded materials, which are affected by the brittleness of the carbide interphase upon repeated heating and cooling cycles. The as‐developed materials can be applicable as heat spreaders for thermal management of power electronic packages. The copper‐carbon chemical bonding solution proposed in this article may also be found interesting to other areas of electronic packaging, such as brazing solders, direct bonded copper substrates, and polymer coatings.
  相似文献   
92.
In the information age, the storage and accessibility of data is of vital importance. There are several possibilities to fulfill this task. Magnetic storage of data is a well‐established method and the range of materials used is continuously extended. In this study, the magnetic remanence of thermally sprayed tungsten carbide–cobalt (WCCo)‐coatings in dependence of their thickness is examined. Two magnetic fields differing in value and geometry are imprinted into the coatings and the resulting remanence field is measured. It is found that there are two effects, which in combination determine the effective value of the magnetic remanence usable for magnetic data storage.
  相似文献   
93.
94.
It is shown that several recursive least squares (RLS) type equalization algorithms such as, e.g., decisiondirected schemes and orthogonalized constant modulus algorithms, possess a common algorithmic structure and are therefore rather straightforwardly implemented on an triangular array (filter structure) for RLS estimation with inverse updating. While the computational complexity for such algorithms isO(N 2), whereN is the problem size, the throughput rate for the array implementation isO(1), i.e., independent of the problem size. Such a throughput rate cannot be achieved with standard (Gentleman-Kung-type) RLS/QR-updating arrays because of feedback loops in the computational schemes.  相似文献   
95.
We demonstrate controlled transport of superparamagnetic beads in the opposite direction of a laminar flow. A permanent magnet assembles 200 nm magnetic particles into about 200 μm long bead chains that are aligned in parallel to the magnetic field lines. Due to a magnetic field gradient, the bead chains are attracted towards the wall of a microfluidic channel. A rotation of the permanent magnet results in a rotation of the bead chains in the opposite direction to the magnet. Due to friction on the surface, the bead chains roll along the channel wall, even in counter-flow direction, up to at a maximum counter-flow velocity of 8 mm s−1. Based on this approach, magnetic beads can be accurately manoeuvred within microfluidic channels. This counter-flow motion can be efficiently be used in Lab-on-a-Chip systems, e.g. for implementing washing steps in DNA purification.  相似文献   
96.
A procedure to find the optimal design of a flywheel with a split-type hub is presented. Since cost plays a decisive role in stationary flywheel energy storage applications, a trade-off between energy and cost is required. Applying a scaling technique, the multi-objective design problem is reduced to the maximization of the energy-per-cost ratio as the single objective. Both an analytical and a finite element model were studied. The latter was found to be more than three orders of magnitude more computationally expensive than the analytical model, while the analytical model can only be regarded as a coarse approximation. Multifidelity approaches were examined to reduce the computational expense while retaining the high accuracy and large modeling depth of the finite element model. Using a surrogate-based optimization strategy, the computational cost was reduced to only one third in comparison to using only the finite element model. A nonlinear interior-point method was employed to find the optimal rim thicknesses and rotational speed. The benefits of the split-type hub architecture were demonstrated.  相似文献   
97.
During the past few years, several works have been done to derive string kernels from probability distributions. For instance, the Fisher kernel uses a generative model M (e.g. a hidden Markov model) and compares two strings according to how they are generated by M. On the other hand, the marginalized kernels allow the computation of the joint similarity between two instances by summing conditional probabilities. In this paper, we adapt this approach to edit distance-based conditional distributions and we present a way to learn a new string edit kernel. We show that the practical computation of such a kernel between two strings x and x built from an alphabet Σ requires (i) to learn edit probabilities in the form of the parameters of a stochastic state machine and (ii) to calculate an infinite sum over Σ* by resorting to the intersection of probabilistic automata as done for rational kernels. We show on a handwritten character recognition task that our new kernel outperforms not only the state of the art string kernels and string edit kernels but also the standard edit distance used by a neighborhood-based classifier.  相似文献   
98.
The SHARC framework for data quality in Web archiving   总被引:1,自引:0,他引:1  
Web archives preserve the history of born-digital content and offer great potential for sociologists, business analysts, and legal experts on intellectual property and compliance issues. Data quality is crucial for these purposes. Ideally, crawlers should gather coherent captures of entire Web sites, but the politeness etiquette and completeness requirement mandate very slow, long-duration crawling while Web sites undergo changes. This paper presents the SHARC framework for assessing the data quality in Web archives and for tuning capturing strategies toward better quality with given resources. We define data quality measures, characterize their properties, and develop a suite of quality-conscious scheduling strategies for archive crawling. Our framework includes single-visit and visit?Crevisit crawls. Single-visit crawls download every page of a site exactly once in an order that aims to minimize the ??blur?? in capturing the site. Visit?Crevisit strategies revisit pages after their initial downloads to check for intermediate changes. The revisiting order aims to maximize the ??coherence?? of the site capture(number pages that did not change during the capture). The quality notions of blur and coherence are formalized in the paper. Blur is a stochastic notion that reflects the expected number of page changes that a time-travel access to a site capture would accidentally see, instead of the ideal view of a instantaneously captured, ??sharp?? site. Coherence is a deterministic quality measure that counts the number of unchanged and thus coherently captured pages in a site snapshot. Strategies that aim to either minimize blur or maximize coherence are based on prior knowledge of or predictions for the change rates of individual pages. Our framework includes fairly accurate classifiers for change predictions. All strategies are fully implemented in a testbed and shown to be effective by experiments with both synthetically generated sites and a periodic crawl series for different Web sites.  相似文献   
99.
Enterprise Architecture (EA) is increasingly being used by large organizations to get a grip on the complexity of their business processes, information systems and technical infrastructure. Although seen as an important instrument to help solve major organizational problems, effectively applying EA seems no easy task. Active participation of EA stakeholders is one of the main critical success factors for EA. This participation depends on the degree in which EA helps stakeholders achieve their individual goals. A highly related topic is effectiveness of EA, the degree in which EA helps to achieve the collective goals of the organization. In this article we present our work regarding EA stakeholder satisfaction and EA effectiveness, and compare these two topics. We found that, regarding EA, the individual goals of stakeholders map quite well onto the collective goals of the organization. In a case study we conducted, we found that the organization is primarily concerned with the final results of EA, while individual stakeholders also worry about the way the architects operate.  相似文献   
100.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号