首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   497篇
  免费   30篇
电工技术   4篇
化学工业   94篇
金属工艺   9篇
机械仪表   14篇
建筑科学   15篇
能源动力   22篇
轻工业   76篇
水利工程   3篇
石油天然气   1篇
无线电   33篇
一般工业技术   72篇
冶金工业   77篇
原子能技术   1篇
自动化技术   106篇
  2024年   2篇
  2023年   6篇
  2022年   18篇
  2021年   39篇
  2020年   23篇
  2019年   22篇
  2018年   32篇
  2017年   21篇
  2016年   20篇
  2015年   22篇
  2014年   24篇
  2013年   32篇
  2012年   20篇
  2011年   30篇
  2010年   22篇
  2009年   15篇
  2008年   13篇
  2007年   14篇
  2006年   11篇
  2005年   5篇
  2004年   9篇
  2003年   11篇
  2002年   8篇
  2001年   5篇
  2000年   4篇
  1999年   4篇
  1998年   21篇
  1997年   10篇
  1996年   10篇
  1995年   7篇
  1994年   6篇
  1993年   4篇
  1992年   4篇
  1990年   3篇
  1989年   6篇
  1988年   4篇
  1987年   1篇
  1986年   5篇
  1985年   4篇
  1984年   3篇
  1983年   3篇
  1981年   2篇
  1979年   1篇
  1974年   1篇
排序方式: 共有527条查询结果,搜索用时 15 毫秒
1.
Epoxy novolac/anhydride cure kinetics has been studied by differential scanning calorimetry under isothermal conditions. The system used in this study was an epoxy novolac resin (DEN431), with nadic methyl anhydride as hardener and benzyldimethylamine as accelerator. Kinetic parameters including the reaction order, activation energy and kinetic rate constants, were investigated. The cure reaction was described with the catalyst concentration, and a normalized kinetic model developed for it. It is shown that the cure reaction is dependent on the cure temperature and catalyst concentration, and that it proceeds through an autocatalytic kinetic mechanism. The curing kinetic constants and the cure activation energies were obtained using the Arrhenius kinetic model. A suggested kinetic model with a diffusion term was successfully used to describe and predict the cure kinetics of epoxy novolac resin compositions as a function of the catalyst content and temperature. Copyright © 2003 Society of Chemical Industry  相似文献   
2.
We consider the problem of scheduling jobs on related machines owned by selfish agents. We provide a 5-approximation deterministic truthful mechanism, the first deterministic truthful result for the problem. Previously, Archer and Tardos showed a 2-approximation randomized mechanism which is truthful in expectation only (a weaker notion of truthfulness). In case the number of machines is constant, we provide a deterministic Fully Polynomial-Time Approximation Scheme (FPTAS) and a suitable payment scheme that yields a truthful mechanism for the problem. This result, which is based on converting FPTAS to monotone FPTAS, improves a previous result of Auletta et al., who showed a (4 + ε)-approximation truthful mechanism.  相似文献   
3.
Christian Azar 《Energy》1994,19(12):1255-1261
Haraden's model for estimating the economic cost of global warming is analysed. We change his method of discounting and some of his input parameters in a manner consistent with physical and economic theory as well as empirical data. We then find much higher costs than Haraden found. These costs are compared to the cost of reducing CO2 emissions and we find that deep cuts of the emissions of CO2 are preferable. A check of the sensitivity of our results with respect to some crucial parameter values does not alter that conclusion.  相似文献   
4.
Type 2 diabetes (T2D) typically occurs in the setting of obesity and insulin resistance, where hyperglycemia is associated with decreased pancreatic β-cell mass and function. Loss of β-cell mass has variably been attributed to β-cell dedifferentiation and/or death. In recent years, it has been proposed that circulating epigenetically modified DNA fragments arising from β cells might be able to report on the potential occurrence of β-cell death in diabetes. Here, we review published literature of DNA-based β-cell death biomarkers that have been evaluated in human cohorts of islet transplantation, type 1 diabetes, and obesity and type 2 diabetes. In addition, we provide new data on the applicability of one of these biomarkers (cell free unmethylated INS DNA) in adult cohorts across a spectrum from obesity to T2D, in which no significant differences were observed, and compare these findings to those previously published in youth cohorts where differences were observed. Our analysis of the literature and our own data suggest that β-cell death may occur in subsets of individuals with obesity and T2D, however a more sensitive method or refined study designs are needed to provide better alignment of sampling with disease progression events.  相似文献   
5.
The rheological and morphological properties of blends based on high‐density polyethylene (HDPE) and a commercial ethylene–octene copolymer (EOC) produced by metallocene technology were investigated. The rheological properties were evaluated in steady and dynamic shear experiments at 190°C in shear rates ranging from 90 s?1 to 1500 s?1 and frequency range between 10?1 rad/s and 102 rad/s, respectively. These blends presented a high level of homogeneity in the molten state and rheological behavior was generally intermediate to those of the pure components. Scanning electron microscopy (SEM) showed that the blends exhibit dispersed morphologies with EOC domains distributed homogeneously and with particle size inferior to 2 μm. © 2002 Wiley Periodicals, Inc. J Appl Polym Sci 86: 2240–2246, 2002  相似文献   
6.
This paper proposes a color image encryption scheme using one-time keys based on crossover operator, chaos and the Secure Hash Algorithm(SHA-2). The (SHA-2) is employed to generate a 256-bit hash value from both the plain-image and the secret hash keys to make the key stream change in each encryption process. The SHA-2 value is employed to generate three initial values of the chaotic system. The permutation-diffusion process is based on the crossover operator and XOR operator, respectively. Experimental results and security analysis show that the scheme can achieve good encryption result through only one round encryption process, the key space is large enough to resist against common attacks,so the scheme is reliable to be applied in image encryption and secure communication.  相似文献   
7.
There is growing evidence that face recognition is "special" but less certainty concerning the way in which it is special. The authors review and compare previous proposals and their own more recent hypothesis, that faces are recognized "holistically" (i.e., using relatively less part decomposition than other types of objects). This hypothesis, which can account for a variety of data from experiments on face memory, was tested with 4 new experiments on face perception. A selective attention paradigm and a masking paradigm were used to compare the perception of faces with the perception of inverted faces, words, and houses. Evidence was found of relatively less part-based shape representation for faces. The literatures on machine vision and single unit recording in monkey temporal cortex are also reviewed for converging evidence on face representation. The neuropsychological literature is reviewed for-evidence on the question of whether face representation differs in degree or kind from the representation of other types of objects.  相似文献   
8.
We consider buffer management of unit packets with deadlines for a multi-port device with reconfiguration overhead. The goal is to maximize the throughput of the device, i.e., the number of packets delivered by their deadline. For a single port or with free reconfiguration, the problem reduces to the well-known packets scheduling problem, where the celebrated earliest-deadline-first (EDF) strategy is optimal 1-competitive. However, EDF is not 1-competitive when there is a reconfiguration overhead. We design an online algorithm that achieves a competitive ratio of 1−o(1) when the ratio between the minimum laxity of the packets and the number of ports tends to infinity. This is one of the rare cases where one can design an almost 1-competitive algorithm. One ingredient of our analysis, which may be interesting on its own right, is a perturbation theorem on EDF for the classical packets scheduling problem. Specifically, we show that a small perturbation in the release and deadline times cannot significantly degrade the optimal throughput. This implies that EDF is robust in the sense that its throughput is close to the optimum even when the deadlines are not precisely known.  相似文献   
9.
We consider the on-line version of the maximum vertex disjoint path problem when the underlying network is a tree. In this problem, a sequence of requests arrives in an on-line fashion, where every request is a path in the tree. The on-line algorithm may accept a request only if it does not share a vertex with a previously accepted request. The goal is to maximize the number of accepted requests. It is known that no on-line algorithm can have a competitive ratio better than Ω(log n) for this problem, even if the algorithm is randomized and the tree is simply a line. Obviously, it is desirable to beat the logarithmic lower bound. Adler and Azar (Proc. of the 10th ACM-SIAM Symposium on Discrete Algorithm, pp. 1–10, 1999) showed that if preemption is allowed (namely, previously accepted requests may be discarded, but once a request is discarded it can no longer be accepted), then there is a randomized on-line algorithm that achieves constant competitive ratio on the line. In the current work we present a randomized on-line algorithm with preemption that has constant competitive ratio on any tree. Our results carry over to the related problem of maximizing the number of accepted paths subject to a capacity constraint on vertices (in the disjoint path problem this capacity is 1). Moreover, if the available capacity is at least 4, randomization is not needed and our on-line algorithm becomes deterministic.  相似文献   
10.
Summary Non-casein protein fractions of raw skim med milk, obtained according to the Aschaffenburg and Drewry procedure, were studied by discontinuous polyacrylamide electrophoresis.Differences between the electropherograms obtained and the data of the above authors were observed in the fractions non-casein nitrogen minus proteose peptone nitrogen and total albumin nitrogen plus non-protein nitrogen.In the first fraction, instead of immunoglobulin, proteose-peptone was present and in the second fraction, besides the total albumin, immunglobulin and proteose-peptone were present. In our opinion the differences observed in the two fractions are due to incomplete salting out.
Überprüfung der Aschaffenburg und Drewry-Bestimmungsmethode der Nicht-Casein-Proteine durch Polyacrylamid-Elektrophorese
Zusammenfassung Es wurden Nicht-Casein-Protein-Fraktionen von roher Magermilch, die nach der Aschaffenburg- und Drewry-Methode getrennt wurden, discelektrophoretisch untersucht. Unterschiede zwischen den erhaltenen Elektropherogrammen und den Angaben der oben genannten Autoren wurden bei den Fraktionen Nicht-Casein-Stickstoff minus Proteose-Pepton-Stickstoff und Totalalbumin-Stickstoff plus Nicht-Protein-Stickstoff festgestellt.Bei der ersten Fraktion wurde das Proteose-Pepton anstatt Immunglobulin erhalten, und bei der zweiten Fraktion erhielt man neben dem Totalalbumin auch Immunglobulin und Proteose-Pepton. Nach unserer Auffassung sind die bei den beiden Fraktionen festgestellten Abweichungen auf unvollständiges Aussalzen zurückzuführen.
  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号