首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   511篇
  免费   21篇
电工技术   6篇
综合类   1篇
化学工业   148篇
金属工艺   5篇
机械仪表   15篇
建筑科学   16篇
能源动力   38篇
轻工业   62篇
水利工程   3篇
石油天然气   1篇
无线电   17篇
一般工业技术   77篇
冶金工业   43篇
原子能技术   7篇
自动化技术   93篇
  2024年   2篇
  2023年   8篇
  2022年   10篇
  2021年   31篇
  2020年   13篇
  2019年   23篇
  2018年   17篇
  2017年   17篇
  2016年   21篇
  2015年   13篇
  2014年   22篇
  2013年   28篇
  2012年   29篇
  2011年   36篇
  2010年   26篇
  2009年   29篇
  2008年   28篇
  2007年   14篇
  2006年   15篇
  2005年   15篇
  2004年   12篇
  2003年   10篇
  2002年   17篇
  2001年   11篇
  2000年   3篇
  1999年   3篇
  1998年   5篇
  1997年   6篇
  1996年   2篇
  1995年   3篇
  1994年   5篇
  1993年   2篇
  1992年   3篇
  1991年   3篇
  1990年   3篇
  1989年   5篇
  1988年   2篇
  1986年   5篇
  1985年   2篇
  1984年   3篇
  1983年   3篇
  1980年   5篇
  1979年   4篇
  1977年   3篇
  1976年   3篇
  1975年   3篇
  1973年   2篇
  1971年   2篇
  1962年   1篇
  1949年   1篇
排序方式: 共有532条查询结果,搜索用时 15 毫秒
11.
The biosynthesis of fatty acids in the diatomPhaeodactylum tricornutum was studied. The diatom was incubated with sodium [114C] acetate and the acids [1-14C] palmitic, [1-14C] stearic, [1-14C] linoleic and [1-14C] α-linolenic. The distribution of radioactivity in the products was determined by gas liquid radiochromatography. The diatom synthesized “de novo” not only saturated and monounsaturated fatty acids, but also linoleic, α-linolenic and other fatty acids including the highly polyunsaturated 20∶5ω3 and 22∶6ω3. When labeled acetate, stearic, α-linolenic or even linoleic acid were incubated with the diatom, the polyunsaturated C20 fatty acids synthesized belonged predominantly to the ω 3 family. The existence of Δ9, Δ6, Δ5, Δ4, ω6 and possibly ω3 desaturases inP. tricornutum is suggested. Member of the Carrera del Investigador Científico of the Comisión de Investigaciones Científicas de la Provincia de Buenos Aires. Member of the Carrera del Investigador Cientifico of the Consejo Nacional de Investigaciones Cientificas y Técnicas.  相似文献   
12.
Two important issues in computational modelling in cognitive neuroscience are: first, how to formally describe neuronal networks (i.e. biologically plausible models of the central nervous system), and second, how to analyse complex models, in particular, their dynamics and capacity to learn. We make progress towards these goals by presenting a communicating automata perspective on neuronal networks. Specifically, we describe neuronal networks and their biological mechanisms using Data-rich Communicating Automata, which extend classic automata theory with rich data types and communication. We use two case studies to illustrate our approach. In the first case study, we model a number of learning frameworks, which vary in respect of their biological detail, for instance the Backpropagation (BP) and the Generalized Recirculation (GeneRec) learning algorithms. We then used the SPIN model checker to investigate a number of behavioral properties of the neural learning algorithms. SPIN is a well-known model checker for reactive distributed systems, which has been successfully applied to many non-trivial problems. The verification results show that the biologically plausible GeneRec learning is less stable than BP learning. In the second case study, we presented a large scale (cognitive-level) neuronal network, which models an attentional spotlight mechanism in the visual system. A set of properties of this model was verified using Uppaal, a popular real-time model checker. The results show that the asynchronous processing supported by concurrency theory is not only a more biologically plausible way to model neural systems, but also provides a better performance in cognitive modelling of the brain than conventional artificial neural networks that use synchronous updates. Finally, we compared our approach with several other related theories that apply formal methods to cognitive modelling. In addition, the practical implications of the approach are discussed in the context of neuronal network based controllers.  相似文献   
13.
The effect of ethanol on the fatty acid desaturation by rat liver has been studied using liquid diets of different composition. Acute ethanol administration increased triacylglycerols of total liver lipids, but did not modify significantly the lipidic composition of microsomes. The Δ6 and Δ5 desaturases were inhibited by ethanol whereas the Δ9 desaturase and fatty acid synthetase were apparently modified only by diet composition. NADH-cytochrome (cyt.) c reductase was partially inhibited, whereas NADH-cyt. b5 reductase remained practically unaltered and NADPH-cyt. c reductase activity was enhanced. Decreased electrons supplied by the microsomal cyt. b5 electron transport chain would not be the reason for the inhibition of Δ6 and Δ5 desaturases by ethanol.  相似文献   
14.
The effect of oral administration, for 24 or 48 hr, of different octadeca fatty acids containing a 9,12-dienoic structure on the fatty acid composition and Δ9 desaturation activity of liver microsomes of rat fed a fat-free diet was studied. The ethyl esters of linoelaidic and γ-linolenic acids, the methyl ester of linoleic acid and free columbinic acid were administered to rats maintained on a fat-free diet. The supplementation of the fat-free diet with linoelaidate produced no relevant changes in the fatty acid composition pattern of liver microsomes and did not modify the percentage of conversion of palmitic to palmitoleic acid. The addition of linoleate or γ-linolenate to the fat-free diet returned liver microsome Δ9 desaturation activity toward the control and partially restored the liver microsome fatty acid spectrum found in the fat-free diet. Columbinic acid (5-trans-9-cis,12-cis-18∶3), which cannot be transformed into arachidonic acid, also decreased the Δ9 desaturation activity enhanced by the fat-free diet and evoked changes in the microsomal fatty acid composition similar to those produced by the ω6 fatty acids. These results suggest that the modulation of Δ9 desaturase activity evoked by dietary administration of unsaturated acids of ω6 series would depend on thecis double bond configuration of these acids.  相似文献   
15.
Ves-Losada A  Maté SM  Brenner RR 《Lipids》2001,36(3):273-282
Liver nuclear incorporation of stearic (18∶0), linoleic (18∶2n−6), and arachidonic (20∶4n−6) acids was studied by incubation in vitro of the [1-14C] fatty acids with nuclei, with or without the cytosol fraction at different times. The [1-14C] fatty acids were incorporated into the nuclei as free fatty acids in the following order: 18∶0>20∶4n−6≫18∶2n−6, and esterified into nuclear lipids by an acyl-CoA pathway. All [1-14C] fatty acids were esterified mainly to phospholipids and triacylglycerols and in a minor proportion to diacylglycerols. Only [1-14C] 18∶2n−6-CoA was incorporated into cholesterol esters. The incorporation was not modified by cytosol addition. The incorporation of 20∶4n−6 into nuclear phosphatidylcholine (PC) pools was also studied by incubation of liver nuclei in vitro with [1-14C]20∶4n−6-CoA, and nuclear labeled PC molecular species were determined. From the 15 PC nuclear molecular species determined, five were labeled with [1-14C]20∶4n−6-CoA: 18∶0–20∶4, 16∶0–20∶4, 18∶1–20∶4, 18∶2–20∶4, and 20∶4–20∶4. The highest specific radioactivity was found in 20∶4–20∶4 PC, which is a minor species. In conclusion, liver cell nuclei possess the necessary enzymes to incorporate exogenous saturated and unsaturated fatty acids into lipids by an acyl-CoA pathway, showing specificity for each fatty acid. Liver cell nuclei also utilize exogenous 20∶4n−6-CoA to synthesize the major molecular species of PC with 20∶4n−6 at the sn-2 position. However, the most actively synthesized is 20∶4–20∶4 PC, which is a quantitatively minor component. The labeling pattern of 20∶4–20∶4 PC would indicate that this molecular species is synthesized mainly by the de novo pathway.  相似文献   
16.
The potential benefits of using human resources efficiently in the service sector constitute an incentive for decision makers in this industry to intelligently manage the work shifts of their employees, especially those dealing directly with customers. In the long term, they should attempt to find the right balance between employing as few labor resources as possible and keeping a high level of service. In the short run (e.g., 1 week), however, contracted staff levels cannot be adjusted, and management efforts thus focus on the efficient assignment of shifts and activities to each employee. This article proposes a mixed integer program model that solves the short-term multi-skilled workforce tour scheduling problem, enabling decision makers to simultaneously design workers’ shifts and days off, assign activities to shifts and assign those to employees so as to maximize and balance coverage of a firm’s demand for on-duty staff across multiple activities. Our model is simple enough to be solved with a commercial MIP solver calibrated by default without recurring to complex methodologies, such as extended reformulations and exact and/or heuristic column generation subroutines. A wide computational testing over 1000 randomly generated instances suggests that the model’s solution times are compatible with daily use and that multi-skilling is a significant source of labor flexibility to improve coverage of labor requirements, in particular when such requirements are negatively correlated and part-time workers are a scarce resource.  相似文献   
17.
This paper describes an inverse procedure to determine the constitutive constants and the friction conditions in the machining processes using Finite Elements (FE) simulations. In general, the FE modeling of machining processes is an effective tool to analyze the materials machinability under different cutting conditions. However, the use of reliable rheological and friction models represents the basis of a correct numerical investigation. The presented inverse procedure was based on the numerical results obtained using a commercial FE code and was developed considering a specific optimization problem, in which the objective functions that have to be minimized is the experimental/numerical error. This problem was performed by a routine developed in a commercial optimization software. In order to verify the goodness and the robustness of the methodology, it was applied on a Super Duplex Stainless Steel (SDSS) and on an Austenitic Stainless Steel (AUSS) orthogonal machining processes. This work, then, was focused on the identification of the Johnson-Cook (JC) coefficients (A,B,C, n and m) and on the calibration of a Coulomb friction model, in the specific cases of the SAF2507 SDSS and of an AISI 316 Based AUSS Alloy (AISI 316 ASBA). The identification phases were performed considering forces and temperatures experimental data, collected in two specific experimental tasks in which different orthogonal cutting tests were carried out under different cutting parameters conditions.  相似文献   
18.
We present in this paper an analysis of a semi-Lagrangian second order Backward Difference Formula combined with hp-finite element method to calculate the numerical solution of convection diffusion equations in ℝ2. Using mesh dependent norms, we prove that the a priori error estimate has two components: one corresponds to the approximation of the exact solution along the characteristic curves, which is O(Dt2+hm+1(1+\frac\mathopen|logh|Dt))O(\Delta t^{2}+h^{m+1}(1+\frac{\mathopen{|}\log h|}{\Delta t})); and the second, which is O(Dtp+|| [(u)\vec]-[(u)\vec]h||L)O(\Delta t^{p}+\| \vec{u}-\vec{u}_{h}\|_{L^{\infty}}), represents the error committed in the calculation of the characteristic curves. Here, m is the degree of the polynomials in the finite element space, [(u)\vec]\vec{u} is the velocity vector, [(u)\vec]h\vec{u}_{h} is the finite element approximation of [(u)\vec]\vec{u} and p denotes the order of the method employed to calculate the characteristics curves. Numerical examples support the validity of our estimates.  相似文献   
19.
In this paper we will focus on the notion of “implicit” or lexically unexpressed linguistic elements that are nonetheless necessary for a complete semantic interpretation of a text. We refer to “entities” and “events” because the recovery of the implicit material may affect all the modules of a system for semantic processing, from the grammatically guided components to the inferential and reasoning ones. Reference to the system GETARUNS offers one possible implementation of the algorithms and procedures needed to cope with the problem and enables us to deal with all the spectrum of phenomena. The paper will address at first the following three types of “implicit” entities and events:
  • the grammatical ones, as suggested by a linguistic theories like LFG or similar generative theories;
  • the semantic ones suggested in the FrameNet project, i.e. CNI, DNI, INI;
  • the pragmatic ones: here we will present a theory and an implementation for the recovery of implicit entities and events of (non-) standard implicatures.
  • In particular we will show how the use of commonsense knowledge may fruitfully contribute to find relevant implied meanings. Last Implicit Entity only touched on, though for lack of space, is the Subject of Point of View, which is computed by Semantic Informational Structure and contributes the intended entity from whose point of view a given subjective statement is expressed.  相似文献   
    20.
    This paper analyzes the application of Moran’s index and Geary’s coefficient to the characterization of lung nodules as malignant or benign in computerized tomography images. The characterization method is based on a process that verifies which combination of measures, from the proposed measures, has been best able to discriminate between the benign and malignant nodules using stepwise discriminant analysis. Then, a linear discriminant analysis procedure was performed using the selected features to evaluate the ability of these in predicting the classification for each nodule. In order to verify this application we also describe tests that were carried out using a sample of 36 nodules: 29 benign and 7 malignant. A leave-one-out procedure was used to provide a less biased estimate of the linear discriminator’s performance. The two analyzed functions and its combinations have provided above 90% of accuracy and a value area under receiver operation characteristic (ROC) curve above 0.85, that indicates a promising potential to be used as nodules signature measures. The preliminary results of this approach are very encouraging in characterizing nodules using the two functions presented.
    Rodolfo Acatauassu NunesEmail:
      相似文献   
    设为首页 | 免责声明 | 关于勤云 | 加入收藏

    Copyright©北京勤云科技发展有限公司  京ICP备09084417号