首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   796篇
  免费   44篇
  国内免费   1篇
电工技术   10篇
化学工业   138篇
金属工艺   19篇
机械仪表   24篇
建筑科学   20篇
矿业工程   1篇
能源动力   19篇
轻工业   155篇
水利工程   1篇
石油天然气   1篇
无线电   39篇
一般工业技术   149篇
冶金工业   51篇
原子能技术   11篇
自动化技术   203篇
  2024年   3篇
  2023年   9篇
  2022年   19篇
  2021年   26篇
  2020年   22篇
  2019年   24篇
  2018年   25篇
  2017年   25篇
  2016年   25篇
  2015年   19篇
  2014年   34篇
  2013年   63篇
  2012年   68篇
  2011年   55篇
  2010年   52篇
  2009年   59篇
  2008年   39篇
  2007年   32篇
  2006年   30篇
  2005年   21篇
  2004年   24篇
  2003年   22篇
  2002年   17篇
  2001年   12篇
  2000年   3篇
  1999年   11篇
  1998年   12篇
  1997年   10篇
  1996年   10篇
  1995年   7篇
  1994年   3篇
  1993年   7篇
  1992年   3篇
  1991年   2篇
  1990年   3篇
  1989年   3篇
  1988年   2篇
  1987年   3篇
  1986年   4篇
  1985年   4篇
  1984年   6篇
  1983年   4篇
  1982年   2篇
  1979年   5篇
  1978年   2篇
  1977年   2篇
  1976年   2篇
  1974年   2篇
  1973年   1篇
  1972年   1篇
排序方式: 共有841条查询结果,搜索用时 125 毫秒
71.
This article describes the application of ionic liquid 1‐decyl‐3‐methylimidazolium tetrafluoroborate in the preparation of polypropylene‐silica composites. The sol‐gel technology was used to prepare xerogel silica‐ionic liquid hybrid S1 , which was obtained as a free flowing powder of aggregated spherical particles. Ionic liquid free silica S2 was obtained by extraction and calcination of S1 . Melt blending of isotactic polypropylene with S1 and S2 afforded the composites C1 (with ionic liquid) and C2 (without ionic liquid), respectively. The presence of ionic liquid on the S1 silica surface promoted significantly improved silica dispersion in the polymer matrix and prevented compression of the silica particles. Furthermore, the crystallization temperature of composite C1 was significantly higher, which indicated that silica‐ionic liquid filler S1 acted as nucleating agent. The resistance to thermal decomposition of both composites was increased, but this was higher in the presence of ionic liquid. These results show that liquid salts can function as coupling agents and compatibalizers for the preparation of polymeric composites with differentiated properties. © 2009 Wiley Periodicals, Inc. J Appl Polym Sci, 2010  相似文献   
72.
73.
Summary Principal component analysis has been applied to analyze the correlation matrix obtained from a 8 × 43 data matrix. The 8 trace metals are Mn, Co, Ni, Cu, Zn, Cd, Hg, Ph, which are contained in the soft part of mussels (Mytilus galloprovincialis Lamarck). Mussels were sampled from two sites in the Gulf of Trieste. In both samples, 76–78% of the total variance is explained by the four principal components. The orthogonally rotated factor matrix indicates that Co and Ni are bonded to the first principal component and Cd and Pb to the first (site 2) or second principal component (site 1). The origin of trace metals in the soft part of mussels from the Gulf of Trieste is discussed.
Analyse der Hauptkomponenten zur Identifizierung der Verunreinigungsursachen von Muscheln über die Spurenmetalle
Zusammenfassung Die Analyse der Hauptkomponenten wurde auf die Korrelationsmatrix, die aus der 8 × 43-Datenmatrix hervorgeht, angewandt. Diese 8 Spurenmetalle sind Mn, Co, Ni, Cu, Zn, Cd, Hg, Pb, die im Fleisch der Miesmuscheln (Mytilus galloprovincialis Lamarck) gefunden worden sind. Die Miesmuscheln stammen aus zwei Gegenden des Golfes von Triest. Vier Hauptkomponenten erklären 76–78 % der totalen Varianz der beiden Stichproben. Die orthogonale-rotierte Faktorenmatrix zeigt, daß Co und Ni an die erste Hauptkomponente und Cd und Pb an die erste (Lage 2) oder an die zweite Hauptkomponente (Lage 1) gebunden sind. Die Herkunft der Spurenmetalle im Muschelfleisch aus dem Golf von Triest wird diskutiert.


L. Felician is the author of the statistical analysis for site 1 as a part of his thesis in Commodity Science, Faculty of Economics, University of Trieste- L. Gabrielli Favretto is the author of the statistical analysis of the data for site 2. All authors contributed to the rest of the paper  相似文献   
74.
In this work, sugarcane bagasse fibers were used as filler in composites having recycled high‐density polyethylene (PEr) as matrix. Because of the poor interaction between fibers surface and the PEr, the surface of bagasse was chemically modified. This modification consists of washing with water at 80°C, a mercerization process using sodium hydroxide and acetylation reaction with acetic anhydride. The chemical modification was characterized by Fourier transform infrared–horizontal attenuated total reflectance (FTIR‐HATR) and 13C nuclear magnetic resonance spectroscopies (NMR), thermogravimetric analysis (TGA), and scanning electronic microscopy (SEM). The composites were prepared from modified and unmodified fibers into PEr matrix, containing 5, 10, and 20% (w/w) of fiber. The samples were processed by extrusion and molds were prepared by injection process in order to perform mechanical tests. These materials were analyzed by SEM, TGA, and the water uptake was evaluated. Also, their mechanical properties were analyzed. Morphological analysis indicated that the chemical modification of sugarcane bagasse increased the compatibility between matrix and reinforcement. Tensile, flexural, and impact tests showed that the mechanical properties of the composite were improved compared to PEr due to the presence of the fibers. POLYM. COMPOS., 35:768–774, 2014. © 2013 Society of Plastics Engineers  相似文献   
75.
Interval methods is one option for managing uncertainty in optimization problems and in decision management. The precise numerical estimation of coefficients may be meaningless in real-world applications, because data sources are often uncertain, vague and incomplete. In this paper we introduce a comparison index for interval ordering based on the generalized Hukuhara difference; we show that the new index includes the commonly used order relations proposed in literature. The definition of a risk measure guarantees the possibility to quantify a worst-case loss when solving maximization or minimization problems with intervals.  相似文献   
76.
The collapse load of masonry arches with limited compressive strength and externally bonded reinforcement, such as FRP, is evaluated by solving the minimization problem obtained by applying the upper bound theorem of limit analysis. The arch is composed of a finite number of blocks. The nonlinearity of the problem (no-tension material, frictional sliding and crushing) is concentrated in the interface between two adjacent blocks. The crushing in the collapse mechanism is schematised by the interpenetration of the blocks with the formation of hinges at internal or boundary points of the interface. The minimization problem is solved with linear optimization, taking advantages of the robust algorithms offered by linear programming (LP). The optimal solution of the linear programming problem approximates the exact solution to any degree of accuracy. The dual of the minimization problem is also formulated and is solved in order to present the statics (thrust curve, locus of feasible internal reactions, etc.) of the reinforced arch as a consequence of the kinematical assumptions used in the primal minimization problem. Numerical examples are presented in order to show the effectiveness of the proposed method. Finally, it is shown that the results provided by the proposed LP are in good agreement with an experiment on a FRP-strengthened arch characterized by crushing failure of the masonry.  相似文献   
77.
State space exploration is often used to prove properties about sequential behavior of Finite State Machines (FSMs). For example, equivalence of two machines is proved by analyzing the reachable state set of their product machine. Nevertheless, reachability analysis is infeasible on large practical examples. Combinational verification is far less expensive, but on the other hand its application is limited to combinational circuits, or particular design schemes. Finally, approximate techniques imply sufficient, not strictly necessary conditions.The purpose of this paper is to extend the applicability of purely combinational checks. This is generally achieved through state minimization, partitioning, and re-encoding the FSMs to factor out their differences. We focus on re-encoding. In particular, we present an incremental approach to re-encoding for verification that transforms the product machine traversal into a combinational verification in the best case, and into a computationally simpler product machine traversal in the general case.Experimental results demonstrate the effectiveness of this technique on medium-large circuits where other techniques may fail.  相似文献   
78.
Software testing is essential to guarantee high quality products. However, it is a very expensive activity, particularly when manually performed. One way to cut down costs is by reducing the input test suites, which are usually large in order to fully satisfy the test goals. Yet, since large test suites usually contain redundancies (i.e., two or more test cases (TC) covering the same requirement/piece of code), it is possible to reduce them in order to respect time/people constraints without severely compromising coverage. In this light, we formulated the TC selection problem as a constrained search based optimization task, using requirements coverage as the fitness function to be maximized (quality of the resultant suite), and the execution effort (time) of the selected TCs as a constraint in the search process. Our work is based on the Particle Swarm Optimization (PSO) algorithm, which is simple and efficient when compared to other widespread search techniques. Despite that, besides our previous works, we did not find any other proposals using PSO for TC selection, neither we found solutions treating this task as a constrained optimization problem. We implemented a Binary Constrained PSO (BCPSO) for functional TC selection, and two hybrid algorithms integrating BCPSO with local search mechanisms, in order to refine the solutions provided by BCPSO. These algorithms were evaluated using two different real-world test suites of functional TCs related to the mobile devices domain. In the performed experiments, the BCPSO obtained promising results for the optimization tasks considered. Also, the hybrid algorithms obtained statistically better results than the individual search techniques.  相似文献   
79.
80.
Several methods perform the integration of multiple range scans of an object aiming the generation of a reconstructed triangle mesh; however, achieving high fidelity digital reconstructions is still a challenge. That is mostly due to the existence of outliers in the acquired range data, and their harmful effects on the integration algorithms. In this work, we first discuss artifacts usually found on real range data captured with 3D scanners based on laser triangulation. Following that there is the assessment of two widely used volumetric integration techniques (VRIP and Consensus Surface) and our suggested improvements. We also present a novel, hybrid approach that combines strengths from both VRIP and Consensus Surface, named IMAGO Volumetric Integration Algorithm (IVIA). Our novel algorithm adds new ideas while improving the detection and elimination of artifacts. Further, IVIA works in close cooperation with the subsequent hole filling process, which greatly improves the overall quality of the generated 3D models. Our technique leads to better results when assessed in different situations, when compared to VRIP, Consensus Surface, and also to a well known state-of-the-art surface-based method, Poisson Surface Reconstruction.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号