首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   45598篇
  免费   4958篇
  国内免费   2898篇
电工技术   2591篇
综合类   3666篇
化学工业   8139篇
金属工艺   2060篇
机械仪表   5269篇
建筑科学   2103篇
矿业工程   1080篇
能源动力   2494篇
轻工业   1196篇
水利工程   928篇
石油天然气   2005篇
武器工业   881篇
无线电   2576篇
一般工业技术   5480篇
冶金工业   1405篇
原子能技术   449篇
自动化技术   11132篇
  2024年   213篇
  2023年   823篇
  2022年   1392篇
  2021年   1746篇
  2020年   1652篇
  2019年   1460篇
  2018年   1382篇
  2017年   1724篇
  2016年   1926篇
  2015年   2030篇
  2014年   2864篇
  2013年   3075篇
  2012年   2885篇
  2011年   3754篇
  2010年   2491篇
  2009年   2843篇
  2008年   2723篇
  2007年   3144篇
  2006年   2636篇
  2005年   2233篇
  2004年   1906篇
  2003年   1647篇
  2002年   1285篇
  2001年   1070篇
  2000年   897篇
  1999年   703篇
  1998年   547篇
  1997年   439篇
  1996年   338篇
  1995年   294篇
  1994年   231篇
  1993年   215篇
  1992年   174篇
  1991年   154篇
  1990年   109篇
  1989年   130篇
  1988年   70篇
  1987年   42篇
  1986年   38篇
  1985年   28篇
  1984年   33篇
  1983年   14篇
  1982年   16篇
  1981年   5篇
  1980年   7篇
  1979年   7篇
  1977年   5篇
  1959年   9篇
  1955年   5篇
  1951年   15篇
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
71.
基于Lisual Basic6.0编制了杜芬(Duffing)方程混沌特性分析的仿真软件,利用该软件不仅可方便地显示该系统在相空间上的轨迹线图,而且可绘制时程曲线图、幅频图和庞加菜(Poincare)截面图等。软件操作简便,运行可靠。  相似文献   
72.
73.
非均匀抽样网格简化   总被引:2,自引:0,他引:2  
提出了一种考虑视点空间中某些重要视点的非均匀抽样网格简化的新方法,是在借鉴了Garland-Heckbert方法的基础上提出的,是一种考虑外观相似性的简化算法。给出并证明了两个判定边界的定理,为抽样提供了理论依据。在简化过程中,该算法通过采用视点空间中某些重要视点对模型进行抽样,使抽中的顶点对(轮廓附近的顶点对)得到适当保护。该算法除具有Garland-Heckbert方法的长处外,还可以在三角面片数较少的情况下(50多个三角面片),尽可能保持模型的重要外观特征,给出了计算0-1图像的外观相似性误差的公式,通过该公式对简化结果进行比较,证明提出的简化算法对保持模型的外观特征是行之有效的。最后对该算法的时间和空间复杂性进行了分析。  相似文献   
74.
Logical Representation of a Conceptual Model for Spatial Data Warehouses   总被引:2,自引:2,他引:0  
The MultiDimER model is a conceptual model used for representing a multidimensional view of data for Data Warehouse (DW) and On-Line Analytical Processing (OLAP) applications. This model includes a spatial extension allowing spatiality in levels, hierarchies, fact relationships, and measures. In this way decision-making users can represent in an abstract manner their analysis needs without considering complex implementation issues and spatial OLAP tools developers can have a common vision for representing spatial data in a multidimensional model. In this paper we propose the transformation of a conceptual schema based on the MultiDimER constructs to an object-relational schema. We based our mapping on the SQL:2003 and SQL/MM standards giving examples of commercial implementation using Oracle 10g with its spatial extension. Further we use spatial integrity constraints to ensure the semantic equivalence of the conceptual and logical schemas. We also show some examples of Oracle spatial functions, including aggregation functions required for the manipulation of spatial data. The described mappings to the object-relational model along with the examples using a commercial system show the feasibility of implementing spatial DWs in current commercial DBMSs. Further, using integrated architectures, where spatial and thematic data is defined within the same DBMS, facilitates the system management simplifying data definition and manipulation.
Esteban ZimányiEmail:
  相似文献   
75.
Published online: 25 July 2001  相似文献   
76.
Clive A J Fletcher 《Sadhana》1993,18(3-4):657-681
A turbulent gas particle finite-volume flow simulation of a representative coal classifier is presented. Typical values of the loading ratio permit a one-way coupling analysis. As a case study, the computational fluid dynamics code,ranstad, and the modelling aspects are discussed in some detail. The simulation indicates that small (≈ 30 μm) coal particles pass through the classifier to the furnace but that large (≈ 300 μm) particles are captured and remilled. The computational simulation indicates that the classifier performance can be improved by internal geometric modification. The commitment of the Electricity Commission of New South Wales (Pacific Power) to the exploitation of Computational Engineering for the improvement of all aspects of electricity generation is acknowledged.  相似文献   
77.
The development of massively parallel supercomputers provides a unique opportunity to advance the state of the art inN-body simulations. TheseN-body codes are of great importance for simulations in stellar dynamics and plasma physics. For systems with long-range forces, such as gravity or electromagnetic forces, it is important to increase the number of particles toN 107 particles. Significantly improved modeling ofN body systems can be expected by increasingN, arising from a more realistic representation of physical transport processes involving particle diffusion and energy and momentum transport. In addition, it will be possible to guarantee that physically significant portions of complex physical systems, such as Lindblad resonances of galaxies or current sheets in magnetospheres, will have an adequate population of particles for a realistic simulation. Particle-mesh (PM) and particle-particle particle-mesh (P3M) algorithms present the best prospects for the simulation of large-scaleN-body systems. As an example we present a two-dimensional PM simulation of a disk galaxy that we have developed on the Connection Machine-2, a massively parallel boolean hypercube supercomputer. The code is scalable to any CM-2 configuration available and, on the largest configuration, simulations withN = 128 M = 227 particles are possible in reasonable run times.  相似文献   
78.
Some significant progress related to multidimensional data analysis has been achieved in the past few years, including the design of fast algorithms for computing datacubes, selecting some precomputed group-bys to materialize, and designing efficient storage structures for multidimensional data. However, little work has been carried out on multidimensional query optimization issues. Particularly the response time (or evaluation cost) for answering several related dimensional queries simultaneously is crucial to the OLAP applications. Recently, Zhao et al. first exploited this problem by presenting three heuristic algorithms. In this paper we first consider in detail two cases of the problem in which all the queries are either hash-based star joins or index-based star joins only. In the case of the hash-based star join, we devise a polynomial approximation algorithm which delivers a plan whose evaluation cost is $ O(n^{\epsilon }$) times the optimal, where n is the number of queries and is a fixed constant with . We also present an exponential algorithm which delivers a plan with the optimal evaluation cost. In the case of the index-based star join, we present a heuristic algorithm which delivers a plan whose evaluation cost is n times the optimal, and an exponential algorithm which delivers a plan with the optimal evaluation cost. We then consider a general case in which both hash-based star-join and index-based star-join queries are included. For this case, we give a possible improvement on the work of Zhao et al., based on an analysis of their solutions. We also develop another heuristic and an exact algorithm for the problem. We finally conduct a performance study by implementing our algorithms. The experimental results demonstrate that the solutions delivered for the restricted cases are always within two times of the optimal, which confirms our theoretical upper bounds. Actually these experiments produce much better results than our theoretical estimates. To the best of our knowledge, this is the only development of polynomial algorithms for the first two cases which are able to deliver plans with deterministic performance guarantees in terms of the qualities of the plans generated. The previous approaches including that of [ZDNS98] may generate a feasible plan for the problem in these two cases, but they do not provide any performance guarantee, i.e., the plans generated by their algorithms can be arbitrarily far from the optimal one. Received: July 21, 1998 / Accepted: August 26, 1999  相似文献   
79.
By combining linear graph theory with the principle of virtualwork, a dynamic formulation is obtained that extends graph-theoreticmodelling methods to the analysis of flexible multibody systems. Thesystem is represented by a linear graph, in which nodes representreference frames on rigid and flexible bodies, and edges representcomponents that connect these frames. By selecting a spanning tree forthe graph, the analyst can choose the set of coordinates appearing inthe final system of equations. This set can include absolute, joint, orelastic coordinates, or some combination thereof. If desired, allnon-working constraint forces and torques can be automaticallyeliminated from the dynamic equations by exploiting the properties ofvirtual work. The formulation has been implemented in a computerprogram, DynaFlex, that generates the equations of motion in symbolicform. Three examples are presented to demonstrate the application of theformulation, and to validate the symbolic computer implementation.  相似文献   
80.
The Maximum Likelihood Estimator (MLE) and Extended Quasi-Likelihood (EQL) estimator have commonly been used to estimate the unknown parameters within the joint modeling of mean and dispersion framework. However, these estimators can be very sensitive to outliers in the data. In order to overcome this disadvantage, the usage of the maximum Trimmed Likelihood Estimator (TLE) and the maximum Extended Trimmed Quasi-Likelihood (ETQL) estimator is recommended to estimate the unknown parameters in a robust way. The superiority of these approaches in comparison with the MLE and EQL estimator is illustrated by an example and a simulation study. As a prominent measure of robustness, the finite sample Breakdown Point (BDP) of these estimators is characterized in this setting.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号