首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   734篇
  免费   34篇
  国内免费   61篇
电工技术   12篇
综合类   117篇
化学工业   11篇
金属工艺   5篇
机械仪表   7篇
建筑科学   337篇
矿业工程   24篇
能源动力   10篇
轻工业   1篇
水利工程   45篇
石油天然气   9篇
无线电   7篇
一般工业技术   69篇
冶金工业   93篇
原子能技术   9篇
自动化技术   73篇
  2024年   2篇
  2023年   6篇
  2022年   9篇
  2021年   6篇
  2020年   10篇
  2019年   7篇
  2018年   14篇
  2017年   8篇
  2016年   26篇
  2015年   24篇
  2014年   36篇
  2013年   44篇
  2012年   36篇
  2011年   69篇
  2010年   59篇
  2009年   113篇
  2008年   87篇
  2007年   33篇
  2006年   30篇
  2005年   30篇
  2004年   39篇
  2003年   20篇
  2002年   21篇
  2001年   9篇
  2000年   8篇
  1999年   10篇
  1998年   21篇
  1997年   10篇
  1996年   7篇
  1995年   2篇
  1994年   7篇
  1993年   2篇
  1992年   1篇
  1991年   2篇
  1990年   5篇
  1989年   8篇
  1988年   4篇
  1986年   2篇
  1985年   1篇
  1982年   1篇
排序方式: 共有829条查询结果,搜索用时 15 毫秒
11.
为从地质灾害多光谱遥感影像资源中挖掘有用的信息,形成标准化、定量化的地质灾害遥感判读知识,进行地震次生地质灾害遥感影像特征图谱研究。以"5·12"汶川地震重灾区地震次生地质灾害滑坡、碎屑流和崩塌为例,首先提取灾害体单元的遥感影像,选取其光谱、纹理和色彩方面的20种遥感影像特征参数并统计参数值;然后对特征参数进行归一化处理,并通过主成分分析得到利用第一、第二、第三主成分表达的各类地震次生地质灾害的影像特征图谱。经过测试样本验证,上述几类图谱可表征相应地震次生地质灾害类型的影像特征。揭示了不同类型地震次生地质灾害遥感影像特征之间的关系,并为遥感影像中灾害体的识别提供了定量化和可视化的知识。  相似文献   
12.
针对城市给水管线不同管段在灾后对于管网整体可靠度影响的差异性,为了能够较好地评价不同管段对管网系统连接可靠度的贡献,引入失效概率重要度和关键重要度,建立了基于历史经验方法的管网可靠度及管线失效概率重要度、关键重要度计算模型。实例计算结果表明,该模型计算的管网连接可靠度结果与其他文献方法的计算结果基本一致,验证了模型的合理性、有效性。进而应用该模型计算出各段管线的失效概率重要度、关键重要度,对管线防灾等级划分,给出了管网分割方法与管网震后恢复策略。研究结果为编制给水管网抗震防灾规划、震后管网恢复计划及制定出相应的抗灾应急措施提供依据。  相似文献   
13.
地震地表破裂成因分析——以玉树地震为例   总被引:1,自引:0,他引:1  
通过对玉树地震区4个典型地表破裂观察点地表破裂的观察分析,认为地表破裂可以分为同震地表破裂和震后受重力等影响而形成的滑坡后缘破裂以及河岸垮塌体破裂等。同震地表破裂大部分与发震断层的出露位置一致,是断层错动引起的,但有一部分地表破裂与断层位置并不重合,其形成可能另有原因。断层是地壳中相对软弱带,其消耗能量的作用大于传递能量的作用。这类同震地表破裂很可能是震源体破裂释放的能量以地震波的形式通过极不均匀的上地壳传播,再经过地壳中断层、裂隙、岩层岩性界面、褶皱面等复杂的折射、反射过程,在局部地段叠加增强后最终突出地表释放能量形成的。这种由于地震波突出地表形成地震破裂的过程,可以称之为地震地表破裂的波突成因。由于地震波携带有发震构造的相关信息,因此即使是波突成因的同震地表破裂,其组合形式也能够反映发震构造的一些基本特征。  相似文献   
14.
Performance-Based Design (PBD) methodologies is the contemporary trend in designing better and more economic earthquake-resistant structures where the main objective is to achieve more predictable and reliable levels of safety and operability against natural hazards. On the other hand, reliability-based optimization (RBO) methods directly account for the variability of the design parameters into the formulation of the optimization problem. The objective of this work is to incorporate PBD methodologies under seismic loading into the framework of RBO in conjunction with innovative tools for treating computational intensive problems of real-world structural systems. Two types of random variables are considered: Those which influence the level of seismic demand and those that affect the structural capacity. Reliability analysis is required for the assessment of the probabilistic constraints within the RBO formulation. The Monte Carlo Simulation (MCS) method is considered as the most reliable method for estimating the probabilities of exceedance or other statistical quantities albeit with excessive, in many cases, computational cost. First or Second Order Reliability Methods (FORM, SORM) constitute alternative approaches which require an explicit limit-state function. This type of limit-state function is not available for complex problems. In this study, in order to find the most efficient methodology for performing reliability analysis in conjunction with performance-based optimum design under seismic loading, a Neural Network approximation of the limit-state function is proposed and is combined with either MCS or with FORM approaches for handling the uncertainties. These two methodologies are applied in RBO problems with sizing and topology design variables resulting in two orders of magnitude reduction of the computational effort.  相似文献   
15.
When clusters with different densities and noise lie in a spatial point set, the major obstacle to classifying these data is the determination of the thresholds for classification, which may form a series of bins for allocating each point to different clusters. Much of the previous work has adopted a model-based approach, but is either incapable of estimating the thresholds in an automatic way, or limited to only two point processes, i.e. noise and clusters with the same density. In this paper, we present a new density-based cluster method (DECODE), in which a spatial data set is presumed to consist of different point processes and clusters with different densities belong to different point processes. DECODE is based upon a reversible jump Markov Chain Monte Carlo (MCMC) strategy and divided into three steps. The first step is to map each point in the data to its mth nearest distance, which is referred to as the distance between a point and its mth nearest neighbor. In the second step, classification thresholds are determined via a reversible jump MCMC strategy. In the third step, clusters are formed by spatially connecting the points whose mth nearest distances fall into a particular bin defined by the thresholds. Four experiments, including two simulated data sets and two seismic data sets, are used to evaluate the algorithm. Results on simulated data show that our approach is capable of discovering the clusters automatically. Results on seismic data suggest that the clustered earthquakes, identified by DECODE, either imply the epicenters of forthcoming strong earthquakes or indicate the areas with the most intensive seismicity, this is consistent with the tectonic states and estimated stress distribution in the associated areas. The comparison between DECODE and other state-of-the-art methods, such as DBSCAN, OPTICS and Wavelet Cluster, illustrates the contribution of our approach: although DECODE can be computationally expensive, it is capable of identifying the number of point processes and simultaneously estimating the classification thresholds with little prior knowledge.  相似文献   
16.
In this paper we investigate the surface displacement related to the 2006 Machaze earthquake using Synthetic Aperture Radar Interferometry (InSAR) and sub-pixel correlation (SPC) of radar amplitude images. We focus on surface displacement measurement during three stages of the seismic cycle. First, we examined the co-seismic stage, using an Advanced SAR (ASAR) sensor onboard the Envisat satellite. Then we investigated the post-seismic stage using the Phase Array L-band SAR sensor (PALSAR) onboard the ALOS satellite. Lastly, we focussed on the inter-seismic stage, prior to the earthquake by analysing the L-band JERS-1 SAR data. The high degree of signal decorrelation in the C-band co-seismic interferogram hinders a correct positioning of the surface rupture and correct phase unwrapping. The post-seismic L-band interferograms reveal a time-constant surface displacement, causing subsidence of the surface at a ∼ 5 cm/yr rate. This phenomenon continued to affect the close rupture field for at least two years following the earthquake and intrinsically reveals a candidate seismogenic fault trace that we use as a proxy for an inversion against an elastic dislocation model. Prior to the earthquake, the JERS interferograms do not indicate any traces of pre-seismic slip on the seismogenic fault. Therefore, slip after the earthquake is post-seismic, and it was triggered by the Machaze earthquake. This feature represents a prominent post-seismic slip event rarely observed in such a geodynamic context.  相似文献   
17.
Integrated earthquake simulation (IES) is a seamless simulation of the three earthquake processes, namely, the earthquake hazard process, the earthquake disaster process and the anti-disaster action process. High performance computing (HPC) is essential if IES, or particularly, the simulation of the earthquake disaster process is applied to an urban area in which 104∼6 structures are located. IES is enhanced with parallel computation, and its performance is examined, so that virtual earthquake disaster simulation will be made for a model of an actual city by inputting observed strong ground motion. It is shown that parallel IES has fairly good scalability even when advanced non-linear seismic structure analysis is used.  相似文献   
18.
A two-dimensional (2-D) cellular automata (CA) dynamic system constituted of cells-charges has been proposed for the simulation of the earthquake process. In this paper, the study is focused on the optimal parameterisation of the model introducing the use of genetic algorithm (GA). The optimisation of the CA model parameterisation, by applying a standard GA, extends its ability to study various hypotheses concerning the seismicity of the region under consideration. The GA evolves an initially random population of candidate solutions of model parameters, such that in time appropriate solutions to emerge. The quality criterion is realised by taking into account the extent that the simulation results match the Gutenberg-Richter (GR) law derived from recorded data of the area under test. The simulation results presented here regard regions of Greece with different seismic and geophysical characteristics. The results found are in good quantitative and qualitative agreement with the GR scaling relations.  相似文献   
19.
传统的基于真实距离的聚类分析方法不利于地震不同断层破裂传播和愈合速度的精确计算。为提高地震预测精度,提出并建立了基于软距离计算的聚类方法。给出了基于软距离聚类过程、软距离计算方法以及具体的基于软距离计算的聚类算法。以现实的强震样本点作为聚类数据源,采用该聚类方法以及其它传统聚类方法对该样本数据进行聚类分析。分析结果表明,采用该聚类方法获得的聚类中心点更接近地壳应力场演变的客观真实性,该聚类分析方法为地震的断层带下次发生强震的精确计算提供了很好的计算依据。  相似文献   
20.
汶川地震对碧口、麒麟寺及苗家坝水电站均造成了不同程度的损伤。对震损部位修复工作的组织设计和实施效果进行评价探讨,为类似工程的修复提供参考。  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号