首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   642篇
  免费   19篇
  国内免费   4篇
电工技术   14篇
综合类   4篇
化学工业   154篇
金属工艺   11篇
机械仪表   31篇
建筑科学   43篇
能源动力   40篇
轻工业   25篇
水利工程   4篇
石油天然气   8篇
无线电   83篇
一般工业技术   67篇
冶金工业   64篇
原子能技术   8篇
自动化技术   109篇
  2024年   6篇
  2023年   13篇
  2022年   18篇
  2021年   28篇
  2020年   18篇
  2019年   29篇
  2018年   38篇
  2017年   21篇
  2016年   37篇
  2015年   20篇
  2014年   42篇
  2013年   55篇
  2012年   22篇
  2011年   40篇
  2010年   24篇
  2009年   37篇
  2008年   20篇
  2007年   14篇
  2006年   22篇
  2005年   16篇
  2004年   14篇
  2003年   5篇
  2002年   8篇
  2001年   8篇
  2000年   12篇
  1999年   10篇
  1998年   10篇
  1997年   12篇
  1996年   9篇
  1995年   9篇
  1994年   3篇
  1993年   4篇
  1992年   9篇
  1991年   3篇
  1990年   4篇
  1989年   2篇
  1988年   3篇
  1987年   3篇
  1986年   1篇
  1985年   2篇
  1984年   1篇
  1983年   1篇
  1982年   3篇
  1980年   1篇
  1979年   1篇
  1978年   1篇
  1977年   2篇
  1976年   1篇
  1974年   2篇
  1973年   1篇
排序方式: 共有665条查询结果,搜索用时 15 毫秒
51.
This paper presents a tool that enables the direct editing of surface features in large point‐clouds or meshes. This is made possible by a novel multi‐scale analysis of unstructured point‐clouds that automatically extracts the number of relevant features together with their respective scale all over the surface. Then, combining this ingredient with an adequate multi‐scale decomposition allows us to directly enhance or reduce each feature in an independent manner. Our feature extraction is based on the analysis of the scale‐variations of locally fitted surface primitives combined with unsupervised learning techniques. Our tool may be applied either globally or locally, and millions of points are handled in real‐time. The resulting system enables users to accurately edit complex geometries with minimal interaction.  相似文献   
52.
53.
While scan-based compression is widely utilized in order to alleviate the test time and data volume problems,the overall compression level is dictated not only by the chain to channel ratio but also the ratio of encodable patterns.Aggressively increasing the number of scan chains in an effort to raise the compression levels may reduce the ratio of encodable patterns,degrading the overall compression level.In this paper,we present various methods to improve the ratio of encodable patterns.These methods are b...  相似文献   
54.
The assumption of proportional hazards (PH) fundamental to the Cox PH model sometimes may not hold in practice. In this paper, we propose a generalization of the Cox PH model in terms of the cumulative hazard function taking a form similar to the Cox PH model, with the extension that the baseline cumulative hazard function is raised to a power function. Our model allows for interaction between covariates and the baseline hazard and it also includes, for the two sample problem, the case of two Weibull distributions and two extreme value distributions differing in both scale and shape parameters. The partial likelihood approach can not be applied here to estimate the model parameters. We use the full likelihood approach via a cubic B-spline approximation for the baseline hazard to estimate the model parameters. A semi-automatic procedure for knot selection based on Akaike’s information criterion is developed. We illustrate the applicability of our approach using real-life data.  相似文献   
55.
The contribution of this paper is three empirical evaluations of a reference model for the practice of software reuse. Our research thesis is that software development based upon a software reuse reference model improves quality of products, productivity of processes and product time‐to‐market for many software development enterprises. The definition and investigation of such a model has been carried out using three steps. First, the reference model is developed based on existing software reuse concepts. Second, this reference model is empirically evaluated using three studies: one using a survey method, one using a case studies method, and one using a legacy studies method. Third, the impact of the reference model on software development productivity, quality, and time‐to‐market is empirically derived. This revised version was published online in June 2006 with corrections to the Cover Date.  相似文献   
56.
Traffic congestion is one of the main problems in large cities for which several approaches have been proposed. Park-and-ride is one of the best approaches that can remove traffic from the traffic network. Park-and-ride facilities are an important part of urban mass transit systems, effectively extending the service area and attracting commuters and other people who may not otherwise have used mass transit. However, its efficiency depends on its location in the urban network. In this research, we focus on travel time of shortest paths instead of the distance criterion for computing network traffic and develop a model for finding the best location(s) for siting park-and-ride systems so as to minimize the network traffic. The model is formulated based on population points, potential sites for park-and-ride establishment, and several Central Business District (CBDs). Then we present a Genetic algorithm that has been proved to be efficient in solving large size problems. Finally, the proposed model is used to locate park-and-ride facilities in the city of Isfahan, Iran.  相似文献   
57.
Caron JN  Namazi NM  Rollins CJ 《Applied optics》2002,41(32):6884-6889
A signal-processing algorithm has been developed where a filter function is extracted from degraded data through mathematical operations. The filter function can then be used to restore much of the degraded content of the data through use of a deconvolution algorithm. This process can be performed without prior knowledge of the detection system, a technique known as blind deconvolution. The extraction process, designated self-deconvolving data reconstruction algorithm, has been used successfully to restore digitized photographs, digitized acoustic waveforms, and other forms of data. The process is noniterative, computationally efficient, and requires little user input. Implementation is straightforward, allowing inclusion into many types of signal-processing software and hardware. The novelty of the invention is the application of a power law and smoothing function to the degraded data in frequency space. Two methods for determining the value of the power law are discussed. The first method assumes the power law is frequency dependent. The function derived comparing the frequency spectrum of the degraded data with the spectrum of a signal with the desired frequency response. The second method assumes this function is a constant of frequency. This approach requires little knowledge of the original data or the degradation.  相似文献   
58.
Hydraulic jumps in density currents are technically referred to density jumps. These jumps significantly influence the dynamic and quality characteristics of the gravity currents. The density jump is studied theoretically and experimentally in this study by considering the bed roughness. Experiments were performed in a rectangular laboratory flume (0.4 m width; 0.9 m depth; 8.3 m length). Four rough beds comprised of closely packed gravel particles glued onto the horizontal part of the bed were examined. For both smooth and rough beds, a simple relationship was obtained for estimating the conjugate depth ratio as a function of the relative roughness and the upstream densimetric Froude number. The conjugate depth ratio was found to decrease with increasing relative roughness. The results also indicated that, if the entrainment ratio is specified, the minimum value of the upstream densimetric Froude number increases with increasing relative bed roughness. An equation for calculating the maximum possible value of the relative roughness was also determined. The spatial development of the density current for smooth beds was analysed in both super‐critical and sub‐critical flow regimes. Good similarity collapses of velocity and concentration profiles were obtained for the super‐critical section just upstream of the jump. The concentration distributions located just downstream of the jump, however, exhibited a large scattering of measured data, especially near the bed. It was found that this scattering decreases with the distance from the end of the jump. The results of the experimental runs also indicated that, at a distance about nine times the post‐jump current thickness from the end of the jump, the non‐dimensional vertical profile of mean velocity has a shape similar to that at the pre‐jump section. A new reliable relationship was also proposed for calculating the local velocity inside both the wall and jet regions.  相似文献   
59.
Time to event data have long been important in many applied fields. Many models and analysis methods have been developed for this type of data in which each sample unit experiences at most a single end-of-life event.In contrast, many applications involve repeated events, where a subject or sampling unit experiences more than one event. There is growing interest in the analysis of recurrent events data, also called repeated events and recurrence data. This type of data arises in many fields. For example, the repair history of manufactured items can be modeled as recurrent events. In medical studies, the times of recurrent disease episodes in patients can also be modeled as recurrent events. In this paper we focus on medical applications (e.g. seizures, heart attacks, cancerous tumors, etc.). However, our proposed methodologies can be applied to other areas as well.For analyzing recurrence data, the first and perhaps most important step is to model the expected number of events, and sometimes this can be facilitated by modeling the cumulative intensity function or its derivative, the intensity rate function. One particular recurrent events scenario involves patients experiencing events according to a common intensity rate, and then a treatment may be applied. Assuming the treatment to be effective, the patients would be expected to follow a different intensity rate after receiving the treatment. Further, the treatment might be effective for a limited amount of time, so that a third rate would govern arrivals of the recurrent events after the effects of the treatment wore out. In this paper we model the intensity rate for such scenarios. In particular we allow models for the intensity rate, post-treatment, to be piecewise constant. Two estimators of the location of this change are proposed. Properties of the estimators are discussed. An example is studied for illustrative purposes.  相似文献   
60.
This paper presents an updated review of the petroleum prospectivity of Lebanon. It is based on a re‐assessment of the tectono‐stratigraphic succession in Lebanon, correlation with nearby countries and the results of a recent offshore seismic survey. A generalized model illustrating potential petroleum system(s) in Lebanon is presented with data on Palaeozoic, Mesozoic and Cenozoic plays. Major lithological units are described with respect to their source, reservoir and cap‐rock potential. Based on a general review of previous studies and existing data, Lebanese exploration prospects may comprise on‐ and offshore as well as coastal (margin) targets. They include potential Triassic reservoirs in onshore central‐northern Lebanon including those at the Qartaba structure. Offshore plays are discussed with reference to recent seismic profiles; potential offshore targets comprise Oligo‐Miocene reservoirs sealed by Messinian evaporites as well as deeper Mesozoic reservoirs.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号