首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   3289篇
  免费   266篇
  国内免费   5篇
电工技术   25篇
综合类   4篇
化学工业   657篇
金属工艺   62篇
机械仪表   84篇
建筑科学   147篇
矿业工程   10篇
能源动力   81篇
轻工业   444篇
水利工程   30篇
石油天然气   5篇
无线电   289篇
一般工业技术   687篇
冶金工业   386篇
原子能技术   8篇
自动化技术   641篇
  2023年   48篇
  2022年   37篇
  2021年   104篇
  2020年   83篇
  2019年   79篇
  2018年   148篇
  2017年   167篇
  2016年   171篇
  2015年   116篇
  2014年   152篇
  2013年   314篇
  2012年   256篇
  2011年   207篇
  2010年   196篇
  2009年   175篇
  2008年   161篇
  2007年   133篇
  2006年   101篇
  2005年   96篇
  2004年   67篇
  2003年   73篇
  2002年   75篇
  2001年   34篇
  2000年   25篇
  1999年   32篇
  1998年   86篇
  1997年   68篇
  1996年   36篇
  1995年   36篇
  1994年   32篇
  1993年   26篇
  1992年   13篇
  1991年   8篇
  1990年   7篇
  1989年   7篇
  1988年   15篇
  1987年   12篇
  1986年   15篇
  1985年   13篇
  1984年   9篇
  1983年   9篇
  1982年   12篇
  1981年   7篇
  1980年   10篇
  1979年   8篇
  1978年   6篇
  1976年   12篇
  1974年   6篇
  1973年   8篇
  1960年   5篇
排序方式: 共有3560条查询结果,搜索用时 78 毫秒
101.
Parameterization of computational domain plays an important role in isogeometric analysis as mesh generation in finite element analysis. In this paper, we investigate this problem in the 2D case, i.e., how to parametrize the computational domains by planar B-spline surface from the given CAD objects (four boundary planar B-spline curves). Firstly, two kinds of sufficient conditions for injective B-spline parameterization are derived with respect to the control points. Then we show how to find good parameterization of computational domain by solving a constraint optimization problem, in which the constraint condition is the injectivity sufficient conditions of planar B-spline parameterization, and the optimization term is the minimization of quadratic energy functions related to the first and second derivatives of planar B-spline parameterization. By using this method, the resulted parameterization has no self-intersections, and the isoparametric net has good uniformity and orthogonality. After introducing a posteriori error estimation for isogeometric analysis, we propose r-refinement method to optimize the parameterization by repositioning the inner control points such that the estimated error is minimized. Several examples are tested on isogeometric heat conduction problem to show the effectiveness of the proposed methods and the impact of the parameterization on the quality of the approximation solution. Comparison examples with known exact solutions are also presented.  相似文献   
102.
Resource management is an important aspect to consider regarding applications that might have different non‐functional or operational requirements, when running in distributed and heterogeneous environments. In this context, it is necessary to provide the means to specify the required resource constraints and an infrastructure that can adapt the applications in light of the changes in resource availability. We adopted a contract‐based approach to describe and maintain parallel applications that have non‐functional requirements in a Computing Grid context, called ZeliGrid. To form the supporting infrastructure we have designed a software architecture that integrates some of the Globus services, the LDAP and the NWS monitoring services. Some modules that map the contract approach into software artifacts were also integrated to this architecture. This paper addresses the architecture and integration issues of our approach, as well as how we put the pieces together highlighting deployment and implementation details, which have to consider diverse aspects such as monitoring, security and dynamic reconfiguration. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   
103.
Abstract— The TCO requirements provide well‐known and recognized quality labels for displays. For these requirements to remain useful, they must continuously be reviewed and updated when necessary. The study described here was performed in response to the market trend of designing flat‐panel displays and notebooks with glare panels. The purpose of this study was to investigate subjective responses to display screens of different gloss levels for office workers working on different tasks under normal office‐lighting conditions. The study consisted of three parts, one where the users should set an acceptable reflex level, one where the user should rate their disturbance on a category scale, and one where the visual acuity of the users were investigated whether they were affected by glare or not. The results show that increasing gloss and increasing luminance levels had negative effects on the acceptance and the disturbance of reflexes. There were statistically significant differences in the acceptance and the disturbance levels between screens with low gloss and screens with high gloss, which suggests that screens with the highest gloss levels should be avoided. The study did not show an effect on the performance based on acuity testing.  相似文献   
104.
In June 2003, a large scale injection experiment started at the Continental Deep Drilling site (KTB) in Germany. A tiltmeter array was installed which consisted of five high resolution borehole tiltmeters of the ASKANIA type, also equipped with three dimensional seismometers. For the next 11 months, 86 000 m(3) were injected into the KTB pilot borehole 4000 m deep. The average injection rate was approximately 200 l/min. The research objective was to observe and to analyze deformation caused by the injection into the upper crust at the kilometer range. A new data acquisition system was developed by Geo-Research Center Potsdam (GFZ) to master the expected huge amount of seismic and tilt data. Furthermore, it was necessary to develop a new preprocessing software called PREANALYSE for long-period time series. This software includes different useful functions, such as step and spike correction, interpolation, filtering, and spectral analysis. This worldwide unique installation offers the excellent opportunity of the separation of signals due to injection and due to environment by correlation of the data of the five stations with the ground water table and meteorological data.  相似文献   
105.
Video microscopy is a widely applied diagnostic to investigate the structure and the dynamics of particles in dusty plasmas. Reliable algorithms are required to accurately recover particle positions from the camera images. Here, four different particle positioning techniques have been tested on artificial and experimental data of dusty plasma situations. Two methods that rely on pixel-intensity thresholds were found to be strongly affected by pixel-locking errors and by noise. Two other methods-one applying spatial bandpass filters and the other fitting polynomials to the intensity pattern-yield subpixel resolution under various conditions. These two methods have been shown to be ideally suited to recover particle positions even from small-scale fluctuations that are used to derive the normal mode spectra of finite dust clusters.  相似文献   
106.
This work addresses the soundtrack indexing of multimedia documents. Our purpose is to detect and locate sound unity to structure the audio dataflow in program broadcasts (reports). We present two audio classification tools that we have developed. The first one, a speech music classification tool, is based on three original features: entropy modulation, stationary segment duration (with a Forward–Backward Divergence algorithm) and number of segments. They are merged with the classical 4 Hz modulation energy. It is divided into two classifications (speech/non-speech and music/non-music) and provides more than 90% of accuracy for speech detection and 89% for music detection. The other system, a jingle identification tool, uses an Euclidean distance in the spectral domain to index the audio data flow. Results show that is efficient: among 132 jingles to recognize, we have detected 130. Systems are tested on TV and radio corpora (more than 10 h). They are simple, robust and can be improved on every corpus without training or adaptation.
Régine André-ObrechtEmail:
  相似文献   
107.
The exponential increase of subjective, user-generated content since the birth of the Social Web, has led to the necessity of developing automatic text processing systems able to extract, process and present relevant knowledge. In this paper, we tackle the Opinion Retrieval, Mining and Summarization task, by proposing a unified framework, composed of three crucial components (information retrieval, opinion mining and text summarization) that allow the retrieval, classification and summarization of subjective information. An extensive analysis is conducted, where different configurations of the framework are suggested and analyzed, in order to determine which is the best one, and under which conditions. The evaluation carried out and the results obtained show the appropriateness of the individual components, as well as the framework as a whole. By achieving an improvement over 10% compared to the state-of-the-art approaches in the context of blogs, we can conclude that subjective text can be efficiently dealt with by means of our proposed framework.  相似文献   
108.
There are two main strategies for solving correspondence problems in computer vision: sparse local feature based approaches and dense global energy based methods. While sparse feature based methods are often used for estimating the fundamental matrix by matching a small set of sophistically optimised interest points, dense energy based methods mark the state of the art in optical flow computation. The goal of our paper is to show that this separation into different application domains is unnecessary and can be bridged in a natural way. As a first contribution we present a new application of dense optical flow for estimating the fundamental matrix. Comparing our results with those obtained by feature based techniques we identify cases in which dense methods have advantages over sparse approaches. Motivated by these promising results we propose, as a second contribution, a new variational model that recovers the fundamental matrix and the optical flow simultaneously as the minimisers of a single energy functional. In experiments we show that our coupled approach is able to further improve the estimates of both the fundamental matrix and the optical flow. Our results prove that dense variational methods can be a serious alternative even in classical application domains of sparse feature based approaches.  相似文献   
109.
The tracking of products trajectories involves major challenges in simulation generation and adaptation. Positioning techniques and technologies have become available and affordable to incorporate more deeply into workshop operations. We present our 2-year effort into developing a general framework in location and manufacturing applications. We demonstrate the features of the proposed applications using a case study, a synthetic flexible manufacturing environment, with product-driven policy, which enables the generation of a location data stream of product trajectories over the whole plant. These location data are mined and processed to reproduce the manufacturing system dynamics in an adaptive simulation scheme. This article proposes an original method for the generation of simulation models in discrete event systems. This method uses the product location data in the running system. The data stream of points (product ID, location, and time) is the starting point for the algorithm to generate a queuing network simulation model.  相似文献   
110.
This paper deals with a variant of flowshop scheduling, namely, the hybrid or flexible flowshop with sequence dependent setup times. This type of flowshop is frequently used in the batch production industry and helps reduce the gap between research and operational use. This scheduling problem is NP-hard and solutions for large problems are based on non-exact methods. An improved genetic algorithm (GA) based on software agent design to minimise the makespan is presented. The paper proposes using an inherent characteristic of software agents to create a new perspective in GA design. To verify the developed metaheuristic, computational experiments are conducted on a well-known benchmark problem dataset. The experimental results show that the proposed metaheuristic outperforms some of the well-known methods and the state-of-art algorithms on the same benchmark problem dataset.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号