首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   967篇
  免费   85篇
  国内免费   4篇
电工技术   15篇
综合类   3篇
化学工业   245篇
金属工艺   21篇
机械仪表   48篇
建筑科学   24篇
矿业工程   3篇
能源动力   41篇
轻工业   129篇
水利工程   12篇
石油天然气   11篇
无线电   78篇
一般工业技术   181篇
冶金工业   119篇
原子能技术   8篇
自动化技术   118篇
  2024年   5篇
  2023年   5篇
  2022年   19篇
  2021年   46篇
  2020年   38篇
  2019年   42篇
  2018年   43篇
  2017年   46篇
  2016年   49篇
  2015年   30篇
  2014年   42篇
  2013年   83篇
  2012年   45篇
  2011年   65篇
  2010年   53篇
  2009年   45篇
  2008年   45篇
  2007年   32篇
  2006年   26篇
  2005年   25篇
  2004年   24篇
  2003年   22篇
  2002年   13篇
  2001年   18篇
  2000年   16篇
  1999年   16篇
  1998年   30篇
  1997年   16篇
  1996年   19篇
  1995年   21篇
  1994年   17篇
  1993年   8篇
  1992年   6篇
  1991年   4篇
  1990年   6篇
  1989年   7篇
  1988年   2篇
  1987年   1篇
  1985年   2篇
  1984年   1篇
  1983年   2篇
  1982年   3篇
  1981年   1篇
  1980年   2篇
  1978年   1篇
  1977年   3篇
  1976年   7篇
  1975年   2篇
  1972年   1篇
  1971年   1篇
排序方式: 共有1056条查询结果,搜索用时 31 毫秒
31.
Neto LG  Roberge D  Sheng Y 《Applied optics》1995,34(11):1944-1950
Commercial twisted nematic liquid-crystal television provides coupled phase and amplitude modulation. We propose a simple wedged shear plate interferometer for in situ measurement of its phase modulation and operating curve. For a given operating curve, the coupled-mode modulation holograms are designed with an iterative method. We adjust the operating curve by rotating the polarizer and analyzer to obtain the optimal operating curve. The phase-mostly holograms yield good-quality reconstructed images with the zero-order spot reduced to a minimum. Experimental results are shown.  相似文献   
32.
The recent trend to reduce the thickness of metallic sheets used in forming processes strongly increases the likelihood of the occurrence of wrinkling. Thus, in order to obtain defect-free components, the prediction of this kind of defect becomes extremely important in the tool design and selection of process parameters. In this study, the sheet metal forming process proposed as a benchmark in the Numisheet 2014 conference is selected to analyse the influence of the tool geometry on wrinkling behaviour, as well as the reliability of the developed numerical model. The side-wall wrinkling during the deep drawing process of a cylindrical cup in AA5042 aluminium alloy is investigated through finite element simulation and experimental measurements. The material plastic anisotropy is modelled with an advanced yield criterion beyond the isotropic (von Mises) material behaviour. The results show that the shape of the wrinkles predicted by the numerical model is strongly affected by the finite element mesh used in the blank discretization. The accurate modelling of the plastic anisotropy of the aluminium alloy yields numerical results that are in good agreement with the experiments, particularly the shape and location of the wrinkles. The predicted punch force evolution is strongly influenced by the friction coefficient used in the model. Moreover, the two punch geometries provide drawn cups with different wrinkle waves, mainly differing in amplitude.  相似文献   
33.
The present work addresses the problem of structural damage identification built on the statistical inversion approach. Here, the damage state of the structure is continuously described by a cohesion parameter, which is spatially discretized by the finite element method. The inverse problem of damage identification is then posed as the determination of the posterior probability densities of the nodal cohesion parameters. The Markov Chain Monte Carlo method, implemented with the Metropolis–Hastings algorithm, is considered in order to approximate the posterior probabilities by drawing samples from the desired joint posterior probability density function. With this approach, prior information on the sought parameters can be used and the uncertainty concerning the known values of the material properties can be quantified in the estimation of the cohesion parameters. The assessment of the proposed approach has been performed by means of numerical simulations on a simply supported Euler–Bernoulli beam. The damage identification and assessment are performed considering time domain response data. Different damage scenarios and noise levels were addressed, demonstrating the feasibility of the proposed approach.  相似文献   
34.
The widespread use of bicarbonate dialysate and reprocessed high-efficiency and "high-flux" dialyzers has raised concerns about the increased risk of reverse-transfer of dialysate contaminants into the blood compartment. To evaluate this concern, the reverse-transfer of bacterial products from contaminated bicarbonate dialysate into the blood compartment was compared during in vitro dialysis with new or reprocessed high-flux polysulfone dialyzers. In vitro dialysis was carried out at 37 degrees C by use of a counter-current recirculating loop dialysis circuit with either new high-flux polysulfone dialyzers or dialyzers reprocessed once or 20 times with formaldehyde (0.75%) and bleach (< 1%) with an automated system. Heparinized whole blood from healthy volunteers was circulated through the blood compartment, and bicarbonate dialysate was circulated in the dialysate compartment. The dialysate was challenged sequentially by 1:1000 and 1:100 dilutions of a sterile Pseudomonas aeruginosa culture supernatant (bacterial challenge). Samples were drawn from the blood and dialysate compartments 1 h after each challenge. Peripheral blood mononuclear cells (PBMC) were harvested by Ficoll-Hypaque separation from whole blood in the blood compartment and a 5 x 10(6) PBMC/mL cell suspension was prepared. Likewise, dialysate samples (0.5 mL) were added to 0.5 mL suspension of 5 x 10(6) PBMC/mL drawn at baseline. All PBMC suspensions were incubated upright in a humidified atmosphere at 37 degrees C with 5% CO2 for 24 h, and total interleukin-1 alpha (IL-1 alpha) and tumor necrosis factor-alpha (TNF alpha) cytokine production (cell-associated and secreted) was measured by radioimmunoassay. Eight experiments were performed for each arm of the study with the same donor for each arm. One hour after contaminating the dialysate with a 1:1000 dilution of the bacterial challenge, IL-1 alpha production by PBMC harvested from the blood compartment was 160 +/- 0, 171 +/- 11, and 270 +/- 35 pg, respectively, for new dialyzers, dialyzers reprocessed once, and dialyzers reprocessed 20 times (P = 0.004). One hour after challenging the dialysate with 1:100 dilution, IL-1 alpha production by PBMC harvested from the blood compartment was 188 +/- 20, 228 +/- 35, and 427 +/- 67 pg, respectively, for new dialyzers, dialyzers reprocessed once, and dialyzers reprocessed 20 times (P = 0.006). IL-1 alpha production by PBMC from dialyzers reprocessed 20 times was significantly greater than both new and dialyzers reprocessed once. However, there were no significant differences between new dialyzers and dialyzers reprocessed once. Similarly, after the 1:1000 challenge, TNF alpha production by PBMC harvested from the blood compartment was 160 +/- 0, 160 +/- 0, and 213 +/- 22 pg, respectively, for new dialyzers, dialyzers reprocessed once, and dialyzers reprocessed 20 times (P = 0.008). After the 1:100 challenge, TNF alpha production was 168 +/- 8, 188 +/- 20, and 225 +/- 32 pg, respectively, for new dialyzers, dialyzers reprocessed once, and dialyzers reprocessed 20 times (P = 0.20). These results demonstrate that reprocessing of high-flux polysulfone dialyzers with bleach increases the risk of reverse-transfer of bacterial products from contaminated dialysate, and this risk appears to increase with the number of reuses. Consequently, units that reprocess membranes with bleach and have suboptimal water quality might subject their patients to a higher risk of cytokine-related morbidity.  相似文献   
35.
Constrained linear regression models for symbolic interval-valued variables   总被引:3,自引:0,他引:3  
This paper introduces an approach to fitting a constrained linear regression model to interval-valued data. Each example of the learning set is described by a feature vector for which each feature value is an interval. The new approach fits a constrained linear regression model on the midpoints and range of the interval values assumed by the variables in the learning set. The prediction of the lower and upper boundaries of the interval value of the dependent variable is accomplished from its midpoint and range, which are estimated from the fitted linear regression models applied to the midpoint and range of each interval value of the independent variables. This new method shows the importance of range information in prediction performance as well as the use of inequality constraints to ensure mathematical coherence between the predicted values of the lower () and upper () boundaries of the interval. The authors also propose an expression for the goodness-of-fit measure denominated determination coefficient. The assessment of the proposed prediction method is based on the estimation of the average behavior of the root-mean-square error and square of the correlation coefficient in the framework of a Monte Carlo experiment with different data set configurations. Among other aspects, the synthetic data sets take into account the dependence, or lack thereof, between the midpoint and range of the intervals. The bias produced by the use of inequality constraints over the vector of parameters is also examined in terms of the mean-square error of the parameter estimates. Finally, the approaches proposed in this paper are applied to a real data set and performances are compared.  相似文献   
36.
This paper deals with the scheduling problem of minimizing the makespan in a permutational flowshop environment with the possibility of outsourcing certain jobs. It addresses this problem by means of the development of an ant colony optimization-based algorithm. This new algorithm, here named as flowshop ant colony optimization is composed of two combined ACO heuristics. The results show that this new approach can be used to solve the problem efficiently and in a short computational time.  相似文献   
37.

Context

In software development, Testing is an important mechanism both to identify defects and assure that completed products work as specified. This is a common practice in single-system development, and continues to hold in Software Product Lines (SPL). Even though extensive research has been done in the SPL Testing field, it is necessary to assess the current state of research and practice, in order to provide practitioners with evidence that enable fostering its further development.

Objective

This paper focuses on Testing in SPL and has the following goals: investigate state-of-the-art testing practices, synthesize available evidence, and identify gaps between required techniques and existing approaches, available in the literature.

Method

A systematic mapping study was conducted with a set of nine research questions, in which 120 studies, dated from 1993 to 2009, were evaluated.

Results

Although several aspects regarding testing have been covered by single-system development approaches, many cannot be directly applied in the SPL context due to specific issues. In addition, particular aspects regarding SPL are not covered by the existing SPL approaches, and when the aspects are covered, the literature just gives brief overviews. This scenario indicates that additional investigation, empirical and practical, should be performed.

Conclusion

The results can help to understand the needs in SPL Testing, by identifying points that still require additional investigation, since important aspects regarding particular points of software product lines have not been addressed yet.  相似文献   
38.
39.
In this paper, we propose a multiresolution approach for surface reconstruction from clouds of unorganized points representing an object surface in 3-D space. The proposed method uses a set of mesh operators and simple rules for selective mesh refinement, with a strategy based on Kohonen's self-organizing map (SOM). Basically, a self-adaptive scheme is used for iteratively moving vertices of an initial simple mesh in the direction of the set of points, ideally the object boundary. Successive refinement and motion of vertices are applied leading to a more detailed surface, in a multiresolution, iterative scheme. Reconstruction was experimented on with several point sets, including different shapes and sizes. Results show generated meshes very close to object final shapes. We include measures of performance and discuss robustness.  相似文献   
40.
The development of complex information systems calls for conceptual models that describe aspects beyond entities and activities. In particular, recent research has pointed out that conceptual models need to model goals, in order to capture the intentions which underlie complex situations within an organisational context. This paper focuses on one class of goals, namely non-functional requirements (NFR), which need to be captured and analysed from the very early phases of the software development process. The paper presents a framework for integrating NFRs into the ER and OO models. This framework has been validated by two case studies, one of which is very large. The results of the case studies suggest that goal modelling during early phases can lead to a more productive and complete modelling activity.    相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号