全文获取类型
收费全文 | 563篇 |
免费 | 30篇 |
专业分类
电工技术 | 2篇 |
化学工业 | 166篇 |
金属工艺 | 7篇 |
机械仪表 | 10篇 |
建筑科学 | 86篇 |
能源动力 | 7篇 |
轻工业 | 39篇 |
水利工程 | 2篇 |
石油天然气 | 2篇 |
无线电 | 18篇 |
一般工业技术 | 127篇 |
冶金工业 | 30篇 |
自动化技术 | 97篇 |
出版年
2024年 | 1篇 |
2023年 | 5篇 |
2022年 | 4篇 |
2021年 | 15篇 |
2020年 | 6篇 |
2019年 | 16篇 |
2018年 | 14篇 |
2017年 | 10篇 |
2016年 | 13篇 |
2015年 | 29篇 |
2014年 | 22篇 |
2013年 | 40篇 |
2012年 | 34篇 |
2011年 | 37篇 |
2010年 | 20篇 |
2009年 | 26篇 |
2008年 | 37篇 |
2007年 | 28篇 |
2006年 | 29篇 |
2005年 | 39篇 |
2004年 | 17篇 |
2003年 | 14篇 |
2002年 | 13篇 |
2001年 | 10篇 |
2000年 | 10篇 |
1999年 | 9篇 |
1998年 | 14篇 |
1997年 | 12篇 |
1996年 | 5篇 |
1995年 | 8篇 |
1994年 | 5篇 |
1993年 | 10篇 |
1992年 | 7篇 |
1991年 | 2篇 |
1990年 | 3篇 |
1989年 | 3篇 |
1988年 | 4篇 |
1987年 | 4篇 |
1986年 | 3篇 |
1985年 | 1篇 |
1984年 | 2篇 |
1983年 | 1篇 |
1982年 | 3篇 |
1980年 | 1篇 |
1979年 | 3篇 |
1973年 | 1篇 |
1972年 | 1篇 |
1971年 | 1篇 |
1967年 | 1篇 |
排序方式: 共有593条查询结果,搜索用时 328 毫秒
21.
Schnitzler H Fröhlich U Boley TK Clemen AE Mlynek J Peters A Schiller S 《Applied optics》2002,41(33):7000-7005
We present a novel approach for the generation of higly frequency-stable, widely tunable, single-frequency cw UV light that is suitable for high-resolution spectroscopy. Sum-frequency generation (SFG) of two solid-state sources with a single cavity resonant for both fundamental waves is employed. Using a highly stable, narrow-linewidth frequency-doubled cw Nd:YAG laser as a master laser and slaving to it the SFG cavity and the other fundamental wave from a Ti:sapphire laser, we generate UV radiation of 33-mW output power around 313 nm. Alternatively, we use a diode laser instead of the Ti:sapphire laser and produce an output power of 2.1 mW at 313 nm. With both setups we obtain a continuous tunability of >15 GHz, short-term frequency fluctuations in the submegahertz range, a long-term frequency drift below 100 MHz/h, and stable operation for several hours. The theory of optimized doubly resonant SFG is also given. 相似文献
22.
Prof. Dr. Hans Ulrich Buhl Dr. Bernd Heinrich Prof. Dr. Peter Loos Prof. Dr. Ulrich Frank Visiting Prof. Daniel L. Moody PhD Prof. Jeffrey Parsons PhD Prof. Dr. Michael Rosemann Prof. Dr. Elmar J. Sinz Prof. Ron Weber PhD Achim Kindler Dr. Prof. Dr. Claus Rautenstrauch Dipl.-Wirt.-Inf. Peter Fettke 《WIRTSCHAFTSINFORMATIK》2005,47(2):152-161
23.
Alin Achim Ercan E Kuruo?lu Josiane Zerubia 《IEEE transactions on image processing》2006,15(9):2686-2693
Synthetic aperture radar (SAR) images are inherently affected by a signal dependent noise known as speckle, which is due to the radar wave coherence. In this paper, we propose a novel adaptive despeckling filter and derive a maximum a posteriori (MAP) estimator for the radar cross section (RCS). We first employ a logarithmic transformation to change the multiplicative speckle into additive noise. We model the RCS using the recently introduced heavy-tailed Rayleigh density function, which was derived based on the assumption that the real and imaginary parts of the received complex signal are best described using the alpha-stable family of distribution. We estimate model parameters from noisy observations by means of second-kind statistics theory, which relies on the Mellin transform. Finally, we compare the proposed algorithm with several classical speckle filters applied on actual SAR images. Experimental results show that the homomorphic MAP filter based on the heavy-tailed Rayleigh prior for the RCS is among the best for speckle removal. 相似文献
24.
Breunig M Lungwitz U Klar J Kurtz A Blunk T Goepferich A 《Journal of nanoscience and nanotechnology》2004,4(5):512-520
For non-viral gene delivery, the carriers for DNA transfer into cells must be vastly improved. The branched cationic polymer polyethylenimine has been described as an efficient gene carrier. However, polyethylenimine was demonstrated to mediate substantial cytotoxicity. Therefore, this study is aimed at investigating per-N-methylated polyethylenimine, which is thought to have a much lower cytotoxicity due to its lower charge density. Results from a gel retardation assay and laser light scattering indicated that per-N-methylated polyethylenimine condenses DNA into small and compact nanoparticles with a mean diameter <150 nm. Furthermore, polyplexes of polyethylenimine and per-N-methylated polyethylenimine with DNA had a positive zeta potential and the polymers protected DNA from nuclease-mediated digestion. The transfection efficiency of polyethylenimine and per-N-methylated polyethylenimine was tested in CHO-K1 cells. Using green fluorescent protein as reporter gene and flow cytometry analysis, we demonstrated that per-N-methylated polyethylenimine has a lower cytotoxicity, but also a significantly lower transfection efficiency. Using propidium iodide staining, we could additionally distinguish between viable and dead cells. At NP > or = 12, per-N-methylated polyethylenimine showed a much higher cell viability and the ratio of viable and transfected cells to dead and transfected cells was about 1.5 to 1.7 fold higher than for polyethylenimine. The results of cell viability from flow cytometry analysis were confirmed by the MTS assay. Using luciferase reporter gene for transfection experiments, the gene expression of per-N-methylated polyethylenimine was lower at NP 6, 12 and 18 as compared to polyethylenimine, but at NP 24 it yielded similar levels. 相似文献
25.
The race for creating an automated patch clamp has begun. Here, we present a novel technology to produce true gigaseals and whole cell preparations at a high rate. Suspended cells are flushed toward the tip of glass micropipettes. Seal, whole-cell break-in, and pipette/liquid handling are fully automated. Extremely stable seals and access resistance guarantee high recording quality. Data obtained from different cell types sealed inside pipettes show long-term stability, voltage clamp and seal quality, as well as block by compounds in the pM range. A flexible array of independent electrode positions minimizes consumables consumption at maximal throughput. Pulled micropipettes guarantee a proven gigaseal substrate with ultra clean and smooth surface at low cost. 相似文献
26.
Michel A Junger A Benson M Brammen DG Hempelmann G Dudeck J Marquardt K 《Computer methods and programs in biomedicine》2003,70(1):71-79
The major intent of this article was to describe the design principles of the drug-therapy documentation module of the Patient Data Management System (PDMS) ICUData, in routine use at the intensive care unit (ICU) of the Department of Anesthesiology and Intensive Care Medicine at the University Hospital of Giessen, Germany, since February 1999. The new drug management system has been in routine use since March 2000. Until 8 January 2001, 1140 patients have been documented using this approach. It could be demonstrated that it was possible to transform the formerly unstructured text-based documentation into a detailed and structured model. The mediated benefit resulted in the automatic calculation of fluid balance. Further, detailed statistical analyses of therapeutic behavior in drug administration are now possible. 相似文献
27.
The long-term variability of the fetal heart rate (FHR) provides valuable information on the fetal health status. The routine clinical FHR measurements are usually carried out by the means of ultrasound cardiography. Although the frequent FHR monitoring is recommendable, the high quality ultrasound devices are so expensive that they are not available for home care use. The passive and fully non-invasive acoustic recording called phonocardiography, provides an alternative low-cost measurement method. Unfortunately, the acoustic signal recorded on the maternal abdominal surface is heavily loaded by noise, thus the determination of the FHR raises serious signal processing issues. The development of an accurate and robust fetal phonocardiograph has been since long researched. This paper presents a novel two-channel phonocardiographic device and an advanced signal processing method for determination of the FHR. The developed system provided 83% accuracy compared to the simultaneously recorded reference ultrasound measurements. 相似文献
28.
Aggregate architectures are full-scale spatial formations made from loose granular matter. Especially if the individual grain is custom-designed, the range of behaviours can be calibrated to match a wide range of architectural and structural performance criteria. The aggregate becomes programmable matter. The relevance of loose granular systems for architecture is on the one hand their rapid re-configurability, allowing for a system not to be destroyed but rather to be recycled. On the other hand aggregates per se can be functionally graded either within one and the same particle type or through mixing different particle geometries. This enables the variation of architectural properties throughout one and the same material system, which is one of the core postulates of current architectural design research. However, very few examples of designed granular matter in architecture exist. The results presented here are thus one of the first coherent bodies of comprehensive research in this field compiled over a period of five years. Methodologically aggregate systems challenge conventional architectural design principles: whereas an architect generally precisely defines local and global geometry of a structure, in a designed granular system he can only calibrate the particle geometry in order to tune the overall behaviour of the aggregate formation. Thus new design methods have been developed throughout the research projects, which are informed by the related fields of granular physics and behaviour-based robotics. In this context the article provides an introduction to both designed particle systems and suitable fabrication approaches in an architectural context. Case study projects serve to verify the applicability of the concepts introduced. The research findings are discussed with regards to their practical, methodological and design theoretical contributions. To conclude, further directions of research are highlighted. 相似文献
29.
Codifying expert domain knowledge is a difficult and expensive task. To evaluate the quality of the outcome, often the same domain expert or a colleague of similar expertise is relied on to undertake a direct evaluation of the knowledge-based system or indirectly by preparing appropriate test data. During an incremental knowledge acquisition process, a data stream is available, and the knowledge base is observed and amended by an expert each time it produces an error. Using the kept record of the system’s performance, we propose an evaluation process to estimate its effectiveness as it gets evolved. We instantiate this process for an incremental knowledge acquisition methodology, Ripple Down Rules. We estimate the added value in each knowledge base update. Using these values, the decision makers in the organisation employing the knowledge-based information system can apply a cost-benefit analysis of the continuation of the incremental knowledge acquisition process. They can then determine when this process, involving keeping an expert online, should be terminated. As a result, the expert is not kept on-line longer than it is absolutely necessary. Hence, a major expense in deploying the information system—the cost of keeping a domain expert on-line—is reduced. 相似文献
30.