全文获取类型
收费全文 | 4487篇 |
免费 | 204篇 |
国内免费 | 10篇 |
专业分类
电工技术 | 54篇 |
综合类 | 3篇 |
化学工业 | 951篇 |
金属工艺 | 86篇 |
机械仪表 | 67篇 |
建筑科学 | 268篇 |
矿业工程 | 4篇 |
能源动力 | 196篇 |
轻工业 | 471篇 |
水利工程 | 30篇 |
石油天然气 | 9篇 |
无线电 | 384篇 |
一般工业技术 | 824篇 |
冶金工业 | 425篇 |
原子能技术 | 32篇 |
自动化技术 | 897篇 |
出版年
2023年 | 44篇 |
2022年 | 66篇 |
2021年 | 124篇 |
2020年 | 83篇 |
2019年 | 99篇 |
2018年 | 109篇 |
2017年 | 101篇 |
2016年 | 138篇 |
2015年 | 125篇 |
2014年 | 159篇 |
2013年 | 284篇 |
2012年 | 287篇 |
2011年 | 342篇 |
2010年 | 296篇 |
2009年 | 266篇 |
2008年 | 276篇 |
2007年 | 238篇 |
2006年 | 206篇 |
2005年 | 178篇 |
2004年 | 155篇 |
2003年 | 136篇 |
2002年 | 120篇 |
2001年 | 59篇 |
2000年 | 75篇 |
1999年 | 53篇 |
1998年 | 66篇 |
1997年 | 54篇 |
1996年 | 56篇 |
1995年 | 43篇 |
1994年 | 51篇 |
1993年 | 35篇 |
1992年 | 27篇 |
1991年 | 27篇 |
1990年 | 23篇 |
1989年 | 31篇 |
1988年 | 17篇 |
1987年 | 26篇 |
1986年 | 21篇 |
1985年 | 21篇 |
1984年 | 22篇 |
1983年 | 19篇 |
1982年 | 23篇 |
1981年 | 16篇 |
1980年 | 14篇 |
1979年 | 18篇 |
1978年 | 9篇 |
1977年 | 15篇 |
1976年 | 10篇 |
1974年 | 7篇 |
1971年 | 6篇 |
排序方式: 共有4701条查询结果,搜索用时 15 毫秒
1.
Jean‐Marc Aldric Philippe Thonart 《Journal of chemical technology and biotechnology (Oxford, Oxfordshire : 1986)》2008,83(10):1401-1408
BACKGROUND: In the framework of biological processes used for waste gas treatment, the impact of the inoculum size on the start‐up performance needs to be better evaluated. Moreover, only a few studies have investigated the behaviour of elimination capacity and biomass viability in a two‐phase partitioning bioreactor (TPPB) used for waste gas treatment. Lastly, the impact of ethanol as a co‐substrate remains misunderstood. RESULTS: Firstly, no benefit of inoculation with a high cellular density (>1.5 g L?1) was observed in terms of start‐up performance. Secondly, the TPPB was monitored for 38 days to characterise its behaviour under several operational conditions. The removal efficiency remained above 63% for an inlet concentration of 7 g isopropylbenzene (IPB) m?3 and at some time points reached 92% during an intermittent loading phase (10 h day?1), corresponding to a mean elimination capacity of 4 × 10?3 g L?1 min?1 (240 g m?3 h?1) for a mean IPB inlet load of 6.19 × 10?3 g L?1 min?1 (390 g m?3 h?1). Under continuous IPB loading, the performance of the TPPB declined, but the period of biomass acclimatisation to this operational condition was shorter than 5 days. The biomass grew to approximately 10 g L?1 but the cellular viability changed greatly during the experiment, suggesting an endorespiration phenomenon in the bioreactor. It was also shown that simultaneous degradation of IPB and ethanol occurred, suggesting that ethanol improves the biodegradation process without causing oxygen depletion. CONCLUSION: A water/silicone oil TPPB with ethanol as co‐substrate allowed the removal of a high inlet load of IPB during an experiment lasting 38 days. Copyright © 2008 Society of Chemical Industry 相似文献
2.
Huseyin Selcuk Jeosadaque J Sene Marc A Anderson 《Journal of chemical technology and biotechnology (Oxford, Oxfordshire : 1986)》2003,78(9):979-984
This study addresses the removal of humic acid (HA) dissolved in an aqueous medium by a photoelectrocatalytic process. UV254 removal and the degradation of color (Vis400) followed pseudo‐first order kinetics. Rate constants were 1.1 × 10?1 min?1, 8.3 × 10?2 min?1 and 2.49 × 10?2 min?1 (R2 > 0.97) for UV254 degradation and 1.7 × 10?1 min?1, 6.5 × 10?2 min?1 and 2.0 × 10?2 min?1 for color removal from 5 mg dm?3, 10 mg dm?3 and 25 mg dm?3 HA respectively. Following a 2 h irradiation time, 96% of the color, 98% of the humic acid and 85% of the total organic carbon (TOC) was removed from an initial 25 mg dm?3 HA solution in the photoanode cell. Photocatalytic removal on the same photoanode was also studied in order to compare the two methods of degradation. Results showed that the photoelectrocatalytic method was much more effective than the photocatalytic method especially at high pH values and with respect to UV254 removal. The effect of other important reaction variables, eg pH, external potential and electrolyte concentration, on the photoelectrocatalytic HA degradation was also studied. Copyright © 2003 Society of Chemical Industry 相似文献
3.
A family of energy/economic/environmental (E3) models is presented as a mechanism for analysing national policy issues. The family consists of discrete models which are designed to be run in an integrated manner. The outputs of certain models provide the inputs to the next. This structure allows the analyst to readily incorporate an understanding of regional factors such as local energy prices, concerns over air quality, water availability, or attitudes towards construction of new energy facilities, into national assessments of energy policies. This paper reviews the analytic framework within which energy policy issues are currently addressed. The initial family of E3 models is described with the emphasis on the data linkages and feedback which are provided when these models are run sequentially. The ongoing MITRE research programme with the E3 family of models is presented and plans and opportunities for future work are outlined. 相似文献
4.
This article presents an autonomous guide agent that can observe a community of learners on the web, interpret the learners' inputs, and then assess their sharing. The goal of this agent is to find a reliable helper (tutor or other learner) to assist a learner in solving his task. Despite the growing number of Internet users, the ability to find helpers is still a challenging and important problem. Although helpers could have much useful information about courses to be taught, many learners fail to understand their presentations. For that, the agent must be able to deal autonomously with the following challenges: Do helpers have information that the learners need? Will helpers present information that learners can understand? And can we guarantee that these helpers will collaborate effectively with learners? We have developed a new filtering framework, called a pyramid collaborative filtering model, to whittle the number of helpers down to just one. We have proposed four levels for the pyramid. Moving from one level to another depends on three filtering techniques: domain model filtering, user model filtering, and credibility model filtering. A new technique is filtering according to helpers' credibilities. Our experiments show that this method greatly improves filtering effectiveness. © 2007 Wiley Periodicals, Inc. Int J Int Syst 22: 1065–1082, 2007. 相似文献
5.
6.
Iris Reinbacher Marc Benkert Marc van Kreveld Joseph S. B. Mitchell Jack Snoeyink Alexander Wolff 《Algorithmica》2008,50(3):386-414
In geographic information retrieval, queries often name geographic regions that do not have a well-defined boundary, such
as “Southern France.” We provide two algorithmic approaches to the problem of computing reasonable boundaries of such regions
based on data points that have evidence indicating that they lie either inside or outside the region. Our problem formulation
leads to a number of subproblems related to red-blue point separation and minimum-perimeter polygons, many of which we solve
algorithmically. We give experimental results from our implementation and a comparison of the two approaches.
This research is supported by the EU-IST Project No. IST-2001-35047 (SPIRIT) and by grant WO 758/4-2 of the German Research
Foundation (DFG). 相似文献
7.
Ye Chen Keith W. Hipel D. Marc Kilgour Yuming Zhu 《Environmental Modelling & Software》2009,24(5):647-654
Brownfield redevelopment (BR) is an ongoing issue for governments, communities, and consultants around the world. It is also an increasingly popular research topic in several academic fields. Strategic decision support that is now available for BR is surveyed and assessed. Then a dominance-based rough-set approach is developed and used to classify cities facing BR issues according to the level of two characteristics, BR effectiveness and BR future needs. The data for the classification are based on the widely available results of a survey of US cities. The unique features of the method are its reduced requirement for preference information, its ability to handle missing information effectively, and the easily understood linguistic decision rules that it generates, based on a training classification provided by experts. The resulting classification should be a valuable aid to cities and governments as they plan their BR projects and budgets. 相似文献
8.
The melanoma antigen coded by the MAGE-1 gene was the first tumor antigen described in human cancer. Genetic, biochemical, and "candidate peptide" strategies have been used to identify antigenic peptides presented to T-cells by class I major histocompatibility complex antigens. Antigens have now been characterized in a wide variety of tumor types. Five categories have been described based on expression profile. These antigens are detailed in this review. Among the tumor antigens produced as a result of intratumoral mutations, some are of special interest because of their potentially oncogenic effects. These new data can be expected to lead to the development of novel anticancer treatments based on specific immunotherapy. Pilot clinical studies are ongoing. 相似文献
9.
The SHARC framework for data quality in Web archiving 总被引:1,自引:0,他引:1
Dimitar Denev Arturas Mazeika Marc Spaniol Gerhard Weikum 《The VLDB Journal The International Journal on Very Large Data Bases》2011,20(2):183-207
Web archives preserve the history of born-digital content and offer great potential for sociologists, business analysts, and legal experts on intellectual property and compliance issues. Data quality is crucial for these purposes. Ideally, crawlers should gather coherent captures of entire Web sites, but the politeness etiquette and completeness requirement mandate very slow, long-duration crawling while Web sites undergo changes. This paper presents the SHARC framework for assessing the data quality in Web archives and for tuning capturing strategies toward better quality with given resources. We define data quality measures, characterize their properties, and develop a suite of quality-conscious scheduling strategies for archive crawling. Our framework includes single-visit and visit?Crevisit crawls. Single-visit crawls download every page of a site exactly once in an order that aims to minimize the ??blur?? in capturing the site. Visit?Crevisit strategies revisit pages after their initial downloads to check for intermediate changes. The revisiting order aims to maximize the ??coherence?? of the site capture(number pages that did not change during the capture). The quality notions of blur and coherence are formalized in the paper. Blur is a stochastic notion that reflects the expected number of page changes that a time-travel access to a site capture would accidentally see, instead of the ideal view of a instantaneously captured, ??sharp?? site. Coherence is a deterministic quality measure that counts the number of unchanged and thus coherently captured pages in a site snapshot. Strategies that aim to either minimize blur or maximize coherence are based on prior knowledge of or predictions for the change rates of individual pages. Our framework includes fairly accurate classifiers for change predictions. All strategies are fully implemented in a testbed and shown to be effective by experiments with both synthetically generated sites and a periodic crawl series for different Web sites. 相似文献
10.
Valette S Chassery JM Prost R 《IEEE transactions on visualization and computer graphics》2008,14(2):369-381
In this paper, we propose a generic framework for 3D surface remeshing. Based on a metric-driven Discrete Voronoi Diagram construction, our output is an optimized 3D triangular mesh with a user defined vertex budget. Our approach can deal with a wide range of applications, from high quality mesh generation to shape approximation. By using appropriate metric constraints the method generates isotropic or anisotropic elements. Based on point-sampling, our algorithm combines the robustness and theoretical strength of Delaunay criteria with the efficiency of entirely discrete geometry processing . Besides the general described framework, we show experimental results using isotropic, quadric-enhanced isotropic and anisotropic metrics which prove the efficiency of our method on large meshes, for a low computational cost. 相似文献