首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   4595篇
  免费   205篇
  国内免费   10篇
电工技术   54篇
综合类   3篇
化学工业   974篇
金属工艺   88篇
机械仪表   68篇
建筑科学   273篇
矿业工程   4篇
能源动力   196篇
轻工业   485篇
水利工程   31篇
石油天然气   9篇
无线电   392篇
一般工业技术   841篇
冶金工业   452篇
原子能技术   33篇
自动化技术   907篇
  2023年   44篇
  2022年   66篇
  2021年   125篇
  2020年   83篇
  2019年   99篇
  2018年   111篇
  2017年   101篇
  2016年   140篇
  2015年   125篇
  2014年   160篇
  2013年   289篇
  2012年   295篇
  2011年   348篇
  2010年   303篇
  2009年   271篇
  2008年   279篇
  2007年   241篇
  2006年   209篇
  2005年   179篇
  2004年   158篇
  2003年   139篇
  2002年   123篇
  2001年   61篇
  2000年   79篇
  1999年   53篇
  1998年   72篇
  1997年   58篇
  1996年   58篇
  1995年   48篇
  1994年   57篇
  1993年   37篇
  1992年   29篇
  1991年   29篇
  1990年   23篇
  1989年   31篇
  1988年   18篇
  1987年   25篇
  1986年   23篇
  1985年   25篇
  1984年   22篇
  1983年   20篇
  1982年   26篇
  1981年   16篇
  1980年   14篇
  1979年   19篇
  1978年   10篇
  1977年   15篇
  1976年   12篇
  1974年   8篇
  1973年   7篇
排序方式: 共有4810条查询结果,搜索用时 0 毫秒
1.
BACKGROUND: In the framework of biological processes used for waste gas treatment, the impact of the inoculum size on the start‐up performance needs to be better evaluated. Moreover, only a few studies have investigated the behaviour of elimination capacity and biomass viability in a two‐phase partitioning bioreactor (TPPB) used for waste gas treatment. Lastly, the impact of ethanol as a co‐substrate remains misunderstood. RESULTS: Firstly, no benefit of inoculation with a high cellular density (>1.5 g L?1) was observed in terms of start‐up performance. Secondly, the TPPB was monitored for 38 days to characterise its behaviour under several operational conditions. The removal efficiency remained above 63% for an inlet concentration of 7 g isopropylbenzene (IPB) m?3 and at some time points reached 92% during an intermittent loading phase (10 h day?1), corresponding to a mean elimination capacity of 4 × 10?3 g L?1 min?1 (240 g m?3 h?1) for a mean IPB inlet load of 6.19 × 10?3 g L?1 min?1 (390 g m?3 h?1). Under continuous IPB loading, the performance of the TPPB declined, but the period of biomass acclimatisation to this operational condition was shorter than 5 days. The biomass grew to approximately 10 g L?1 but the cellular viability changed greatly during the experiment, suggesting an endorespiration phenomenon in the bioreactor. It was also shown that simultaneous degradation of IPB and ethanol occurred, suggesting that ethanol improves the biodegradation process without causing oxygen depletion. CONCLUSION: A water/silicone oil TPPB with ethanol as co‐substrate allowed the removal of a high inlet load of IPB during an experiment lasting 38 days. Copyright © 2008 Society of Chemical Industry  相似文献   
2.
Learning long-term dependencies with gradient descent is difficult   总被引:12,自引:0,他引:12  
Recurrent neural networks can be used to map input sequences to output sequences, such as for recognition, production or prediction problems. However, practical difficulties have been reported in training recurrent neural networks to perform tasks in which the temporal contingencies present in the input/output sequences span long intervals. We show why gradient based learning algorithms face an increasingly difficult problem as the duration of the dependencies to be captured increases. These results expose a trade-off between efficient learning by gradient descent and latching on information for long periods. Based on an understanding of this problem, alternatives to standard gradient descent are considered.  相似文献   
3.
This study addresses the removal of humic acid (HA) dissolved in an aqueous medium by a photoelectrocatalytic process. UV254 removal and the degradation of color (Vis400) followed pseudo‐first order kinetics. Rate constants were 1.1 × 10?1 min?1, 8.3 × 10?2 min?1 and 2.49 × 10?2 min?1 (R2 > 0.97) for UV254 degradation and 1.7 × 10?1 min?1, 6.5 × 10?2 min?1 and 2.0 × 10?2 min?1 for color removal from 5 mg dm?3, 10 mg dm?3 and 25 mg dm?3 HA respectively. Following a 2 h irradiation time, 96% of the color, 98% of the humic acid and 85% of the total organic carbon (TOC) was removed from an initial 25 mg dm?3 HA solution in the photoanode cell. Photocatalytic removal on the same photoanode was also studied in order to compare the two methods of degradation. Results showed that the photoelectrocatalytic method was much more effective than the photocatalytic method especially at high pH values and with respect to UV254 removal. The effect of other important reaction variables, eg pH, external potential and electrolyte concentration, on the photoelectrocatalytic HA degradation was also studied. Copyright © 2003 Society of Chemical Industry  相似文献   
4.
A family of energy/economic/environmental (E3) models is presented as a mechanism for analysing national policy issues. The family consists of discrete models which are designed to be run in an integrated manner. The outputs of certain models provide the inputs to the next. This structure allows the analyst to readily incorporate an understanding of regional factors such as local energy prices, concerns over air quality, water availability, or attitudes towards construction of new energy facilities, into national assessments of energy policies. This paper reviews the analytic framework within which energy policy issues are currently addressed. The initial family of E3 models is described with the emphasis on the data linkages and feedback which are provided when these models are run sequentially. The ongoing MITRE research programme with the E3 family of models is presented and plans and opportunities for future work are outlined.  相似文献   
5.
This article presents an autonomous guide agent that can observe a community of learners on the web, interpret the learners' inputs, and then assess their sharing. The goal of this agent is to find a reliable helper (tutor or other learner) to assist a learner in solving his task. Despite the growing number of Internet users, the ability to find helpers is still a challenging and important problem. Although helpers could have much useful information about courses to be taught, many learners fail to understand their presentations. For that, the agent must be able to deal autonomously with the following challenges: Do helpers have information that the learners need? Will helpers present information that learners can understand? And can we guarantee that these helpers will collaborate effectively with learners? We have developed a new filtering framework, called a pyramid collaborative filtering model, to whittle the number of helpers down to just one. We have proposed four levels for the pyramid. Moving from one level to another depends on three filtering techniques: domain model filtering, user model filtering, and credibility model filtering. A new technique is filtering according to helpers' credibilities. Our experiments show that this method greatly improves filtering effectiveness. © 2007 Wiley Periodicals, Inc. Int J Int Syst 22: 1065–1082, 2007.  相似文献   
6.
7.
A kinetic study for the one-step conversion of synthesis gas to gasoline on a ZnO–Cr2O3–ZSM-5 catalyst is described. On this catalyst, three reactions are involved in the overall transformation of synthesis gas: the methanol synthesis, the conversion of methanol to hydrocarbons and the water–gas shift reaction. Under the operating conditions selected for the study, it was found that the water–gas shift was at equilibrium and the methanol was completely converted to hydrocarbons. Consequently, it was postulated that the kinetics of the limiting reaction step, the methanol synthesis on the ZnO–Cr2O3 component, was the one that controls the overall reaction rate. Three kinetic model equations describing the rate of synthesis gas conversion on the bifunctional catalyst, were considered to fit the data of the experimental runs performed in a Berty well-mixed reactor. Those equations were derived under very special conditions where the methanol decomposition term could be neglected. It was also observed that in the kinetic equations a term involving the fugacity of CO2 was required to predict the rate properly. The catalyst deactivation was also taken into account in the analysis.  相似文献   
8.
In geographic information retrieval, queries often name geographic regions that do not have a well-defined boundary, such as “Southern France.” We provide two algorithmic approaches to the problem of computing reasonable boundaries of such regions based on data points that have evidence indicating that they lie either inside or outside the region. Our problem formulation leads to a number of subproblems related to red-blue point separation and minimum-perimeter polygons, many of which we solve algorithmically. We give experimental results from our implementation and a comparison of the two approaches. This research is supported by the EU-IST Project No. IST-2001-35047 (SPIRIT) and by grant WO 758/4-2 of the German Research Foundation (DFG).  相似文献   
9.
Brownfield redevelopment (BR) is an ongoing issue for governments, communities, and consultants around the world. It is also an increasingly popular research topic in several academic fields. Strategic decision support that is now available for BR is surveyed and assessed. Then a dominance-based rough-set approach is developed and used to classify cities facing BR issues according to the level of two characteristics, BR effectiveness and BR future needs. The data for the classification are based on the widely available results of a survey of US cities. The unique features of the method are its reduced requirement for preference information, its ability to handle missing information effectively, and the easily understood linguistic decision rules that it generates, based on a training classification provided by experts. The resulting classification should be a valuable aid to cities and governments as they plan their BR projects and budgets.  相似文献   
10.
The SHARC framework for data quality in Web archiving   总被引:1,自引:0,他引:1  
Web archives preserve the history of born-digital content and offer great potential for sociologists, business analysts, and legal experts on intellectual property and compliance issues. Data quality is crucial for these purposes. Ideally, crawlers should gather coherent captures of entire Web sites, but the politeness etiquette and completeness requirement mandate very slow, long-duration crawling while Web sites undergo changes. This paper presents the SHARC framework for assessing the data quality in Web archives and for tuning capturing strategies toward better quality with given resources. We define data quality measures, characterize their properties, and develop a suite of quality-conscious scheduling strategies for archive crawling. Our framework includes single-visit and visit?Crevisit crawls. Single-visit crawls download every page of a site exactly once in an order that aims to minimize the ??blur?? in capturing the site. Visit?Crevisit strategies revisit pages after their initial downloads to check for intermediate changes. The revisiting order aims to maximize the ??coherence?? of the site capture(number pages that did not change during the capture). The quality notions of blur and coherence are formalized in the paper. Blur is a stochastic notion that reflects the expected number of page changes that a time-travel access to a site capture would accidentally see, instead of the ideal view of a instantaneously captured, ??sharp?? site. Coherence is a deterministic quality measure that counts the number of unchanged and thus coherently captured pages in a site snapshot. Strategies that aim to either minimize blur or maximize coherence are based on prior knowledge of or predictions for the change rates of individual pages. Our framework includes fairly accurate classifiers for change predictions. All strategies are fully implemented in a testbed and shown to be effective by experiments with both synthetically generated sites and a periodic crawl series for different Web sites.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号