首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   256篇
  免费   5篇
电工技术   6篇
化学工业   76篇
机械仪表   4篇
建筑科学   13篇
能源动力   9篇
轻工业   10篇
水利工程   9篇
石油天然气   1篇
无线电   27篇
一般工业技术   26篇
冶金工业   19篇
自动化技术   61篇
  2024年   1篇
  2023年   3篇
  2022年   7篇
  2021年   9篇
  2020年   4篇
  2019年   8篇
  2018年   9篇
  2017年   7篇
  2016年   12篇
  2015年   9篇
  2014年   8篇
  2013年   16篇
  2012年   12篇
  2011年   15篇
  2010年   17篇
  2009年   11篇
  2008年   6篇
  2007年   6篇
  2006年   6篇
  2005年   5篇
  2004年   5篇
  2003年   4篇
  2002年   2篇
  2000年   4篇
  1999年   2篇
  1998年   6篇
  1997年   4篇
  1996年   7篇
  1995年   3篇
  1994年   4篇
  1993年   1篇
  1992年   4篇
  1990年   2篇
  1989年   1篇
  1987年   5篇
  1985年   2篇
  1984年   3篇
  1983年   6篇
  1982年   4篇
  1981年   2篇
  1979年   4篇
  1978年   1篇
  1977年   6篇
  1976年   2篇
  1975年   1篇
  1974年   1篇
  1973年   2篇
  1972年   1篇
  1965年   1篇
排序方式: 共有261条查询结果,搜索用时 31 毫秒
61.
Wireless Networks - COVID-19 surprised the whole world by its quick and sudden spread. Coronavirus pushes all community sectors: government, industry, academia, and nonprofit organizations to take...  相似文献   
62.
63.
Communicating Finite State Machines (CFSM) lack the high level syntactic and structural abstractions of Communicating Complex State Machines (CCSM), such as nesting and encapsulation, to model highly complex protocols that are likely to arise in web services environments. The incorporation of these features in a protocol specification model would require the design of a new validation technique to efficiently check for protocol errors, such as deadlocks and non-reachable transitions. A reachability graph is used to represent the execution states of the protocol and to verify their consistency. In this paper, we propose a new validation technique for protocols modeled with complex FSM, called RLRA (Reverse Leaping Reachability Analysis), which enables the detection of all deadlock errors. It is a backtracking approach, which first identifies an initial set of suspected states, those possibly containing deadlocks, then refines this set to those likely to cause deadlock, and finally backtracks through the graph while checking for errors until the root state of the protocol is reached. Leap graphs are employed to prune the number of execution states examined, and thereby mitigate the combinatorial explosion of the state space. Extensive tests and comparisons were performed, which show the effectiveness of our technique.  相似文献   
64.
According to the Intergovernmental Panel on Climate Change the buildings sector has the largest mitigation potential for CO2 emissions. Especially in office buildings, where internal heat loads and a relatively high occupant density occur at the same time with solar heat gains, overheating has become a common problem. In Europe the adaptive thermal comfort model according to EN 15251 provides a method to evaluate thermal comfort in naturally ventilated buildings. However, especially in the context of the climate change and the occurrence of heat waves within the last decade, the question arises, how thermal comfort can be maintained without additional cooling, especially in warm climates. In this paper a parametric study for a typical cellular naturally ventilated office room has been conducted, using the building simulation software EnergyPlus. It is based on the Mediterranean climate of Athens, Greece. Adaptive thermal comfort is evaluated according to EN 15251. Variations refer to different building design priorities, and they consider the variability of occupant behaviour and internal heat loads by using an ideal and worst case scenario. The influence of heat waves is considered by comparing measured temperatures for an average and an exceptionally hot year within the last decade. Since the use of building controls for shading affects thermal as well as visual comfort, daylighting and view are evaluated as well. Conclusions are drawn regarding the influence and interaction of building design, occupants and heat waves on comfort and greenhouse gas emissions in naturally ventilated offices, and related optimisation potential.  相似文献   
65.
Three R.E.MO.S. (Remote Environmental MOnitoring System) telemetric networks have been installed in the catchment area of River Nestos, by research team PERSEAS. The first network has been installed in Nestos Delta. This network consists of two Remote Stations (R.S.):
The first one is called R.S. “Nestos” and is settled in Nestos Delta in Chrysoupoli and
the second one is called R.S “Agiasma” and is settled in the homonymous Lagoon.
This paper deals with R.S. “Agiasma”, which operates in Agiasma Lagoon, an area of great environmental importance in the west part of River Nestos Delta. The gradients of the water quality and quantity monitored parameters are very important for the ecological preservation of the lagoon. Moreover, this case can be an excellent example of how the real-time monitoring data can work as an alarm system to prevent environmental hazards.The scientific issues this paper is focused on are:
1.
The three years systematic daily electronic monitoring data (1/1/2000-31/12/2002). The monitored parameters are Water level—H (cm), Salinity—Sal (‰), Redox Potential—RP (mV), Dissolved Oxygen—DO (mg/l), Water Temperature—Tw (oC) and Air Temperature—Ta (oC).
2.
The assessment of water quality and quantity parameters and the aquatic environment of Agiasma lagoon.
3.
The detection of trends, using the non-parametric Spearman's criterion. This trend analysis proved the existence of trends for the parameters H, Sal and RP.
4.
The necessity of real-time monitoring, which can prevent and confront possible natural hazards and disasters and work as an alarm system for the local authorities.
  相似文献   
66.
A Product–Service System (PSS) is created by combing a tangible product and an intangible service into one integrated offering. Thus, a PSS can be achieved by a production company adding intangible services to a product using a servitisation strategy or by a service company adding a tangible product to a service by means of a productisation strategy. The focus of this paper is on the latter. Our work demonstrates a significant gap in the literature in this area. To address this, we adapt an existing PSS conceptual framework as a means to identify the driving and restraining forces considered by a service company as it explored the possibility of pursuing a PSS productisation strategy. The conceptual framework is applied in an exploratory case study with a 3PL service provider. Application of the framework reveals new driving and restraining forces not previously discussed in the literature. Furthermore, it allows a preliminary quantification of the driving and restraining forces using a force field analysis approach. Our work contributes towards the expansion of the empirical knowledge base in the area of PSS.  相似文献   
67.
Exact Knowledge Hiding through Database Extension   总被引:1,自引:0,他引:1  
In this paper, we propose a novel, exact border-based approach that provides an optimal solution for the hiding of sensitive frequent itemsets by (i) minimally extending the original database by a synthetically generated database part - the database extension, (ii) formulating the creation of the database extension as a constraint satisfaction problem, (iii) mapping the constraint satisfaction problem to an equivalent binary integer programming problem, (iv) exploiting underutilized synthetic transactions to proportionally increase the support of non-sensitive itemsets, (v) minimally relaxing the constraint satisfaction problem to provide an approximate solution close to the optimal one when an ideal solution does not exist, and (vi) by using a partitioning in the universe of the items to increase the efficiency of the proposed hiding algorithm. Extending the original database for sensitive itemset hiding is proved to provide optimal solutions to an extended set of hiding problems compared to previous approaches and to provide solutions of higher quality. Moreover, the application of binary integer programming enables the simultaneous hiding of the sensitive itemsets and thus allows for the identification of globally optimal solutions.  相似文献   
68.
Effective and efficient classification on a search-engine model   总被引:5,自引:5,他引:0  
Traditional document classification frameworks, which apply the learned classifier to each document in a corpus one by one, are infeasible for extremely large document corpora, like the Web or large corporate intranets. We consider the classification problem on a corpus that has been processed primarily for the purpose of searching, and thus our access to documents is solely through the inverted index of a large scale search engine. Our main goal is to build the “best” short query that characterizes a document class using operators normally available within search engines. We show that surprisingly good classification accuracy can be achieved on average over multiple classes by queries with as few as 10 terms. As part of our study, we enhance some of the feature-selection techniques that are found in the literature by forcing the inclusion of terms that are negatively correlated with the target class and by making use of term correlations; we show that both of those techniques can offer significant advantages. Moreover, we show that optimizing the efficiency of query execution by careful selection of terms can further reduce the query costs. More precisely, we show that on our set-up the best 10-term query can achieve 93% of the accuracy of the best SVM classifier (14,000 terms), and if we are willing to tolerate a reduction to 89% of the best SVM, we can build a 10-term query that can be executed more than twice as fast as the best 10-term query.  相似文献   
69.
70.
In this study of coagulation operation, a comparison was made between the optimum jar test values for pH, coagulant and coagulant aid obtained from traditional methods (an adjusted one-factor-at-a-time (OFAT) method) and with central composite design (the standard design of response surface methodology (RSM)). Alum (coagulant) and polymer (coagulant aid) were used to treat a water source with very low pH and high aluminium concentration at Sri-Gading water treatment plant (WTP) Malaysia. The optimum conditions for these factors were chosen when the final turbidity, pH after coagulation and residual aluminium were within 0-5 NTU, 6.5-7.5 and 0-0.20 mg/l respectively. Traditional and RSM jar tests were conducted to find their respective optimum coagulation conditions. It was observed that the optimum dose for alum obtained through the traditional method was 12 mg/l, while the value for polymer was set constant at 0.020 mg/l. Through RSM optimization, the optimum dose for alum was 7 mg/l and for polymer was 0.004 mg/l. Optimum pH for the coagulation operation obtained through traditional methods and RSM was 7.6. The final turbidity, pH after coagulation and residual aluminium recorded were all within acceptable limits. The RSM method was demonstrated to be an appropriate approach for the optimization and was validated by a further test.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号