首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   787篇
  免费   49篇
电工技术   11篇
化学工业   159篇
金属工艺   2篇
机械仪表   7篇
建筑科学   28篇
矿业工程   1篇
能源动力   3篇
轻工业   108篇
水利工程   1篇
石油天然气   1篇
无线电   77篇
一般工业技术   148篇
冶金工业   40篇
原子能技术   1篇
自动化技术   249篇
  2024年   2篇
  2023年   7篇
  2022年   25篇
  2021年   24篇
  2020年   10篇
  2019年   24篇
  2018年   20篇
  2017年   26篇
  2016年   31篇
  2015年   26篇
  2014年   41篇
  2013年   63篇
  2012年   57篇
  2011年   66篇
  2010年   51篇
  2009年   63篇
  2008年   49篇
  2007年   46篇
  2006年   36篇
  2005年   26篇
  2004年   32篇
  2003年   28篇
  2002年   19篇
  2001年   14篇
  2000年   7篇
  1999年   4篇
  1998年   6篇
  1997年   5篇
  1996年   6篇
  1995年   5篇
  1994年   6篇
  1993年   3篇
  1992年   1篇
  1990年   2篇
  1989年   2篇
  1986年   1篇
  1984年   1篇
  1974年   1篇
排序方式: 共有836条查询结果,搜索用时 10 毫秒
1.
BACKGROUND: Two peat biofilters were used for the removal of toluene from air for one year. One biofilter was fed with pure toluene and the other received 1:1 (by weight) ethyl acetate:toluene mixture. RESULTS: The biofilters were operated under continuous loading: the toluene inlet load (IL) at which 80% removal occurred was 116 g m?3 h?1 at 57 s gas residence time. Maximum elimination capacity of 360 g m?3 h?1 was obtained at an IL of 745 g m?3 h?1. The elimination of toluene was inhibited by the presence of ethyl acetate. Intermittent loading, with pollutants supplied for 16 h/day, 5 days/week, did not significantly affect the removal efficiency (RE). Biomass was fully activated in 2 h after night closures, but 6 h were required to recover RE after weekend closures. Live cell density remained relatively constant over the operational period, while the dead cell fraction increased. Finally, a 15 day starvation period was applied and operation then re‐started. Performance was restored with similar re‐acclimatization period to that after weekend closures, and a reduction in dead cell fraction was observed. CONCLUSION: This study demonstrates the capacity of the system to handle intermittent loading conditions that are common in industrial practices, including long‐term starvation. Copyright © 2008 Society of Chemical Industry  相似文献   
2.
The wet air oxidation of phenol over a commercial active carbon catalyst was studied in a trickle bed reactor (TBR) in the temperature and oxygen partial pressure ranges of 120–160 °C and 0.1–0.2 MPa, respectively. The performance of the active carbon was compared in terms of phenol and COD destruction. The weight change of active carbon due to reaction was also measured. Finally, oxic phenol adsorption isotherms were assessed in batch conditions at 25, 125 and 160 °C. In order to use the conversion data obtained from the TBR for a kinetic study, special care was taken to check the kinetic control in the TBR experiments. Several kinetic models including power law or Langmuir–Hinshelwood expressions were considered to describe the catalytic oxidation of phenol over active carbon. The simple power law model with first order dependence on both phenol and oxygen concentration predicted satisfactorily the experimental data not only over the entire range of operating conditions studied, but also outside its validity range. Copyright © 2005 Society of Chemical Industry  相似文献   
3.
Information diffusion in large-scale networks has been studied to identify the users influence. The influence has been targeted as a key feature either to reach large populations or influencing public opinion. Through the use of micro-blogs, such as Twitter, global influencers have been identified and ranked based on message propagation (retweets). In this paper, a new application is presented, which allows to find first and classify then the local influence on Twitter: who have influenced you and who have been influenced by you. Until now, social structures of tweets’ original authors that have been either retweeted or marked as favourites are unobservable. Throughout this application, these structures can be discovered and they reveal the existence of communities formed by users of similar profile (that are connected among them) interrelated with other similar profile users’ communities.  相似文献   
4.
5.
End-effectors are usually related to the location of limbs, and their reliable detection enables robust body tracking as well as accurate pose estimation. Recent innovation in depth cameras has re-stated the pose estimation problem. We focus on the information provided by these sensors, for which we borrow the name 2.5D data from the Graphics community. In this paper we propose a human pose estimation algorithm based on topological propagation. Geometric Deformable Models are used to carry out such propagation, implemented according to the Narrow Band Level Set approach. A variant of the latter method is proposed, including a density restriction which helps preserving the topological properties of the object under analysis. Principal end-effectors are extracted from a directed graph weighted with geodesic distances, also providing a skeletal-like structure describing human pose. An evaluation against reference methods is performed with promising results. The proposed solution allows a frame-wise end-effector detection, with no temporal tracking involved, which may be generalized to the tracking of other objects beyond human body.  相似文献   
6.
We introduce WSimply, a new framework for modelling and solving Weighted Constraint Satisfaction Problems (WCSP) using Satisfiability Modulo Theories (SMT) technology. In contrast to other well-known approaches designed for extensional representation of goods or no-goods, and with few declarative facilities, our approach aims to follow an intensional and declarative syntax style. In addition, our language has built-in support for some meta-constraints, such as priority and homogeneity, which allows the user to easily specify rich requirements on the desired solutions, such as preferences and fairness. We propose two alternative strategies for solving these WCSP instances using SMT. The first is the reformulation into Weighted SMT (WSMT) and the application of satisfiability test based algorithms from recent contributions in the Weighted Maximum Satisfiability field. The second one is the reformulation into an operation research-like style which involves an optimisation variable or objective function and the application of optimisation SMT solvers. We present experimental results of two well-known problems: the Nurse Rostering Problem (NRP) and a variant of the Balanced Academic Curriculum Problem (BACP), and provide some insights into the impact of the addition of meta-constraints on the quality of the solutions and the solving time.  相似文献   
7.
The first models of optimization of inventory management costs have undergone few changes since they were developed at the beginning of the last century. It is only with the passage of time that new scenarios have appeared with the introduction of new systems of production, and consequently of new strategies in the logistics chain. In this article, we analyze and propose a revision of the basic inventory model of economic order quantity first defined by Harris in 1913 for a scenario in which the owner of the stock receives a bonus or reward each time he replenishes his stock. This situation arises when the supplier receives a benefit (which he then shares with the customer) when managing his stock replenishment. An array of nested models is shown to illustrate this scenario, from which the constraints of previous scenarios have been removed. The model provides insights into the negotiation of batch size between supplier and buyer in a win‐win environment in the specific situation in which the supplier gives a bonus to the buyer at each stock replenishment. © 2011 Wiley Periodicals, Inc.  相似文献   
8.
This research introduces a new optimality criterion for motion planning of wheeled mobile robots based on a cost index that assesses the nearness to singularity of forward and inverse kinematic models. Slip motions, infinite estimation error and impossible control actions are avoided escaping from singularities. In addition, high amplification of wheel velocity errors and high wheel velocity values are also avoided by moving far from the singularity. The proposed cost index can be used directly to complement path-planning and motion-planning techniques (e.g. tree graphs, roadmaps, etc.) in order to select the optimal collision-free path or trajectory among several possible solutions. To illustrate the applications of the proposed approach, an industrial forklift, equivalent to a tricycle-like mobile robot, is considered in a simulated environment. In particular, several results are validated for the proposed optimality criterion, which are extensively compared to those obtained with other classical optimality criteria, such as shortest-path, time-optimal and minimum-energy.  相似文献   
9.
We present the design of a predictive load shedding scheme for a network monitoring platform that supports multiple and competing traffic queries. The proposed scheme can anticipate overload situations and minimize their impact on the accuracy of the traffic queries. The main novelty of our approach is that it considers queries as black boxes, with arbitrary (and highly variable) input traffic and processing cost. Our system only requires a high-level specification of the accuracy requirements of each query to guide the load shedding procedure and assures a fair allocation of computing resources to queries in a non-cooperative environment. We present an implementation of our load shedding scheme in an existing network monitoring system and evaluate it with a diverse set of traffic queries. Our results show that, with the load shedding mechanism in place, the monitoring system can preserve the accuracy of the queries within predefined error bounds even during extreme overload conditions.  相似文献   
10.
Statistical disclosure control (also known as privacy-preserving data mining) of microdata is about releasing data sets containing the answers of individual respondents protected in such a way that: (i) the respondents corresponding to the released records cannot be re-identified; (ii) the released data stay analytically useful. Usually, the protected data set is generated by either masking (i.e. perturbing) the original data or by generating synthetic (i.e. simulated) data preserving some pre-selected statistics of the original data. Masked data may approximately preserve a broad range of distributional characteristics, although very few of them (if any) are exactly preserved; on the other hand, synthetic data exactly preserve the pre-selected statistics and may seem less disclosive than masked data, but they do not preserve at all any statistics other than those pre-selected. Hybrid data obtained by mixing the original data and synthetic data have been proposed in the literature to combine the strengths of masked and synthetic data. We show how to easily obtain hybrid data by combining microaggregation with any synthetic data generator. We show that numerical hybrid data exactly preserving means and covariances of original data and approximately preserving other statistics as well as some subdomain analyses can be obtained as a particular case with a very simple parameterization. The new method is competitive versus both the literature on hybrid data and plain multivariate microaggregation.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号