首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1006篇
  免费   54篇
电工技术   13篇
化学工业   202篇
金属工艺   46篇
机械仪表   19篇
建筑科学   38篇
矿业工程   1篇
能源动力   40篇
轻工业   55篇
水利工程   3篇
无线电   98篇
一般工业技术   179篇
冶金工业   157篇
原子能技术   12篇
自动化技术   197篇
  2024年   2篇
  2023年   15篇
  2022年   25篇
  2021年   36篇
  2020年   24篇
  2019年   21篇
  2018年   29篇
  2017年   44篇
  2016年   41篇
  2015年   27篇
  2014年   35篇
  2013年   52篇
  2012年   80篇
  2011年   76篇
  2010年   54篇
  2009年   61篇
  2008年   66篇
  2007年   43篇
  2006年   33篇
  2005年   30篇
  2004年   19篇
  2003年   19篇
  2002年   9篇
  2001年   7篇
  2000年   11篇
  1999年   9篇
  1998年   39篇
  1997年   23篇
  1996年   14篇
  1995年   8篇
  1994年   9篇
  1993年   8篇
  1992年   4篇
  1991年   8篇
  1990年   7篇
  1989年   6篇
  1988年   3篇
  1987年   9篇
  1986年   7篇
  1985年   4篇
  1984年   5篇
  1982年   3篇
  1981年   4篇
  1980年   5篇
  1979年   3篇
  1977年   2篇
  1976年   7篇
  1975年   2篇
  1973年   3篇
  1967年   2篇
排序方式: 共有1060条查询结果,搜索用时 15 毫秒
61.
Cognition, Technology & Work - Rapid technological innovations are constantly influencing the complexification and automatization of the work lines pushing human operators to use diverse...  相似文献   
62.
The aim of dose-ranging phase I (resp. phase II) clinical trials is to rapidly identify the maximum tolerated dose (MTD) (resp., minimal effective dose (MED)) of a new drug or combination. For the conduct and analysis of such trials, Bayesian approaches such as the Continual Reassessment Method (CRM) have been proposed, based on a sequential design and analysis up to a completed fixed sample size. To optimize sample sizes, Zohar and Chevret have proposed stopping rules (Stat. Med. 20 (2001) 2827), the computation of which is not provided by available softwares. We present in this paper a user-friendly software for the design and analysis of these Bayesian Phase I (resp. phase II) dose-ranging Clinical Trials (BPCT). It allows to carry out the CRM with stopping rules or not, from the planning of the trial, with choice of model parameterization based on its operating characteristics, up to the sequential conduct and analysis of the trial, with estimation at stopping of the MTD (resp. MED) of the new drug or combination.  相似文献   
63.
Digital Elevation Models (DEMs) are used to compute the hydro-geomorphological variables required by distributed hydrological models. However, the resolution of the most precise DEMs is too fine to run these models over regional watersheds. DEMs therefore need to be aggregated to coarser resolutions, affecting both the representation of the land surface and the hydrological simulations. In the present paper, six algorithms (mean, median, mode, nearest neighbour, maximum and minimum) are used to aggregate the Shuttle Radar Topography Mission (SRTM) DEM from 3″ (90 m) to 5′ (10 km) in order to simulate the water balance of the Lake Chad basin (2.5 Mkm2). Each of these methods is assessed with respect to selected hydro-geomorphological properties that influence Terrestrial Hydrology Model with Biogeochemistry (THMB) simulations, namely the drainage network, the Lake Chad bottom topography and the floodplain extent.The results show that mean and median methods produce a smoother representation of the topography. This smoothing involves the removing of the depressions governing the floodplain dynamics (floodplain area<5000 km2) but it eliminates the spikes and wells responsible for deviations regarding the drainage network. By contrast, using other aggregation methods, a rougher relief representation enables the simulation of a higher floodplain area (>14,000 km2 with the maximum or nearest neighbour) but results in anomalies concerning the drainage network. An aggregation procedure based on a variographic analysis of the SRTM data is therefore suggested. This consists of preliminary filtering of the 3″ DEM in order to smooth spikes and wells, then resampling to 5′ via the nearest neighbour method so as to preserve the representation of depressions. With the resulting DEM, the drainage network, the Lake Chad bathymetric curves and the simulated floodplain hydrology are consistent with the observations (3% underestimation for simulated evaporation volumes).  相似文献   
64.
For robots to succeed in complex missions, they must be reliable in the face of subsystem failures and environmental challenges. In this paper, we focus on autonomous underwater vehicle (AUV) autonomy as it pertains to self‐perception and health monitoring, and we argue that automatic classification of state‐sensor data represents an important enabling capability. We apply an online Bayesian nonparametric topic modeling technique to AUV sensor data in order to automatically characterize its performance patterns, then demonstrate how in combination with operator‐supplied semantic labels these patterns can be used for fault detection and diagnosis by means of a nearest‐neighbor classifier. The method is evaluated using data collected by the Monterey Bay Aquarium Research Institute's Tethys long‐range AUV in three separate field deployments. Our results show that the proposed method is able to accurately identify and characterize patterns that correspond to various states of the AUV, and classify faults at a high rate of correct detection with a very low false detection rate.  相似文献   
65.
Progressive collapse is a failure mode of great concern for tall buildings, and is also typical of building demolitions. The most infamous paradigm is the collapse of the World Trade Center towers. After reviewing the mechanics of their collapse, the motion during the crushing of one floor (or group of floors) and its energetics are analyzed, and a dynamic one-dimensional continuum model of progressive collapse is developed. Rather than using classical homogenization, it is found more effective to characterize the continuum by an energetically equivalent snap-through. The collapse, in which two phases—crush-down followed by crush-up—must be distinguished, is described in each phase by a nonlinear second-order differential equation for the propagation of the crushing front of a compacted block of accreting mass. Expressions for consistent energy potentials are formulated and an exact analytical solution of a special case is given. It is shown that progressive collapse will be triggered if the total (internal) energy loss during the crushing of one story (equal to the energy dissipated by the complete crushing and compaction of one story, minus the loss of gravity potential during the crushing of that story) exceeds the kinetic energy impacted to that story. Regardless of the load capacity of the columns, there is no way to deny the inevitability of progressive collapse driven by gravity alone if this criterion is satisfied (for the World Trade Center it is satisfied with an order-of-magnitude margin). The parameters are the compaction ratio of a crushed story, the fracture of mass ejected outside the tower perimeter, and the energy dissipation per unit height. The last is the most important, yet the hardest to predict theoretically. It is argued that, using inverse analysis, one could identify these parameters from a precise record of the motion of floors of a collapsing building. Due to a shroud of dust and smoke, the videos of the World Trade Center are only of limited use. It is proposed to obtain such records by monitoring (with millisecond accuracy) the precise time history of displacements in different modes of building demolitions. The monitoring could be accomplished by real-time telemetry from sacrificial accelerometers, or by high-speed optical camera. The resulting information on energy absorption capability would be valuable for the rating of various structural systems and for inferring their collapse mode under extreme fire, internal explosion, external blast, impact or other kinds of terrorist attack, as well as earthquake and foundation movements.  相似文献   
66.
This paper presents two online identification algorithms of finite impulse response (FIR) systems using binary measurements both on the input and on the output. These algorithms are based on the least mean square (LMS) technique and on the estimation of the correlation functions of the input and output from binary data. Note that the second algorithm is a simplified version of the first one in the case of a white noise on the input. The convergence and variance analyses are provided. A numerical example is given to demonstrate the effectiveness of the proposed algorithms.  相似文献   
67.
Finding a dominating set of minimum cardinality is an NP-hard graph problem, even when the graph is bipartite. In this paper we are interested in solving the problem on graphs having a large independent set. Given a graph G with an independent set of size z, we show that the problem can be solved in time O(2nz), where n is the number of vertices of G. As a consequence, our algorithm is able to solve the dominating set problem on bipartite graphs in time O(2n/2). Another implication is an algorithm for general graphs whose running time is O(n1.7088).  相似文献   
68.
Health experts are worried about the increase in the number of overweight children and the decrease in activity levels among this age group. This project explores the possibilities of using interactive toys and social interaction in encouraging children to become more physically active. To arrive at the final concept, research was conducted into the factors that encourage and discourage physically active play in children. Based on that knowledge, four key elements were used to develop the product: fantasy, social interaction, surmounting physical and cultural barriers and inspirational factors. The project resulted in a cuddly toy that stimulates young children aged 4–8 to care for it through their own physical actions. First limited tests indicated that children appreciated and understood the toys’ key elements, and suggest that it could be possible to use interactive toys and social interaction to change behaviour with regard to physical activity.  相似文献   
69.
We extend the notion of randomness (in the version introduced by Schnorr) to computable probability spaces and compare it to a dynamical notion of randomness: typicality. Roughly, a point is typical for some dynamic, if it follows the statistical behavior of the system (Birkhoff’s pointwise ergodic theorem). We prove that a point is Schnorr random if and only if it is typical for every mixing computable dynamics. To prove the result we develop some tools for the theory of computable probability spaces (for example, morphisms) that are expected to have other applications.  相似文献   
70.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号