首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1009篇
  免费   54篇
电工技术   13篇
化学工业   206篇
金属工艺   47篇
机械仪表   19篇
建筑科学   38篇
矿业工程   1篇
能源动力   40篇
轻工业   55篇
水利工程   3篇
无线电   98篇
一般工业技术   178篇
冶金工业   156篇
原子能技术   12篇
自动化技术   197篇
  2024年   2篇
  2023年   15篇
  2022年   25篇
  2021年   36篇
  2020年   24篇
  2019年   21篇
  2018年   29篇
  2017年   44篇
  2016年   41篇
  2015年   27篇
  2014年   35篇
  2013年   51篇
  2012年   81篇
  2011年   75篇
  2010年   54篇
  2009年   61篇
  2008年   66篇
  2007年   43篇
  2006年   33篇
  2005年   32篇
  2004年   19篇
  2003年   19篇
  2002年   10篇
  2001年   7篇
  2000年   11篇
  1999年   9篇
  1998年   39篇
  1997年   23篇
  1996年   15篇
  1995年   9篇
  1994年   9篇
  1993年   8篇
  1992年   4篇
  1991年   8篇
  1990年   7篇
  1989年   6篇
  1988年   3篇
  1987年   9篇
  1986年   7篇
  1985年   4篇
  1984年   5篇
  1982年   3篇
  1981年   4篇
  1980年   4篇
  1979年   4篇
  1977年   2篇
  1976年   7篇
  1973年   3篇
  1967年   2篇
  1964年   1篇
排序方式: 共有1063条查询结果,搜索用时 15 毫秒
71.
We extend the notion of randomness (in the version introduced by Schnorr) to computable probability spaces and compare it to a dynamical notion of randomness: typicality. Roughly, a point is typical for some dynamic, if it follows the statistical behavior of the system (Birkhoff’s pointwise ergodic theorem). We prove that a point is Schnorr random if and only if it is typical for every mixing computable dynamics. To prove the result we develop some tools for the theory of computable probability spaces (for example, morphisms) that are expected to have other applications.  相似文献   
72.
Progressive collapse is a failure mode of great concern for tall buildings, and is also typical of building demolitions. The most infamous paradigm is the collapse of the World Trade Center towers. After reviewing the mechanics of their collapse, the motion during the crushing of one floor (or group of floors) and its energetics are analyzed, and a dynamic one-dimensional continuum model of progressive collapse is developed. Rather than using classical homogenization, it is found more effective to characterize the continuum by an energetically equivalent snap-through. The collapse, in which two phases—crush-down followed by crush-up—must be distinguished, is described in each phase by a nonlinear second-order differential equation for the propagation of the crushing front of a compacted block of accreting mass. Expressions for consistent energy potentials are formulated and an exact analytical solution of a special case is given. It is shown that progressive collapse will be triggered if the total (internal) energy loss during the crushing of one story (equal to the energy dissipated by the complete crushing and compaction of one story, minus the loss of gravity potential during the crushing of that story) exceeds the kinetic energy impacted to that story. Regardless of the load capacity of the columns, there is no way to deny the inevitability of progressive collapse driven by gravity alone if this criterion is satisfied (for the World Trade Center it is satisfied with an order-of-magnitude margin). The parameters are the compaction ratio of a crushed story, the fracture of mass ejected outside the tower perimeter, and the energy dissipation per unit height. The last is the most important, yet the hardest to predict theoretically. It is argued that, using inverse analysis, one could identify these parameters from a precise record of the motion of floors of a collapsing building. Due to a shroud of dust and smoke, the videos of the World Trade Center are only of limited use. It is proposed to obtain such records by monitoring (with millisecond accuracy) the precise time history of displacements in different modes of building demolitions. The monitoring could be accomplished by real-time telemetry from sacrificial accelerometers, or by high-speed optical camera. The resulting information on energy absorption capability would be valuable for the rating of various structural systems and for inferring their collapse mode under extreme fire, internal explosion, external blast, impact or other kinds of terrorist attack, as well as earthquake and foundation movements.  相似文献   
73.
Cognition, Technology & Work - Rapid technological innovations are constantly influencing the complexification and automatization of the work lines pushing human operators to use diverse...  相似文献   
74.
The aim of dose-ranging phase I (resp. phase II) clinical trials is to rapidly identify the maximum tolerated dose (MTD) (resp., minimal effective dose (MED)) of a new drug or combination. For the conduct and analysis of such trials, Bayesian approaches such as the Continual Reassessment Method (CRM) have been proposed, based on a sequential design and analysis up to a completed fixed sample size. To optimize sample sizes, Zohar and Chevret have proposed stopping rules (Stat. Med. 20 (2001) 2827), the computation of which is not provided by available softwares. We present in this paper a user-friendly software for the design and analysis of these Bayesian Phase I (resp. phase II) dose-ranging Clinical Trials (BPCT). It allows to carry out the CRM with stopping rules or not, from the planning of the trial, with choice of model parameterization based on its operating characteristics, up to the sequential conduct and analysis of the trial, with estimation at stopping of the MTD (resp. MED) of the new drug or combination.  相似文献   
75.
Distributed as an open‐source library since 2013, real‐time appearance‐based mapping (RTAB‐Map) started as an appearance‐based loop closure detection approach with memory management to deal with large‐scale and long‐term online operation. It then grew to implement simultaneous localization and mapping (SLAM) on various robots and mobile platforms. As each application brings its own set of constraints on sensors, processing capabilities, and locomotion, it raises the question of which SLAM approach is the most appropriate to use in terms of cost, accuracy, computation power, and ease of integration. Since most of SLAM approaches are either visual‐ or lidar‐based, comparison is difficult. Therefore, we decided to extend RTAB‐Map to support both visual and lidar SLAM, providing in one package a tool allowing users to implement and compare a variety of 3D and 2D solutions for a wide range of applications with different robots and sensors. This paper presents this extended version of RTAB‐Map and its use in comparing, both quantitatively and qualitatively, a large selection of popular real‐world datasets (e.g., KITTI, EuRoC, TUM RGB‐D, MIT Stata Center on PR2 robot), outlining strengths, and limitations of visual and lidar SLAM configurations from a practical perspective for autonomous navigation applications.  相似文献   
76.
77.
Pancreatic beta-cells have a crucial role in the regulation of blood glucose homeostasis by the production and secretion of insulin. In type 1 diabetes (T1D), an autoimmune reaction against the beta-cells together with the presence of inflammatory cytokines and ROS in the islets leads to beta-cell dysfunction and death. This review gives an overview of proteomic studies that lead to better understanding of beta-cell functioning in T1D. Protein profiling of isolated islets and beta-cell lines in health and T1D contributed to the unraveling of pathways involved in cytokine-induced cell death. In addition, by studying the serological proteome of T1D patients, new biomarkers and beta-cell autoantigens were discovered, which may improve screening tests and follow-up of T1D development. Interestingly, an important role for PTMs was demonstrated in the generation of beta-cell autoantigens. To conclude, proteomic techniques are of indispensable value to improve the knowledge on beta-cell function in T1D and the search toward therapeutic targets.  相似文献   
78.
79.
We present a simple and effective algorithm to transfer deformation between surface meshes with multiple components. The algorithm automatically computes spatial relationships between components of the target object, builds correspondences between source and target, and finally transfers deformation of the source onto the target while preserving cohesion between the target's components. We demonstrate the versatility of our approach on various complex models.  相似文献   
80.
In a previous paper we presented a way to measure the rheological properties of complex fluids on a microfluidic chip (Guillot et al., Langmuir 22:6438, 2006). The principle of our method is to use parallel flows between two immiscible fluids as a pressure sensor. In fact, in a such flow, both fluids flow side by side and the size occupied by each fluid stream depends only on both flow rates and on both viscosities. We use this property to measure the viscosity of one fluid knowing the viscosity of the other one, both flow rates and the relative size of both streams in a cross-section. We showed that using a less viscous fluid as a reference fluid allows to define a mean shear rate with a low standard deviation in the other fluid. This method allows us to measure the flow curve of a fluid with less than 250 μL of fluid. In this paper we implement this principle in a fully automated set up which controls the flow rate, analyzes the picture and calculates the mean shear rate and the viscosity of the studied fluid. We present results obtained for Newtonian fluids and complex fluids using this set up and we compare our data with cone and plate rheometer measurements. By adding a mixing stage in the fluidic network we show how this set up can be used to characterize in a continuous way the evolution of the rheological properties as a function of the formulation composition. We illustrate this by measuring the rheological curve of four formulations of polyethylene oxide solution with only 1.3 mL of concentrated polyethylene oxide solution. This method could be very useful in screening processes where the viscosity range and the behavior of the fluid to an applied stress must be evaluated.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号