首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   3527篇
  免费   238篇
  国内免费   18篇
电工技术   26篇
综合类   9篇
化学工业   795篇
金属工艺   70篇
机械仪表   53篇
建筑科学   270篇
矿业工程   2篇
能源动力   165篇
轻工业   777篇
水利工程   26篇
石油天然气   17篇
武器工业   1篇
无线电   155篇
一般工业技术   632篇
冶金工业   192篇
原子能技术   21篇
自动化技术   572篇
  2023年   36篇
  2022年   26篇
  2021年   60篇
  2020年   51篇
  2019年   66篇
  2018年   142篇
  2017年   128篇
  2016年   132篇
  2015年   125篇
  2014年   141篇
  2013年   257篇
  2012年   194篇
  2011年   238篇
  2010年   194篇
  2009年   153篇
  2008年   173篇
  2007年   180篇
  2006年   185篇
  2005年   133篇
  2004年   111篇
  2003年   133篇
  2002年   103篇
  2001年   79篇
  2000年   68篇
  1999年   64篇
  1998年   86篇
  1997年   66篇
  1996年   44篇
  1995年   46篇
  1994年   33篇
  1993年   28篇
  1992年   17篇
  1991年   16篇
  1990年   14篇
  1989年   11篇
  1987年   8篇
  1986年   13篇
  1985年   21篇
  1984年   29篇
  1983年   21篇
  1982年   13篇
  1981年   19篇
  1980年   20篇
  1979年   14篇
  1978年   7篇
  1977年   14篇
  1976年   15篇
  1975年   8篇
  1974年   14篇
  1973年   12篇
排序方式: 共有3783条查询结果,搜索用时 31 毫秒
91.
The aims were to evaluate the inter-method reliability of a registration sheet for patient handling tasks, to study the day-to-day variation of musculoskeletal complaints (MSC) and to examine whether patient handling tasks and psychosocial factors were associated with MSC.Nurses (n = 148) fulfilled logbooks for three consecutive working days followed by a day off. Low back pain (LBP), neck/shoulder pain (NSP), knee pain (KP), psychosocial factors (time pressure, stress, conscience of the quality of work) and patient transfers and care tasks were reported.The logbook was reliable for both transfer and care tasks. The numbers of nurses reporting MSC and the level of pain increased significantly during the three working days (15%-30% and 17%-37%, respectively) and decreased on the day off. Stress and transfer task were associated with LPB and transfer tasks were associated with KP.Our results confirm a relationship between work factors and MSC and indicate that logs could be one way to obtain a better understanding of the complex interaction of various nursing working conditions in relation to MSC.  相似文献   
92.
Design and validation of structures against blast loads are important for modern society in order to protect and secure its citizen. Since it is a challenge to validate and optimise protective structures against blast loads using full-scale experimental tests, we have to turn our attention towards advanced numerical tools like the finite element method. Several different finite element techniques can be used to describe the response of structures due to blast loads. Some of these are: (1) a pure Lagrangian formulation, (2) an initial Eulerian simulation (to determine the load) followed by a Lagrangian simulation (for the structural response) and (3) a hybrid technique that combines the advantages of Eulerian and Lagrangian methods to have a full coupling between the blast waves and the deformation of the structure. Ideally, all blast simulations should be carried out using the fully coupled Eulerian–Lagrangian approach, but this may not be practical as the computational time increases considerably when going from a pure Lagrangian to a fully coupled Eulerian–Lagrangian simulation. A major goal in this study is to investigate if a pure Lagrangian formulation can be applied to determine the structural response in a specified blast load problem or if more advanced approaches such as the fully coupled Eulerian–Lagrangian approach is required for reliable results. This is done by conducting numerical simulations of an unprotected 20 ft ISO container exposed to a blast load of 4000 kg TNT at 120 m standoff distance using the three different approaches presented above. To validate and discuss the results, the simulated response of the container is compared to available data from a full-scale blast test under such conditions.  相似文献   
93.
Iterative Feedback Tuning constitutes an attractive control loop tuning method for processes in the absence of an accurate process model. It is a purely data driven approach aiming at optimizing the closed loop performance. The standard formulation ensures an unbiased estimate of the loop performance cost function gradient with respect to the control parameters. This gradient is important in a search algorithm. The extension presented in this paper further ensures informative data to improve the convergence properties of the method and hence reduce the total number of required plant experiments especially when tuning for disturbance rejection. Informative data is achieved through application of an external probing signal in the tuning algorithm. The probing signal is designed based on a constrained optimization which utilizes an approximate black box model of the process. This model estimate is further used to guarantee nominal stability and to improve the parameter update using a line search algorithm for determining the iteration step size. The proposed algorithm is compared to the classical formulation in a simulation study of a disturbance rejection problem. This type of problem is notoriously difficult for Iterative Feedback Tuning due to the lack of excitation from the reference.  相似文献   
94.
A systematic literature search was carried out to investigate the relationship between quick returns (i.e. 11.0 hours or less between two consecutive shifts) and outcome measures of health, sleep, functional ability and work–life balance. A total of 22 studies published in 21 articles were included. Three types of quick returns were differentiated (from evening to morning/day, night to evening, morning/day to night shifts) where sleep duration and sleepiness appeared to be differently affected depending on which shifts the quick returns occurred between. There were some indications of detrimental effects of quick returns on proximate problems (e.g. sleep, sleepiness and fatigue), although the evidence of associations with more chronic outcome measures (physical and mental health and work–life balance) was inconclusive.

Practitioner Summary: Modern societies are dependent on people working shifts. This study systematically reviews literature on the consequences of quick returns (11.0 hours or less between two shifts). Quick returns have detrimental effects on acute health problems. However, the evidence regarding effects on chronic health is inconclusive.  相似文献   

95.
In radiotherapy, tumors are irradiated with a high dose, while surrounding healthy tissues are spared. To quantify the probability that a tumor is effectively treated with a given dose, statistical models were built and employed in clinical research. These are called tumor control probability (TCP) models. Recently, TCP models started incorporating additional information from imaging modalities. In this way, patient‐specific properties of tumor tissues are included, improving the radiobiological accuracy of models. Yet, the employed imaging modalities are subject to uncertainties with significant impact on the modeling outcome, while the models are sensitive to a number of parameter assumptions. Currently, uncertainty and parameter sensitivity are not incorporated in the analysis, due to time and resource constraints. To this end, we propose a visual tool that enables clinical researchers working on TCP modeling, to explore the information provided by their models, to discover new knowledge and to confirm or generate hypotheses within their data. Our approach incorporates the following four main components: (1) It supports the exploration of uncertainty and its effect on TCP models; (2) It facilitates parameter sensitivity analysis to common assumptions; (3) It enables the identification of inter‐patient response variability; (4) It allows starting the analysis from the desired treatment outcome, to identify treatment strategies that achieve it. We conducted an evaluation with nine clinical researchers. All participants agreed that the proposed visual tool provides better understanding and new opportunities for the exploration and analysis of TCP modeling.  相似文献   
96.
Since today’s real-world graphs, such as social network graphs, are evolving all the time, it is of great importance to perform graph computations and analysis in these dynamic graphs. Due to the fact that many applications such as social network link analysis with the existence of inactive users need to handle failed links or nodes, decremental computation and maintenance for graphs is considered a challenging problem. Shortest path computation is one of the most fundamental operations for managing and analyzing large graphs. A number of indexing methods have been proposed to answer distance queries in static graphs. Unfortunately, there is little work on answering such queries for dynamic graphs. In this paper, we focus on the problem of computing the shortest path distance in dynamic graphs, particularly on decremental updates (i.e., edge deletions). We propose maintenance algorithms based on distance labeling, which can handle decremental updates efficiently. By exploiting properties of distance labeling in original graphs, we are able to efficiently maintain distance labeling for new graphs. We experimentally evaluate our algorithms using eleven real-world large graphs and confirm the effectiveness and efficiency of our approach. More specifically, our method can speed up index re-computation by up to an order of magnitude compared with the state-of-the-art method, Pruned Landmark Labeling (PLL).  相似文献   
97.
ABSTRACT

Automated detection and recognition of faces have been implemented in a broad range of media environments. Following that development, what concerns us in this article is the analysis of emotions from facial expressions using computer-based systems, in relation to which we critically investigate the use of theories of basic emotions. We explore in depth the company Affectiva’s attempts to translate, represent and schematize human emotions, as they raise a variety of problems and issues of uncertainty. We analyse the uncertainties concerning the processing of the human face ‘as image’ due to issues concerning temporality and static images as well as polyphony and modulations of the spectrum of expressions. One of our key arguments concerns the temporal character of human emotions, and we address how algorithmically regulated protocols of discretization may be said to prompt specific patterns of emotional responses and expressions based on an ideal of eliminating uncertainty. Through discussions via art pieces by Lauren McCarthy and Kyle McDonald, we question what happens when the protocols of computer systems start to perform aspects of emotional labour for us, making judgments by predicting adequate emotional responses based on the use of the strict metrics criticized in the article.  相似文献   
98.
The direct observation of cells over time using time-lapse microscopy can provide deep insights into many important biological processes. Reliable analyses of motility, proliferation, invasive potential or mortality of cells are essential to many studies involving live cell imaging and can aid in biomarker discovery and diagnostic decisions. Given the vast amount of image- and time-series data produced by modern microscopes, automated analysis is a key feature to capitalize the potential of time-lapse imaging devices. To provide fast and reproducible analyses of multiple aspects of cell behaviour, we developed TimeLapseAnalyzer. Apart from general purpose image enhancements and segmentation procedures, this extensible, self-contained, modular cross-platform package provides dedicated modalities for fast and reliable analysis of multi-target cell tracking, scratch wound healing analysis, cell counting and tube formation analysis in high throughput screening of live-cell experiments. TimeLapseAnalyzer is freely available (MATLAB, Open Source) at http://www.informatik.uni-ulm.de/ni/mitarbeiter/HKestler/tla.  相似文献   
99.
We review a number of formal verification techniques supported by STeP, the Stanford Temporal Prover, describing how the tool can be used to verify properties of several versions of the Bakery Mutual exclusion algorithm for mutual exclusion. We verify the classic two-process algorithm and simple variants, as well as an atomic parameterized version. The methods used include deductive verification rules, verification diagrams, automatic invariant generation, and finite-state model checking and abstraction.  相似文献   
100.
Legal reasoning with subjective logic   总被引:1,自引:0,他引:1  
Judges and jurors must make decisions in an environment of ignoranceand uncertainty for example by hearing statements of possibly unreliable ordishonest witnesses, assessing possibly doubtful or irrelevantevidence, and enduring attempts by the opponents to manipulate thejudge's and the jurors' perceptions and feelings. Three importantaspects of decision making in this environment are the quantificationof sufficient proof, the weighing of pieces of evidence, and therelevancy of evidence. This paper proposes a mathematical frameworkfor dealing with the two first aspects, namely the quantification ofproof and weighing of evidence. Our approach is based on subjectivelogic, which is an extension of standard logic and probability theory,in which the notion of probability is extended by including degrees ofuncertainty. Subjective Logic is a framework for modelling humanreasoning and we show how it can be applied to legalreasoning.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号