首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1532篇
  免费   68篇
电工技术   24篇
综合类   4篇
化学工业   450篇
金属工艺   28篇
机械仪表   12篇
建筑科学   120篇
矿业工程   3篇
能源动力   48篇
轻工业   103篇
水利工程   5篇
石油天然气   5篇
武器工业   1篇
无线电   121篇
一般工业技术   245篇
冶金工业   135篇
原子能技术   3篇
自动化技术   293篇
  2024年   5篇
  2023年   21篇
  2022年   30篇
  2021年   41篇
  2020年   31篇
  2019年   31篇
  2018年   29篇
  2017年   32篇
  2016年   60篇
  2015年   44篇
  2014年   49篇
  2013年   75篇
  2012年   79篇
  2011年   98篇
  2010年   80篇
  2009年   76篇
  2008年   79篇
  2007年   74篇
  2006年   70篇
  2005年   50篇
  2004年   54篇
  2003年   36篇
  2002年   32篇
  2001年   19篇
  2000年   18篇
  1999年   21篇
  1998年   44篇
  1997年   44篇
  1996年   22篇
  1995年   17篇
  1994年   15篇
  1993年   11篇
  1992年   13篇
  1991年   6篇
  1990年   10篇
  1989年   12篇
  1988年   10篇
  1985年   11篇
  1984年   10篇
  1980年   6篇
  1979年   5篇
  1978年   6篇
  1977年   14篇
  1976年   8篇
  1975年   6篇
  1974年   8篇
  1973年   7篇
  1972年   6篇
  1971年   7篇
  1970年   6篇
排序方式: 共有1600条查询结果,搜索用时 0 毫秒
61.
In this paper we study the coordination of Emergency Medical Service (EMS) for patients with acute myocardial infarction with ST-segment elevation (STEMI). This is a health problem with high associated mortality. A “golden standard” treatment for STEMI is angioplasty, which requires a catheterization lab and a highly qualified cardiology team. It should be performed as soon as possible since the delay to treatment worsens the patient’s prognosis. The decrease of the delay is achieved by coordination of EMS, which is especially important in the case of multiple simultaneous patients. Nowadays, this process is based on the First-Come-First-Served (FCFS) principle and it heavily depends on human control and phone communication with high proneness to human error and delays. The objective is, therefore, to automate the EMS coordination while minimizing the time from symptom onset to reperfusion and thus to lower the mortality and morbidity resulting from this disease. In this paper, we present a multi-agent decision-support system for the distributed coordination of EMS focusing on urgent out-of-hospital STEMI patients awaiting angioplasty. The system is also applicable to emergency patients of any pathology needing pre-hospital acute medical care and urgent hospital treatment. The assignment of patients to ambulances and angioplasty-enabled hospitals with cardiology teams is performed via a three-level optimization model. At each level, we find a globally efficient solution by a modification of the distributed relaxation method for the assignment problem called the auction algorithm. The efficiency of the proposed model is demonstrated by simulation experiments.  相似文献   
62.
63.
64.
The rapid increase of renewable energy sources made coordinated control of the distributed and intermittent generation units a more demanded task. Matching demand and supply is particularly challenging in islanded microgrids. In this study, we have demonstrated a mixed‐integer quadratic programming (MIQP) method to achieve efficient use of sources within an islanded microgrid. A unique objective function involving fuel consumption of diesel generator, degradation in a lithium‐ion battery energy storage system, carbon emissions, load shifting, and curtailment of the renewable sources is constructed, and an optimal operating point is pursued using the MIQP approach. A systematic and extensive methodology for building the objective function is given in a sequential and explicit manner with an emphasis on a novel model‐based battery aging formulation. Performance of the designed system and a sensitivity analysis of resulting battery dispatch, diesel generator usage, and storage aging against a range of optimization parameters are presented by considering real‐world specifications of the Semakau Island, an island in the vicinity of Singapore.  相似文献   
65.
In this paper, we are exploring the approach to utilize system-specific static analyses of code with the goal to improve software quality for specific software systems. Specialized analyses, tailored for a particular system, make it possible to take advantage of system/domain knowledge that is not available to more generic analyses. Furthermore, analyses can be selected and/or developed in order to best meet the challenges and specific issues of the system at hand. As a result, such analyses can be used as a complement to more generic code analysis tools because they are likely to have a better impact on (business) concerns such as improving certain software quality attributes and reducing certain classes of failures. We present a case study of a large, industrial embedded system, giving examples of what kinds of analyses could be realized and demonstrate the feasibility of implementing such analyses. We synthesize lessons learned based on our case study and provide recommendations on how to realize system-specific analyses and how to get them adopted by industry.  相似文献   
66.
Corrosion mechanisms between MgO refractory substrates and FeNi slags were investigated. The FeNi slags taken into consideration represent a simple synthetically mixed slag with specific oxides and a real slag from a ferroalloy producer. The MgO refractory substrates with the specimens of FeNi slag were heated in a hot-stage microscope at 10 K/min from room temperature to three different temperatures 1573 K, 1723 K, and 1923 K (1300 °C, 1450 °C, and 1650 °C). The experiments were carried out under a controlled gas atmosphere that simulates the relevant process conditions. The corrosion mechanisms of each system were followed by scanning electron microscope analyses. The results obtained showed that slag corrosion dominates, with a pronounced partial dissolution of refractory fines forming Mg-silicates of type forsterite. It was also observed that iron oxide present in the slag diffused into the coarse refractory grains forming magnesiowustite. Finally, the results obtained were compared with those predicted by FACTSAGE software to understand the corrosion mechanisms and draw implications for improving the refractory performance and lifetime.  相似文献   
67.
Metallurgical and Materials Transactions B - Understanding the effects of impurities, segregation, undercooling, and solidification velocity is necessary to reconstruct prehistoric As-Cu alloy...  相似文献   
68.
The development of creep prediction models has been a field of extensive research and many different models have already been proposed. This paper presents an evaluation method of the prediction quality of creep models for specific experimental data. Within the scope of this paper, the model according to Bockhold and the model according to Heidolf are examined. First, the parameters of the models are identified with respect to existing experimental data. This is done using a sampling based approach of Bayesian updating developed by Ba?ant and Chern. In extension to the method by Ba?ant and Chern, the uncertainty coming from inaccurate measurement data is taken into account in the definition of the likelihood function within the updating algorithm. The more inaccurate the measurements are, the more uncertain the estimated model parameters and model prognoses become. The identification is performed for different short- and long-term creep tests. The intension is not to validate these models intensively, but to evaluate their prognoses for the individually tested creep behavior. The results show that the identifiability of the models?? parameters is different for both models and consequently the models prognoses differ in their uncertainties. Second, the models are evaluated using two different strategies: the stochastic model selection according to MacKay, Beck and Yuen based on the Ockham factor, and a comparison of the uncertainties taking into account parameter and model uncertainties. The results of the evaluation of the creep models differ for various experimental tests. Model Heidolf is more flexible and gives a better fit to the data, however, it fails to predict reliable long-term creep deformations using only short-term measurements compared to model Bockhold. Comparing the evaluation methods, the analysis of uncertainties of the creep prognosis proofs to be more stable than the evaluation using the stochastic model selection.  相似文献   
69.
Nanoparticles represent highly promising platforms for the development of imaging and therapeutic agents, including those that can either be detected via more than one imaging technique (multi‐modal imaging agents) or used for both diagnosis and therapy (theranostics). A major obstacle to their medical application and translation to the clinic, however, is the fact that many accumulate in the liver and spleen as a result of opsonization and scavenging by the mononuclear phagocyte system. This focused review summarizes recent efforts to develop zwitterionic‐coatings to counter this issue and render nanoparticles more biocompatible. Such coatings have been found to greatly reduce the rate and/or extent of non‐specific adsorption of proteins and lipids to the nanoparticle surface, thereby inhibiting production of the “biomolecular corona” that is proposed to be a universal feature of nanoparticles within a biological environment. Additionally, in vivo studies have demonstrated that larger‐sized nanoparticles with a zwitterionic coating have extended circulatory lifetimes, while those with hydrodynamic diameters of ≤5 nm exhibit small‐molecule‐like pharmacokinetics, remaining sufficiently small to pass through the fenestrae and slit pores during glomerular filtration within the kidneys, and enabling efficient excretion via the urine. The larger particles represent ideal candidates for use as blood pool imaging agents, whilst the small ones provide a highly promising platform for the future development of theranostics with reduced side effect profiles and superior dose delivery and image contrast capabilities.  相似文献   
70.
Systems for distributed event processing have recently gained increasing attention in a broad range of application domains. This raises the demand for methods to adapt the system design to application-specific needs. Our approach considers (1) trade-offs regarding the hardware infrastructure and (2) trade-offs in the software design. For the underlying model we categorize events along the dimensions of temporal complexity and physical distribution. This yields a categorization of events that drives trade-offs in the infrastructure design. The presented model supports design decisions in dependence on application-specific event properties and design goals.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号