首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   334篇
  免费   20篇
  国内免费   1篇
化学工业   80篇
金属工艺   9篇
机械仪表   9篇
建筑科学   36篇
能源动力   10篇
轻工业   55篇
无线电   16篇
一般工业技术   64篇
冶金工业   11篇
自动化技术   65篇
  2024年   2篇
  2023年   17篇
  2022年   9篇
  2021年   20篇
  2020年   5篇
  2019年   13篇
  2018年   11篇
  2017年   11篇
  2016年   14篇
  2015年   22篇
  2014年   13篇
  2013年   28篇
  2012年   29篇
  2011年   34篇
  2010年   25篇
  2009年   22篇
  2008年   18篇
  2007年   14篇
  2006年   8篇
  2005年   9篇
  2004年   5篇
  2003年   8篇
  2002年   4篇
  2001年   3篇
  2000年   3篇
  1999年   1篇
  1998年   5篇
  1996年   1篇
  1995年   1篇
排序方式: 共有355条查询结果,搜索用时 0 毫秒
1.
Marrone  Mauricio  Lemke  Sascha  Kolbe  Lutz M. 《Scientometrics》2022,127(7):3857-3878
Scientometrics - Computer-assisted methods and tools can help researchers automate the coding process of literature reviews and accelerate the literature review process. However, existing...  相似文献   
2.
The influence of two peroxides (peroxydicarbonate/dilauroyl peroxide) with various concentrations (10–200 mmol/kg PP) and their effective opportunity to introduce long chain branched (LCB) were investigated. The dependence of a single and double extrusion step and the changes of the properties were studied. Experiments were carried out in a single screw extruder at 180°C for the first extrusion step (modification) and at 240°C for the second extrusion step (processing simulation). Melt flow rate and dynamic rheological properties were studied at a measuring temperature of 230°C. For the definitive determination of long chain branched polypropylene (LCB-PP) served the extensional rheology measurements. The mechanical properties were examined via tensile test and impact tensile test. Summarized, LCB (melt strength) could be observed via extensional rheology for all modified specimens and the mechanical properties were maintained or even improved for the modified samples. Particularly, samples containing dilauroyl peroxide display excellent mechanical properties in this study.  相似文献   
3.
The injection molding of micro-structures is a promising mass-production method for a broad range of materials. However, the replication quality of these structures depends significantly on the heat flow during the filling stage. In this paper, the filling and heat transfer of v-groove and random structures below 5 μm is investigated with the help of an AFM (atomic force microscope) and thermo couples. A numerical model is developed to predict the filling of surface structures during the filling and packing stage. The model implies the use of simple fully developed flow models taking the power-law material model into account. This permits investigation into which ways several processing parameters affect the polymer flow in the surface structures. The mold wall temperature, which has significant effects on the polymer flow, is varied by using a variothermal mold temperature control system to validate the model proposed.  相似文献   
4.
JavaScript provides the technological foundation of Web 2.0 applications. AJAX (Asynchronous JavaScript And XML) applications have received wide-spread attention as a new way to develop highly interactive web applications. Breaking with the complete-page-reload paradigm of traditional web applications, AJAX applications rival desktop applications in their look-and-feel. But AJAX places a high burden on a web developer requiring extensive JavaScript knowledge as well as other advanced client-side technologies. In this paper, we introduce a technique that allows a developer to implement an application in Java or any.NET language and then automatically cross-compile it to an AJAX-enabled web application.  相似文献   
5.
This paper studies a family of optimization problems where a set of items, each requiring a possibly different amount of resource, must be assigned to different slots for which the price of the resource can vary. The objective is then to assign items such that the overall resource cost is minimized. Such problems arise commonly in domains such as production scheduling in the presence of fluctuating renewable energy costs or variants of the Travelling Salesman Problem. In Constraint Programming, this can be naturally modeled in two ways: (a) with a sum of element constraints; (b) with a MinimumAssignment constraint. Unfortunately the sum of element constraints obtains a weak filtering and the MinimumAssignment constraint does not scale well on large instances. This work proposes a third approach by introducing the ResourceCostAllDifferent constraint and an associated incremental and scalable filtering algorithm, running in \(\mathcal {O}(n \cdot m)\), where n is the number of unbound variables and m is the maximum domain size of unbound variables. Its goal is to compute the total cost in a scalable manner by dealing with the fact that all assignments must be different. We first evaluate the efficiency of the new filtering on a real industrial problem and then on the Product Matrix Travelling Salesman Problem, a special case of the Asymmetric Travelling Salesman Problem. The study shows experimentally that our approach generally outperforms the decomposition and the MinimumAssignment ones for the problems we considered.  相似文献   
6.
OBJECTIVE: An evaluation study was conducted to answer the question of which system properties of night vision enhancement systems (NVESs) provide a benefit for drivers without increasing their workload. BACKGROUND: Different infrared sensor, image processing, and display technologies can be integrated into an NVES to support nighttime driving. Because each of these components has its specific strengths and weaknesses, careful testing is required to determine their best combination. METHOD: Six prototypical systems were assessed in two steps. First, a heuristic evaluation with experts from ergonomics, perception, and traffic psychology was conducted. It produced a broad overview of possible effects of system properties on driving. Based on these results, an experimental field study with 15 experienced drivers was performed. Criteria used to evaluate the development potential of the six prototypes were the usability dimensions of effectiveness, efficiency, and user satisfaction (International Organization for Standardization, 1998). RESULTS: Results showed that the intelligibility of information, the easiness with which obstacles could be located in the environment, and the position of the display presenting the output of the system were of crucial importance for the usability of the NVES and its acceptance. Conclusion: All relevant requirements are met best by NVESs that are positioned at an unobtrusive location and are equipped with functions for the automatic identification of objects and for event-based warnings. APPLICATION: These design recommendations and the presented approach to evaluate the systems can be directly incorporated into the development process of future NVESs.  相似文献   
7.
Paul G  Wischniewski S 《Ergonomics》2012,55(9):1115-1118
Digital human models (DHM) have evolved as useful tools for ergonomic workplace design and product development, and found in various industries and education. DHM systems which dominate the market were developed for specific purposes and differ significantly, which is not only reflected in non-compatible results of DHM simulations, but also provoking misunderstanding of how DHM simulations relate to real world problems. While DHM developers are restricted by uncertainty about the user need and lack of model data related standards, users are confined to one specific product and cannot exchange results, or upgrade to another DHM system, as their previous results would be rendered worthless. Furthermore, origin and validity of anthropometric and biomechanical data is not transparent to the user. The lack of standardisation in DHM systems has become a major roadblock in further system development, affecting all stakeholders in the DHM industry. Evidently, a framework for standardising digital human models is necessary to overcome current obstructions. Practitioner Summary: This short communication addresses a standardisation issue for digital human models, which has been addressed at the International Ergonomics Association Technical Committee for Human Simulation and Virtual Environments. It is the outcome of a workshop at the DHM 2011 symposium in Lyon, which concluded steps towards DHM standardisation that need to be taken.  相似文献   
8.
Investigating the dynamical and physical properties of cosmic dust can reveal a great deal of information about both the dust and its many sources. Over recent years, several spacecraft (e.g., Cassini, Stardust, Galileo, and Ulysses) have successfully characterised interstellar, interplanetary, and circumplanetary dust using a variety of techniques, including in situ analyses and sample return. Charge, mass, and velocity measurements of the dust are performed either directly (induced charge signals) or indirectly (mass and velocity from impact ionisation signals or crater morphology) and constrain the dynamical parameters of the dust grains. Dust compositional information may be obtained via either time-of-flight mass spectrometry of the impact plasma or direct sample return. The accurate and reliable interpretation of collected spacecraft data requires a comprehensive programme of terrestrial instrument calibration. This process involves accelerating suitable solar system analogue dust particles to hypervelocity speeds in the laboratory, an activity performed at the Max Planck Institut fu?r Kernphysik in Heidelberg, Germany. Here, a 2 MV Van de Graaff accelerator electrostatically accelerates charged micron and submicron-sized dust particles to speeds up to 80 km s(-1). Recent advances in dust production and processing have allowed solar system analogue dust particles (silicates and other minerals) to be coated with a thin conductive shell, enabling them to be charged and accelerated. Refinements and upgrades to the beam line instrumentation and electronics now allow for the reliable selection of particles at velocities of 1-80 km s(-1) and with diameters of between 0.05 μm and 5 μm. This ability to select particles for subsequent impact studies based on their charges, masses, or velocities is provided by a particle selection unit (PSU). The PSU contains a field programmable gate array, capable of monitoring in real time the particles' speeds and charges, and is controlled remotely by a custom, platform independent, software package. The new control instrumentation and electronics, together with the wide range of accelerable particle types, allow the controlled investigation of hypervelocity impact phenomena across a hitherto unobtainable range of impact parameters.  相似文献   
9.
In this paper we study the coordination of Emergency Medical Service (EMS) for patients with acute myocardial infarction with ST-segment elevation (STEMI). This is a health problem with high associated mortality. A “golden standard” treatment for STEMI is angioplasty, which requires a catheterization lab and a highly qualified cardiology team. It should be performed as soon as possible since the delay to treatment worsens the patient’s prognosis. The decrease of the delay is achieved by coordination of EMS, which is especially important in the case of multiple simultaneous patients. Nowadays, this process is based on the First-Come-First-Served (FCFS) principle and it heavily depends on human control and phone communication with high proneness to human error and delays. The objective is, therefore, to automate the EMS coordination while minimizing the time from symptom onset to reperfusion and thus to lower the mortality and morbidity resulting from this disease. In this paper, we present a multi-agent decision-support system for the distributed coordination of EMS focusing on urgent out-of-hospital STEMI patients awaiting angioplasty. The system is also applicable to emergency patients of any pathology needing pre-hospital acute medical care and urgent hospital treatment. The assignment of patients to ambulances and angioplasty-enabled hospitals with cardiology teams is performed via a three-level optimization model. At each level, we find a globally efficient solution by a modification of the distributed relaxation method for the assignment problem called the auction algorithm. The efficiency of the proposed model is demonstrated by simulation experiments.  相似文献   
10.
The near-future penetration of plug-in electric vehicles is expected to be large enough to have a significant impact on the power grid. If PEVs were allowed to charge simultaneously at the maximum power rate, the distribution grid would face serious problems of stability. Therefore, mechanisms are needed to coordinate various PEVs that charge simultaneously. In this paper we propose an allocation mechanism that aims at balancing allocative efficiency and fairness, providing preferential treatment to the PEVs that have a high valuation of the available power, while guaranteeing a fair share of this power to all thePEVs.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号