首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1834篇
  免费   193篇
  国内免费   6篇
电工技术   39篇
综合类   11篇
化学工业   466篇
金属工艺   55篇
机械仪表   31篇
建筑科学   152篇
矿业工程   2篇
能源动力   51篇
轻工业   66篇
水利工程   6篇
无线电   189篇
一般工业技术   412篇
冶金工业   100篇
原子能技术   9篇
自动化技术   444篇
  2024年   6篇
  2023年   51篇
  2022年   63篇
  2021年   100篇
  2020年   86篇
  2019年   79篇
  2018年   85篇
  2017年   70篇
  2016年   99篇
  2015年   98篇
  2014年   115篇
  2013年   132篇
  2012年   145篇
  2011年   160篇
  2010年   134篇
  2009年   93篇
  2008年   98篇
  2007年   88篇
  2006年   75篇
  2005年   50篇
  2004年   29篇
  2003年   14篇
  2002年   19篇
  2001年   15篇
  2000年   8篇
  1999年   12篇
  1998年   23篇
  1997年   15篇
  1996年   9篇
  1995年   7篇
  1994年   7篇
  1993年   3篇
  1991年   4篇
  1987年   1篇
  1986年   5篇
  1985年   2篇
  1983年   2篇
  1982年   1篇
  1981年   4篇
  1980年   3篇
  1979年   2篇
  1978年   2篇
  1977年   2篇
  1976年   6篇
  1975年   4篇
  1974年   1篇
  1973年   1篇
  1968年   1篇
  1966年   1篇
  1960年   1篇
排序方式: 共有2033条查询结果,搜索用时 15 毫秒
41.
The incident indirect light over a range of image pixels is often coherent. Two common approaches to exploit this inter‐pixel coherence to improve rendering performance are Irradiance Caching and Radiance Caching. Both compute incident indirect light only for a small subset of pixels (the cache), and later interpolate between pixels. Irradiance Caching uses scalar values that can be interpolated efficiently, but cannot account for shading variations caused by normal and reflectance variation between cache items. Radiance Caching maintains directional information, e.g., to allow highlights between cache items, but at the cost of storing and evaluating a Spherical Harmonics (SH) function per pixel. The arithmetic and bandwidth cost for this evaluation is linear in the number of coefficients and can be substantial. In this paper, we propose a method to replace it by an efficient per‐cache item pre‐filtering based on MIP maps — such as previously done for environment maps — leading to a single constant‐time lookup per pixel. Additionally, per‐cache item geometry statistics stored in distance‐MIP maps are used to improve the quality of each pixel's lookup. Our approximate interactive global illumination approach is an order of magnitude faster than Radiance Caching with Phong BRDFs and can be combined with Monte Carlo‐raytracing, Point‐based Global Illumination or Instant Radiosity.  相似文献   
42.
This article presents a case study on retrospective verification of the Linux Virtual File System (VFS), which is aimed at checking violations of API usage rules and memory properties. Since VFS maintains dynamic data structures and is written in a mixture of C and inlined assembly, modern software model checkers cannot be applied. Our case study centres around our novel automated software verification tool, the SOCA Verifier, which symbolically executes and analyses compiled code. We describe how this verifier deals with complex features such as memory access, pointer aliasing and computed jumps in the VFS implementation, while reducing manual modelling to a minimum. Our results show that the SOCA Verifier is capable of analysing the complex Linux VFS implementation reliably and efficiently, thereby going beyond traditional testing tools and into niches that current software model checkers do not reach. This testifies to the SOCA Verifier’s suitability as an effective and efficient bug-finding tool during the development of operating system components.  相似文献   
43.
Music is a fundamental part of most cultures. Controlling music playback has commonly been used to demonstrate new interaction techniques and algorithms. In particular, controlling music playback has been used to demonstrate and evaluate gesture recognition algorithms. Previous work, however, used gestures that have been defined based on intuition, the developers’ preferences, and the respective algorithm’s capabilities. In this paper we propose a refined process for deriving gestures from constant user feedback. Using this process every result and design decision is validated in the subsequent step of the process. Therefore, comprehensive feedback can be collected from each of the conducted user studies. Along the process we develop a set of free-hand gestures for controlling music playback. The situational context is analysed to shape the usage scenario and derive an initial set of necessary functions. In a successive user study the set of functions is validated and proposals for gestures are collected from participants for each function. Two gesture sets containing static and dynamic gestures are derived and analysed in a comparative evaluation. The comparative evaluation shows the suitability of the identified gestures and allows further refinement. Our results indicate that the proposed process, that includes validation of each design decision, improves the final results. By using the process to identify gestures for controlling music playback we not only show that the refined process can successfully be applied, but we also provide a consistent gesture set that can serve as a realistic benchmark for gesture recognition algorithms.  相似文献   
44.
We provide a Mathematica code for decomposing strongly correlated quantum states described by a first-quantized, analytical wave function into many-body Fock states. Within them, the single-particle occupations refer to the subset of Fock–Darwin functions with no nodes. Such states, commonly appearing in two-dimensional systems subjected to gauge fields, were first discussed in the context of quantum Hall physics and are nowadays very relevant in the field of ultracold quantum gases. As important examples, we explicitly apply our decomposition scheme to the prominent Laughlin and Pfaffian states. This allows for easily calculating the overlap between arbitrary states with these highly correlated test states, and thus provides a useful tool to classify correlated quantum systems. Furthermore, we can directly read off the angular momentum distribution of a state from its decomposition. Finally we make use of our code to calculate the normalization factors for Laughlin?s famous quasi-particle/quasi-hole excitations, from which we gain insight into the intriguing fractional behavior of these excitations.Program summaryProgram title: StrongdecoCatalogue identifier: AELA_v1_0Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AELA_v1_0.htmlProgram obtainable from: CPC Program Library, Queen?s University, Belfast, N. IrelandLicensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.htmlNo. of lines in distributed program, including test data, etc.: 5475No. of bytes in distributed program, including test data, etc.: 31 071Distribution format: tar.gzProgramming language: MathematicaComputer: Any computer on which Mathematica can be installedOperating system: Linux, Windows, MacClassification: 2.9Nature of problem: Analysis of strongly correlated quantum states.Solution method: The program makes use of the tools developed in Mathematica to deal with multivariate polynomials to decompose analytical strongly correlated states of bosons and fermions into a standard many-body basis. Operations with polynomials, determinants and permanents are the basic tools.Running time: The distributed notebook takes a couple of minutes to run.  相似文献   
45.
This study evaluates the implementation of physical coordination training (PCT) and cognitive behavioural training (CBTr) interventions in a randomised controlled trial at nine cleaners' workplaces. Female cleaners (n = 294) were randomised into a PCT, a CBTr or a reference (REF) group. Both 12-week interventions were performed in groups guided by an instructor. Records were kept on intervention dose (adherence) unanticipated events at the work place (context) and quality of intervention delivery (fidelity). Participant adherence was 37% in the PCT and 49% in the CBTr interventions. Optimal implementation was reached by only 6% in PCT and 42% in the CBTr. Analysis of the barriers to successful implementation indicated that the intervention process is sensitive to unanticipated events. In order to succeed in improving the health of high-risk populations such as cleaners and to correctly interpret intervention effects, more research on implementation is needed. Trial registration: ISRCTN96241850. PRACTITIONER SUMMARY: Both physical coordination training and cognitive behavioural training are potential effective workplace interventions among low educated job groups with high physical work demands. However, thorough consideration should be given to feasibility in the design of interventions. The optimal intervention should be tailored to closely match the implementation context and be robust and flexible to minimise susceptibility to changes in work organisation.  相似文献   
46.
47.
48.
Investigating the dynamical and physical properties of cosmic dust can reveal a great deal of information about both the dust and its many sources. Over recent years, several spacecraft (e.g., Cassini, Stardust, Galileo, and Ulysses) have successfully characterised interstellar, interplanetary, and circumplanetary dust using a variety of techniques, including in situ analyses and sample return. Charge, mass, and velocity measurements of the dust are performed either directly (induced charge signals) or indirectly (mass and velocity from impact ionisation signals or crater morphology) and constrain the dynamical parameters of the dust grains. Dust compositional information may be obtained via either time-of-flight mass spectrometry of the impact plasma or direct sample return. The accurate and reliable interpretation of collected spacecraft data requires a comprehensive programme of terrestrial instrument calibration. This process involves accelerating suitable solar system analogue dust particles to hypervelocity speeds in the laboratory, an activity performed at the Max Planck Institut fu?r Kernphysik in Heidelberg, Germany. Here, a 2 MV Van de Graaff accelerator electrostatically accelerates charged micron and submicron-sized dust particles to speeds up to 80 km s(-1). Recent advances in dust production and processing have allowed solar system analogue dust particles (silicates and other minerals) to be coated with a thin conductive shell, enabling them to be charged and accelerated. Refinements and upgrades to the beam line instrumentation and electronics now allow for the reliable selection of particles at velocities of 1-80 km s(-1) and with diameters of between 0.05 μm and 5 μm. This ability to select particles for subsequent impact studies based on their charges, masses, or velocities is provided by a particle selection unit (PSU). The PSU contains a field programmable gate array, capable of monitoring in real time the particles' speeds and charges, and is controlled remotely by a custom, platform independent, software package. The new control instrumentation and electronics, together with the wide range of accelerable particle types, allow the controlled investigation of hypervelocity impact phenomena across a hitherto unobtainable range of impact parameters.  相似文献   
49.
50.
Blockchain technology is destined to revolutionise supply chain processes. At the same time, governmental and regulatory policies are forcing firms to adjust their supply chains in response to environmental concerns. The objective of this study is therefore to develop a distributed ledger-based blockchain approach for monitoring supply chain performance and optimising both emission levels and operational costs in a synchronised fashion, producing a better outcome for the supply chain. We propose the blockchain approach for different production allocation problems within a multi-echelon supply chain (MESC) under a carbon taxation policy. As such, we couple recent advances in digitalisation of operations with increasingly stringent regulatory environmental policies. Specifically, with lead time considerations under emission rate constraints (imposed by a carbon taxation policy), we simultaneously consider the production, distribution and inventory control decisions in a production allocation-based MESC problem. The problem is then formulated as a Mixed Integer Non-Linear Programming (MINLP) model. We show that the distributed ledger-based blockchain approach minimises both total cost and carbon emissions. We then validate the feasibility of the proposed approach by comparing the results with a non-dominated sorting genetic algorithm (NSGA-II). The findings provide support for policymakers and supply chain executives alike.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号