首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   2519篇
  免费   11篇
电工技术   22篇
综合类   1篇
化学工业   139篇
金属工艺   6篇
机械仪表   17篇
建筑科学   38篇
矿业工程   2篇
能源动力   16篇
轻工业   94篇
水利工程   7篇
石油天然气   8篇
无线电   91篇
一般工业技术   116篇
冶金工业   1861篇
原子能技术   5篇
自动化技术   107篇
  2022年   6篇
  2021年   8篇
  2019年   8篇
  2018年   8篇
  2017年   8篇
  2016年   7篇
  2014年   13篇
  2013年   42篇
  2012年   16篇
  2011年   23篇
  2010年   24篇
  2009年   15篇
  2008年   32篇
  2007年   35篇
  2006年   35篇
  2005年   19篇
  2004年   23篇
  2003年   22篇
  2002年   17篇
  2001年   17篇
  2000年   22篇
  1999年   78篇
  1998年   664篇
  1997年   354篇
  1996年   221篇
  1995年   116篇
  1994年   97篇
  1993年   135篇
  1992年   23篇
  1991年   25篇
  1990年   30篇
  1989年   21篇
  1988年   25篇
  1987年   16篇
  1986年   23篇
  1985年   21篇
  1983年   10篇
  1982年   10篇
  1981年   14篇
  1980年   20篇
  1979年   9篇
  1978年   15篇
  1977年   38篇
  1976年   71篇
  1974年   8篇
  1973年   11篇
  1972年   5篇
  1967年   7篇
  1966年   6篇
  1965年   8篇
排序方式: 共有2530条查询结果,搜索用时 15 毫秒
91.
The accuracy of optical flow estimation algorithms has been improving steadily as evidenced by results on the Middlebury optical flow benchmark. The typical formulation, however, has changed little since the work of Horn and Schunck. We attempt to uncover what has made recent advances possible through a thorough analysis of how the objective function, the optimization method, and modern implementation practices influence accuracy. We discover that “classical” flow formulations perform surprisingly well when combined with modern optimization and implementation techniques. One key implementation detail is the median filtering of intermediate flow fields during optimization. While this improves the robustness of classical methods it actually leads to higher energy solutions, meaning that these methods are not optimizing the original objective function. To understand the principles behind this phenomenon, we derive a new objective function that formalizes the median filtering heuristic. This objective function includes a non-local smoothness term that robustly integrates flow estimates over large spatial neighborhoods. By modifying this new term to include information about flow and image boundaries we develop a method that can better preserve motion details. To take advantage of the trend towards video in wide-screen format, we further introduce an asymmetric pyramid downsampling scheme that enables the estimation of longer range horizontal motions. The methods are evaluated on the Middlebury, MPI Sintel, and KITTI datasets using the same parameter settings.  相似文献   
92.

Implementing information and communications technology (ICT) at scale requires evaluation processes to capture the impact on users as well as the infrastructure into which it is being introduced. For older adults living with cognitive impairment, this requires evaluation that can accommodate different levels of cognitive impairment, alongside input from family and formal caregivers, plus stakeholder organisations. The European Horizon 2020 project INdependent LIving support Functions for the Elderly (IN LIFE) set out to integrate 17 technologies into a single digital platform for older people living with cognitive impairment plus their families, care providers and stakeholders. The IN LIFE evaluation took place across six national pilot sites to examine a number of variables including impact on the users, user acceptance of the individual services and the overall platform, plus the economic case for the IN LIFE platform. The results confirmed the interest and need among older adults, family caregivers, formal caregivers and stakeholders, for information and communications technology (ICT). Relative to the baseline, quality of life improved and cognition stabilised; however, there was an overall reluctance to pay for the platform. The findings provide insights into existing barriers and challenges for adoption of ICT for older people living with cognitive impairment.

  相似文献   
93.
Processor Design in 3D Die-Stacking Technologies   总被引:4,自引:0,他引:4  
Three-dimensional integration is an emerging fabrication technology that vertically stacks multiple integrated chips. The benefits include an increase in device density; much greater flexibility in routing signals, power, and clock; the ability to integrate disparate technologies; and the potential for new 3D circuit and microarchitecture organizations. This article provides a technical introduction to the technology and its impact on processor design. Although our discussions here primarily focus on high-performance processor design, most of the observations and conclusions apply to other microprocessor market segments.  相似文献   
94.
The need for thermophysical properties of components and their mixtures has grown as computer simulation of processes has developed and expanded. Although equations of state require fewer input data, they are not yet generally applicable to all types of systems. Accordingly, in many cases, the liquid activity models are still very much required. A long-time disadvantage of the liquid activity method, for systems containing supercritical components, is overcome if the Henry constant is utilized. A van Laar-type interpolative equation provides the Henry constant in liquid mixtures from the values in the pure liquid components. The addition of a ternary interaction in addition to the usual binary ones provides improved MVL prediction of phase equilibria, espcially VLLE involving three phases. Examination of the consistency of thermal properties is made feasible with the aid of a generalized reduced Frost-Kalkwarf vapor pressure equation. It is useful also for extending and supplementing sparse data and for predicting properties from the structure and boiling point. Possible trends in properties needed and their availability to simulators are discussed in view of available computer facilities.Invited paper presented at the Ninth Symposium on Thermophysical Properties, June 24–27, 1985, Boulder, Colorado, U.S.A.  相似文献   
95.
A high-resolution hard x-ray microscope is described. This system is capable of detecting line features as small as 0.6 µm in width, and resolving line pairs 1.2-µm wide and 1.2-µm apart. Three types of two-dimensional image detectors are discussed and compared for use with hard x rays in high resolution. Principles of x-ray image magnification are discussed based on x-ray optics and diffraction physics. Examples of applications are shown in microradiography with fiber reinforced composite materials (SiC in Ti3Al Nb) and in diffraction imaging (topography) with device patterns on a silicon single crystal. High-resolution tomography has now become a reality.  相似文献   
96.
97.
A ragged array is an irregularly shaped data structure that is an extremely convenient and natural means of implementing storage schemes that exploit the symmetry and sparsity of the different stiffness matrices involved in the finite-element method. Ragged arrays have the potential for improving the programmer’s productivity as well as enhancing code maintainability. Additionally, no performance degradation was detected when ragged arrays were used; the performance of the Gauss elimination procedure, implemented in C++ using ragged arrays, was comparable to the performance of the same procedure implemented in FORTRAN using traditional data structures.  相似文献   
98.
The genes of the trithorax group (trxG) in Drosophila melanogaster are required to maintain the pattern of homeotic gene expression that is established early in embryogenesis by the transient expression of the segmentation genes. The precise role of each of the diverse trxG members and the functional relationships among them are not well understood. Here, we report on the isolation of the trxG gene moira (mor) and its molecular characterization. mor encodes a fruit fly homolog of the human and yeast chromatin-remodeling factors BAF170, BAF155, and SWI3. mor is widely expressed throughout development, and its 170-kDa protein product is present in many embryonic tissues. In vitro, MOR can bind to itself and it interacts with Brahma (BRM), an SWI2-SNF2 homolog, with which it is associated in embryonic nuclear extracts. The leucine zipper motif of MOR is likely to participate in self-oligomerization; the equally conserved SANT domain, for which no function is known, may be required for optimal binding to BRM. MOR thus joins BRM and Snf5-related 1 (SNR1), two known Drosophila SWI-SNF subunits that act as positive regulators of the homeotic genes. These observations provide a molecular explanation for the phenotypic and genetic relationships among several of the trxG genes by suggesting that they encode evolutionarily conserved components of a chromatin-remodeling complex.  相似文献   
99.
Validation studies are a crucial requirement before implementation of new genetic typing systems for clinical diagnostics or forensic identity. Two different fluorescence-based multiplex DNA profiling systems composed of amelogenin, HumD21S11 and HumFGA (referred to as multiplex 1A), and HumD3S1358, HumD21S11 and HumFGA (multiplex 1B) have been evaluated for use in forensic identification using the Applied Biosystems Model 373A and Prism 377 DNA Sequencers, respectively. Experiments were aimed at defining the limit of target DNA required for reliable profiling, the level of degradation that would still permit amplification of the short tandem repeat (STR) loci examined, and the robustness of each locus in the multiplexes after samples were exposed to environmental insults. In addition, the specificity of the multiplexes was demonstrated using nonhuman DNAs. Forensically relevant samples such as cigarette butts, chewing gum, fingernails and envelope flaps were processed using both an organic extraction procedure and a QIAamp protocol. DNAs and resultant multiplex STR profiles were compared. The validation of the triplex STR systems was extended to include over 140 nonprobative casework specimens and was followed with a close monitoring of initial casework (over 300 exhibits). Our results document the robustness of these multiplex STR profiling systems which, when combined with other multiplex systems, could provide a power of discrimination of approximately 0.9999.  相似文献   
100.
The objective of this study was to validate retrospective caregiver interviews for diagnosing major causes of severe neonatal illness and death. A convenience sample of 149 infants aged < 28 days with one or more suspected diagnoses of interest (low birthweight/severe malnutrition, preterm birth, birth asphyxia, birth trauma, neonatal tetanus, pneumonia, meningitis, septicaemia, diarrhoea, congenital malformation or injury) was taken from patients admitted to two hospitals in Dhaka, Bangladesh. Study paediatricians performed a standardised history and physical examination and ordered laboratory and radiographic tests according to study criteria. With a median interval of 64.5 days after death or hospital discharge, caregivers of 118 (79%) infants were interviewed about their child's illness. Using reference diagnoses based on predefined clinical and laboratory criteria, the sensitivity and specificity of particular combinations of signs (algorithms) reported by the caregivers were ascertained. Sufficient numbers of children with five reference standard diagnoses were studied to validate caregiver reports. Algorithms with sensitivity and specificity > 80% were identified for neonatal tetanus, low birthweight/severe malnutrition and preterm delivery. Algorithms with specificities > 80% for birth asphyxia and pneumonia had sensitivities < 70%, or alternatively had high sensitivity with lower specificity. In settings with limited access to medical care, retrospective caregiver interviews provide a valid means of diagnosing several of the most common causes of severe neonatal illness and death.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号