首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   2078篇
  免费   223篇
  国内免费   2篇
电工技术   22篇
化学工业   591篇
金属工艺   60篇
机械仪表   71篇
建筑科学   88篇
矿业工程   2篇
能源动力   90篇
轻工业   393篇
水利工程   20篇
石油天然气   15篇
无线电   177篇
一般工业技术   354篇
冶金工业   168篇
原子能技术   16篇
自动化技术   236篇
  2024年   9篇
  2023年   25篇
  2022年   95篇
  2021年   116篇
  2020年   90篇
  2019年   83篇
  2018年   109篇
  2017年   111篇
  2016年   107篇
  2015年   98篇
  2014年   99篇
  2013年   160篇
  2012年   156篇
  2011年   142篇
  2010年   118篇
  2009年   112篇
  2008年   104篇
  2007年   69篇
  2006年   57篇
  2005年   57篇
  2004年   43篇
  2003年   28篇
  2002年   29篇
  2001年   24篇
  2000年   19篇
  1999年   19篇
  1998年   51篇
  1997年   37篇
  1996年   28篇
  1995年   22篇
  1994年   10篇
  1993年   9篇
  1992年   4篇
  1991年   10篇
  1990年   7篇
  1989年   5篇
  1988年   2篇
  1987年   3篇
  1986年   6篇
  1985年   6篇
  1984年   2篇
  1983年   1篇
  1982年   1篇
  1981年   5篇
  1980年   1篇
  1977年   3篇
  1976年   8篇
  1974年   1篇
  1966年   1篇
  1965年   1篇
排序方式: 共有2303条查询结果,搜索用时 15 毫秒
41.
The international planning competition (IPC) is an important driver for planning research. The general goals of the IPC include pushing the state of the art in planning technology by posing new scientific challenges, encouraging direct comparison of planning systems and techniques, developing and improving a common planning domain definition language, and designing new planning domains and problems for the research community. This paper focuses on the deterministic part of the fifth international planning competition (IPC5), presenting the language and benchmark domains that we developed for the competition, as well as a detailed experimental evaluation of the deterministic planners that entered IPC5, which helps to understand the state of the art in the field.We present an extension of pddl, called pddl3, allowing the user to express strong and soft constraints about the structure of the desired plans, as well as strong and soft problem goals. We discuss the expressive power of the new language focusing on the restricted version that was used in IPC5, for which we give some basic results about its compilability into pddl2. Moreover, we study the relative performance of the IPC5 planners in terms of solved problems, CPU time, and plan quality; we analyse their behaviour with respect to the winners of the previous competition; and we evaluate them in terms of their capability of dealing with soft goals and constraints, and of finding good quality plans in general. Overall, the results indicate significant progress in the field, but they also reveal that some important issues remain open and require further research, such as dealing with strong constraints and computing high quality plans in metric-time domains and domains involving soft goals or constraints.  相似文献   
42.
ABSTRACT

Gestural interaction devices emerged and originated various studies on multimodal human–computer interaction to improve user experience (UX). However, there is a knowledge gap regarding the use of these devices to enhance learning. We present an exploratory study which analysed the UX with a multimodal immersive videogame prototype, based on a Portuguese historical/cultural episode. Evaluation tests took place in high school environments and public videogaming events. Two users would be present simultaneously in the same virtual reality (VR) environment: one as the helmsman aboard Vasco da Gama’s fifteenth-century Portuguese ship and the other as the mythical Adamastor stone giant at the Cape of Good Hope. The helmsman player wore a VR headset to explore the environment, whereas the giant player used body motion to control the giant, and observed results on a screen, with no headset. This allowed a preliminary characterisation of UX, identifying challenges and potential use of these devices in multi-user virtual learning contexts. We also discuss the combined use of such devices, towards future development of similar systems, and its implications on learning improvement through multimodal human–computer interaction.  相似文献   
43.
Cloud computing systems handle large volumes of data by using almost unlimited computational resources, while spatial data warehouses (SDWs) are multidimensional databases that store huge volumes of both spatial data and conventional data. Cloud computing environments have been considered adequate to host voluminous databases, process analytical workloads and deliver database as a service, while spatial online analytical processing (spatial OLAP) queries issued over SDWs are intrinsically analytical. However, hosting a SDW in the cloud and processing spatial OLAP queries over such database impose novel obstacles. In this article, we introduce novel concepts as cloud SDW and spatial OLAP as a service, and afterwards detail the design of novel schemas for cloud SDW and spatial OLAP query processing over cloud SDW. Furthermore, we evaluate the performance to process spatial OLAP queries in cloud SDWs using our own query processor aided by a cloud spatial index. Moreover, we describe the cloud spatial bitmap index to improve the performance to process spatial OLAP queries in cloud SDWs, and assess it through an experimental evaluation. Results derived from our experiments revealed that such index was capable to reduce the query response time from 58.20 up to 98.89 %.  相似文献   
44.
Shuhong Gao (2003) [6] has proposed an efficient algorithm to factor a bivariate polynomial f over a field F. This algorithm is based on a simple partial differential equation and depends on a crucial fact: the dimension of the polynomial solution space G associated with this differential equation is equal to the number r of absolutely irreducible factors of f. However, this holds only when the characteristic of F is either zero or sufficiently large in terms of the degree of f. In this paper we characterize a vector subspace of G for which the dimension is r, regardless of the characteristic of F, and the properties of Gao’s construction hold. Moreover, we identify a second vector subspace of G that leads to an analogous theory for the rational factorization of f.  相似文献   
45.
46.
The code STATFLUX, implementing a new and simple statistical procedure for the calculation of transfer coefficients in radionuclide transport to animals and plants, is proposed. The method is based on the general multiple-compartment model, which uses a system of linear equations involving geometrical volume considerations. Flow parameters were estimated by employing two different least-squares procedures: Derivative and Gauss-Marquardt methods, with the available experimental data of radionuclide concentrations as the input functions of time. The solution of the inverse problem, which relates a given set of flow parameter with the time evolution of concentration functions, is achieved via a Monte Carlo simulation procedure.

Program summary

Title of program:STATFLUXCatalogue identifier:ADYS_v1_0Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADYS_v1_0Program obtainable from: CPC Program Library, Queen's University of Belfast, N. IrelandLicensing provisions: noneComputer for which the program is designed and others on which it has been tested:Micro-computer with Intel Pentium III, 3.0 GHzInstallation:Laboratory of Linear Accelerator, Department of Experimental Physics, University of São Paulo, BrazilOperating system:Windows 2000 and Windows XPProgramming language used:Fortran-77 as implemented in Microsoft Fortran 4.0. NOTE: Microsoft Fortran includes non-standard features which are used in this program. Standard Fortran compilers such as, g77, f77, ifort and NAG95, are not able to compile the code and therefore it has not been possible for the CPC Program Library to test the program.Memory required to execute with typical data:8 Mbytes of RAM memory and 100 MB of Hard disk memoryNo. of bits in a word:16No. of lines in distributed program, including test data, etc.:6912No. of bytes in distributed program, including test data, etc.:229 541Distribution format:tar.gzNature of the physical problem:The investigation of transport mechanisms for radioactive substances, through environmental pathways, is very important for radiological protection of populations. One such pathway, associated with the food chain, is the grass-animal-man sequence. The distribution of trace elements in humans and laboratory animals has been intensively studied over the past 60 years [R.C. Pendlenton, C.W. Mays, R.D. Lloyd, A.L. Brooks, Differential accumulation of iodine-131 from local fallout in people and milk, Health Phys. 9 (1963) 1253-1262]. In addition, investigations on the incidence of cancer in humans, and a possible causal relationship to radioactive fallout, have been undertaken [E.S. Weiss, M.L. Rallison, W.T. London, W.T. Carlyle Thompson, Thyroid nodularity in southwestern Utah school children exposed to fallout radiation, Amer. J. Public Health 61 (1971) 241-249; M.L. Rallison, B.M. Dobyns, F.R. Keating, J.E. Rall, F.H. Tyler, Thyroid diseases in children, Amer. J. Med. 56 (1974) 457-463; J.L. Lyon, M.R. Klauber, J.W. Gardner, K.S. Udall, Childhood leukemia associated with fallout from nuclear testing, N. Engl. J. Med. 300 (1979) 397-402]. From the pathways of entry of radionuclides in the human (or animal) body, ingestion is the most important because it is closely related to life-long alimentary (or dietary) habits. Those radionuclides which are able to enter the living cells by either metabolic or other processes give rise to localized doses which can be very high. The evaluation of these internally localized doses is of paramount importance for the assessment of radiobiological risks and radiological protection. The time behavior of trace concentration in organs is the principal input for prediction of internal doses after acute or chronic exposure. The General Multiple-Compartment Model (GMCM) is the powerful and more accepted method for biokinetical studies, which allows the calculation of concentration of trace elements in organs as a function of time, when the flow parameters of the model are known. However, few biokinetics data exist in the literature, and the determination of flow and transfer parameters by statistical fitting for each system is an open problem.Restriction on the complexity of the problem:This version of the code works with the constant volume approximation, which is valid for many situations where the biological half-live of a trace is lower than the volume rise time. Another restriction is related to the central flux model. The model considered in the code assumes that exist one central compartment (e.g., blood) that connect the flow with all compartments, and the flow between other compartments is not included.Typical running time:Depends on the choice for calculations. Using the Derivative Method the time is very short (a few minutes) for any number of compartments considered. When the Gauss-Marquardt iterative method is used the calculation time can be approximately 5-6 hours when ∼15 compartments are considered.  相似文献   
47.
Recently, a number of empirical studies have compared the performance of PCA and ICA as feature extraction methods in appearance-based object recognition systems, with mixed and seemingly contradictory results. In this paper, we briefly describe the connection between the two methods and argue that whitened PCA may yield identical results to ICA in some cases. Furthermore, we describe the specific situations in which ICA might significantly improve on PCA  相似文献   
48.
The aim of this study is to show histological and immunofluorescence analysis of renal parenchyma of agoutis affected by gentamicin‐induced renal disease after the infusion of bone marrow mononuclear cells (BMMC) stained with Hoechst®. Nine agouti's males were divided into three groups: Test group (TG): renal disease by gentamicin induced (n = 3), cell therapy group (CTG): renal disease by gentamicin induced and BMMC infusion (n = 3), and control group (CG): nonrenal disease and BMMC infusion (n = 3). TG and CTG were submitted to the protocol of renal disease induction using weekly application of gentamicin sulfate for 4 months. CG and CTG received a 1 × 108 BMMC stained with Hoechst and were euthanized for kidney examination 21 days after BMMC injection and samples were collected for histology and immunofluorescence analysis. Histological analysis demonstrated typical interstitial lesions in kidney similarly to human disease, as tubular necrosis, glomerular destruction, atrophy tubular, fibrotic areas, and collagen deposition. We conclude that histological analysis suggest a positive application of agouti's as a model for a gentamicin inducing of kidney disease, beyond the immunofluorescence analysis suggest a significant migration of BMMC to sites of renal injury in CTG. Microsc. Res. Tech., 2012. © 2011 Wiley Periodicals, Inc.  相似文献   
49.
Atomic broadcast is a fundamental problem of distributed systems: It states that messages must be delivered in the same order to their destination processes. This paper describes a solution to this problem in asynchronous distributed systems in which processes can crash and recover. A consensus-based solution to atomic broadcast problem has been designed by Chandra and Toueg for asynchronous distributed systems where crashed processes do not recover. We extend this approach: it transforms any consensus protocol suited to the crash-recovery model into an atomic broadcast protocol suited to the same model. We show that atomic broadcast can be implemented requiring few additional log operations in excess of those required by the consensus. The paper also discusses how additional log operations can improve the protocol in terms of faster recovery and better throughput. To illustrate the use of the protocol, the paper also describes a solution to the replica management problem in asynchronous distributed systems in which processes can crash and recover. The proposed technique makes a bridge between established results on weighted voting and recent results on the consensus problem.  相似文献   
50.
Format dependence implies that assessment of the same subjective probability distribution produces different conclusions about over- or underconfidence depending on the assessment format. In 2 experiments, the authors demonstrate that the overconfidence bias that occurs when participants produce intervals for an uncertain quantity is almost abolished when they evaluate the probability that the same intervals include the quantity. The authors successfully apply a method for adaptive adjustment of probability intervals as a debiasing tool and discuss a tentative explanation in terms of a naive sampling model. According to this view, people report their experiences accurately, but they are naive in that they treat both sample proportion and sample dispersion as unbiased estimators, yielding small bias in probability evaluation but strong bias in interval production. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号