首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   2269篇
  免费   76篇
  国内免费   6篇
电工技术   24篇
综合类   1篇
化学工业   378篇
金属工艺   42篇
机械仪表   87篇
建筑科学   133篇
矿业工程   5篇
能源动力   38篇
轻工业   221篇
水利工程   22篇
石油天然气   11篇
武器工业   3篇
无线电   145篇
一般工业技术   364篇
冶金工业   511篇
原子能技术   9篇
自动化技术   357篇
  2022年   23篇
  2021年   25篇
  2020年   16篇
  2019年   32篇
  2018年   40篇
  2017年   34篇
  2016年   38篇
  2015年   30篇
  2014年   56篇
  2013年   147篇
  2012年   60篇
  2011年   106篇
  2010年   75篇
  2009年   72篇
  2008年   102篇
  2007年   96篇
  2006年   77篇
  2005年   73篇
  2004年   56篇
  2003年   67篇
  2002年   54篇
  2001年   39篇
  2000年   38篇
  1999年   43篇
  1998年   109篇
  1997年   86篇
  1996年   78篇
  1995年   41篇
  1994年   51篇
  1993年   42篇
  1992年   23篇
  1991年   19篇
  1990年   26篇
  1989年   14篇
  1988年   24篇
  1987年   31篇
  1986年   27篇
  1985年   27篇
  1984年   28篇
  1983年   21篇
  1982年   17篇
  1981年   24篇
  1980年   23篇
  1979年   27篇
  1978年   19篇
  1977年   23篇
  1976年   31篇
  1975年   19篇
  1974年   15篇
  1973年   19篇
排序方式: 共有2351条查询结果,搜索用时 31 毫秒
991.
In a twin study using direct behavioral observation of parent–child interaction, as well as ratings and experimental measures, the question of the differential treatment by parents of monozygotic (MZ) and dizygotic (DZ) twins was investigated. Data were obtained from 17 MZ and 29 DZ male twin pairs and 44 male singletons, all aged 2? yrs. Four separate approaches, taken together, led to the conclusions that (a) parents do treat MZ twins more alike than DZ twins in some respects; but (b) they do not introduce systematically greater similarity of treatment for MZ twins in actions which they initiate themselves; and (c) the greater homogeneity of treatment of MZ twins, where it occurs, is in line with their actual, rather than their perceived, zygosity. In other words, parents respond to, rather than create, differences between the twins. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   
992.
The authors propose a new approach for tracking the deformation of the left-ventricular (LV) myocardium from two-dimensional (2-D) magnetic resonance (MR) phase contrast velocity fields. The use of phase contrast MR velocity data in cardiac motion problems has been introduced by others (N.J. Pelc et al., 1991) and shown to be potentially useful for tracking discrete tissue elements, and therefore, characterizing LV motion. However, the authors show here that these velocity data: 1) are extremely noisy near the LV borders; and 2) cannot alone be used to estimate the motion and the deformation of the entire myocardium due to noise in the velocity fields. In this new approach, the authors use the natural spatial constraints of the endocardial and epicardial contours, detected semiautomatically in each image frame, to help remove noisy velocity vectors at the LV contours. The information from both the boundaries and the phase contrast velocity data is then integrated into a deforming mesh that is placed over the myocardium at one time frame and then tracked over the entire cardiac cycle. The deformation is guided by a Kalman filter that provides a compromise between 1) believing the dense field velocity and the contour data when it is crisp and coherent in a local spatial and temporal sense and 2) employing a temporally smooth cyclic model of cardiac motion when contour and velocity data are not trustworthy. The Kalman filter is particularly well suited to this task as it produces an optimal estimate of the left ventricle's kinematics (in the sense that the error is statistically minimized) given incomplete and noise corrupted data, and given a basic dynamical model of the left ventricle. The method has been evaluated with simulated data; the average error between tracked nodes and theoretical position was 1.8% of the total path length. The algorithm has also been evaluated with phantom data; the average error was 4.4% of the total path length. The authors show that in their initial tests with phantoms that the new approach shows small, but concrete improvements over previous techniques that used primarily phase contrast velocity data alone. They feel that these improvements will be amplified greatly as they move to direct comparisons in in vivo and three-dimensional (3-D) datasets.  相似文献   
993.
An approach to analyzing and quantifying the shape characteristics of the endocardial contour of the left ventricle of the heart is described. The computation begins by finding the local curvature differences between the contour under consideration and the mean normal contour at each of 100 equidistant points. The weighted square of these differences, summed over a set of points, is shown to be the regional or, global bending energy required to deform the mean normal contour to the characteristic shape of the analyzed contour. Resampling, smoothing and curvature computation issues are considered for the image-derived digital contours that are used in the analysis. Experiments were performed on artificial contour data and data derived from contrast ventriculographic (CV) studies of humans. It is also shown that the method has been adapted to measure endocardial shape form equilibrium radionuclide angiocardiography.  相似文献   
994.
In this paper we consider the problem of deadlock-free routing in arbitrary parallel and distributed computers. We focus on asynchronous routing algorithms which continuously receive new packets to route and which do not discard packets that encounter congestion. Specifically, we examine what we call the deadlock-free routing (DFR ) problem. The input to the DFR problem consists of an arbitrary network and an arbitrary set of paths in the network. The output consists of a routing algorithm, which is a list of the buffers used along each of the paths. The routing algorithm is required to be free from deadlock and the goal is to minimize the number of buffers required in any one node. We study the DFR problem by converting it into an equivalent problem which we call the flattest common supersequence (FCS ) problem. The input to the FCS problem consists of a set of sequences and the output consists of a single sequence that contains all of the input sequences as (possibly noncontiguous) subsequences. The goal of the FCS problem is to minimize the maximum frequency of any symbol in the output sequence. We present three main results. First, we prove that the decision version of the FCS problem is NP-complete, and has no polynomial-time approximation scheme unless P= NP . An alternative proof is presented which shows that unlike the shortest common supersequence (SCS) problem, the FCS problem is still NP-complete for two input sequences. This implies that approximation algorithms for FCS based on an exact pairwise merge are not possible. Next, we propose and experimentally evaluate a range of heuristics for FCS. Our experimental results show that one of these heuristics performs very well over a wide range of inputs. Lastly, we prove that this heuristic is in fact optimal for certain restricted classes of inputs. Online publication November 27, 2000.  相似文献   
995.
Humans and robots need to exchange information if the objective is to achieve a task collaboratively. Two questions are considered in this paper: what and when to communicate. To answer these questions, we developed a human–robot communication framework which makes use of common probabilistic robotics representations. The data stored in the representation determines what to communicate, and probabilistic inference mechanisms determine when to communicate. One application domain of the framework is collaborative human–robot decision making: robots use decision theory to select actions based on perceptual information gathered from their sensors and human operators. In this paper, operators are regarded as remotely located, valuable information sources which need to be managed carefully. Robots decide when to query operators using Value-Of-Information theory, i.e. humans are only queried if the expected benefit of their observation exceeds the cost of obtaining it. This can be seen as a mechanism for adjustable autonomy whereby adjustments are triggered at run-time based on the uncertainty in the robots’ beliefs related to their task. This semi-autonomous system is demonstrated using a navigation task and evaluated by a user study. Participants navigated a robot in simulation using the proposed system and via classical teleoperation. Results show that our system has a number of advantages over teleoperation with respect to performance, operator workload, usability, and the users’ perception of the robot. We also show that despite these advantages, teleoperation may still be a preferable driving mode depending on the mission priorities.  相似文献   
996.
Despite the many potential benefits to its users, social networking appears to provide a rich setting for criminal activities and other misdeeds. In this paper we consider whether the risks of social networking are unique and novel to this context. Having considered the nature and range of applications to which social networks may be applied, we conclude that there are no exploits or fundamental threats inherent to the social networking setting. Rather, the risks and associated threats treat this communicative and social context as an enabler for existing, long established and well-recognised exploits and activities.  相似文献   
997.
Public sector organizations (city authorities) have begun to explore ways to exploit big data to provide smarter solutions for cities. The way organizations learn to use new forms of technology has been widely researched. However, many public sector organisations have found themselves in new territory in trying to deploy and integrate this new form of technology (big data) to another fast moving and relatively new concept (smart city). This paper is a cross-sectional scoping study—from two UK smart city initiatives—on the learning processes experienced by elite (top management) stakeholders in the advent and adoption of these two novel concepts. The findings are an experiential narrative account on learning to exploit big data to address issues by developing solutions through smart city initiatives. The findings revealed a set of moves in relation to the exploration and exploitation of big data through smart city initiatives: (a) knowledge finding; (b) knowledge reframing; (c) inter-organization collaborations and (d) ex-post evaluations. Even though this is a time-sensitive scoping study it gives an account on a current state-of-play on the use of big data in public sector organizations for creating smarter cities. This study has implications for practitioners in the smart city domain and contributes to academia by operationalizing and adapting Crossan et al’s (Acad Manag Rev 24(3): 522–537, 1999) 4I model on organizational learning.  相似文献   
998.
Urban poverty is a complex socio-economic problem. The expected doubling of the urban population relative to rural areas by 2050 without a corresponding economic and infrastructure growth will worsen the problem, especially in emerging economies. Poor urban residents face rising unemployment and underemployment, constrained access to financial services, market exploitation, poor housing, crime, unsatisfactory health services and scant education opportunities. Several players have attempted to address these problems through information and communication technologies. This paper isolated a few of these to determine critical success factors on the economic empowerment front.  相似文献   
999.
The Cerrados of central Brazil have undergone profound landscape transformation in recent decades due to agricultural expansion, and this remains poorly assessed. The present research investigates the spatial-temporal rates and patterns of land-use and land-cover (LULC) changes in one of the main areas of agricultural production in Mato Grosso State (Brazil), the region of Primavera do Leste. To quantify the different aspects of LULC changes (e.g. rates, types, and spatial patterns) in this region, we applied a post-classification change detection method, complemented with landscape metrics, for three dates (1985, 1995, and 2005). LULC maps were obtained from an object-based classification approach, using the nearest neighbour (NN) classifier and a multi-source data set for image object classification (e.g. seasonal Thematic Mapper (TM) bands, digital elevation model (DEM), and a Moderate Resolution Imaging Spectroradiometer (MODIS)-derived index), strategically chosen to increase class separability. The results provided an improved mapping of the Cerrados natural vegetation conversion into crops and pasture once auxiliary data were incorporated into the classification data set. Moreover, image segmentation was crucial for LULC map quality, in particular because of crop size and shape. The changes detected point towards increasing loss and fragmentation of natural vegetation and high rates of crop expansion. Between 1985 and 2005, approximately 42% (6491 km2) of Cerrados in the study area were converted to agricultural land uses. In addition, it was verified that cultivated areas are encroaching into fragile environments such as wetlands, which indicates the intense pressure of agricultural expansion on the environment.  相似文献   
1000.
A satellite data set for tropical forest area change assessment   总被引:1,自引:0,他引:1  
A database of largely cloud-free (less than 2.5% of all sites have more than 5% cloud cover), geo-referenced 20 km?×?20 km sample sites of 30 m resolution optical satellite imagery have been prepared for the 1990 and 2000 epochs. This spans the tropics with a systematic sample located at the degree confluence points of the geographic grid. The resulting 4016 sample pairs are to be used to measure changes in the area of forest cover between the two epochs. The primary data source was the National Aeronautics and Space Administration's (NASA's) global land survey (GLS) data sets. Visual screening of GLS images at all 4016 confluence points from each date identified 2868 suitable pairs where no better alternatives exist (71.6% of the sample). Better alternatives could be found for 26.6% of the sample, substituting cloudy or missing GLS data sets at one or the other epoch or both (GLS-1990 or GLS-2000). Gaps were filled from the United States Geological Survey (USGS) Landsat archives (1070 samples), data from other Landsat archives (53 samples) or with alternatives to Landsat, that is, 15 samples from Satellite Pour l'Observation de la Terre (SPOT). This increased the effective number of sample pairs to 3945 representing 98% of all target samples. No suitable image pairs were found for 71 confluence points, which were not randomly distributed, but mostly concentrated in the Congo basin, where around 15% of the region remains un-sampled. Variations in date of image acquisition and geometric fidelity are documented. Results highlight the importance of combining systematic data-processing schemes with targeted image acquisition and archiving strategies for global scale applications such as deforestation monitoring and shows that by replacing cloudy or missing GLS data with alternative imagery, the overall coverage of the sample sites within the ecological zones ‘Tropical rainforest’ and ‘Tropical mountain system’ can be improved by 16%.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号