首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   5547篇
  免费   268篇
  国内免费   21篇
电工技术   59篇
综合类   4篇
化学工业   1032篇
金属工艺   113篇
机械仪表   129篇
建筑科学   237篇
矿业工程   1篇
能源动力   150篇
轻工业   317篇
水利工程   47篇
石油天然气   15篇
无线电   788篇
一般工业技术   1122篇
冶金工业   775篇
原子能技术   39篇
自动化技术   1008篇
  2023年   36篇
  2022年   97篇
  2021年   121篇
  2020年   97篇
  2019年   93篇
  2018年   145篇
  2017年   143篇
  2016年   149篇
  2015年   126篇
  2014年   183篇
  2013年   315篇
  2012年   282篇
  2011年   368篇
  2010年   249篇
  2009年   306篇
  2008年   304篇
  2007年   282篇
  2006年   210篇
  2005年   173篇
  2004年   158篇
  2003年   145篇
  2002年   156篇
  2001年   96篇
  2000年   95篇
  1999年   112篇
  1998年   263篇
  1997年   165篇
  1996年   118篇
  1995年   72篇
  1994年   62篇
  1993年   61篇
  1992年   69篇
  1991年   52篇
  1990年   37篇
  1989年   41篇
  1988年   35篇
  1987年   27篇
  1986年   36篇
  1985年   32篇
  1984年   27篇
  1983年   27篇
  1982年   31篇
  1981年   21篇
  1980年   32篇
  1979年   22篇
  1977年   20篇
  1976年   40篇
  1974年   16篇
  1973年   13篇
  1972年   13篇
排序方式: 共有5836条查询结果,搜索用时 31 毫秒
71.
Floridi and Taddeo propose a condition of ‘zero semantic commitment’ for solutions to the grounding problem, and a solution to it. I argue briefly that their condition cannot be fulfilled, not even by their own solution. After a look at Luc Steel's very different competing suggestion, I suggest that we need to re-think what the problem is and what role the ‘goals’ in a system play in formulating the problem. On the basis of a proper (syntactic) understanding of computing, I come to the conclusion that the only sensible grounding problem is how we can explain and re-produce the behavioural ability and function of meaning in artificial computational agents.  相似文献   
72.
73.
In this paper, we consider a large-scale evacuation problem after a major disaster. The evacuation is assumed to occur by means of a fleet of buses, thus leading to scheduling the evacuation operations by buses [(bus evacuation problem (BEP)]. We propose time-indexed formulations as well as heuristic algorithms such as greedy algorithms and a matheuristic. This matheuristic uses the former formulation to improve the best solution obtained by the greedy heuristics. In computational experiments, we analyze and evaluate the efficiency of the proposed solution algorithms.  相似文献   
74.
There has been a growing interest in applying human computation – particularly crowdsourcing techniques – to assist in the solution of multimedia, image processing, and computer vision problems which are still too difficult to solve using fully automatic algorithms, and yet relatively easy for humans. In this paper we focus on a specific problem – object segmentation within color images – and compare different solutions which combine color image segmentation algorithms with human efforts, either in the form of an explicit interactive segmentation task or through an implicit collection of valuable human traces with a game. We use Click’n’Cut, a friendly, web-based, interactive segmentation tool that allows segmentation tasks to be assigned to many users, and Ask’nSeek, a game with a purpose designed for object detection and segmentation. The two main contributions of this paper are: (i) We use the results of Click’n’Cut campaigns with different groups of users to examine and quantify the crowdsourcing loss incurred when an interactive segmentation task is assigned to paid crowd-workers, comparing their results to the ones obtained when computer vision experts are asked to perform the same tasks. (ii) Since interactive segmentation tasks are inherently tedious and prone to fatigue, we compare the quality of the results obtained with Click’n’Cut with the ones obtained using a (fun, interactive, and potentially less tedious) game designed for the same purpose. We call this contribution the assessment of the gamification loss, since it refers to how much quality of segmentation results may be lost when we switch to a game-based approach to the same task. We demonstrate that the crowdsourcing loss is significant when using all the data points from workers, but decreases substantially (and becomes comparable to the quality of expert users performing similar tasks) after performing a modest amount of data analysis and filtering out of users whose data are clearly not useful. We also show that – on the other hand – the gamification loss is significantly more severe: the quality of the results drops roughly by half when switching from a focused (yet tedious) task to a more fun and relaxed game environment.  相似文献   
75.
76.
Type 2 diabetes mellitus (T2DM) is an important risk factor for cardiovascular disease (CVD)—the leading cause of death in the United States. Yet not all subjects with T2DM are at equal risk for CVD complications; the challenge lies in identifying those at greatest risk. Therapies directed toward treating conventional risk factors have failed to significantly reduce this residual risk in T2DM patients. Thus newer targets and markers are needed for the development and testing of novel therapies. Herein we review two complementary MS-based approaches—mass spectrometric immunoassay (MSIA) and MS/MS as MRM—for the analysis of plasma proteins and PTMs of relevance to T2DM and CVD. Together, these complementary approaches allow for high-throughput monitoring of many PTMs and the absolute quantification of proteins near the low picomolar range. In this review article, we discuss the clinical relevance of the high density lipoprotein (HDL) proteome and Apolipoprotein A-I PTMs to T2DM and CVD as well as provide illustrative MSIA and MRM data on HDL proteins from T2DM patients to provide examples of how these MS approaches can be applied to gain new insight regarding cardiovascular risk factors. Also discussed are the reproducibility, interpretation, and limitations of each technique with an emphasis on their capacities to facilitate the translation of new biomarkers into clinical practice.  相似文献   
77.
METIS-II was an EU-FET MT project running from October 2004 to September 2007, which aimed at translating free text input without resorting to parallel corpora. The idea was to use “basic” linguistic tools and representations and to link them with patterns and statistics from the monolingual target-language corpus. The METIS-II project has four partners, translating from their “home” languages Greek, Dutch, German, and Spanish into English. The paper outlines the basic ideas of the project, their implementation, the resources used, and the results obtained. It also gives examples of how METIS-II has continued beyond its lifetime and the original scope of the project. On the basis of the results and experiences obtained, we believe that the approach is promising and offers the potential for development in various directions.  相似文献   
78.
陈文雄 《程序员》2009,(9):90-93
通过什么样的规则来安排哪个广告给哪个关键字,才能最大化当天的收益呢?此问题可以抽象成"On-line带权二部图最大匹配问题"。  相似文献   
79.
In this paper, we propose a distributed congestion-aware channel assignment (DCACA) algorithm for multi-channel wireless mesh networks (MC–WMNs). The frequency channels are assigned according to the congestion measures which indicate the congestion status at each link. Depending on the selected congestion measure (e.g., queueing delay, packet loss probability, and differential backlog), various design objectives can be achieved. Our proposed distributed algorithm is simple to implement as it only requires each node to perform a local search. Unlike most of the previous channel assignment schemes, our proposed algorithm assigns not only the non-overlapped (i.e., orthogonal) frequency channels, but also the partially-overlapped channels. In this regard, we introduce the channel overlapping and mutual interference matrices which model the frequency overlapping among different channels. Simulation results show that in the presence of elastic traffic (e.g., TCP Vegas or TCP Reno) sources, our proposed DCACA algorithm increases the aggregate throughput and also decreases the average packet round-trip compared with the previously proposed Load-Aware channel assignment algorithm. Furthermore, in a congested IEEE 802.11b network setting, compared with the use of three non-overlapped channels, the aggregate network throughput can further be increased by 25% and the average round-trip time can be reduced by more than one half when all the 11 partially-overlapped channels are used.  相似文献   
80.
The aim of this study was to reproduce the delayed (secondary) cerebral energy failure previously described in birth-asphyxiated newborn infants and to investigate relationships between primary insult severity and the extent of the delayed energy failure. Phosphorus (31P) magnetic resonance spectroscopy (MRS) at 7 T was used to study the brains of 12 newborn piglets during an acute, reversible, cerebral hypoxic-ischemic episode which continued until nucleotide triphosphates (NTP) were depleted. After reperfusion and reoxygenation, spectroscopy was continued for 48 h. High-energy metabolite concentrations returned to near normal levels after the insult, but later they fell as delayed energy failure developed. The time integral of NTP depletion in the primary insult correlated strongly with the minimum [phosphocreatine (PCr)]/[inorganic orthophosphate (Pi)] observed 24–48 h after the insult. (Linear regression analysis gave slope –8.04 h–1; ordinate intercept=1.23;r=0.92;P<0.0001.) This model is currently being used to investigate the therapeutic potential of various cerebroprotective strategies including hypothermia.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号