首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   4601篇
  免费   285篇
  国内免费   13篇
电工技术   50篇
综合类   1篇
化学工业   1167篇
金属工艺   120篇
机械仪表   133篇
建筑科学   108篇
矿业工程   9篇
能源动力   147篇
轻工业   933篇
水利工程   61篇
石油天然气   17篇
无线电   259篇
一般工业技术   747篇
冶金工业   520篇
原子能技术   30篇
自动化技术   597篇
  2024年   24篇
  2023年   50篇
  2022年   160篇
  2021年   254篇
  2020年   157篇
  2019年   205篇
  2018年   218篇
  2017年   227篇
  2016年   183篇
  2015年   139篇
  2014年   185篇
  2013年   294篇
  2012年   287篇
  2011年   299篇
  2010年   204篇
  2009年   233篇
  2008年   183篇
  2007年   170篇
  2006年   133篇
  2005年   112篇
  2004年   81篇
  2003年   78篇
  2002年   75篇
  2001年   62篇
  2000年   63篇
  1999年   56篇
  1998年   172篇
  1997年   97篇
  1996年   85篇
  1995年   59篇
  1994年   53篇
  1993年   41篇
  1992年   25篇
  1991年   15篇
  1990年   11篇
  1989年   25篇
  1988年   15篇
  1987年   14篇
  1986年   15篇
  1985年   14篇
  1984年   13篇
  1983年   6篇
  1982年   8篇
  1981年   8篇
  1980年   9篇
  1979年   8篇
  1977年   11篇
  1976年   22篇
  1975年   6篇
  1974年   7篇
排序方式: 共有4899条查询结果,搜索用时 10 毫秒
31.
This paper presents the Clearing Fund Protocol, a three layered protocol designed to schedule soft real-time sets of precedence related tasks with shared resources. These sets are processed in an open dynamic environment. Open because new applications may enter the system at any time and dynamic because the schedulability is tested on-line as tasks request admission. Top-down, the three layers are the Clearing Fund, the Bandwidth Inheritance and two versions of the Constant Bandwidth Server algorithms. Bandwidth Inheritance applies a priority inheritance mechanism to the Constant Bandwidth Server. However, a serious drawback is its unfairness. In fact, a task executing in a server can potentially steal the bandwidth of another server without paying any penalty. The main idea of the Clearing Fund Algorithm is to keep track of processor-time debts contracted by lower priority tasks that block higher priority ones and are executed in the higher priority servers by having inherited the higher priority. The proposed algorithm reduces the undesirable effects of those priority inversions because the blocked task can finish its execution in its own server or in the server of the blocking task, whichever has the nearest deadline. If demanded, debts are paid back in that way. Inheritors are therefore debtors. Moreover, at certain instants in time, all existing debts may be waived and the servers are reset making a clear restart of the system. The Clearing Fund Protocol showed definite better performances when evaluated by simulations against Bandwidth Inheritance, the protocol it tries to improve.  相似文献   
32.
A cartographic-oriented model uses algebraic map operations to perform spatial analysis of medical data relative to the human body. A prototype system uses 3D visualization techniques to deliver analysis results. A prototype implementation suggests the model might provide the basis for a medical application tool that introduces new information insight.  相似文献   
33.
Zambia Consolidated Copper Mines Ltd. (ZCCM) is planning a substantial increase in ore production in several of their underground mines on the Zambian Copperbelt over the next 10 years. The future production strategy is based on development of productive and economic mining methods through the application of mechanization and backfilling. Mechanization is designed to provide the production capability and the backfilling is designed to reduce water inflow into the mines. A similar trend can be seen in world-wide changes in mining methods from open stoping and sub-level caving to cut-and-fill stoping. Backfill is being employed worldwide, including in Australia, Canada, Sweden, Latin America, Zambia, and the U.S.A. Plans for backfill mining methods are underway for future operations in Chile, Canada, Zambia, and Mexico. The principal reasons for these changes in mining methods are twofold:
  • ? Increased ore recovery, and
  • ? Decreased environmental impact.
  • The main difference in the environmental impacts between mining with sub-level caving or open stoping and mining with backfilling methods is the reduction in subsidence or the potential for subsidence. Backfilling reduces ground movements in the rock overlying and adjacent to mine openings as well as subsidence at the surface. Reduced ground movement decreases the number and size of fracture-controlled hydraulic flow paths into a mine and, thereby, the impact of mining on surface and ground water resources. This paper deals with: 1) The impacts caused by open stoping and sub-level caving in comparison to backfilling methods; 2) The approximate impact of backfill on dewatering strategies, and; 3) The environmental benefits of backfill mining. The differences in mine drainage strategies are supported by case histories from various mines.  相似文献   
    34.
    Ribeiro AB  Caleya RF  Santos JL 《Applied optics》1995,34(28):6481-6488
    Progressive ladder topology is studied by consideration of its properties of power budget and coupler tailoring. Optimization criteria are addressed for lossless and real systems, and their basic characteristics are compared with other topologies. Numerical results are presented, and an experiment is described for the case in which the network supports interferometric and intensity (with referentiation) fiber-optic-based sensors.  相似文献   
    35.
    BACKGROUND: The high incidence of locoregional recurrences and distant metastases after curative surgery for gastric cancer calls for improved locoregional control and systemic adjuvant treatment. METHODS: In a randomized clinical trial on adjuvant FAM2 chemotherapy, quality of surgery was evaluated by comparing surgical and pathology data. Univariate and multivariate analysis was made to evaluate the effect of prognostic factors on survival and time of recurrence in relation to patients, tumor, and therapy. RESULTS: Of 314 patients randomized from 28 European institutions, 159 comprised the control and 155 the FAM2 group. After a median follow-up of 80 months, no statistically significant difference was found between survivals. However, for recurrence time, treated patients had a significant advantage over controls (p = 0.02). At univariate analysis, statistically significant differences in survival and time to progression emerged for T, N, disease stage and "adequacy" of surgery. The multivariate analysis retained preoperative Hb level, T, N, and "adequacy" of surgery for time of survival; and T, N, "adequacy" of surgery and adjuvant chemotherapy for recurrence time. CONCLUSIONS: Disease stage is the most important prognostic factor. "Adequate" surgery has an important effect. Adjuvant FAM2 delayed time of recurrence, but did not influence overall survival.  相似文献   
    36.
    Phosphate- and silicate-based glasses were added to hydroxyapatite in order to improve its mechanical properties and to fabricate composites with different degrees of bioactivity. A strong chemical bonding was obtained between hydroxyapatite and the phosphate-based glasses leading to samples approaching theoretical density, according to density measurements and scanning electron microscopy. Bioglass® additions led to the formation of a complex calcium phosphate silicate which hampered the reinforcement process. The fracture toughness of the hydroxyapatite-glass composites was shown to be within the 1.1–1.2 MPam1/2 range, which is double that determined for sintered hydroxyapatite. A 2 m thick apatite layer was observed on the surface of the hydroxyapatite-glass composites after 48 h immersion in a simulated human blood plasma, whereas only a few apatite crystals were detected on sintered hydroxyapatite after 7 days immersion. From the results obtained we anticipate that the composites might show a higher rate of bone bonding, leading to enhanced bioactivity.  相似文献   
    37.

    The classification task usually works with flat and batch learners, assuming problems as stationary and without relations between class labels. Nevertheless, several real-world problems do not assume these premises, i.e., data have labels organized hierarchically and are made available in streaming fashion, meaning that their behavior can drift over time. Existing studies on hierarchical classification do not consider data streams as input of their process, and thus, data is assumed as stationary and handled through batch learners. The same can be said about works on streaming data, as the hierarchical classification is overlooked. Studies concerning each area individually are promising, yet, do not tackle their intersection. This study analyzes the main characteristics of the state-of-the-art works on hierarchical classification for streaming data concerning five aspects: (i) problems tackled, (ii) datasets, (iii) algorithms, (iv) evaluation metrics, and (v) research gaps in the area. We performed a systematic literature review of primary studies and retrieved 3,722 papers, of which 42 were identified as relevant and used to answer the aforementioned research questions. We found that the problems handled by hierarchical classification of data streams include mainly classification of images, human activities, texts, and audio; the datasets are mostly created or synthetic data; the algorithms and evaluation metrics are well-known techniques or based on those; and research gaps are related to dynamic context, data complexity, and computational resources constraints. We also provide implications for future research and experiments to consider common characteristics shared amongst hierarchical classification and data stream classification.

      相似文献   
    38.
    Jürgen Abel 《Software》2010,40(9):751-777
    The lossless Burrows–Wheeler compression algorithm has received considerable attention over recent years for both its simplicity and effectiveness. It is based on a permutation of the input sequence—the Burrows–Wheeler transformation (BWT)—which groups symbols with a similar context close together. In the original version, this permutation was followed by a Move‐To‐Front transformation and a final entropy coding stage. Later versions used different algorithms, placed after the BWT, since the following stages have a significant influence on the compression rate. This paper describes different algorithms and improvements for these post BWT stages including a new context‐based approach. The results for compression rates are presented together with compression and decompression times on the Calgary corpus, the Canterbury corpus, the large Canterbury corpus and the Lukas 2D 16‐bit medical image corpus. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   
    39.
    This paper proposes a new approach for the segmentation of both near-end and far-end intima-media regions of the common carotid artery in ultrasound images. The method requires minimal user interaction and is able to segment the near-end wall in arteries with large, hypoechogenic and irregular plaques, issues usually not considered previously due to the increased segmentation difficulty.  相似文献   
    40.
    Autonomous robots are leaving the laboratories to master new outdoor applications, and walking robots in particular have already shown their potential advantages in these environments, especially on a natural terrain. Gait generation is the key to success in the negotiation of natural terrain with legged robots; however, most of the algorithms devised for hexapods have been tested under laboratory conditions. This paper presents the development of crab and turning gaits for hexapod robots on a natural terrain characterized by containing uneven ground and forbidden zones. The gaits we have developed rely on two empirical rules that derive three control modules that have been tested both under simulation and by experiment. The geometrical model of the SILO-6 walking robot has been used for simulation purposes, while the real SILO-6 walking robot has been used in the experiments. This robot was built as a mobile platform for a sensory system to detect and locate antipersonnel landmines in humanitarian demining missions.  相似文献   
    设为首页 | 免责声明 | 关于勤云 | 加入收藏

    Copyright©北京勤云科技发展有限公司  京ICP备09084417号