首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   4869篇
  免费   391篇
  国内免费   55篇
电工技术   115篇
综合类   21篇
化学工业   1365篇
金属工艺   127篇
机械仪表   177篇
建筑科学   212篇
矿业工程   10篇
能源动力   305篇
轻工业   359篇
水利工程   88篇
石油天然气   70篇
武器工业   5篇
无线电   438篇
一般工业技术   778篇
冶金工业   249篇
原子能技术   36篇
自动化技术   960篇
  2024年   18篇
  2023年   88篇
  2022年   158篇
  2021年   316篇
  2020年   268篇
  2019年   331篇
  2018年   379篇
  2017年   356篇
  2016年   327篇
  2015年   212篇
  2014年   358篇
  2013年   522篇
  2012年   356篇
  2011年   407篇
  2010年   244篇
  2009年   236篇
  2008年   145篇
  2007年   87篇
  2006年   100篇
  2005年   60篇
  2004年   37篇
  2003年   41篇
  2002年   36篇
  2001年   23篇
  2000年   19篇
  1999年   17篇
  1998年   26篇
  1997年   15篇
  1996年   19篇
  1995年   13篇
  1994年   8篇
  1993年   12篇
  1991年   8篇
  1990年   9篇
  1987年   5篇
  1986年   4篇
  1985年   5篇
  1984年   7篇
  1983年   7篇
  1982年   6篇
  1981年   3篇
  1980年   2篇
  1979年   2篇
  1978年   4篇
  1977年   5篇
  1976年   2篇
  1975年   7篇
  1970年   1篇
  1969年   1篇
  1965年   1篇
排序方式: 共有5315条查询结果,搜索用时 15 毫秒
101.
The increasing demand on execution of large-scale Cloud workflow applications which need a robust and elastic computing infrastructure usually lead to the use of high-performance Grid computing clusters. As the owners of Cloud applications expect to fulfill the requested Quality of Services (QoS) by the Grid environment, an adaptive scheduling mechanism is needed which enables to distribute a large number of related tasks with different computational and communication demands on multi-cluster Grid computing environments. Addressing the problem of scheduling large-scale Cloud workflow applications onto multi-cluster Grid environment regarding the QoS constraints declared by application’s owner is the main contribution of this paper. Heterogeneity of resource types (service type) is one of the most important issues which significantly affect workflow scheduling in Grid environment. On the other hand, a Cloud application workflow is usually consisting of different tasks with the need for different resource types to complete which we call it heterogeneity in workflow. The main idea which forms the soul of all the algorithms and techniques introduced in this paper is to match the heterogeneity in Cloud application’s workflow to the heterogeneity in Grid clusters. To obtain this objective a new bi-level advanced reservation strategy is introduced, which is based upon the idea of first performing global scheduling and then conducting local scheduling. Global-scheduling is responsible to dynamically partition the received DAG into multiple sub-workflows that is realized by two collaborating algorithms: (1) The Critical Path Extraction algorithm (CPE) which proposes a new dynamic task overall critically value strategy based on DAG’s specification and requested resource type QoS status to determine the criticality of each task; and (2) The DAG Partitioning algorithm (DAGP) which introduces a novel dynamic score-based approach to extract sub-workflows based on critical paths by using a new Fuzzy Qualitative Value Calculation System to evaluate the environment. Local-scheduling is responsible for scheduling tasks on suitable resources by utilizing a new Multi-Criteria Advance Reservation algorithm (MCAR) which simultaneously meets high reliability and QoS expectations for scheduling distributed Cloud-base applications. We used the simulation to evaluate the performance of the proposed mechanism in comparison with four well-known approaches. The results show that the proposed algorithm outperforms other approaches in different QoS related terms.  相似文献   
102.
In multiview 3D TV, a pair of corresponding pixels in adjacent 2D views contributes to the reconstruction of voxels (3D pixels) in the 3D scene. We analyze this reconstruction process and determine the optimal pixel aspect ratio based on which the estimated object position can be improved given specific imaging or viewing configurations and constraints. By applying mathematical modeling, we deduce the optimal solutions for two general stereo configurations: parallel and with vergence. We theoretically show that for a given total resolution a finer horizontal resolution, compared to the usual uniform pixel distribution, in general, provides a better 3D visual experience for both configurations. The optimal value may vary depending on different configuration parameter values. We validate our theoretical results by conducting subjective studies using a set of simulated non-square discretized red–blue stereo pairs and show that human observers indeed have a better 3D viewing experience with an optimized vs. a non-optimized representation of 3D-models.  相似文献   
103.
The main recognition procedure in modern HMM-based continuous speech recognition systems is Viterbi algorithm. Viterbi algorithm finds out the best acoustic sequence according to input speech in the search space using dynamic programming. In this paper, dynamic programming is replaced by a search method which is based on particle swarm optimization. The major idea is focused on generating initial population of particles as the speech segmentation vectors. The particles try to achieve the best segmentation by an updating method during iterations. In this paper, a new method of particles representation and recognition process is introduced which is consistent with the nature of continuous speech recognition. The idea was tested on bi-phone recognition and continuous speech recognition workbenches and the results show that the proposed search method reaches the performance of the Viterbi segmentation algorithm ; however, there is a slight degradation in the accuracy rate.  相似文献   
104.
Protein–protein interactions (PPIs) play a fundamental role in various biological functions; thus, detecting PPI sites is essential for understanding diseases and developing new drugs. PPI prediction is of particular relevance for the development of drugs employing targeted protein degradation, as their efficacy relies on the formation of a stable ternary complex involving two proteins. However, experimental methods to detect PPI sites are both costly and time-intensive. In recent years, machine learning-based methods have been developed as screening tools. While they are computationally more efficient than traditional docking methods and thus allow rapid execution, these tools have so far primarily been based on sequence information, and they are therefore limited in their ability to address spatial requirements. In addition, they have to date not been applied to targeted protein degradation. Here, we present a new deep learning architecture based on the concept of graph representation learning that can predict interaction sites and interactions of proteins based on their surface representations. We demonstrate that our model reaches state-of-the-art performance using AUROC scores on the established MaSIF dataset. We furthermore introduce a new dataset with more diverse protein interactions and show that our model generalizes well to this new data. These generalization capabilities allow our model to predict the PPIs relevant for targeted protein degradation, which we show by demonstrating the high accuracy of our model for PPI prediction on the available ternary complex data. Our results suggest that PPI prediction models can be a valuable tool for screening protein pairs while developing new drugs for targeted protein degradation.  相似文献   
105.
Although yeast are generally non-haemolytic, we have found that addition of alcohol vapour confers haemolytic properties on many strains of yeast and other fungi. We have called this phenomenon 'microbial alcohol-conferred haemolysis' (MACH). MACH is species- and strain-specific: whereas all six Candida tropicalis strains tested were haemolytic in the presence of ethanol, none among 10 C. glabrata strains tested exhibited this phenomenon. Among 27 C. albicans strains and 11 Saccharomyces cerevisiae strains tested, ethanol-mediated haemolysis was observed in 11 and 4 strains, respectively. Haemolysis is also dependent on the alcohol moiety: n-butanol and n-pentanol could also confer haemolysis, whereas methanol and 2-propanol did not. Haemolysis was found to be dependent on initial oxidation of the alcohol. Reduced haemolysis was observed in specific alcohol dehydrogenase mutants of both Aspergillus nidulans and S. cerevisiae. MACH was not observed during anaerobic growth, and was reduced in the presence of pararosaniline, an aldehyde scavenger. Results suggest that initial oxidation of the alcohol to the corresponding aldehyde is an essential step in the observed phenomenon.  相似文献   
106.
ABSTRACT

Objective: We systematically reviewed available randomized clinical trials (RCTs) to elucidate the overall effects of synbiotic supplementation in patients with nonalcoholic fatty liver disease (NAFLD).

Methods: PubMed, Scopus, ISI Web of science and Google Scholar were searched up to December, 2017. All RCTs using synbiotic supplements to treat NAFLD included in this systematic review and meta-analysis. Mean Difference (MD) was pooled using a random-effects model.

Results: Eleven eligible databases from seven RCTs were identified for the present meta-analysis. Our results showed that synbiotic supplementation can decrease body weight, fasting blood sugar, insulin, low density lipoprotein cholesterol, total cholesterol, triglyceride, high-sensitivity C-reactive protein, tumor necrosis factor alpha, alanine transaminase and aspartate transaminase levels among patients with NAFLD. In contrast, synbiotic did not have favorable effects on body mass index (BMI), waist circumference, homeostasis model assessment for insulin resistance (HOMA-IR), and high density lipoprotein cholesterol (HDL) levels compared with the placebo group.

Conclusion: The current study revealed that synbiotic supplementation has favorable effect on inflammatory factors, liver enzymes and some anthropometric indices, lipid profiles and glucose homeostasis parameters in patients with NAFLD.  相似文献   
107.
108.
109.
Fluid flow and mixing of molten steel in a twin-slab-strand continuous casting tundish were investigated using a mixing model under non-isothermal conditions.This model led to a set of ordinary differential equations that were solved with a Runge-Kutta algorithm.Steady state water modeling was carried out under non-isothermal conditions.Experimental data obtained from the water model were used to calibrate the mixing model.Owing to the presence of a mixed convection in the non-isothermal conditions,a channelizing flow would be created in the fluid inside the tundish.A mixing model was designed that was capable of predicting RTD(residence time distribution)curves for different cases in non-isothermal conditions.The relationship between RTD parameters and the Tu(tundish Richardson number)was obtained for various cases under non-isothermal conditions.The results show that the RTD parameters were completely different under isothermal and non-isothermal conditions.The comparison of the RTD curves between the isothermal and non-isothermal conditions presents that the extent of mixing in the tundish in non-isothermal conditions is lower than the mixing extent in isothermal conditions.  相似文献   
110.
This research establishes a methodological framework for quantifying community resilience based on fluctuations in a population''s activity during a natural disaster. Visits to points-of-interests (POIs) over time serve as a proxy for activities to capture the combined effects of perturbations in lifestyles, the built environment and the status of business. This study used digital trace data related to unique visits to POIs in the Houston metropolitan area during Hurricane Harvey in 2017. Resilience metrics in the form of systemic impact, duration of impact, and general resilience (GR) values were examined for the region along with their spatial distributions. The results show that certain categories, such as religious organizations and building material and supplies dealers had better resilience metrics—low systemic impact, short duration of impact, and high GR. Other categories such as medical facilities and entertainment had worse resilience metrics—high systemic impact, long duration of impact and low GR. Spatial analyses revealed that areas in the community with lower levels of resilience metrics also experienced extensive flooding. This insight demonstrates the validity of the approach proposed in this study for quantifying and analysing data for community resilience patterns using digital trace/location-intelligence data related to population activities. While this study focused on the Houston metropolitan area and only analysed one natural hazard, the same approach could be applied to other communities and disaster contexts. Such resilience metrics bring valuable insight into prioritizing resource allocation in the recovery process.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号