首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   4581篇
  免费   278篇
  国内免费   34篇
电工技术   69篇
综合类   12篇
化学工业   1274篇
金属工艺   109篇
机械仪表   118篇
建筑科学   110篇
矿业工程   12篇
能源动力   346篇
轻工业   421篇
水利工程   49篇
石油天然气   79篇
无线电   464篇
一般工业技术   808篇
冶金工业   280篇
原子能技术   49篇
自动化技术   693篇
  2024年   12篇
  2023年   81篇
  2022年   195篇
  2021年   243篇
  2020年   181篇
  2019年   218篇
  2018年   284篇
  2017年   226篇
  2016年   254篇
  2015年   175篇
  2014年   234篇
  2013年   464篇
  2012年   258篇
  2011年   307篇
  2010年   212篇
  2009年   181篇
  2008年   157篇
  2007年   102篇
  2006年   114篇
  2005年   76篇
  2004年   76篇
  2003年   61篇
  2002年   70篇
  2001年   39篇
  2000年   40篇
  1999年   24篇
  1998年   92篇
  1997年   49篇
  1996年   50篇
  1995年   39篇
  1994年   37篇
  1993年   33篇
  1992年   27篇
  1991年   27篇
  1990年   19篇
  1989年   23篇
  1988年   17篇
  1987年   16篇
  1986年   10篇
  1985年   16篇
  1984年   15篇
  1983年   16篇
  1982年   11篇
  1981年   12篇
  1980年   12篇
  1979年   11篇
  1978年   11篇
  1977年   12篇
  1976年   18篇
  1971年   6篇
排序方式: 共有4893条查询结果,搜索用时 15 毫秒
101.
The aim of this paper is to deal with an output controllability problem. It consists in driving the state of a distributed parabolic system toward a state between two prescribed functions on a boundary subregion of the system evolution domain with minimum energy control. Two necessary conditions are derived. The first one is formulated in terms of subdifferential associated with a minimized functional. The second one is formulated as a system of equations for arguments of the Lagrange systems. Numerical illustrations show the efficiency of the second approach and lead to some conjectures. Recommended by Editorial Board member Fumitoshi Matsuno under the direction of Editor Jae Weon Choi. Zerrik El Hassan is a Professor at the university Moulay Ismail of Meknes in Morocco. He was an Assistant Professor in the faculty of sciences of Meknes and researcher at the university of Perpignan (France). He got his doctorat d etat in system regional analysis (1993) at the University Mohammed V of Rabat, Morocco. Professor Zerrik wrote many papers and books in the area of systems analysis and control. Now he is the Head of the research team MACS (Modeling Analysis and Control of Systems) at the university Moulay Ismail of Meknes in Morocco. Ghafrani Fatima is a Researcher at team MACS at the University Moulay Ismail of Meknes in Morocco. She wrote many papers in the area of systems analysis and control.  相似文献   
102.
Powdered black pepper from Egyptian markets, was irradiated with different recommended doses of gamma rays (5.0 and 10.0 kGy) and with microwaves for different periods (20, 40 and 75 s) to improve its hygienic quality. The most common bacterial isolates were of three generaBacillus, Clostridium andMicrococcus (7.5 × 106), whereas the predominant fungi (7.8 × 104) wereAspergillus species,A. glaucus, A. flavus, A. niger andA. ochraceus. Doses of gamma irradiation used (5.0 and 10 kGy) were sufficient to decrease spore-forming bacteria (SFB) and to inhibit the fungal flora and coliforms which contaminated the black pepper powder. Microwave treatments for 40 s and 75 s were of the same effectiveness whereas treatment for 20 s was less so. GLC analysis proved the presence of 31 peaks, only 19 compounds were identified as monoterpene hydrocarbons (56.21%), the major one being -phellandrene and limonene. Sesquiterpenes were also present, mainly -caryollphyllene (3.69%) as well as oxygenated compounds such as terpenol, geraniol, Me-chavicol, eugenol and anisol. Gamma irradiation at 5 kGy and 10 kGy respectively decreased the numbers of identified compounds from 21 (86.58% concentration) in untreated pepper to 16 (59.22% concentration), 15 (54.06% concentration). In comparison, microwave treatments, particularly for 40 s and 75 s, increased the concentration of the same compounds. The results obtained indicate that microwave treatment, under these conditions, is a safe and suitable technique for decontamination of black pepper which does not result in a great loss of flavour compounds, as compared with recommended doses of gamma irradiation.  相似文献   
103.

Background

The use of crowdsourcing in a pedagogically supported form to partner with learners in developing novel content is emerging as a viable approach for engaging students in higher-order learning at scale. However, how students behave in this form of crowdsourcing, referred to as learnersourcing, is still insufficiently explored.

Objectives

To contribute to filling this gap, this study explores how students engage with learnersourcing tasks across a range of course and assessment designs.

Methods

We conducted an exploratory study on trace data of 1279 students across three courses, originating from the use of a learnersourcing environment under different assessment designs. We employed a new methodology from the learning analytics (LA) field that aims to represent students' behaviour through two theoretically-derived latent constructs: learning tactics and the learning strategies built upon them.

Results

The study's results demonstrate students use different tactics and strategies, highlight the association of learnersourcing contexts with the identified learning tactics and strategies, indicate a significant association between the strategies and performance and contribute to the employed method's generalisability by applying it to a new context.

Implications

This study provides an example of how learning analytics methods can be employed towards the development of effective learnersourcing systems and, more broadly, technological educational solutions that support learner-centred and data-driven learning at scale. Findings should inform best practices for integrating learnersourcing activities into course design and shed light on the relevance of tactics and strategies to support teachers in making informed pedagogical decisions.  相似文献   
104.

In this article, we will present a new set of hybrid polynomials and their corresponding moments, with a view to using them for the localization, compression and reconstruction of 2D and 3D images. These polynomials are formed from the Hahn and Krawtchouk polynomials. The process of calculating these is successfully stabilized using the modified recurrence relations with respect to the n order, the variable x and the symmetry property. The hybrid polynomial generation process is carried out in two forms: the first form contains the separable discrete orthogonal polynomials of Krawtchouk–Hahn (DKHP) and Hahn–Krawtchouk (DHKP). The latter are generated as the product of the discrete orthogonal Hahn and Krawtchouk polynomials, while the second form is the square equivalent of the first form, it consists of discrete squared Krawtchouk–Hahn polynomials (SKHP) and discrete polynomials of Hahn–Krawtchouk squared (SHKP). The experimental results clearly show the efficiency of hybrid moments based on hybrid polynomials in terms of localization property and computation time of 2D and 3D images compared to other types of moments; on the other hand, encouraging results have also been shown in terms of reconstruction quality and compression despite the superiority of classical polynomials.

  相似文献   
105.

The edge computing model offers an ultimate platform to support scientific and real-time workflow-based applications over the edge of the network. However, scientific workflow scheduling and execution still facing challenges such as response time management and latency time. This leads to deal with the acquisition delay of servers, deployed at the edge of a network and reduces the overall completion time of workflow. Previous studies show that existing scheduling methods consider the static performance of the server and ignore the impact of resource acquisition delay when scheduling workflow tasks. Our proposed method presented a meta-heuristic algorithm to schedule the scientific workflow and minimize the overall completion time by properly managing the acquisition and transmission delays. We carry out extensive experiments and evaluations based on commercial clouds and various scientific workflow templates. The proposed method has approximately 7.7% better performance than the baseline algorithms, particularly in overall deadline constraint that gives a success rate.

  相似文献   
106.
The Journal of Supercomputing - This paper designs and develops a computational intelligence-based framework using convolutional neural network (CNN) and genetic algorithm (GA) to detect COVID-19...  相似文献   
107.
Data available in software engineering for many applications contains variability and it is not possible to say which variable helps in the process of the prediction. Most of the work present in software defect prediction is focused on the selection of best prediction techniques. For this purpose, deep learning and ensemble models have shown promising results. In contrast, there are very few researches that deals with cleaning the training data and selection of best parameter values from the data. Sometimes data available for training the models have high variability and this variability may cause a decrease in model accuracy. To deal with this problem we used the Akaike information criterion (AIC) and the Bayesian information criterion (BIC) for selection of the best variables to train the model. A simple ANN model with one input, one output and two hidden layers was used for the training instead of a very deep and complex model. AIC and BIC values are calculated and combination for minimum AIC and BIC values to be selected for the best model. At first, variables were narrowed down to a smaller number using correlation values. Then subsets for all the possible variable combinations were formed. In the end, an artificial neural network (ANN) model was trained for each subset and the best model was selected on the basis of the smallest AIC and BIC value. It was found that combination of only two variables’ ns and entropy are best for software defect prediction as it gives minimum AIC and BIC values. While, nm and npt is the worst combination and gives maximum AIC and BIC values.  相似文献   
108.

With the development of online social networking applications, microblogs have become a necessary online communication network in daily life. Users are interested in obtaining personalized recommendations related to their tastes and needs. In some microblog systems, tags are not available, or the use of tags is rare. In addition, user-specified social relations are extremely rare. Hence, sparsity is a problem in microblog systems. To address this problem, we propose a new framework called Pblog to alleviate sparsity. Pblog identifies users’ interests via their microblogs and social relations and computes implicit similarity among users using a new algorithm. The experimental results indicated that the use of this algorithm can improve the results. In online social networks, such as Twitter, the number of microblogs in the system is high, and it is constantly increasing. Therefore, providing personalized recommendations to target users requires considerable time. To address this problem, the Pblog framework groups similar users using the analytic hierarchy process (AHP) method. Then, Pblog prunes microblogs of the target user group and recommends microblogs with higher ratings to the target user. In the experimental results section, the Pblog framework was compared with several other frameworks. All of these frameworks were run on two datasets: Twitter and Tumblr. Based on the results of these comparisons, the Pblog framework provides more appropriate recommendations to the target user than previous frameworks.

  相似文献   
109.
In this study, we demonstrate Zn1?x Fe x S (x = 0.0, 0.25, 0.50, 0.75, and 1.0) device applications by reporting electronic, magnetic, and optical properties, computed with Wien2k software, using density functional theory (DFT). The modified Becke and Johnson (mBJ) potential has been applied to accurately determine the material band gap. The presence of half-metallic ferromagnetism (HMF) is demonstrated. Moreover, the observed ferromagnetism is justified in terms of various splitting energies and the exchange constants. The Fe magnetic moment decreases from 4.0 μ B due to the strong p ? d hybridization. A complete set of various optical parameters is also presented. The variation in the calculated static dielectric constant, due to Fe doping, is inversely related to the band gap that verifies Penn’s model. Moreover, the band gap of ZnS is tunable by the Fe doping, from ultraviolet to visible regions, depicting that the materials are appropriate for optoelectronic devices.  相似文献   
110.
Six Sigma is a quality philosophy and methodology that aims to achieve operational excellence and delighted customers. The cost of poor quality depends on the sigma quality level and its corresponding failure rate. Six Sigma provides a well-defined target of 3.4 defects per million. This failure rate is commonly evaluated under the assumption that the process is normally distributed and its specifications are two-sided. However, these assumptions may lead to implementation of quality-improvement strategies that are based on inaccurate evaluations of quality costs and profits. This paper defines the relationship between failure rate and sigma quality level for inverse Gaussian processes. The inverse Gaussian distribution has considerable applications in describing cycle times, product life, employee service times, and so on. We show that for these processes attaining Six Sigma target failure rate requires higher quality efforts than for normal processes. A generic model is presented to characterise cycle times in manufacturing systems. In this model, the asymptotic production is described by a drifted Brownian motion, and the cycle time is evaluated by using the first passage time theory of a Wiener process to a boundary. The proposed method estimates the right efforts required to reach Six Sigma goals.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号