首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   4319篇
  免费   498篇
  国内免费   315篇
电工技术   519篇
综合类   756篇
化学工业   535篇
金属工艺   174篇
机械仪表   241篇
建筑科学   351篇
矿业工程   78篇
能源动力   247篇
轻工业   290篇
水利工程   53篇
石油天然气   22篇
武器工业   11篇
无线电   336篇
一般工业技术   500篇
冶金工业   43篇
原子能技术   5篇
自动化技术   971篇
  2024年   44篇
  2023年   215篇
  2022年   222篇
  2021年   267篇
  2020年   226篇
  2019年   147篇
  2018年   84篇
  2017年   165篇
  2016年   172篇
  2015年   173篇
  2014年   396篇
  2013年   474篇
  2012年   625篇
  2011年   654篇
  2010年   467篇
  2009年   384篇
  2008年   153篇
  2007年   40篇
  2006年   34篇
  2005年   22篇
  2004年   10篇
  2003年   6篇
  2002年   18篇
  2001年   11篇
  2000年   12篇
  1999年   11篇
  1998年   7篇
  1997年   2篇
  1996年   5篇
  1995年   7篇
  1994年   4篇
  1993年   6篇
  1992年   3篇
  1991年   3篇
  1990年   4篇
  1989年   2篇
  1988年   4篇
  1987年   21篇
  1986年   15篇
  1985年   3篇
  1984年   1篇
  1982年   1篇
  1981年   1篇
  1980年   3篇
  1979年   3篇
  1978年   4篇
  1976年   1篇
排序方式: 共有5132条查询结果,搜索用时 0 毫秒
1.
Recently, Nowotarski et al. (2013) have found that wavelet-based models for the long-term seasonal component (LTSC) are not only better in extracting the LTSC from a series of spot electricity prices but also significantly more accurate in terms of forecasting these prices up to a year ahead than the commonly used monthly dummies and sine-based models. However, a clear disadvantage of the wavelet-based approach is the increased complexity of the technique, as compared to the other two classes of LTSC models, and the resulting need for dedicated numerical software, which may not be readily available to practitioners in their work environments. To facilitate this problem, we propose here a much simpler, yet equally powerful method for identifying the LTSC in electricity spot price series. It makes use of the Hodrick–Prescott (HP) filter, a widely-recognized tool in macroeconomics.  相似文献   
2.
《Materials Letters》2007,61(8-9):1881-1884
Organic ultraviolet (UV) ray absorbents, cinnamic acid (CA) and p-methoxycinnamic acid (PMOCA) were intercalated into Zn2Al layered double hydroxides (Zn2Al-LDHs) by co-precipitation reaction. The organic–inorganic nanocomposites, Zn2Al-LDH/CA and Zn2Al-LDH/PMOCA were obtained. The samples showed excellent UV ray absorption ability and their catalytic activity for the air oxidation of castor oil greatly decreased when the organic UV ray absorbents were intercalated in the layers of the Zn2Al-LDHs. The studies suggested that Zn2Al-LDH/organic UV absorbent nanocomposites might be used as safe sunscreen materials.  相似文献   
3.
主观性句子识别旨在发现文本集合中具有观点的句子。本文基于概率主题模型,提出融合主题的主观性句子识别模型。该模型通过考虑主题因素识别句子主观性,同时挖掘文本集合中的潜在主观性主题。提出的模型是一个弱监督生成模型,不需要大量的标记语料进行训练,仅需要一小部分领域独立的主观性词典修改模型的先验。实验证明,提出的模型能有效地提高句子识别召回率和F值,同时抽取的主观性主题具有较强的语义信息。  相似文献   
4.
针对校园网出口拥塞问题,提出一种基于行为分析的用户兴趣建模方法,通过用户参与时间来衡量并计算不同种类应用的用户访问兴趣度,在用户兴趣度的基础上结合通道管理技术进行带宽管理策略研究,实现动态带宽管理。实施结果表明,该方法能够有效地提高用户的网络满意度。  相似文献   
5.
自动信任协商是通过逐渐请求和披露数字证书在两个陌生实体间建立相互信任的方法。当前对自动信任协商协商策略的研究,往往存在很多问题。文章提出了一种基于期望因子的自动信任协商模型,该模型采用MCD策略,通过分析各信任证书的期望因子,当可能的协商存在时,它能披露和请求最小的信任证找到一条成功的协商路径,当协商不可能成功时,尽快的发现并终止它。它能保证在协商的过程中没有不相关的信任证被披露并且不需要暴露双方的访问控制策略,同时通过实验证了明MCD策略是完备的。  相似文献   
6.
Fuzzy local maximal marginal embedding for feature extraction   总被引:1,自引:0,他引:1  
In graph-based linear dimensionality reduction algorithms, it is crucial to construct a neighbor graph that can correctly reflect the relationship between samples. This paper presents an improved algorithm called fuzzy local maximal marginal embedding (FLMME) for linear dimensionality reduction. Significantly differing from the existing graph-based algorithms is that two novel fuzzy gradual graphs are constructed in FLMME, which help to pull the near neighbor samples in same class nearer and nearer and repel the far neighbor samples of margin between different classes farther and farther when they are projected to feature subspace. Through the fuzzy gradual graphs, FLMME algorithm has lower sensitivities to the sample variations caused by varying illumination, expression, viewing conditions and shapes. The proposed FLMME algorithm is evaluated through experiments by using the WINE database, the Yale and ORL face image databases and the USPS handwriting digital databases. The results show that the FLMME outperforms PCA, LDA, LPP and local maximal marginal embedding.  相似文献   
7.
To enable the immediate and efficient dispatch of relief to victims of disaster, this study proposes a greedy-search-based, multi-objective, genetic algorithm capable of regulating the distribution of available resources and automatically generating a variety of feasible emergency logistics schedules for decision-makers. The proposed algorithm dynamically adjusts distribution schedules from various supply points according to the requirements at demand points in order to minimize unsatisfied demand for resources, time to delivery, and transportation costs. The proposed algorithm was applied to the case of the Chi–Chi earthquake in Taiwan to verify its performance. Simulation results demonstrate that under conditions of a limited/unlimited number of available vehicles, the proposed algorithm outperforms the MOGA and standard greedy algorithm in ‘time to delivery’ by an average of 63.57% and 46.15%, respectively, based on 10,000 iterations.  相似文献   
8.
The viability of networked communities depends on the creation and disclosure of user-generated content and the frequency of user visitation (Facebook 10-K Annual Report, 2012). However, little is known about how to align the interests of user and social networking sites. In this study, we draw upon the principal-agent perspective to extend Pavlou et al.’s uncertainty mitigation model of online exchange relationships (2007) and propose an empirically tested model for aligning the incentives of the principal (user) and the agent (service provider). As suggested by Pavlou et al., we incorporated a multi-dimensional measure of trust: trust of provider and trust of members. The proposed model is empirically tested with survey data from 305 adults aged 20-55. The results support our model, delineating how real individuals with bounded rationality actually make decision about information disclosure under uncertainty in the social networking site context. There is show little to no relationship between online privacy concerns and information disclosure on online social network sites. Perceived benefits provide the linkage between the incentives of principal (user) and agent (provider) while usage intensity demonstrated the most significant impact on information disclosure. We argue that the phenomenon may be explained through Communication Privacy Management Theory. The present study enhances our understanding of agency theory and human judgment theory in the context of social media. Practical implications for understanding and facilitating online social exchange relationships are also discussed.  相似文献   
9.
In this paper, we consider interactive fuzzy programming for multi-level 0–1 programming problems involving random variable coefficients both in objective functions and constraints. Following the probability maximization model together with the concept of chance constraints, the formulated stochastic multi-level 0–1 programming problems are transformed into deterministic ones. Taking into account vagueness of judgments of the decision makers, we present interactive fuzzy programming. In the proposed interactive method, after determining the fuzzy goals of the decision makers at all levels, a satisfactory solution is derived efficiently by updating satisfactory levels of the decision makers with considerations of overall satisfactory balance among all levels. For solving the transformed deterministic problems efficiently, we also introduce novel tabu search for general 0–1 programming problems. A numerical example for a three-level 0–1 programming problem is provided to illustrate the proposed method.  相似文献   
10.
Partitioning the universe of discourse and determining intervals containing useful temporal information and coming with better interpretability are critical for forecasting in fuzzy time series. In the existing literature, researchers seldom consider the effect of time variable when they partition the universe of discourse. As a result, and there is a lack of interpretability of the resulting temporal intervals. In this paper, we take the temporal information into account to partition the universe of discourse into intervals with unequal length. As a result, the performance improves forecasting quality. First, time variable is involved in partitioning the universe through Gath–Geva clustering-based time series segmentation and obtain the prototypes of data, then determine suitable intervals according to the prototypes by means of information granules. An effective method of partitioning and determining intervals is proposed. We show that these intervals carry well-defined semantics. To verify the effectiveness of the approach, we apply the proposed method to forecast enrollment of students of Alabama University and the Taiwan Stock Exchange Capitalization Weighted Stock Index. The experimental results show that the partitioning with temporal information can greatly improve accuracy of forecasting. Furthermore, the proposed method is not sensitive to its parameters.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号