首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   5801篇
  免费   282篇
  国内免费   8篇
电工技术   104篇
综合类   20篇
化学工业   1666篇
金属工艺   135篇
机械仪表   101篇
建筑科学   455篇
矿业工程   53篇
能源动力   121篇
轻工业   498篇
水利工程   38篇
石油天然气   3篇
无线电   378篇
一般工业技术   1048篇
冶金工业   353篇
原子能技术   37篇
自动化技术   1081篇
  2023年   75篇
  2022年   120篇
  2021年   150篇
  2020年   136篇
  2019年   130篇
  2018年   157篇
  2017年   135篇
  2016年   211篇
  2015年   215篇
  2014年   224篇
  2013年   331篇
  2012年   346篇
  2011年   413篇
  2010年   270篇
  2009年   267篇
  2008年   266篇
  2007年   295篇
  2006年   206篇
  2005年   189篇
  2004年   173篇
  2003年   120篇
  2002年   129篇
  2001年   78篇
  2000年   75篇
  1999年   79篇
  1998年   97篇
  1997年   86篇
  1996年   75篇
  1995年   71篇
  1994年   77篇
  1993年   52篇
  1992年   58篇
  1991年   52篇
  1990年   32篇
  1989年   39篇
  1988年   27篇
  1987年   30篇
  1986年   33篇
  1985年   44篇
  1984年   45篇
  1983年   38篇
  1982年   33篇
  1981年   29篇
  1980年   29篇
  1978年   27篇
  1977年   29篇
  1976年   27篇
  1975年   41篇
  1973年   24篇
  1970年   30篇
排序方式: 共有6091条查询结果,搜索用时 15 毫秒
81.
The SHARC framework for data quality in Web archiving   总被引:1,自引:0,他引:1  
Web archives preserve the history of born-digital content and offer great potential for sociologists, business analysts, and legal experts on intellectual property and compliance issues. Data quality is crucial for these purposes. Ideally, crawlers should gather coherent captures of entire Web sites, but the politeness etiquette and completeness requirement mandate very slow, long-duration crawling while Web sites undergo changes. This paper presents the SHARC framework for assessing the data quality in Web archives and for tuning capturing strategies toward better quality with given resources. We define data quality measures, characterize their properties, and develop a suite of quality-conscious scheduling strategies for archive crawling. Our framework includes single-visit and visit?Crevisit crawls. Single-visit crawls download every page of a site exactly once in an order that aims to minimize the ??blur?? in capturing the site. Visit?Crevisit strategies revisit pages after their initial downloads to check for intermediate changes. The revisiting order aims to maximize the ??coherence?? of the site capture(number pages that did not change during the capture). The quality notions of blur and coherence are formalized in the paper. Blur is a stochastic notion that reflects the expected number of page changes that a time-travel access to a site capture would accidentally see, instead of the ideal view of a instantaneously captured, ??sharp?? site. Coherence is a deterministic quality measure that counts the number of unchanged and thus coherently captured pages in a site snapshot. Strategies that aim to either minimize blur or maximize coherence are based on prior knowledge of or predictions for the change rates of individual pages. Our framework includes fairly accurate classifiers for change predictions. All strategies are fully implemented in a testbed and shown to be effective by experiments with both synthetically generated sites and a periodic crawl series for different Web sites.  相似文献   
82.
Modeling spatially distributed phenomena in terms of its controlling factors is a recurring problem in geoscience. Most efforts concentrate on predicting the value of response variable in terms of controlling variables either through a physical model or a regression model. However, many geospatial systems comprises complex, nonlinear, and spatially non-uniform relationships, making it difficult to even formulate a viable model. This paper focuses on spatial partitioning of controlling variables that are attributed to a particular range of a response variable. Thus, the presented method surveys spatially distributed relationships between predictors and response. The method is based on association analysis technique of identifying emerging patterns, which are extended in order to be applied more effectively to geospatial data sets. The outcome of the method is a list of spatial footprints, each characterized by a unique “controlling pattern”—a list of specific values of predictors that locally correlate with a specified value of response variable. Mapping the controlling footprints reveals geographic regionalization of relationship between predictors and response. The data mining underpinnings of the method are given and its application to a real world problem is demonstrated using an expository example focusing on determining variety of environmental associations of high vegetation density across the continental United States.  相似文献   
83.
In interactive theorem proving practice a significant amount of time is spent on unsuccessful proof attempts of wrong conjectures. An automatic method that reveals them by generating finite counter examples would offer an extremely valuable support for a proof engineer by saving his time and effort. In practice, such counter examples tend to be small, so usually there is no need to search for big instances. Most definitions of functions or predicates on infinite structures do not preserve the semantics if a transition to arbitrary finite substructures is made. We propose constraints which guarantee a correct axiomatization on finite structures and present an approach which uses the Alloy Analyzer to generate finite instances of theories in the theorem prover KIV. It is evaluated on the library of basic data types as well as on some challenging case studies in KIV. The technique is implemented using the Kodkod constraint solver which is a successor of Alloy.  相似文献   
84.
This paper introduces a system for real-time physiological measurement, analysis, and metaphorical visualization within a virtual environment (VE). Our goal is to develop a method that allows humans to unconsciously relate to parts of an environment more strongly than to others, purely induced by their own physiological responses to the virtual reality (VR) displays. In particular, we exploit heart rate, respiration, and galvanic skin response in order to control the behavior of virtual characters in the VE. Such unconscious processes may become a useful tool for storytelling or assist guiding participants through a sequence of tasks in order to make the application more interesting, e.g., in rehabilitation. We claim that anchoring of subjective bodily states to a virtual reality (VR) can enhance a person’s sense of realism of the VR and ultimately create a stronger relationship between humans and the VR.  相似文献   
85.
86.
Micro injection molded polymeric parts coated with functional thin films/layers show off the promising applications in microsystems area. But the unfavorable and unavoidable defect of weld line in micro injection molding part leads to detrimental mechanical and surface properties. The possibility of the functional thin film for enhancing micro injection molded weld lines was investigated. Two typical coating materials (aluminum and titanium) with various film thicknesses (400, 600, 800 nm) were deposited on one side of the micro injection molded weld line tensile sample via physical vapor deposition (PVD) method. The coated micro weld line samples were characterized by tensile tests. The results show that PVD films of aluminum and titanium can reinforce the strength and stiffness of micro injection molded weld line, even at thin thickness levels. But when the film thickness is increasing, the weaker adhesion between metallic films and polymers decreased the PVD films’ enhancing performance for micro weld line mechanical properties due to the degradation of polymers related to longer time exposure under high temperature.  相似文献   
87.
The injection molding of micro-structures is a promising mass-production method for a broad range of materials. However, the replication quality of these structures depends significantly on the heat flow during the filling stage. In this paper, the filling and heat transfer of v-groove and random structures below 5 μm is investigated with the help of an AFM (atomic force microscope) and thermo couples. A numerical model is developed to predict the filling of surface structures during the filling and packing stage. The model implies the use of simple fully developed flow models taking the power-law material model into account. This permits investigation into which ways several processing parameters affect the polymer flow in the surface structures. The mold wall temperature, which has significant effects on the polymer flow, is varied by using a variothermal mold temperature control system to validate the model proposed.  相似文献   
88.
Context: A number of approaches have been proposed for the general problem of software component evaluation and selection. Most approaches come from the field of Component-Based Software Development (CBSD), tackle the problem of Commercial-off-the-shelf component selection and use goal-oriented requirements modelling and multi-criteria decision making techniques. Evaluation of the suitability of components is carried out largely manually and partly relies on subjective judgement. However, in dynamic, distributed environments with high demands for transparent selection processes leading to trustworthy, auditable decisions, subjective judgements and vendor claims are not considered sufficient. Furthermore, continuous monitoring and re-evaluation of components after integration is sometimes needed.Objective: This paper describes how an evidence-based approach to component evaluation can improve repeatability and reproducibility of component selection under the following conditions: (1) Functional homogeneity of candidate components and (2) High number of components and selection problem instances.Method: Our evaluation and selection method and tool empirically evaluate candidate components in controlled experiments by applying automated measurements. By analysing the differences to system-development-oriented scenarios, the paper shows how the process of utility analysis can be tailored to fit the problem space, and describes a method geared towards automated evaluation in an empirical setting. We describe tool support and a framework for automated measurements.We further present a taxonomy of decision criteria for the described scenario and discuss the data collection means needed for each category of criteria.Results: To evaluate our approach, we discuss a series of case studies in the area of digital preservation. We analyse the criteria defined in these case studies, classify them according to the taxonomy, and discuss the quantitative coverage of automated measurements.Conclusion: The results of the analysis show that an automated measurement, evaluation and selection framework is necessary and feasible to ensure trusted and repeatable decisions.  相似文献   
89.
90.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号