首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1444篇
  免费   73篇
  国内免费   2篇
电工技术   15篇
化学工业   379篇
金属工艺   29篇
机械仪表   41篇
建筑科学   37篇
矿业工程   5篇
能源动力   40篇
轻工业   183篇
水利工程   18篇
石油天然气   5篇
无线电   105篇
一般工业技术   211篇
冶金工业   89篇
原子能技术   4篇
自动化技术   358篇
  2023年   12篇
  2022年   41篇
  2021年   68篇
  2020年   35篇
  2019年   41篇
  2018年   49篇
  2017年   41篇
  2016年   46篇
  2015年   40篇
  2014年   57篇
  2013年   111篇
  2012年   93篇
  2011年   113篇
  2010年   89篇
  2009年   81篇
  2008年   79篇
  2007年   60篇
  2006年   56篇
  2005年   45篇
  2004年   45篇
  2003年   31篇
  2002年   28篇
  2001年   26篇
  2000年   12篇
  1999年   16篇
  1998年   9篇
  1997年   10篇
  1996年   7篇
  1995年   12篇
  1993年   12篇
  1992年   8篇
  1991年   4篇
  1990年   5篇
  1989年   5篇
  1988年   5篇
  1987年   4篇
  1985年   7篇
  1984年   8篇
  1983年   9篇
  1982年   8篇
  1981年   14篇
  1979年   5篇
  1978年   5篇
  1977年   4篇
  1976年   5篇
  1971年   6篇
  1970年   4篇
  1967年   3篇
  1966年   4篇
  1936年   4篇
排序方式: 共有1519条查询结果,搜索用时 62 毫秒
21.
Blogs can be used as a conduit for customer opinions and, in so doing, building communities around products. We attempt to realise this vision by building blogs out of product catalogues. Unfortunately, the immaturity of blog engines makes this endeavour risky. This paper presents a model-driven approach to face this drawback. This implies the introduction of (meta)models: the catalogue model, based on the standard Open Catalog Format, and blog models, that elaborate on the use of blogs as conduits for virtual communities. Blog models end up being realised through blog engines. Specifically, we focus on two types of engines: a hosted blog platform and a standalone blog platform, both in Blojsom. However, the lack of standards in a broad and constantly evolving blog-engine space, hinders both the portability and the maintainability of the solution. Hence, we resort to the notion of “abstract platform” as a way to depart from the peculiarities of specific blog engines. Additionally, the paper measures the reuse gains brought by MDE in comparison with the manual coding of blogs.  相似文献   
22.
For software process improvement - SPI - there are few small organizations using models that guide the management and deployment of their improvement initiatives. This is largely because a lot of these models do not consider the special characteristics of small businesses, nor the appropriate strategies for deploying an SPI initiative in this type of organization. It should also be noted that the models which direct improvement implementation for small settings do not present an explicit process with which to organize and guide the internal work of the employees involved in the implementation of the improvement opportunities. In this paper we propose a lightweight process, which takes into account appropriate strategies for this type of organization. Our proposal, known as a “Lightweight process to incorporate improvements”, uses the philosophy of the Scrum agile method, aiming to give detailed guidelines for supporting the management and performance of the incorporation of improvement opportunities within processes and their putting into practice in small companies. We have applied the proposed process in two small companies by means of the case study research method, and from the initial results, we have observed that it is indeed suitable for small businesses.  相似文献   
23.
For the last 30 years, several dynamic memory managers (DMMs) have been proposed. Such DMMs include first fit, best fit, segregated fit and buddy systems. Since the performance, memory usage and energy consumption of each DMM differs, software engineers often face difficult choices in selecting the most suitable approach for their applications. This issue has special impact in the field of portable consumer embedded systems, that must execute a limited amount of multimedia applications (e.g., 3D games, video players, signal processing software, etc.), demanding high performance and extensive memory usage at a low energy consumption. Recently, we have developed a novel methodology based on genetic programming to automatically design custom DMMs, optimizing performance, memory usage and energy consumption. However, although this process is automatic and faster than state-of-the-art optimizations, it demands intensive computation, resulting in a time-consuming process. Thus, parallel processing can be very useful to enable to explore more solutions spending the same time, as well as to implement new algorithms. In this paper we present a novel parallel evolutionary algorithm for DMMs optimization in embedded systems, based on the Discrete Event Specification (DEVS) formalism over a Service Oriented Architecture (SOA) framework. Parallelism significantly improves the performance of the sequential exploration algorithm. On the one hand, when the number of generations are the same in both approaches, our parallel optimization framework is able to reach a speed-up of 86.40× when compared with other state-of-the-art approaches. On the other, it improves the global quality (i.e., level of performance, low memory usage and low energy consumption) of the final DMM obtained in a 36.36% with respect to two well-known general-purpose DMMs and two state-of-the-art optimization methodologies.  相似文献   
24.
Recently, some studies linked the computational power of abstract computing systems based on multiset rewriting to models of Petri nets and the computation power of these nets to their topology. In turn, the computational power of these abstract computing devices can be understood by just looking at their topology, that is, information flow.Here we continue this line of research by introducing J languages and proving that they can be accepted by place/transition systems whose underlying net is composed only by join. Moreover, we study how J languages relate to other families of formal languages.  相似文献   
25.
26.
The position of mobile users has become highly important information in pervasive computing environments. Indoor localization systems based on Wi–Fi signal strength fingerprinting techniques are widely used in office buildings with an existing Wi–Fi infrastructure. Our previous work has proposed a solution based on exploitation of a FM signal to deal with environments not covered with Wi–Fi signal or environments with only a single Wi–Fi access point. However, a general problem of indoor wireless positioning systems pertains to signal degradation due to the environmental factors affecting signal propagation. Therefore, in order to maintain a desirable level of localization accuracy, it becomes necessary to perform periodic calibrations of the system, which is either time consuming or requires dedicated equipment and expert knowledge. In this paper, we present a comparison of FM versus Wi–Fi positioning systems and a combination of both systems, exploiting their strengths for indoors positioning. We also address the problem of recalibration by introducing a novel concept of spontaneous recalibration and demonstrate it using the FM localization system. Finally, the results related to device orientation and localization accuracy are discussed.  相似文献   
27.
Web-based learning environments are becoming increasingly popular in higher education. One of the most important web-learning resources is the virtual laboratory (VL), which gives students an easy way for training and learning through the Internet. Moreover, on-line collaborative communication represents a practical method to transmit the knowledge and experience from the teacher to students overcoming physical distance and isolation. Considering these facts, the authors of this document have developed a new dynamic collaborative e-learning system which combines the main advantages of virtual laboratories and collaborative learning practices. In this system, the virtual laboratories are based on Java applets which have embedded simulations developed in Easy Java Simulations (EJS), an open-source tool for teachers who do not need complex programming skills. The collaborative e-learning is based on a real-time synchronized communication among these Java applets. Therefore, this original approach provides a new tool which integrates virtual laboratories inside a synchronous collaborative e-learning framework. This paper describes the main features of this system and its successful application in a distance education environment among different universities from Spain.  相似文献   
28.
Photographic supra-projection is a forensic process that aims to identify a missing person from a photograph and a skull found. One of the crucial tasks throughout all this process is the craniofacial superimposition which tries to find a good fit between a 3D model of the skull and the 2D photo of the face. This photographic supra-projection stage is usually carried out manually by forensic anthropologists. It is thus very time consuming and presents several difficulties. In this paper, we aim to demonstrate that real-coded evolutionary algorithms are suitable approaches to tackle craniofacial superimposition. To do so, we first formulate this complex task in forensic identification as a numerical optimization problem. Then, we adapt three different evolutionary algorithms to solve it: two variants of a real-coded genetic algorithm and the state of the art evolution strategy CMA-ES. We also consider an existing binary-coded genetic algorithm as a baseline. Results on several superimposition problems of real-world identification cases solved by the Physical Anthropology lab at the University of Granada (Spain) are considered to test our proposals.  相似文献   
29.
Agricultural production is highly dependent on climate variability in many parts of the world. In particular, drought may severely reduce crop yields, potentially affecting food availability at local, regional, and global scales. The Food and Agriculture Organization of the United Nations (FAO) operates the Global Early Warning System (GIEWS), which monitors global food supply and demand. One of the key challenges is to obtain synoptic information on a recurrent and timely basis about drought-affected agricultural zones. This is needed to quickly identify areas requiring immediate attention. The Agricultural Stress Index System (ASIS), based on imagery from the Advanced Very High Resolution Radiometer (AVHRR) sensors on board the National Oceanic and Atmospheric Administration (NOAA) and Meteorological Operational Satellite (METOP) satellites, was specifically developed to meet this need. The system is based on a methodology developed by Rojas, Vrieling, and Rembold over the African continent. This approach has been modified and adapted to the global scale to produce an agricultural stress index (ASI) representing, per administrative unit, the percentage of cropland (or pasture) areas affected by drought over the growing season. The vegetation health index (VHI), based on normalized difference vegetation index (NDVI) and temperature anomalies, is used as a drought indicator. A fused time series of AVHRR data from METOP and NOAA was used to produce a consistent time series of VHI at 1 km resolution. Global phenology maps, indicating the number of growing seasons and their start and end dates, were derived from a multi-annual image set of SPOT-Vegetation (1999–2011). The VHI time series and phenology maps were then combined to produce the ASI for the years 1984 to the present. This allowed evaluation of the suitability of the ASIS to identify drought using historical reports and ancillary data. As a result of this analysis, ASIS was positively evaluated to support the FAO early warning system.  相似文献   
30.
Existing empirical studies on test-driven development (TDD) report different conclusions about its effects on quality and productivity. Very few of those studies are experiments conducted with software professionals in industry. We aim to analyse the effects of TDD on the external quality of the work done and the productivity of developers in an industrial setting. We conducted an experiment with 24 professionals from three different sites of a software organization. We chose a repeated-measures design, and asked subjects to implement TDD and incremental test last development (ITLD) in two simple tasks and a realistic application close to real-life complexity. To analyse our findings, we applied a repeated-measures general linear model procedure and a linear mixed effects procedure. We did not observe a statistical difference between the quality of the work done by subjects in both treatments. We observed that the subjects are more productive when they implement TDD on a simple task compared to ITLD, but the productivity drops significantly when applying TDD to a complex brownfield task. So, the task complexity significantly obscured the effect of TDD. Further evidence is necessary to conclude whether TDD is better or worse than ITLD in terms of external quality and productivity in an industrial setting. We found that experimental factors such as selection of tasks could dominate the findings in TDD studies.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号