首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   269篇
  免费   10篇
电工技术   5篇
化学工业   41篇
金属工艺   5篇
机械仪表   1篇
建筑科学   5篇
矿业工程   1篇
能源动力   15篇
轻工业   25篇
水利工程   1篇
无线电   24篇
一般工业技术   53篇
冶金工业   30篇
自动化技术   73篇
  2023年   8篇
  2022年   5篇
  2021年   14篇
  2020年   6篇
  2019年   8篇
  2018年   5篇
  2017年   5篇
  2016年   15篇
  2015年   8篇
  2014年   11篇
  2013年   20篇
  2012年   15篇
  2011年   24篇
  2010年   12篇
  2009年   16篇
  2008年   11篇
  2007年   17篇
  2006年   15篇
  2005年   13篇
  2004年   9篇
  2003年   4篇
  2002年   7篇
  2001年   3篇
  2000年   3篇
  1999年   4篇
  1998年   5篇
  1997年   3篇
  1995年   2篇
  1993年   1篇
  1992年   1篇
  1990年   2篇
  1988年   1篇
  1986年   2篇
  1985年   1篇
  1984年   2篇
  1972年   1篇
排序方式: 共有279条查询结果,搜索用时 15 毫秒
1.
2.
3.
Reuse of software components, either closed or open source, is considered to be one of the most important best practices in software engineering, since it reduces development cost and improves software quality. However, since reused components are (by definition) generic, they need to be customized and integrated into a specific system before they can be useful. Since this integration is system-specific, the integration effort is non-negligible and increases maintenance costs, especially if more than one component needs to be integrated. This paper performs an empirical study of multi-component integration in the context of three successful open source distributions (Debian, Ubuntu and FreeBSD). Such distributions integrate thousands of open source components with an operating system kernel to deliver a coherent software product to millions of users worldwide. We empirically identified seven major integration activities performed by the maintainers of these distributions, documented how these activities are being performed by the maintainers, then evaluated and refined the identified activities with input from six maintainers of the three studied distributions. The documented activities provide a common vocabulary for component integration in open source distributions and outline a roadmap for future research on software integration.  相似文献   
4.
5.
Bug fixing accounts for a large amount of the software maintenance resources. Generally, bugs are reported, fixed, verified and closed. However, in some cases bugs have to be re-opened. Re-opened bugs increase maintenance costs, degrade the overall user-perceived quality of the software and lead to unnecessary rework by busy practitioners. In this paper, we study and predict re-opened bugs through a case study on three large open source projects—namely Eclipse, Apache and OpenOffice. We structure our study along four dimensions: (1) the work habits dimension (e.g., the weekday on which the bug was initially closed), (2) the bug report dimension (e.g., the component in which the bug was found) (3) the bug fix dimension (e.g., the amount of time it took to perform the initial fix) and (4) the team dimension (e.g., the experience of the bug fixer). We build decision trees using the aforementioned factors that aim to predict re-opened bugs. We perform top node analysis to determine which factors are the most important indicators of whether or not a bug will be re-opened. Our study shows that the comment text and last status of the bug when it is initially closed are the most important factors related to whether or not a bug will be re-opened. Using a combination of these dimensions, we can build explainable prediction models that can achieve a precision between 52.1–78.6 % and a recall in the range of 70.5–94.1 % when predicting whether a bug will be re-opened. We find that the factors that best indicate which bugs might be re-opened vary based on the project. The comment text is the most important factor for the Eclipse and OpenOffice projects, while the last status is the most important one for Apache. These factors should be closely examined in order to reduce maintenance cost due to re-opened bugs.  相似文献   
6.
Novel sintering methods have emerged in the recent past years, which have raised great interest in the scientific community. Relying on electric field effects, high heating rates, the use of mechanical pressure, or hydrothermal conditions, they offer fundamental advantages compared to conventional sintering routes like minimizing the energy consumption and enhancing the process efficiency. This perspective aims at explaining these effects in a general way and presenting the status quo of using them for the processing of high-performing ceramic materials. In detail, this work focuses on flash sintering, ultrafast high-temperature sintering, spark plasma sintering, cold sintering, and photonic sintering methods based on different light sources. The specificities, potentials, and limitations of each method are compared, especially in the light of a possible industrialization.  相似文献   
7.
This study explores how distributing the controls of a video game among multiple players affects the sociality and engagement experienced in game play. A video game was developed in which the distribution of game controls among the players could be varied, thereby affecting the abilities of the individual players to control the game. An experiment was set up in which eight groups of three players were asked to play the video game while the distribution of the game controls was increased in three steps. After each playing session, the players’ experiences of sociality and engagement were assessed using questionnaires. The results showed that distributing game control among the players increased the level of experienced sociality and reduced the level of experienced control. The game in which the controls were partly distributed led to the highest levels of experienced engagement, because the game allowed social play while still giving the players a sense of autonomy. The implications for interaction design are discussed.  相似文献   
8.
Big data is being implemented with success in the private sector and science. Yet the public sector seems to be falling behind, despite the potential value of big data for government. Government organizations do recognize the opportunities of big data but seem uncertain about whether they are ready for the introduction of big data, and if they are adequately equipped to use big data. This paper addresses those uncertainties. It presents an assessment framework for evaluating public organizations’ big data readiness. Doing so demystifies the concept of big data, as it is expressed in terms of specific and measureable organizational characteristics. The framework was tested by applying it to organizations in the Dutch public sector. The results suggest that organizations may be technically capable of using big data, but they will not significantly gain from these activities if the applications do not fit their organizations and main statutory tasks. The framework proved helpful in pointing out areas where public sector organizations could improve, providing guidance on how government can become more big data ready in the future.  相似文献   
9.
Diagnosing cardiovascular system (CVS) diseases from clinically measured data is difficult, due to the complexity of the hemodynamic and autonomic nervous system (ANS) interactions. Physiological models could describe these interactions to enable simulation of a variety of diseases, and could be combined with parameter estimation algorithms to help clinicians diagnose CVS dysfunctions. This paper presents modifications to an existing CVS model to include a minimal physiological model of ANS activation. A minimal model is used so as to minimise the number of parameters required to specify ANS activation, enabling the effects of each parameter on hemodynamics to be easily understood. The combined CVS and ANS model is verified by simulating a variety of CVS diseases, and comparing simulation results with common physiological understanding of ANS function and the characteristic hemodynamics seen in these diseases. The model of ANS activation is required to simulate hemodynamic effects such as increased cardiac output in septic shock, elevated pulmonary artery pressure in left ventricular infarction, and elevated filling pressures in pericardial tamponade. This is the first known example of a minimal CVS model that includes a generic model of ANS activation and is shown to simulate diseases from throughout the CVS.  相似文献   
10.
This paper introduces the third generation of Pleated Pneumatic Artificial Muscles (PPAM), which has been developed to simplify the production over the first and second prototype. This type of artificial muscle was developed to overcome dry friction and material deformation, which is present in the widely used McKibben muscle. The essence of the PPAM is its pleated membrane structure which enables the muscle to work at low pressures and at large contractions. In order to validate the new PPAM generation, it has been compared with the mathematical model and the previous generation. The new production process and the use of new materials introduce improvements such as 55% reduction in the actuator’s weight, a higher reliability, a 75% reduction in the production time and PPAMs can now be produced in all sizes from 4 to 50?cm. This opens the possibility to commercialize this type of muscles so others can implement it. Furthermore, a comparison with experiments between PPAM and Festo McKibben muscles is discussed. Small PPAMs present similar force ranges and larger contractions than commercially available McKibben-like muscles. The use of series arrangements of PPAMs allows for large strokes and relatively small diameters at the same time and, since PPAM 3.0 is much more lightweight than the commong McKibben models made by Festo, it presents better force-to-mass and energy to mass ratios than Festo models.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号