首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   5797篇
  免费   265篇
  国内免费   21篇
电工技术   53篇
综合类   23篇
化学工业   1140篇
金属工艺   90篇
机械仪表   125篇
建筑科学   297篇
矿业工程   10篇
能源动力   239篇
轻工业   356篇
水利工程   56篇
石油天然气   31篇
无线电   513篇
一般工业技术   1176篇
冶金工业   835篇
原子能技术   46篇
自动化技术   1093篇
  2023年   56篇
  2022年   115篇
  2021年   186篇
  2020年   130篇
  2019年   175篇
  2018年   212篇
  2017年   136篇
  2016年   163篇
  2015年   134篇
  2014年   197篇
  2013年   353篇
  2012年   311篇
  2011年   347篇
  2010年   255篇
  2009年   270篇
  2008年   273篇
  2007年   276篇
  2006年   202篇
  2005年   193篇
  2004年   167篇
  2003年   170篇
  2002年   146篇
  2001年   86篇
  2000年   96篇
  1999年   73篇
  1998年   232篇
  1997年   157篇
  1996年   98篇
  1995年   107篇
  1994年   78篇
  1993年   71篇
  1992年   59篇
  1991年   46篇
  1990年   31篇
  1989年   31篇
  1988年   37篇
  1987年   34篇
  1986年   19篇
  1985年   28篇
  1984年   20篇
  1983年   15篇
  1982年   27篇
  1981年   30篇
  1980年   24篇
  1979年   22篇
  1978年   17篇
  1977年   26篇
  1976年   45篇
  1975年   13篇
  1972年   13篇
排序方式: 共有6083条查询结果,搜索用时 15 毫秒
91.
About 20 years ago, Markus and Robey noted that most research on IT impacts had been guided by deterministic perspectives and had neglected to use an emergent perspective, which could account for contradictory findings. They further observed that most research in this area had been carried out using variance theories at the expense of process theories. Finally, they suggested that more emphasis on multilevel theory building would likely improve empirical reliability. In this paper, we reiterate the observations and suggestions made by Markus and Robey on the causal structure of IT impact theories and carry out an analysis of empirical research published in four major IS journals, Management Information Systems Quarterly (MISQ), Information Systems Research (ISR), the European Journal of Information Systems (EJIS), and Information and Organization (I&O), to assess compliance with those recommendations. Our final sample consisted of 161 theory-driven articles, accounting for approximately 21% of all the empirical articles published in these journals. Our results first reveal that 91% of the studies in MISQ, ISR, and EJIS focused on deterministic theories, while 63% of those in I&O adopted an emergent perspective. Furthermore, 91% of the articles in MISQ, ISR, and EJIS adopted a variance model; this compares with 71% from I&O that applied a process model. Lastly, mixed levels of analysis were found in 14% of all the surveyed articles. Implications of these findings for future research are discussed.  相似文献   
92.
The purpose of autonomic networking is to manage the business and technical complexity of networked components and systems. However, existing network management data has no link to business concepts. This makes it very difficult to ensure that services offered by the network are meeting business objectives. This paper describes a novel context-aware policy model that uses a combination of modeled and ontological data to determine the current context, which policies are applicable to that context, and what services and resources should be offered to which users and applications.
Simon DobsonEmail:

John Strassner   is the director of autonomic research in the Telecommunications Systems & Software Group in Waterford Institute of Technology, and a Visiting Professor at POSTECH. His research interests are in autonomic systems, policy based management, machine learning, and semantic reasoning. He is the Chairman of the Autonomic Communications Forum, and the past chair of the TMF’s NGOSS SID, metamodel and policy working groups. He has authored two books, written chapters for five other books, and co-edited five journals on network and service management and autonomics. John is the recipient of the Daniel A. Stokesbury memorial award for excellence in network management, and has authored 211 refereed journal papers and publications. Sven van der Meer   received his M.Sc in computer science and his Dr.-Ing. from Technical University Berlin (TUB), Germany, in 1996 and 2002. Since November 2002, Sven has been a research fellow at the Telecommunications Software & Systems Group at the Waterford Institute of Technology. Since October 2004 he is Senior Investigator of the Competence Centre for Communication Infrastructure Management at TSSG, involved in the Architecture and Information Modelling teams in the TMF, and has served as editor for Technological Neutral Architecture and Contracts specifications within the TM Forum. Declan O’Sullivan   is the director of the Knowledge and Data Engineering (KDEG) research group in Trinity College Dublin (TCD). His research interests are in the use of semantic-driven approaches for network and service management, in particular to enable semantic interoperability. He is currently a Principal Investigator in the SFI funded research project investigating Federated Autonomic Management Environments (FAME). O’Sullivan has a Ph.D. and a M.Sc in computer science from TCD. Simon Dobson   is a co-founder of the Systems Research Group at UCD Dublin. His research centers around adaptive pervasive computing and novel programming techniques. He is on the editorial boards of the Journal of Network and Systems Management and the International Journal of Autonomous and Adaptive Communications Systems, and participates in a number of EU strategic workshops and working groups. He is National Director and vice-president of the European Research Consortium for Informatics and Mathematics, a board member of the Autonomic Communication Forum, and a member of the IBEC/ICT Ireland standing committee on academic/industrial research and development. He holds a BSc and DPhil in computer science, is a Chartered Fellow of the British Computer Society, a Chartered Engineer, and member of the IEEE and ACM.  相似文献   
93.
Practical Interdomain Routing Security   总被引:1,自引:0,他引:1  
This article reviews risks and vulnerabilities in interdomain routing and best practices that can have near-term benefits for routing security. It includes examples of routing failures and common attacks on routers, and countermeasures to reduce router vulnerabilities.  相似文献   
94.
This paper explores how research teams in Intel’s Digital Health Group are using ethnography to identify ‘designable moments’—spaces, times, objects, issues and practices which suggest opportunities for appropriate interventions. It argues that technology innovation should aim to incorporate the views, experiences and practices of users from the start of the design process to support independent living and develop culturally sensitive enhancements that contribute towards wellbeing and a life of quality for local older populations.  相似文献   
95.
As more interactive surfaces enter public life, casual interactions from passersby are bound to increase. Most of these users can be expected to carry a mobile phone or PDA, which nowadays offers significant computing capabilities of its own. This offers new possibilities for interaction between these users’ private displays and large public ones. In this paper, we present a system that supports such casual interactions. We first explore a method to track mobile phones that are placed on a horizontal interactive surface by examining the shadows which are cast on the surface. This approach detects the presence of a mobile device, as opposed to any other opaque object, through the signal strength emitted by the built-in Bluetooth transceiver without requiring any modifications to the devices’ software or hardware. We then go on to investigate interaction between a Sudoku game running in parallel on the public display and on mobile devices carried by passing users. Mobile users can join a running game by placing their devices on a designated area. The only requirement is that the device is in discoverable Bluetooth mode. After a specific device has been recognized, a client software is sent to the device which then enables the user to interact with the running game. Finally, we explore the results of a study which we conducted to determine the effectiveness and intrusiveness of interactions between users on the tabletop and users with mobile devices.  相似文献   
96.
We examined the high precision deposition of toner and polymer microparticles with a typical size of approximately 10 microm on electrode arrays with electrodes of 100 microm and below using custom-made microelectronic chips. Selective desorption of redundant particles was employed to obtain a given particle pattern from preadsorbed particle layers. Microparticle desorption was regulated by dielectrophoretic attracting forces generated by individual pixel electrodes, tangential detaching forces of an air flow, and adhesion forces on the microchip surface. A theoretical consideration of the acting forces showed that without pixel voltage, the tangential force applied for particle detachment exceeded the particle adhesion force. When the pixel voltage was switched on, however, the sum of attracting forces was larger than the tangential detaching force, which was crucial for desorption efficiency. In our experiments, appropriately large dielectrophoretic forces were achieved by applying high voltages of up to 100 V on the pixel electrodes. In addition, electrode geometries on the chip's surface as well as particle size influenced the desorption quality. We further demonstrated the compatibility of this procedure to complementary metal oxide semiconductor chip technology, which should allow for an easy technical implementation with respect to high-resolution microparticle deposition.  相似文献   
97.
Weighted Max-SAT is the optimization version of SAT and many important problems can be naturally encoded as such. Solving weighted Max-SAT is an important problem from both a theoretical and a practical point of view. In recent years, there has been considerable interest in finding efficient solving techniques. Most of this work focuses on the computation of good quality lower bounds to be used within a branch and bound DPLL-like algorithm. Most often, these lower bounds are described in a procedural way. Because of that, it is difficult to realize the logic that is behind.In this paper we introduce an original framework for Max-SAT that stresses the parallelism with classical SAT. Then, we extend the two basic SAT solving techniques: search and inference. We show that many algorithmic tricks used in state-of-the-art Max-SAT solvers are easily expressible in logical terms in a unified manner, using our framework.We also introduce an original search algorithm that performs a restricted amount of weighted resolution at each visited node. We empirically compare our algorithm with a variety of solving alternatives on several benchmarks. Our experiments, which constitute to the best of our knowledge the most comprehensive Max-SAT evaluation ever reported, demonstrate the practical usability of our approach.  相似文献   
98.
The aim of this study was to reproduce the delayed (secondary) cerebral energy failure previously described in birth-asphyxiated newborn infants and to investigate relationships between primary insult severity and the extent of the delayed energy failure. Phosphorus (31P) magnetic resonance spectroscopy (MRS) at 7 T was used to study the brains of 12 newborn piglets during an acute, reversible, cerebral hypoxic-ischemic episode which continued until nucleotide triphosphates (NTP) were depleted. After reperfusion and reoxygenation, spectroscopy was continued for 48 h. High-energy metabolite concentrations returned to near normal levels after the insult, but later they fell as delayed energy failure developed. The time integral of NTP depletion in the primary insult correlated strongly with the minimum [phosphocreatine (PCr)]/[inorganic orthophosphate (Pi)] observed 24–48 h after the insult. (Linear regression analysis gave slope –8.04 h–1; ordinate intercept=1.23;r=0.92;P<0.0001.) This model is currently being used to investigate the therapeutic potential of various cerebroprotective strategies including hypothermia.  相似文献   
99.
100.

IT systems pervade our society more and more, and we become heavily dependent on them. At the same time, these systems are increasingly targeted in cyberattacks, making us vulnerable. Enterprise and cybersecurity responsibles face the problem of defining techniques that raise the level of security. They need to decide which mechanism provides the most efficient defense with limited resources. Basically, the risks need to be assessed to determine the best cost-to-benefit ratio. One way to achieve this is through threat modeling; however, threat modeling is not commonly used in the enterprise IT risk domain. Furthermore, the existing threat modeling methods have shortcomings. This paper introduces a metamodel-based approach named Yet Another Cybersecurity Risk Assessment Framework (Yacraf). Yacraf aims to enable comprehensive risk assessment for organizations with more decision support. The paper includes a risk calculation formalization and also an example showing how an organization can use and benefit from Yacraf.

  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号