首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1883篇
  免费   214篇
  国内免费   279篇
电工技术   70篇
综合类   193篇
化学工业   12篇
金属工艺   13篇
机械仪表   311篇
建筑科学   22篇
矿业工程   21篇
轻工业   13篇
水利工程   8篇
石油天然气   18篇
武器工业   9篇
无线电   158篇
一般工业技术   56篇
冶金工业   6篇
原子能技术   2篇
自动化技术   1464篇
  2023年   5篇
  2022年   5篇
  2021年   18篇
  2020年   13篇
  2019年   21篇
  2018年   31篇
  2017年   16篇
  2016年   21篇
  2015年   52篇
  2014年   63篇
  2013年   80篇
  2012年   127篇
  2011年   153篇
  2010年   146篇
  2009年   177篇
  2008年   199篇
  2007年   239篇
  2006年   235篇
  2005年   239篇
  2004年   160篇
  2003年   149篇
  2002年   92篇
  2001年   58篇
  2000年   43篇
  1999年   17篇
  1998年   8篇
  1997年   4篇
  1996年   4篇
  1994年   1篇
排序方式: 共有2376条查询结果,搜索用时 31 毫秒
1.
页岩油探井现场地质评价实验流程与技术进展   总被引:1,自引:0,他引:1  
页岩油勘探钻完井过程中需要及时对页岩含油性及页岩油的可动性进行快速评价,而已有的现场评价技术主要适用于常规砂岩储层,难于满足页岩非均质性描述的需求,迫切需要建立针对性的实验技术序列。根据页岩油探井现场快速地质评价客观需求,结合实验仪器的客观技术指标,提出了现场实验项目、取样保存和实验技术流程,并开展了现场技术方法实验和应用研究,研制磁流体变密度岩石总体积测试装置,解决了易松散形变页岩难于取柱塞样而无法及时获取物性参数的难题,建立了无固定形状页岩物性分析技术;采用液氮冷冻密闭粉碎制样技术避免轻烃损失,岩石热解分析数据更为真实;优化现场测试项目之间以及现场和实验室分析项目之间的衔接,实验支撑更为高效。新技术在江汉盆地潜江组、济阳坳陷沙河街组和鄂尔多斯盆地延长组七段应用效果显著,为陆相页岩油勘探快速地质评价提供了科学依据。同时也需要指出,中国目前页岩油勘探现场实验技术还不完善,还没有相应的规范标准,下一步攻关方向应该是,在完善现场实验技术和应用技术的基础上,形成规范的方法技术体系,搭建页岩油勘探快速地质评价技术平台。  相似文献   
2.
Both planning and design phase of large infrastructural project require analysis, modelling, visualization, and numerical analysis. To perform these tasks, different tools such as Building Information Modelling (BIM) and numerical analysis software are commonly employed. However, in current tunnel engineering practice, there are no systematic solutions for the exchange between design and analysis models, and these tasks usually involve manual and error-prone model generation, setup and update. In this paper, focussing on tunnelling engineering, we demonstrate a systematic and versatile approach to efficiently generate a tunnel design and analyse the lining in different practical scenarios. To this end, a BIM-based approach is developed, which connects a user-friendly industry-standard BIM software with effective simulation tools for high-performance computing. A fully automatized design-through-analysis workflow solution for segmented tunnel lining is developed based on a fully parametric design model and an isogeometric analysis software, connected through an interface implemented with a Revit plugin. The IGA-Revit interface implements a reconstruction algorithm based on sweeping teachnique to construct trivariate NURBS lining segment geometry, which avoids the burden to deal with trimmed geometries.  相似文献   
3.
Workflow management technologies have been dramatically improving their deployment architectures and systems along with the evolution and proliferation of cloud distributed computing environments. Especially, such cloud computing environments ought to be providing a suitable distributed computing paradigm to deploy very large-scale workflow processes and applications with scalable on-demand services. In this paper, we focus on the distribution paradigm and its deployment formalism for such very large-scale workflow applications being deployed and enacted across the multiple and heterogeneous cloud computing environments. We propose a formal approach to vertically as well as horizontally fragment very large-scale workflow processes and their applications and to deploy the workflow process and application fragments over three types of cloud deployment models and architectures. To concretize the formal approach, we firstly devise a series of operational situations fragmenting into cloud workflow process and application components and deploying onto three different types of cloud deployment models and architectures. These concrete approaches are called the deployment-driven fragmentation mechanism to be applied to such very large-scale workflow process and applications as an implementing component for cloud workflow management systems. Finally, we strongly believe that our approach with the fragmentation formalisms becomes a theoretical basis of designing and implementing very large-scale and maximally distributed workflow processes and applications to be deployed on cloud deployment models and architectural computing environments as well.  相似文献   
4.
Knitted composites are textile composite materials that consist of knitted textile reinforcement and polymer matrix. Knitted composites exhibit great design flexibility by allowing the customization of shapes, textures, and material properties. These features facilitate the optimization of buildings’ material systems and the creation of buildings with light weight and high material efficiency.To achieve such a lightweight, material-efficient building structure with knitted composites, this research investigates the material properties of knitted composites and proposes a design process for building-scale knitted composite systems. In the material study, this research examines certain mechanical properties of the material and the effects of additional design elements. In the design exploration, this research explores the design workflow of the structural form, element arrangement, and knit distribution of the material system at the macro-, meso-, and microscales. The project of MeiTing serves as proof of the concept and the design workflow.  相似文献   
5.
在智能电网调度控制系统整体框架下,以电力调度工作流作为流程描述规范,E语言文件作为数据交互载体,消息邮件作为数据交互工具,以调控运行管理需求为导向,设计研发运行管理系统的横纵向多级立体式互联互通框架。应用所述框架,实现调控机构纵向业务流程贯通和跨安全区的横向业务融合,从而实现业务流程完整的、闭环的一体化运作,并且实现与大检修管理系统的横向数据共享。  相似文献   
6.
Next-generation sequencing (NGS) is a cost-effective technology capable of screening several genes simultaneously; however, its application in a clinical context requires an established workflow to acquire reliable sequencing results. Here, we report an optimized NGS workflow analyzing 22 lung cancer-related genes to sequence critical samples such as DNA from formalin-fixed paraffin-embedded (FFPE) blocks and circulating free DNA (cfDNA). Snap frozen and matched FFPE gDNA from 12 non-small cell lung cancer (NSCLC) patients, whose gDNA fragmentation status was previously evaluated using a multiplex PCR-based quality control, were successfully sequenced with Ion Torrent PGM™. The robust bioinformatic pipeline allowed us to correctly call both Single Nucleotide Variants (SNVs) and indels with a detection limit of 5%, achieving 100% specificity and 96% sensitivity. This workflow was also validated in 13 FFPE NSCLC biopsies. Furthermore, a specific protocol for low input gDNA capable of producing good sequencing data with high coverage, high uniformity, and a low error rate was also optimized. In conclusion, we demonstrate the feasibility of obtaining gDNA from FFPE samples suitable for NGS by performing appropriate quality controls. The optimized workflow, capable of screening low input gDNA, highlights NGS as a potential tool in the detection, disease monitoring, and treatment of NSCLC.  相似文献   
7.
Web Service Business Process Execution Language (WS‐BPEL) is one of the most popular service‐oriented workflow applications. The unique features (e.g. dead path elimination semantics and correlation mechanism) of WS‐BPEL applications have raised enormous problems to its test case generation, especially in unit testing. Existing studies mainly assume that each path in the control flow graphs that correspond to WS‐BPEL applications is feasible, which always yields imprecise test cases or complicates testing results. The current study tackles this problem based on satisfiability modulo theory solvers. First, a new coverage criterion is proposed to measure the quality of test sets for testing WS‐BPEL applications. Second, decomposition algorithms are presented to obtain test paths that meet the proposed coverage criterion. Finally, this paper symbolically encodes each test path with several constraints by capturing the unique features of WS‐BPEL. These constraints are solved and the test cases (test paths and test data) are obtained with the help of satisfiability modulo theory solvers to test WS‐BPEL applications effectively. Experiments are conducted using our approach and other typical approaches (e.g. message‐sequence generation‐based approach and concurrent path analysis approach) with 10 WS‐BPEL applications. Experimental results demonstrate that the test cases generated by our approach can avoid instantiating idle instance and expose more faults. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   
8.
工作流将业务过程分解为有序的步骤并分配人力资源加以执行.资源分配受访问控制约束及资源异常干扰,存在可满足性和鲁棒性问题.而其鲁棒性验证又依赖于其可满足性判定,目前通过求可满足性的一个解来完成.本文提出另一种途径,通过统计解的个数来完成判定.特别地,通过多项式计数归约为有求解器可用的#SAT问题,给出了互斥和绑定约束下的可满足性计数算法.实验表明,相对目前时间复杂度最低的可满足性求解算法,该可满足性计数算法显著提高了实际判定效率和适用规模.  相似文献   
9.
The handling of containers in port logistics consists of several activities, such as discharging, loading, gate-in and gate-out, among others. These activities are carried out using various equipment including quay cranes, yard cranes, trucks, and other related machinery. The high inter-dependency among activities and equipment on various factors often puts successive activities off schedule in real-time, leading to undesirable activity down time and the delay of activities. A late container process, in other words, can negatively affect the scheduling of the following ones. The purpose of the study is to analyze the lateness probability using a Bayesian network by considering various factors in container handling. We propose a method to generate a Bayesian network from a process model which can be discovered from event logs in port information systems. In the network, we can infer the activities’ lateness probabilities and, sequentially, provide to port managers recommendations for improving existing activities.  相似文献   
10.
Data-intensive workflows are generally computing- and data-intensive with large volume of data generated during their execution. Therefore, some of the data should be saved to avoid the expensive re-execution of tasks in case of exceptions. However, cloud-based data storage services come at some expense. In this paper, we introduce the risk evaluation model tailored for workflow structure to measure and achieve the trade-off between the overhead of backup storage and the cost of data regeneration in failure, making the service selection and execution more efficient and robust. The proposed method computes and compares the potential loss with and without data backup to achieve the trade-off between overhead of intermediate dataset backup and task re-execution after exceptions. We also design the utility function with the model and apply a genetic algorithm to find the optimized schedule. The results show that the robustness of the schedule is increased while the possible risk of failure is minimized, especially when the volume of generated data is not large in comparison with the input.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号