首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   130篇
  免费   4篇
  国内免费   3篇
电工技术   2篇
技术理论   1篇
综合类   8篇
化学工业   9篇
金属工艺   3篇
机械仪表   4篇
建筑科学   5篇
能源动力   15篇
水利工程   2篇
无线电   5篇
一般工业技术   22篇
原子能技术   1篇
自动化技术   60篇
  2023年   16篇
  2022年   8篇
  2021年   2篇
  2020年   12篇
  2019年   8篇
  2018年   3篇
  2017年   13篇
  2016年   16篇
  2015年   7篇
  2014年   4篇
  2013年   3篇
  2012年   7篇
  2011年   4篇
  2010年   3篇
  2009年   1篇
  2008年   2篇
  2007年   1篇
  2006年   4篇
  2005年   2篇
  2004年   1篇
  2003年   4篇
  2002年   1篇
  2001年   3篇
  1999年   2篇
  1998年   1篇
  1996年   1篇
  1990年   2篇
  1988年   1篇
  1987年   4篇
  1986年   1篇
排序方式: 共有137条查询结果,搜索用时 15 毫秒
51.
Petrochemical industry is one of the major sectors contributing to the world-wide economy and the digital transformation is urgent to enhance core competence. In general, ethylene, propylene and butadiene, which are associated with synthetic chemicals, are the main raw materials of this industry with around 70–80% cost structure. In particular, butadiene is one of the key materials for producing synthetic rubber and used for several daily commodities. However, the price of butadiene fluctuates along with the demand–supply mismatch or by the international economy and political events. This study proposes two-stage data science framework to predict the weekly price of butadiene and optimize the procurement decision. The first stage suggests several the price prediction models with a comprehensive information including contract price, supply rate, demand rate, and upstream and downstream information. The second stage applies the analytic hierarchy process and reinforcement learning technique to derive an optimal policy of procurement decision and reduce the total procurement cost. An empirical study is conducted to validate the proposed framework, and the results improve the accuracy of price forecasts and the procurement cost reduction of the raw materials.  相似文献   
52.
Failure mode and effects analysis (FMEA) is one of the most popular reliability analysis techniques due to its outstanding capabilities in identifying, assessing, and eliminating potential failure modes in a wide range of industrial applications. It provides a comprehensive view for investigating potential failures, causes, and effects in designs, products, and processes. However, traditional FMEA is extensively criticized for its defects in determining the criteria weights, identifying the risk priority of failure modes, and handling the uncertainty during the risk evaluation. To resolve these problems, this study proposes a novel fuzzy rough number extended multi-criteria group decision-making (FR-MCGDM) strategy to determine a more rational rank of failure modes by integrating the fuzzy rough number, AHP (analytic hierarchy process), and VIKOR (Serbian: VIseKriterijumska Optimizacija I Kompromisno Resenje). Above all, a fuzzy rough number is introduced to characterize experts’ judgment, aggregate group risk assessments, and tackle the uncertainty and subjectivity in the risk evaluation. Then a fuzzy rough number enhanced AHP is presented to determine the criteria weights. A fuzzy rough number enhanced VIKOR is proposed to rank the failure modes. A practical case study of the check valve is provided to validate the applicability of the proposed FMEA. Comparative studies demonstrate the efficacy of the proposed FR-MCGDM, with remarkable advantages in handling the uncertainty and subjectivity during failure modes evaluation.  相似文献   
53.
Dynamic personalized orders demand and uncertain manufacturing resource availability have become the research hotspots of intelligent resource optimization allocation. Currently, the data generated from the manufacturing industry are rapidly expanding. Such data are multi-source, heterogeneous and multi-scale. Transforming the data into knowledge to optimize the allocation between personalized orders and manufacturing resources is an effective strategy to improve the cognitive intelligent production level of enterprises. However, the manufacturing processes in resource allocation is diversity. There are many rules and constraints among the data. And the relationship among data is more complicated. There lacks a unified approach to information modeling and industrial knowledge generation from mining semantic information from massive manufacturing data. The research challenge is how to fully integrate the complex data of workshop resources and mine the implicit semantic information to form a viable knowledge-driven resource allocation optimization method. Such method can then efficiently provide the relevant engineering information needed for resource allocation. This research presented a unified knowledge graph-driven production resource allocation approach, allowing fast resource allocation decision-making for given order inserting tasks, subject to the resource machining information and the device evaluation strategy. The workshop resource knowledge graph (WRKG) model was presented to integrate the engineering semantic information in the machining workshop. A distributed knowledge representation learning algorithm was developed to mine the implicit resource information for updating the WRKG in real-time. Moreover, a three-staged resource allocation optimization method supported by the WRKG was proposed to output the device sets needed for a specific task. A case study of the manufacturing resource allocation process task in an aerospace enterprise was used to demonstrate the feasibility of the proposed approach.  相似文献   
54.
Optimal channel assignment (CA) in multi-radio wireless mesh networks is an NP-hard problem for which solutions usually leave several links interfering. Most of these solutions usually consider the overall throughput as the main optimization objective. However, other objectives have to be considered in order to provide better quality wireless connections to non stationary users. In this paper, we propose a multi-objective optimization model that, besides maximizing throughput, improves fairness and handoff experience of mesh clients. In this model, we use the Jain’s index to maximize users’ fairness and we allow same-channel assignments to links involved in the same high handoff traffic, thus reducing handoff-triggered re-routing characterized by its high latency. Then, we propose a centralized variable neighborhood search and a Tabu search heuristics to efficiently solve our model as an offline CA process. Moreover, in order to adapt to traffic dynamics caused especially by user handoffs, we propose an online CA scheme that carefully re-assigns channels to interfaces with the purpose of continuously minimizing the re-routing overhead/latency during user handoffs. We further improve this online scheme using load balancing. Simulation results show the good performance of our proposed approach in terms of delay, loss rate, overall throughput and fairness. Particularly, performance results of our online handoff-aware CA show the effectiveness of handoffs not involving path re-routing in decreasing the delay, especially when considering load balancing.  相似文献   
55.
As a drink, coffee is one of the most in demand products worldwide; as an agricultural product, it requires non-destructive tools for its monitoring and control. In order to create a non-destructive method which can be used in the field, a system was developed to find and classify six types of vegetative structures on coffee branches: leaves, stems, flowers, unripe fruits, semi-ripe fruits, and ripe fruits. Videos were obtained from 12 coffee branches in field conditions, using the rear camera of a mobile device. Approximately 90 frames, those which had the most information from the scene, were selected from each video. Next, a three-dimensional (3D) reconstruction was generated using the Structure from Motion (SfM) and Patch-based Multi-view Stereo (PMVS) techniques for each branch. All acquired images were manually recorded, and a Ground Truth point cloud was generated for each branch. The generated point clouds were filtered using a statistical outliers filter, in order to eliminate noise generated in the 3D reconstruction process. The points that were located in the deepest part were considered to be the scene background, and were removed using a band-pass filter. Point clouds were sub-sampled using a VoxelGrid filter, to reduce the number of points to 50% and therefore reduce computation time of the processes that followed. Various two-dimensional (2D) and 3D features were taken from the point clouds: 11 based on RGB, Lab, Luv, YCbCr, and HSV color space, four based on curvatures, and the remaining two based on shape and curvedness indexes. A Support Vector Machine (SVM) was trained with the previously mentioned features by using eight branches for the training stage, and four branches for the validation stage. Experimental results showed a precision of 0.82 and a recall of 0.79, when classifying said vegetative structures. The proposed system is economical, as only a mobile device is needed to obtain information. Remaining system processes were performed offline. Additionally, the system developed was not affected by changes in lighting conditions, when recording videos on a coffee plantation.  相似文献   
56.
Product design involves a computer-aided design (CAD) model with its design (dimensional) parameters. A generative design (GD) system can then be utilized to generate new designs by modifying these parameters. There is a need for a GD system to determine the visual validity of a design that is obtained after parametric modification. In this context, this paper introduces an approach to learn visual (i.e., design) constraints of a CAD model (represented using B-spline surfaces) by means of user feedbacks. A deformation technique (utilizing modification and limit curves) for B-spline surfaces is first introduced, which involves a few design (deformation) parameters. Via a generative learning process, the proposed system, SplineLearner, generates random designs, which are shown to user(s) for visual validity classifications. In a machine learning step, a mathematical model is computed that can perform prediction for a design to be valid or not. The mathematical model is also integrated into SplineLearner (after some user interactions) to prevent imbalances between the numbers of valid and invalid designs. As a proof of concept, B-spline surface models of a car body parts (hood, roof, side and trunk) are utilized, and two user studies are conducted to demonstrate the efficacy of the proposed method.  相似文献   
57.
This paper aims to analyze the effectiveness of maritime safety control from the perspective of safety level along the Yangtze River with special considerations for navigational environments. The influencing variables of maritime safety are reviewed, including ship condition, maritime regulatory system, human reliability and navigational environment. Because the former three variables are generally assumed to be of the same level of safety, this paper focuses on studying the impact of navigational environments on the level of safety in different waterways. An improved data envelopment analysis (DEA) model is proposed by treating the navigational environment factors as inputs and ship accident data as outputs. Moreover, because the traditional DEA model cannot provide an overall ranking of different decision making units (DMUs), the spatial sequential frontiers and grey relational analysis are incorporated into the DEA model to facilitate a refined assessment. Based on the empirical study results, the proposed model is able to solve the problem of information missing in the prior models and evaluate the level of safety with a better accuracy. The results of the proposed DEA model are further compared with an evidential reasoning (ER) method, which has been widely used for level of safety evaluations. A sensitivity analysis is also conducted to better understand the relationship between the variation of navigational environments and level of safety. The sensitivity analysis shows that the level of safety varies in terms of traffic flow. It indicates that appropriate traffic control measures should be adopted for different waterways to improve their safety. This paper presents a practical method of conducting maritime level of safety assessments under dynamic navigational environment.  相似文献   
58.
To clarify the strength improvement mechanism of gap-graded blended cements with a high amount of supplementary cementitious materials, phase composition of hardened gap-graded blended cement pastes was quantified, and compared with those of Portland cement paste and reference blended cement (prepared by co-grinding) paste. The results show that the gap-graded blended cement pastes containing only 25% cement clinker by mass have comparable amount of gel products and porosity with Portland cement paste at all tested ages. For gap-graded blended cement pastes, about 40% of the total gel products can be attributed to the hydration of fine blast furnace slag, and the main un-hydrated component is coarse fly ash, corresponding to un-hydrated cement clinker in Portland cement paste. Further, pore size refinement is much more pronounced in gap-graded blended cement pastes, attributing to high initial packing density of cement paste (grain size refinement) and significant hydration of BFS.  相似文献   
59.
Virtualization can provide significant benefits in data centers by enabling dynamic virtual machine resizing and migration to eliminate hotspots. We present Sandpiper, a system that automates the task of monitoring and detecting hotspots, determining a new mapping of physical to virtual resources, resizing virtual machines to their new allocations, and initiating any necessary migrations. Sandpiper implements a black-box approach that is fully OS- and application-agnostic and a gray-box approach that exploits OS- and application-level statistics. We implement our techniques in Xen and conduct a detailed evaluation using a mix of CPU, network and memory-intensive applications. Our results show that Sandpiper is able to resolve single server hotspots within 20 s and scales well to larger, data center environments. We also show that the gray-box approach can help Sandpiper make more informed decisions, particularly in response to memory pressure.  相似文献   
60.
In this study, we present the thermodynamic feasibility analysis of a two-step hydrogen chloride cycle for sustainable hydrogen production. Exergy approach in addition to conventional energy approach is utilized to study the performance of the cycle. Here, a solid oxide membrane for the gas phase electrolysis of hydrogen chloride is employed and the temperature change between the cycle steps is eliminated for better thermal management. Moreover, a parametric study is conducted to observe the cycle variation with certain parameters such as operating temperature, current density, and hydrogen production rate. The calculated results show that with the use of the current cycle, one can produce 1 kg/s of hydrogen with the consumption of 335.8 MW electricity and 29.2 MW of thermal energy. Additionally, two different definitions of energy and exergy efficiencies are introduced to investigate the difference between actual and ideal (theoretical) cycle performances. The proposed cycle can be effectively used to produce hydrogen using concentrated solar and nuclear waste heat at high temperatures.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号