首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1469篇
  免费   118篇
  国内免费   39篇
电工技术   32篇
技术理论   2篇
综合类   64篇
化学工业   59篇
金属工艺   53篇
机械仪表   76篇
建筑科学   215篇
矿业工程   42篇
能源动力   17篇
轻工业   35篇
水利工程   35篇
石油天然气   29篇
武器工业   1篇
无线电   209篇
一般工业技术   74篇
冶金工业   121篇
原子能技术   3篇
自动化技术   559篇
  2024年   6篇
  2023年   36篇
  2022年   38篇
  2021年   97篇
  2020年   121篇
  2019年   96篇
  2018年   88篇
  2017年   119篇
  2016年   135篇
  2015年   97篇
  2014年   150篇
  2013年   69篇
  2012年   55篇
  2011年   68篇
  2010年   49篇
  2009年   32篇
  2008年   27篇
  2007年   68篇
  2006年   51篇
  2005年   41篇
  2004年   33篇
  2003年   34篇
  2002年   30篇
  2001年   27篇
  2000年   14篇
  1999年   7篇
  1998年   13篇
  1997年   8篇
  1996年   3篇
  1995年   3篇
  1994年   3篇
  1992年   1篇
  1991年   3篇
  1990年   1篇
  1987年   1篇
  1986年   2篇
排序方式: 共有1626条查询结果,搜索用时 0 毫秒
11.
数据压缩是有效处理大数据的关键技术,在飞腾处理器平台上基于zlib库的数据压缩通常采用软件的方式实现,在数据处理量大而且实时性要求较高的情况下,已经难以满足需求。针对这个问题,通过研究zlib编程函数库,结合飞腾处理器的特性,完成了飞腾平台中硬件数据压缩的驱动设计与实现,其中提出了双向DMA传输技术和基于一致性内存的命令环机制,从而进一步提高了基于硬件的数据压缩的效率。通过实验证实了飞腾平台中采用硬件数据压缩改进的有效性。  相似文献   
12.
Online news has become one of the major channels for Internet users to get news. News websites are daily overwhelmed with plenty of news articles. Huge amounts of online news articles are generated and updated everyday, and the processing and analysis of this large corpus of data is an important challenge. This challenge needs to be tackled by using big data techniques which process large volume of data within limited run times. Also, since we are heading into a social-media data explosion, techniques such as text mining or social network analysis need to be seriously taken into consideration.In this work we focus on one of the most common daily activities: web news reading. News websites produce thousands of articles covering a wide spectrum of topics or categories which can be considered as a big data problem. In order to extract useful information, these news articles need to be processed by using big data techniques. In this context, we present an approach for classifying huge amounts of different news articles into various categories (topic areas) based on the text content of the articles. Since these categories are constantly updated with new articles, our approach is based on Evolving Fuzzy Systems (EFS). The EFS can update in real time the model that describes a category according to the changes in the content of the corresponding articles. The novelty of the proposed system relies in the treatment of the web news articles to be used by these systems and the implementation and adjustment of them for this task. Our proposal not only classifies news articles, but it also creates human interpretable models of the different categories. This approach has been successfully tested using real on-line news.  相似文献   
13.
陶皖  杨磊 《数字社区&智能家居》2013,(10):6340-6342,6347
大数据具有数据量巨大、数据形式多样化等特点,大数据时代为教育和学习提供了丰富的信息资源,但也给教育模式和人才培养带来挑战。首先具体说明大数据时代的特点及对高校人才培养的影响,分析大数据时代对信息系统及相应人才的要求,结合教学实践研究大数据背景下信息系统专业的人才培养模式。  相似文献   
14.
Equivalent electric circuit modeling of PV devices is widely used to predict PV electrical performance. The first task in using the model to calculate the electrical characteristics of a PV device is to find the model parameters which represent the PV device. In the present work, parameter estimation for the model parameter using various evolutionary algorithms is presented and compared. The constraint set on the estimation process is that only the data directly available in module datasheets can be used for estimating the parameters. The electrical model accuracy using the estimated parameters is then compared to several electrical models reported in literature for various PV cell technologies.  相似文献   
15.
本文从环境、功用、性格、艺术、文化五个方面,对岭南园林和江南园林美学特征进行了比较,提炼出二者不同的个性特点,提出了园林是文化的载体。  相似文献   
16.
In this work, we design a multisensory IoT-based online vitals monitor (hereinafter referred to as the VITALS) to sense four bedside physiological parameters including pulse (heart) rate, body temperature, blood pressure, and peripheral oxygen saturation. Then, the proposed system constantly transfers these signals to the analytics system which aids in enhancing diagnostics at an earlier stage as well as monitoring after recovery. The core hardware of the VITALS includes commercial off-the-shelf sensing devices/medical equipment, a powerful microcontroller, a reliable wireless communication module, and a big data analytics system. It extracts human vital signs in a pre-programmed interval of 30 min and sends them to big data analytics system through the WiFi module for further analysis. We use Apache Kafka (to gather live data streams from connected sensors), Apache Spark (to categorize the patient vitals and notify the medical professionals while identifying abnormalities in physiological parameters), Hadoop Distributed File System (HDFS) (to archive data streams for further analysis and long-term storage), Spark SQL, Hive and Matplotlib (to support caregivers to access/visualize appropriate information from collected data streams and to explore/understand the health status of the individuals). In addition, we develop a mobile application to send statistical graphs to doctors and patients to enable them to monitor health conditions remotely. Our proposed system is implemented on three patients for 7 days to check the effectiveness of sensing, data processing, and data transmission mechanisms. To validate the system accuracy, we compare the data values collected from established sensors with the measured readouts using a commercial healthcare monitor, the Welch Allyn® Spot Check. Our proposed system provides improved care solutions, especially for those whose access to care services is limited.  相似文献   
17.
In occupational safety and health, big data and analytics show promise for the prediction and prevention of workplace injuries. Advances in computing power and analytical methods have allowed companies to reveal insights from the “big” data that previously would have gone undetected. Despite the promise, occupational safety has lagged behind other industries, such as supply chain management and healthcare, in terms of exploiting the potential of analytics and much of the data collected by organizations goes unanalyzed. The purpose of the present paper is to argue for the broader application of establishment-level safety analytics. This is accomplished by defining the terms, describing previous research, outlining the necessary components required, and describing knowledge gaps and future directions. The knowledge gaps and future directions for research in establishment-level analytics are categorized into readiness for analytics, analytics methods, technology integration, data culture, and impact of analytics.  相似文献   
18.
The introduction of digital twins is expected to fundamentally change the technology in transportation systems, as they appear to be a compelling concept for monitoring the entire life cycle of the transport system. The advent of widespread information technology, particularly the availability of real-time traffic data, provides the foundation for supplementing predominated (offline) microscopic simulation approaches with actual data to create a detailed real-time digital representation of the physical traffic. However, the use of actual traffic data in real-time motorway analysis has not yet been explored. The reason is that there are no supporting models and the applicability of real-time data in the context of microscopic simulations has yet to be recognized. Thus, this article focuses on microscopic motorway simulation with real-time data integration during system run-time. As a result, we propose a novel paradigm in motorway traffic modeling and demonstrate it using the continuously synchronized digital twin model of the Geneva motorway (DT-GM). We analyze the application of the microscopic simulator SUMO in modeling and simulating on-the-fly synchronized digital replicas of real traffic by leveraging fine-grained actual traffic data streams from motorway traffic counters as input to DT-GM. Thus, the detailed methodological process of developing DT-GM is presented, highlighting the calibration features of SUMO that enable (dynamic) continuous calibration of running simulation scenarios. By doing so, the actual traffic data are directly fused into the running DT-GM every minute so that DT-GM is continuously calibrated as the physical equivalent changes. Accordingly, DT-GM raises a technology dimension in motorway traffic simulation to the next level by enabling simulation-based control optimization during system run-time that was previously unattainable. It, thus, forms the foundation for further evolution of real-time predictive analytics as support for safety–critical decisions in traffic management. Simulation results provide a solid basis for the future real-time analysis of an extended Swiss motorway network.  相似文献   
19.
In the era of big data, traditional regression models cannot deal with uncertain big data efficiently and accurately. In order to make up for this deficiency, this paper proposes a quantum fuzzy regression model, which uses fuzzy theory to describe the uncertainty in big data sets and uses quantum computing to exponentially improve the efficiency of data set preprocessing and parameter estimation. In this paper, data envelopment analysis (DEA) is used to calculate the degree of importance of each data point. Meanwhile, Harrow, Hassidim and Lloyd (HHL) algorithm and quantum swap circuits are used to improve the efficiency of high-dimensional data matrix calculation. The application of the quantum fuzzy regression model to small-scale financial data proves that its accuracy is greatly improved compared with the quantum regression model. Moreover, due to the introduction of quantum computing, the speed of dealing with high-dimensional data matrix has an exponential improvement compared with the fuzzy regression model. The quantum fuzzy regression model proposed in this paper combines the advantages of fuzzy theory and quantum computing which can efficiently calculate high-dimensional data matrix and complete parameter estimation using quantum computing while retaining the uncertainty in big data. Thus, it is a new model for efficient and accurate big data processing in uncertain environments.  相似文献   
20.
This study presents an overview and a short critical review of patented antibiofilm technologies. Patent information was used to determine scenarios that could be used by decision-makers or business intelligence. The study found that academia, mainly from the USA, has been applying for the most patents since 1997. Based on the S-curve analysis, the maturity of this technology sector is emerging. The technological specialization for the 10 most prominent patent applicants and state-of-the-art antibiofilm patents are presented.The high dispersion of patent applicants, the presence of universities among the most active patent applicants, and the content of the patents analyzed are good indicators to infer that developed technologies are close to the academic level. Moreover, many efforts are necessary to bring these technologies into the market.Polymers-N-based, amino acids and peptides, P and S compounds, chelating agents and organometallic complexes, nanoparticles, or composites are claimed to be active against biofilm formation. Such compounds are presented in this work, and in rare cases, they are described as compositions ready for use as marketable products.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号