首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
    
The world is rapidly changing with the advance of information technology. The expansion of the Internet of Things (IoT) is a huge step in the development of the smart city. The IoT consists of connected devices that transfer information. The IoT architecture permits on-demand services to a public pool of resources. Cloud computing plays a vital role in developing IoT-enabled smart applications. The integration of cloud computing enhances the offering of distributed resources in the smart city. Improper management of security requirements of cloud-assisted IoT systems can bring about risks to availability, security, performance, confidentiality, and privacy. The key reason for cloud- and IoT-enabled smart city application failure is improper security practices at the early stages of development. This article proposes a framework to collect security requirements during the initial development phase of cloud-assisted IoT-enabled smart city applications. Its three-layered architecture includes privacy preserved stakeholder analysis (PPSA), security requirement modeling and validation (SRMV), and secure cloud-assistance (SCA). A case study highlights the applicability and effectiveness of the proposed framework. A hybrid survey enables the identification and evaluation of significant challenges.  相似文献   

2.
    
In the Next Generation Radio Networks (NGRN), there will be extreme massive connectivity with the Heterogeneous Internet of Things (HetIoT) devices. The millimeter-Wave (mmWave) communications will become a potential core technology to increase the capacity of Radio Networks (RN) and enable Multiple-Input and Multiple-Output (MIMO) of Radio Remote Head (RRH) technology. However, the challenging key issues in unfair radio resource handling remain unsolved when massive requests are occurring concurrently. The imbalance of resource utilization is one of the main issues occurs when there is overloaded connectivity to the closest RRH receiving exceeding requests. To handle this issue effectively, Machine Learning (ML) algorithm plays an important role to tackle the requests of massive IoT devices to RRH with its obvious capacity conditions. This paper proposed a dynamic RRH gateways steering based on a lightweight supervised learning algorithm, namely K-Nearest Neighbor (KNN), to improve the communication Quality of Service (QoS) in real-time IoT networks. KNN supervises the model to classify and recommend the user’s requests to optimal RRHs which preserves higher power. The experimental dataset was generated by using computer software and the simulation results illustrated a remarkable outperformance of the proposed scheme over the conventional methods in terms of multiple significant QoS parameters, including communication reliability, latency, and throughput.  相似文献   

3.
    
Malicious software (malware) is one of the main cyber threats that organizations and Internet users are currently facing. Malware is a software code developed by cybercriminals for damage purposes, such as corrupting the system and data as well as stealing sensitive data. The damage caused by malware is substantially increasing every day. There is a need to detect malware efficiently and automatically and remove threats quickly from the systems. Although there are various approaches to tackle malware problems, their prevalence and stealthiness necessitate an effective method for the detection and prevention of malware attacks. The deep learning-based approach is recently gaining attention as a suitable method that effectively detects malware. In this paper, a novel approach based on deep learning for detecting malware proposed. Furthermore, the proposed approach deploys novel feature selection, feature co-relation, and feature representations to significantly reduce the feature space. The proposed approach has been evaluated using a Microsoft prediction dataset with samples of 21,736 malware composed of 9 malware families. It achieved 96.01% accuracy and outperformed the existing techniques of malware detection.  相似文献   

4.
    
Supply Chain Finance (SCF) is important for improving the effectiveness of supply chain capital operations and reducing the overall management cost of a supply chain. In recent years, with the deep integration of supply chain and Internet, Big Data, Artificial Intelligence, Internet of Things, Blockchain, etc., the efficiency of supply chain financial services can be greatly promoted through building more customized risk pricing models and conducting more rigorous investment decision-making processes. However, with the rapid development of new technologies, the SCF data has been massively increased and new financial fraud behaviors or patterns are becoming more covertly scattered among normal ones. The lack of enough capability to handle the big data volumes and mitigate the financial frauds may lead to huge losses in supply chains. In this article, a distributed approach of big data mining is proposed for financial fraud detection in a supply chain, which implements the distributed deep learning model of Convolutional Neural Network (CNN) on big data infrastructure of Apache Spark and Hadoop to speed up the processing of the large dataset in parallel and reduce the processing time significantly. By training and testing on the continually updated SCF dataset, the approach can intelligently and automatically classify the massive data samples and discover the fraudulent financing behaviors, so as to enhance the financial fraud detection with high precision and recall rates, and reduce the losses of frauds in a supply chain.  相似文献   

5.
    
In this paper, we provide a new approach to data encryption using generalized inverses. Encryption is based on the implementation of weighted Moore–Penrose inverse AMN(nxm) over the nx8 constant matrix. The square Hermitian positive definite matrix N8x8 p is the key. The proposed solution represents a very strong key since the number of different variants of positive definite matrices of order 8 is huge. We have provided NIST (National Institute of Standards and Technology) quality assurance tests for a random generated Hermitian matrix (a total of 10 different tests and additional analysis with approximate entropy and random digression). In the additional testing of the quality of the random matrix generated, we can conclude that the results of our analysis satisfy the defined strict requirements. This proposed MP encryption method can be applied effectively in the encryption and decryption of images in multi-party communications. In the experimental part of this paper, we give a comparison of encryption methods between machine learning methods. Machine learning algorithms could be compared by achieved results of classification concentrating on classes. In a comparative analysis, we give results of classifying of advanced encryption standard (AES) algorithm and proposed encryption method based on Moore–Penrose inverse.  相似文献   

6.
    
COVID-19 turned out to be an infectious and life-threatening viral disease, and its swift and overwhelming spread has become one of the greatest challenges for the world. As yet, no satisfactory vaccine or medication has been developed that could guarantee its mitigation, though several efforts and trials are underway. Countries around the globe are striving to overcome the COVID-19 spread and while they are finding out ways for early detection and timely treatment. In this regard, healthcare experts, researchers and scientists have delved into the investigation of existing as well as new technologies. The situation demands development of a clinical decision support system to equip the medical staff ways to timely detect this disease. The state-of-the-art research in Artificial intelligence (AI), Machine learning (ML) and cloud computing have encouraged healthcare experts to find effective detection schemes. This study aims to provide a comprehensive review of the role of AI & ML in investigating prediction techniques for the COVID-19. A mathematical model has been formulated to analyze and detect its potential threat. The proposed model is a cloud-based smart detection algorithm using support vector machine (CSDC-SVM) with cross-fold validation testing. The experimental results have achieved an accuracy of 98.4% with 15-fold cross-validation strategy. The comparison with similar state-of-the-art methods reveals that the proposed CSDC-SVM model possesses better accuracy and efficiency.  相似文献   

7.
Distance learning is gradually relocating from Internet-based e-learning platforms to a mobile cloud-based environment. Cloud is a technology which allows the delivery of teaching material, shared learning experience, and exchanges of knowledge with great proficiency. In this paper, we design a mobile cloud-based learning platform adapted to the needs of practice-oriented education, such as teaching of various sports. The platform enables sport students not only to learn skills, but also to view and analyze learning outcomes anytime and anywhere. It is just as easy for teachers because, first, they can share the online learning activities with students and, second, can provide instant assessments of outcomes. The paper explains how the environment can help sport teaching in higher education and how students can use it to improve skills. A specially developed table tennis course employing the cloud environment is the subject for investigating via a questionnaire what students think about it. Their responses, compiled from the questionnaire and follow-up interviews, are statistically analyzed. The results indicate that, overall, experience with the environment agrees with the top four reasons that Jelavic (2014 Jelavic, M. 2014. “Can E-learning be Useful for Sports?eLearning Industry. Accessed February 23. http://elearningindustry.com/can-e-learning-be-useful-for-sports. [Google Scholar]) lists in support of the role of e-learning in sport education and reveals that it is viewed as a fit and useful platform for the purpose.  相似文献   

8.
    
The Internet of Medical Things (IoMT) offers an infrastructure made of smart medical equipment and software applications for healthcare services. Through the internet, the IoMT is capable of providing remote medical diagnosis and timely health services. The patients can use their smart devices to create, store and share their electronic health records (EHR) with a variety of medical personnel including medical doctors and nurses. However, unless the underlying commination within IoMT is secured, malicious users can intercept, modify and even delete the sensitive EHR data of patients. Patients also lose full control of their EHR since most healthcare services within IoMT are constructed under a centralized platform outsourced in the cloud. Therefore, it is appealing to design a decentralized, auditable and secure EHR system that guarantees absolute access control for the patients while ensuring privacy and security. Using the features of blockchain including decentralization, auditability and immutability, we propose a secure EHR framework which is mainly maintained by the medical centers. In this framework, the patients’ EHR data are encrypted and stored in the servers of medical institutions while the corresponding hash values are kept on the blockchain. We make use of security primitives to offer authentication, integrity and confidentiality of EHR data while access control and immutability is guaranteed by the blockchain technology. The security analysis and performance evaluation of the proposed framework confirms its efficiency.  相似文献   

9.
    
The Internet of Things (IoT) has numerous applications in every domain, e.g., smart cities to provide intelligent services to sustainable cities. The next-generation of IoT networks is expected to be densely deployed in a resource-constrained and lossy environment. The densely deployed nodes producing radically heterogeneous traffic pattern causes congestion and collision in the network. At the medium access control (MAC) layer, mitigating channel collision is still one of the main challenges of future IoT networks. Similarly, the standardized network layer uses a ranking mechanism based on hop-counts and expected transmission counts (ETX), which often does not adapt to the dynamic and lossy environment and impact performance. The ranking mechanism also requires large control overheads to update rank information. The resource-constrained IoT devices operating in a low-power and lossy network (LLN) environment need an efficient solution to handle these problems. Reinforcement learning (RL) algorithms like Q-learning are recently utilized to solve learning problems in LLNs devices like sensors. Thus, in this paper, an RL-based optimization of dense LLN IoT devices with heavy heterogeneous traffic is devised. The proposed protocol learns the collision information from the MAC layer and makes an intelligent decision at the network layer. The proposed protocol also enhances the operation of the trickle timer algorithm. A Q-learning model is employed to adaptively learn the channel collision probability and network layer ranking states with accumulated reward function. Based on a simulation using Contiki 3.0 Cooja, the proposed intelligent scheme achieves a lower packet loss ratio, improves throughput, produces lower control overheads, and consumes less energy than other state-of-the-art mechanisms.  相似文献   

10.
    
Cyberattacks are developing gradually sophisticated, requiring effective intrusion detection systems (IDSs) for monitoring computer resources and creating reports on anomalous or suspicious actions. With the popularity of Internet of Things (IoT) technology, the security of IoT networks is developing a vital problem. Because of the huge number and varied kinds of IoT devices, it can be challenging task for protecting the IoT framework utilizing a typical IDS. The typical IDSs have their restrictions once executed to IoT networks because of resource constraints and complexity. Therefore, this paper presents a new Blockchain Assisted Intrusion Detection System using Differential Flower Pollination with Deep Learning (BAIDS-DFPDL) model in IoT Environment. The presented BAIDS-DFPDL model mainly focuses on the identification and classification of intrusions in the IoT environment. To accomplish this, the presented BAIDS-DFPDL model follows blockchain (BC) technology for effective and secure data transmission among the agents. Besides, the presented BAIDS-DFPDL model designs Differential Flower Pollination based feature selection (DFPFS) technique to elect features. Finally, sailfish optimization (SFO) with Restricted Boltzmann Machine (RBM) model is applied for effectual recognition of intrusions. The simulation results on benchmark dataset exhibit the enhanced performance of the BAIDS-DFPDL model over other models on the recognition of intrusions.  相似文献   

11.
    
Networks provide a significant function in everyday life, and cybersecurity therefore developed a critical field of study. The Intrusion detection system(IDS) becoming an essential information protection strategy that tracks the situation of the software and hardware operating on the network. Notwithstandingadvancements of growth, current intrusion detection systems also experience dif-ficulties in enhancing detection precision, growing false alarm levels and identifying suspicious activities. In order to address above mentioned issues, severalresearchers concentrated on designing intrusion detection systems that rely onmachine learning approaches. Machine learning models will accurately identifythe underlying variations among regular information and irregular informationwith incredible efficiency. Artificial intelligence, particularly machine learningmethods can be used to develop an intelligent intrusion detection framework.There in this article in order to achieve this objective, we propose an intrusiondetection system focused on a Deep extreme learning machine (DELM) whichfirst establishes the assessment of safety features that lead to their prominenceand then constructs an adaptive intrusion detection system focusing on the important features. In the moment, we researched the viability of our suggested DELMbased intrusion detection system by conducting dataset assessments and evaluating the performance factors to validate the system reliability. The experimentalresults illustrate that the suggested framework outclasses traditional algorithms.In fact, the suggested framework is not only of interest to scientific researchbut also of functional importance.  相似文献   

12.
13.
    
The fast-paced growth of artificial intelligence applications provides unparalleled opportunities to improve the efficiency of various systems. Such as the transportation sector faces many obstacles following the implementation and integration of different vehicular and environmental aspects worldwide. Traffic congestion is among the major issues in this regard which demands serious attention due to the rapid growth in the number of vehicles on the road. To address this overwhelming problem, in this article, a cloud-based intelligent road traffic congestion prediction model is proposed that is empowered with a hybrid Neuro-Fuzzy approach. The aim of the study is to reduce the delay in the queues, the vehicles experience at different road junctions across the city. The proposed model also intended to help the automated traffic control systems by minimizing the congestion particularly in a smart city environment where observational data is obtained from various implanted Internet of Things (IoT) sensors across the road. After due preprocessing over the cloud server, the proposed approach makes use of this data by incorporating the neuro-fuzzy engine. Consequently, it possesses a high level of accuracy by means of intelligent decision making with minimum error rate. Simulation results reveal the accuracy of the proposed model as 98.72% during the validation phase in contrast to the highest accuracies achieved by state-of-the-art techniques in the literature such as 90.6%, 95.84%, 97.56% and 98.03%, respectively. As far as the training phase analysis is concerned, the proposed scheme exhibits 99.214% accuracy. The proposed prediction model is a potential contribution towards smart cities environment.  相似文献   

14.
15.
    
Among major food production sectors, world aquaculture shows the highest growth rate, providing more than 50% of the global seafood market. However, water pollution in fish farming ponds is regarded as the leading cause of fish death and financial losses in the market. Here, an Internet of Things system based on a cubic multidimensional integration of circuit (MD‐IC) is demonstrated for water and food security applications in fish farming ponds. Both faces of the silicon substrate are used for thin‐film‐based device fabrication. The devices are interconnected via through‐silicon‐vias, resulting in a bifacial complementary metal‐oxide‐semiconductor‐compatible electronics system. The demonstrated cubic MD‐IC is a complete, small, and lightweight system that can be easily deployed by farmers with no need for specialists. The system integrates on its outer sides simultaneous air and water quality monitoring devices (temperature, electrical conductivity, ammonia, and pH sensors), solar cells for energy‐harvesting, and antenna for real‐time data‐transfer, while data‐management circuitry and a solid‐state battery are integrated on its internal faces. Microfluidic cooling technology is used for thermal management in the MD‐IC. Finally, a biofriendly polymeric encapsulation is used to waterproof the embedded electronics, improve the mechanical robustness, and allow the system to float on the surface of the water.  相似文献   

16.
    
《工程(英文)》2017,3(4):460-466
Under intense environmental pressure, the global energy sector is promoting the integration of renewable energy into interconnected energy systems. The demand-side management (DSM) of energy systems has drawn considerable industrial and academic attention in attempts to form new flexibilities to respond to variations in renewable energy inputs to the system. However, many DSM concepts are still in the experimental demonstration phase. One of the obstacles to DSM usage is that the current information infrastructure was mainly designed for centralized systems, and does not meet DSM requirements. To overcome this barrier, this paper proposes a novel information infrastructure named the Internet of Energy Things (IoET) in order to make DSM practicable by basing it on the latest wireless communication technology: the low-power wide-area network (LPWAN). The primary advantage of LPWAN over general packet radio service (GPRS) and area Internet of Things (IoT) is its wide-area coverage, which comes with minimum power consumption and maintenance costs. Against this background, this paper briefly reviews the representative LPWAN technologies of narrow-band Internet of Things (NB-IoT) and Long Range (LoRa) technology, and compares them with GPRS and area IoT technology. Next, a wireless-to-cloud architecture is proposed for the IoET, based on the main technical features of LPWAN. Finally, this paper looks forward to the potential of IoET in various DSM application scenarios.  相似文献   

17.
    
Nowadays, Internet of Things (IoT) has penetrated all facets of human life while on the other hand, IoT devices are heavily prone to cyberattacks. It has become important to develop an accurate system that can detect malicious attacks on IoT environments in order to mitigate security risks. Botnet is one of the dreadful malicious entities that has affected many users for the past few decades. It is challenging to recognize Botnet since it has excellent carrying and hidden capacities. Various approaches have been employed to identify the source of Botnet at earlier stages. Machine Learning (ML) and Deep Learning (DL) techniques are developed based on heavy influence from Botnet detection methodology. In spite of this, it is still a challenging task to detect Botnet at early stages due to low number of features accessible from Botnet dataset. The current study devises IoT with Cloud Assisted Botnet Detection and Classification utilizing Rat Swarm Optimizer with Deep Learning (BDC-RSODL) model. The presented BDC-RSODL model includes a series of processes like pre-processing, feature subset selection, classification, and parameter tuning. Initially, the network data is pre-processed to make it compatible for further processing. Besides, RSO algorithm is exploited for effective selection of subset of features. Additionally, Long Short Term Memory (LSTM) algorithm is utilized for both identification and classification of botnets. Finally, Sine Cosine Algorithm (SCA) is executed for fine-tuning the hyperparameters related to LSTM model. In order to validate the promising performance of BDC-RSODL system, a comprehensive comparison analysis was conducted. The obtained results confirmed the supremacy of BDC-RSODL model over recent approaches.  相似文献   

18.
    
Generally, the risks associated with malicious threats are increasing for the Internet of Things (IoT) and its related applications due to dependency on the Internet and the minimal resource availability of IoT devices. Thus, anomaly-based intrusion detection models for IoT networks are vital. Distinct detection methodologies need to be developed for the Industrial Internet of Things (IIoT) network as threat detection is a significant expectation of stakeholders. Machine learning approaches are considered to be evolving techniques that learn with experience, and such approaches have resulted in superior performance in various applications, such as pattern recognition, outlier analysis, and speech recognition. Traditional techniques and tools are not adequate to secure IIoT networks due to the use of various protocols in industrial systems and restricted possibilities of upgradation. In this paper, the objective is to develop a two-phase anomaly detection model to enhance the reliability of an IIoT network. In the first phase, SVM and Naïve Bayes, are integrated using an ensemble blending technique. K-fold cross-validation is performed while training the data with different training and testing ratios to obtain optimized training and test sets. Ensemble blending uses a random forest technique to predict class labels. An Artificial Neural Network (ANN) classifier that uses the Adam optimizer to achieve better accuracy is also used for prediction. In the second phase, both the ANN and random forest results are fed to the model’s classification unit, and the highest accuracy value is considered the final result. The proposed model is tested on standard IoT attack datasets, such as WUSTL_IIOT-2018, N_BaIoT, and Bot_IoT. The highest accuracy obtained is 99%. A comparative analysis of the proposed model using state-of-the-art ensemble techniques is performed to demonstrate the superiority of the results. The results also demonstrate that the proposed model outperforms traditional techniques and thus improves the reliability of an IIoT network.  相似文献   

19.
    
The heterogeneous nodes in the Internet of Things (IoT) are relatively weak in the computing power and storage capacity. Therefore, traditional algorithms of network security are not suitable for the IoT. Once these nodes alternate between normal behavior and anomaly behavior, it is difficult to identify and isolate them by the network system in a short time, thus the data transmission accuracy and the integrity of the network function will be affected negatively. Based on the characteristics of IoT, a lightweight local outlier factor detection method is used for node detection. In order to further determine whether the nodes are an anomaly or not, the varying behavior of those nodes in terms of time is considered in this research, and a time series method is used to make the system respond to the randomness and selectiveness of anomaly behavior nodes effectively in a short period of time. Simulation results show that the proposed method can improve the accuracy of the data transmitted by the network and achieve better performance.  相似文献   

20.
    
Nowadays, smart wearable devices are used widely in the Social Internet of Things (IoT), which record human physiological data in real time. To protect the data privacy of smart devices, researchers pay more attention to federated learning. Although the data leakage problem is somewhat solved, a new challenge has emerged. Asynchronous federated learning shortens the convergence time, while it has time delay and data heterogeneity problems. Both of the two problems harm the accuracy. To overcome these issues, we propose an asynchronous federated learning scheme based on double compensation to solve the problem of time delay and data heterogeneity problems. The scheme improves the Delay Compensated Asynchronous Stochastic Gradient Descent (DC-ASGD) algorithm based on the second-order Taylor expansion as the delay compensation. It adds the FedProx operator to the objective function as the heterogeneity compensation. Besides, the proposed scheme motivates the federated learning process by adjusting the importance of the participants and the central server. We conduct multiple sets of experiments in both conventional and heterogeneous scenarios. The experimental results show that our scheme improves the accuracy by about 5% while keeping the complexity constant. We can find that our scheme converges more smoothly during training and adapts better in heterogeneous environments through numerical experiments. The proposed double-compensation-based federated learning scheme is highly accurate, flexible in terms of participants and smooth the training process. Hence it is deemed suitable for data privacy protection of smart wearable devices.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号