This paper deals with defining the concept of agent-based time delay margin and computing its value in multi-agent systems controlled by event-triggered based controllers. The agent-based time delay margin specifying the time delay tolerance of each agent for ensuring consensus in event-triggered controlled multi-agent systems can be considered as complementary for the concept of (network) time delay margin, which has been previously introduced in some literature. In this paper, an event-triggered control method for achieving consensus in multi-agent systems with time delay is considered. It is shown that the Zeno behavior is excluded by applying this method. Then, in a multi-agent system controlled by the considered event-triggered method, the concept of agent-based time delay margin in the presence of a fixed network delay is defined. Moreover, an algorithm for computing the value of the time delay margin for each agent is proposed. Numerical simulation results are also provided to verify the obtained theoretical results. 相似文献
We perceive big data with massive datasets of complex and variegated structures in the modern era. Such attributes formulate hindrances while analyzing and storing the data to generate apt aftermaths. Privacy and security are the colossal perturb in the domain space of extensive data analysis. In this paper, our foremost priority is the computing technologies that focus on big data, IoT (Internet of Things), Cloud Computing, Blockchain, and fog computing. Among these, Cloud Computing follows the role of providing on-demand services to their customers by optimizing the cost factor. AWS, Azure, Google Cloud are the major cloud providers today. Fog computing offers new insights into the extension of cloud computing systems by procuring services to the edges of the network. In collaboration with multiple technologies, the Internet of Things takes this into effect, which solves the labyrinth of dealing with advanced services considering its significance in varied application domains. The Blockchain is a dataset that entertains many applications ranging from the fields of crypto-currency to smart contracts. The prospect of this research paper is to present the critical analysis and review it under the umbrella of existing extensive data systems. In this paper, we attend to critics' reviews and address the existing threats to the security of extensive data systems. Moreover, we scrutinize the security attacks on computing systems based upon Cloud, Blockchain, IoT, and fog. This paper lucidly illustrates the different threat behaviour and their impacts on complementary computational technologies. The authors have mooted a precise analysis of cloud-based technologies and discussed their defense mechanism and the security issues of mobile healthcare.
Wireless Networks - Inter-satellite data transmission links are very crucial for providing global inter-connectivity. We report designing and investigations on high date rate inter-satellite... 相似文献
The Journal of Supercomputing - Power consumption is likely to remain a significant concern for exascale performance in the foreseeable future. In addition, graphics processing units (GPUs) have... 相似文献
The World Wide Web(WWW) comprises a wide range of information, and it is mainly operated on the principles of keyword matching which often reduces accurate information retrieval. Automatic query expansion is one of the primary methods for information retrieval, and it handles the vocabulary mismatch problem often faced by the information retrieval systems to retrieve an appropriate document using the keywords. This paper proposed a novel approach of hybrid COOT-based Cat and Mouse Optimization (CMO) algorithm named as hybrid COOT-CMO for the appropriate selection of optimal candidate terms in the automatic query expansion process. To improve the accuracy of the Cat and Mouse Optimization (CMO) algorithm, the parameters are tuned with the help of the Coot algorithm. The best suitable expanded query is identified from the available expanded query sets also known as candidate query pools. All feasible combinations in this candidate query pool should be obtained from the top retrieved documents. Benchmark datasets such as the GOV2 Test Collection, the Cranfield Collections, and the NTCIR Test Collection are utilized to assess the performance of the proposed hybrid COOT-CMO method for automatic query expansion. This proposed method surpasses the existing state-of-the-art techniques using many performance measures such as F-score, precision, and mean average precision (MAP).
While the internet has a lot of positive impact on society, there are negative components. Accessible to everyone through online platforms, pornography is, inducing psychological and health related issues among people of all ages. While a difficult task, detecting pornography can be the important step in determining the porn and adult content in a video. In this paper, an architecture is proposed which yielded high scores for both training and testing. This dataset was produced from 190 videos, yielding more than 19 h of videos. The main sources for the content were from YouTube, movies, torrent, and websites that hosts both pornographic and non-pornographic contents. The videos were from different ethnicities and skin color which ensures the models can detect any kind of video. A VGG16, Inception V3 and Resnet 50 models were initially trained to detect these pornographic images but failed to achieve a high testing accuracy with accuracies of 0.49, 0.49 and 0.78 respectively. Finally, utilizing transfer learning, a convolutional neural network was designed and yielded an accuracy of 0.98. 相似文献
Data available in software engineering for many applications contains variability and it is not possible to say which variable helps in the process of the prediction. Most of the work present in software defect prediction is focused on the selection of best prediction techniques. For this purpose, deep learning and ensemble models have shown promising results. In contrast, there are very few researches that deals with cleaning the training data and selection of best parameter values from the data. Sometimes data available for training the models have high variability and this variability may cause a decrease in model accuracy. To deal with this problem we used the Akaike information criterion (AIC) and the Bayesian information criterion (BIC) for selection of the best variables to train the model. A simple ANN model with one input, one output and two hidden layers was used for the training instead of a very deep and complex model. AIC and BIC values are calculated and combination for minimum AIC and BIC values to be selected for the best model. At first, variables were narrowed down to a smaller number using correlation values. Then subsets for all the possible variable combinations were formed. In the end, an artificial neural network (ANN) model was trained for each subset and the best model was selected on the basis of the smallest AIC and BIC value. It was found that combination of only two variables’ ns and entropy are best for software defect prediction as it gives minimum AIC and BIC values. While, nm and npt is the worst combination and gives maximum AIC and BIC values. 相似文献
The effect of the initial annealing temperature on the evolution of microstructure and microhardness in high purity OFHC Cu is investigated after processing by HPT. Disks of Cu are annealed for 1 h at two different annealing temperatures, 400 and 800 °C, and then processed by HPT at room temperature under a pressure of 6.0 GPa for 1/4, 1/2, 1, 5, and 10 turns. Samples are stored for 6 months after HPT processing to examine the self‐annealing effects. Electron backscattered diffraction (EBSD) measurements are recorded for each disk at three positions: center, mid‐radius, and near edge. Microhardness measurements are also recorded along the diameters of each disk. Both alloys show rapid hardening and then strain softening in the very early stages of straining due to self‐annealing with a clear delay in the onset of softening in the alloy initially annealed at 800 °C. This delay is due to the relatively larger initial grain size compared to the alloy initially annealed at 400 °C. The final microstructures consist of homogeneous fine grains having average sizes of ≈0.28 and ≈0.34 µm for the alloys initially annealed at 400 and 800 °C, respectively. A new model is proposed to describe the behavior of the hardness evolution by HPT in high purity OFHC Cu. 相似文献
In this paper, a new approach is suggested to investigate stability in a family of fractional order linear time invariant systems with order between 1 and 2. The proposed method relies on finding a linear ordinary system that possesses the same stability property as the fractional order system. In this way, instead of performing the stability analysis on the fractional order systems, the analysis is converted into the domain of ordinary systems which is well established and well understood. As a useful consequence, we have extended two general tests for robust stability check of ordinary systems to fractional order systems. 相似文献