The use of RESTful Web services has gained momentum in the development of distributed applications based on traditional Web standards such as HTTP. In particular, these services can integrate easily into various applications, such as mashups. Composing RESTful services into Web-scale workflows requires a lightweight composition language that's capable of describing both the control and data flow that constitute a workflow. The authors address these issues with Bite, a lightweight and extensible composition language that enables the creation of Web-scale workflows and uses RESTful services as its main composable entities. 相似文献
Artificial Intelligence Review - Visual object tracking has become one of the most active research topics in computer vision, and it has been applied in several commercial... 相似文献
Skin lesions have become a critical illness worldwide, and the earlier identification of skin lesions using dermoscopic images can raise the survival rate. Classification of the skin lesion from those dermoscopic images will be a tedious task. The accuracy of the classification of skin lesions is improved by the use of deep learning models. Recently, convolutional neural networks (CNN) have been established in this domain, and their techniques are extremely established for feature extraction, leading to enhanced classification. With this motivation, this study focuses on the design of artificial intelligence (AI) based solutions, particularly deep learning (DL) algorithms, to distinguish malignant skin lesions from benign lesions in dermoscopic images. This study presents an automated skin lesion detection and classification technique utilizing optimized stacked sparse autoencoder (OSSAE) based feature extractor with backpropagation neural network (BPNN), named the OSSAE-BPNN technique. The proposed technique contains a multi-level thresholding based segmentation technique for detecting the affected lesion region. In addition, the OSSAE based feature extractor and BPNN based classifier are employed for skin lesion diagnosis. Moreover, the parameter tuning of the SSAE model is carried out by the use of sea gull optimization (SGO) algorithm. To showcase the enhanced outcomes of the OSSAE-BPNN model, a comprehensive experimental analysis is performed on the benchmark dataset. The experimental findings demonstrated that the OSSAE-BPNN approach outperformed other current strategies in terms of several assessment metrics. 相似文献
We perceive big data with massive datasets of complex and variegated structures in the modern era. Such attributes formulate hindrances while analyzing and storing the data to generate apt aftermaths. Privacy and security are the colossal perturb in the domain space of extensive data analysis. In this paper, our foremost priority is the computing technologies that focus on big data, IoT (Internet of Things), Cloud Computing, Blockchain, and fog computing. Among these, Cloud Computing follows the role of providing on-demand services to their customers by optimizing the cost factor. AWS, Azure, Google Cloud are the major cloud providers today. Fog computing offers new insights into the extension of cloud computing systems by procuring services to the edges of the network. In collaboration with multiple technologies, the Internet of Things takes this into effect, which solves the labyrinth of dealing with advanced services considering its significance in varied application domains. The Blockchain is a dataset that entertains many applications ranging from the fields of crypto-currency to smart contracts. The prospect of this research paper is to present the critical analysis and review it under the umbrella of existing extensive data systems. In this paper, we attend to critics' reviews and address the existing threats to the security of extensive data systems. Moreover, we scrutinize the security attacks on computing systems based upon Cloud, Blockchain, IoT, and fog. This paper lucidly illustrates the different threat behaviour and their impacts on complementary computational technologies. The authors have mooted a precise analysis of cloud-based technologies and discussed their defense mechanism and the security issues of mobile healthcare.
To design an efficient product family, designers have to anticipate the production process and, more generally, the supply chain costs. But this is a difficult problem, and designers often propose a solution which is subsequently evaluated in terms of logistical costs. This paper presents a design problem in which the product and the supply chain design are considered at the same time. It consists in selecting a set of modules that will be manufactured at distant facilities and then shipped to a plant close to the market for final, customized assembly under time constraints. The goal is to obtain the bill of materials for all the items in the product family, each of which is made up of a set of modules, and specifying the location where these modules will be built, in order to minimize the total production costs for the supply chain. The objective of the study is to analyze both, for small instances, the impact of the costs (fixed and variable) on the optimal solutions, and to compare an integrated approach minimizing the total cost in one model with a two-phases approach in which the decisions relating to the design of the products and the allocation of modules to distant sites are made separately. 相似文献
Using the partitioned matrix approach, a parallel hardware architecture for a parametric (Bayes) classifier is designed. The architecture consists of simple, regularly structured processing elements operating in parallel. As a result, the proposed design is suitable for VLSI implementation. A comparative analysis shows that the approach is more efficient and can significantly reduce the cost required for implementing the classifier, while maintaining high speed 相似文献
Two computer programs were developed, program I to optimise insulin treatment using six injections per day, and program II to convert these insulin profiles into less frequent injections of mixtures of regular and NPH insulin. The first software in an HP 41 CV pocket computer uses iterative adjustments during the day and on subsequent days to determine the optimal timing and dosage of insulin. Six self-monitored glucose values at 3 h intervals, insulin doses, and the effects of insulin on plasma glucose are memorised for calculations. The calculated insulin doses were applied by Optipen as five s.c. injections of regular insulin and one bedside injection of NPH insulin. After 5 days the optimised individual insulin profiles with six daily injections were processed by program II. It applies the pharmacokinetics of regular and NPH insulin to make suggestions for a more conventional insulin therapy with one, two, three or four daily injections of regular insulin, NPH insulin, or varying mixtures of both insulins. The procedure was well tolerated in eight insulin-dependent diabetes mellitus out-patients. Insulin therapy with two or three injections, fitted by the second program and selected according to the quality score, produced plasma glucose profiles as satisfactory as those obtained with six injections. The system allows the fine tuning of insulin therapy to out-patients with their individual diets and physical activities. 相似文献
This paper investigates the utilization of wavelet filters via multistage convolution by Reverse Biorthogonal Wavelets (RBW)
in high and low pass band frequency parts of speech signal. Speech signal is decomposed into two pass bands of frequency;
high and low, and then the noise is removed in each band individually in different stages via wavelet filters. This approach
provides better outcomes because it does not cut the speech information, which occurs when utilizing conventional thresholding.
We tested the proposed method via several noise probability distribution functions. Subjective evaluation is engaged in conjunction
with objective evaluation to accomplish optimal investigation method. The method is simple but has surprise high quality results.
The method shows superiority over Donoho and Johnstone thresholding method and Birge-Massart thresholding strategy method. 相似文献
The novelty of the controlled diffusion solidification (CDS) process is the mixing of two precursor alloys with different
thermal masses to obtain the resultant desired alloy, which is subsequently cast into a near-net-shaped product. The critical
event in the CDS process is the ability to generate a favorable environment during the mixing of the two precursor alloys
to enable a well-distributed and copious nucleation event of the primary Al phase leading to a nondendritic morphology in
the cast part. The turbulence dissipation energy coupled with the undercooling of the precursor alloy with the higher temperature
enables the copious nucleation events, which are well distributed in the resultant mixture. 相似文献