The Journal of Supercomputing - Currently, all online social networks (OSNs) are considered to follow a power-law distribution. In this paper, the degree distribution for multiple OSNs has been... 相似文献
Floods are common and recurring natural hazards which damages is the destruction for society. Several regions of the world with different climatic conditions face the challenge of floods in different magnitudes. Here we estimate flood susceptibility based on Analytical neural network (ANN), Deep learning neural network (DLNN) and Deep boost (DB) algorithm approach. We also attempt to estimate the future rainfall scenario, using the General circulation model (GCM) with its ensemble. The Representative concentration pathway (RCP) scenario is employed for estimating the future rainfall in more an authentic way. The validation of all models was done with considering different indices and the results show that the DB model is most optimal as compared to the other models. According to the DB model, the spatial coverage of very low, low, moderate, high and very high flood prone region is 68.20%, 9.48%, 5.64%, 7.34% and 9.33% respectively. The approach and results in this research would be beneficial to take the decision in managing this natural hazard in a more efficient way.
A series of studies of top European and other firms has revealed patterns of design management associated with commercial success. Firms that invest resources and professional expertise in product and industrial design in traditional and new industries have been commercially more successful than firms that pay less attention to these aspects of design. As an industry matures there is a shift in emphasis from design associated with technological innovation, to designs supporting technical improvements, and then to supporting user needs, fashion and product variants. These issues are illustrated through the history of the evolution of the bicycle. 相似文献
BACKGROUND: The enterococci have become important nosocomial pathogens. They can cause multiple site infections and enterococcal bacteremia becomes more frequently associated with a high mortality rate. Previous studies of enterococcal bacteremia showed a variety of results. To establish the significance and importance of enterococci as nosocomial pathogens in this hospital, to characterize their clinical pictures and to search for the risk factors for mortality, this retrospective study was performed. METHODS: There were 208 cases of enterococcal bacteremia which occurred from 1988 to 1992. Twenty-seven cases had no medical charts, dismissing possibility of evaluation. Finally, 181 cases of enterococcal bacteremia were analysed. RESULTS: One hundred and eighteen episodes were nosocomial infections. Polymicrobial bacteremia occurred in 68.5% of the patients and the most common co-isolate was Pseudomonas aeruginosa. Those patients (78.5%) with underlying diseases and malignancies were the most common underlying problems. The portal of entry could be found in 69.6 percent of patients, with the gastrointestinal tract the most common sources. Antimicrobial susceptibility testing showed high gentamicin resistance rate (89.5%), and ampicillin still had about 80 percent sensitivity rate. The group who received specific antibiotic therapy for enterococcus showed lower mortality (36.4% versus 47.6%). Only one case had infective endocarditis. Forty-nine patients suffered from septic shock, the cause of 30 deaths. Totally 75 patients died during hospitalization. Besides sepsis, another major cause of death was their underlying diseases itself. CONCLUSIONS: Enterococci have no doubt become important nosocomial pathogens and enterococcal bacteremia were associated with high mortality, especially in elderly patients with underlying diseases such as malignancy or diabetes. When clinically dealing with sepsis from the gastrointestinal or biliary tract, especially when previous cephalosporins therapy showed no response, the possibility of enterococcal bacteremia should always be considered. 相似文献
Sintering and grain growth of nano-crystalline undoped ZnO has been studied in detail over a wide range of temperature and holding time. Below 800 °C, sintering of over 70% theoretical density is not observed, irrespective of particle size. At 900 °C for 6 h, the nano-crystalline sample sinters to 99% of theoretical density whereas the density for as received sample is 93% of theoretical density. However, at 1300 °C or higher, the densification is found to be much faster and after a few hours becomes independent of holding time. Grain growth studies reveal a similar feature of attaining saturation over holding time. The average saturated grain size is found to be ∼1.5 and ∼2.2 μm at 800 and 900 °C, respectively, while at 1300 °C or higher, it is in between 12 and 13 μm. 相似文献
This paper uses three recently generated southern African satellite burned area products for the month of September 2000 in a sensitivity study of regional biomass burning emissions for a number of trace gases and particulates. Differences in the extent and location of areas burned among products generated from Moderate Resolution Imaging Spectroradiometer (MODIS), Systeme Pour l'Observation de la Terre (SPOT-VEGETATION), and Along Track Scanning Radiometer (ATSR-2) data are significant and result in different emissions estimates for woodland and grassland land cover types. Due to the different emission profiles in woodlands and grasslands, favoring relatively more products of incomplete combustion in woodlands compared with products of complete combustion in grasslands in the late dry season, these changes are not proportional to the differences in the burned area amounts. The importance of accurate burned area information not just in terms of the total area but also in terms of its spatial distribution becomes apparent from our modeling results. This paper highlights the urgent need for satellite data producers to provide accuracy assessments associated with satellite-derived products. Preferably, these accuracy data will be spatially explicit, or defined in a way that can be applied in a spatially explicit modeling context, to enable emissions uncertainties to be defined with respect to different landscape units in support of greenhouse gas emissions reporting. 相似文献
We are interested in deciding if a given nonassociative polynomial h is an identity for a variety of nonassociative algebras. We present an algorithm for constructing a certain homomorphic image of a free nonassociative algebra which is sufficient to answer the question. The algorithm resembles dynamic programming in that the algebra is built by constructing a sequence of subspaces; the basis of each subspace is determined by the basis of previous subspaces. The number of arithmetic operations required to construct the algebra is bounded by a polynomial in the degree of h and the dimension of the resulting algebra. 相似文献