首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Artificial intelligent methods are today extensively used in many areas. They are known as powerful tools to solve engineering problems with uncertainties. The purpose of this study was to develop a model, using artificial intelligent methods, for estimating air-demand ratio in venturi weirs. For this aim, Adaptive Network based Fuzzy Inference Systems (ANFIS) and Artificial Neural Network (ANNs) methods were used. The test results revealed that ANFIS model predicted the measured values at higher accuracy than ANNs model. Average correlation coefficients (R2) in ANFIS models were achieved equal to 0.9623 for β = 0.75 and 0.9666 for β = 0.50. Extremely good agreement between the predicted and measured values confirms that ANFIS model can be successfully used to predict air-demand ratio in venturi weirs.  相似文献   

2.
An accurate contour estimation plays a significant role in classification and estimation of shape, size, and position of thyroid nodule. This helps to reduce the number of false positives, improves the accurate detection and efficient diagnosis of thyroid nodules. This paper introduces an automated delineation method that integrates spatial information with neutrosophic clustering and level-sets for accurate and effective segmentation of thyroid nodules in ultrasound images. The proposed delineation method named as Spatial Neutrosophic Distance Regularized Level Set (SNDRLS) is based on Neutrosophic L-Means (NLM) clustering which incorporates spatial information for Level Set evolution. The SNDRLS takes rough estimation of region of interest (ROI) as input provided by Spatial NLM (SNLM) clustering for precise delineation of one or more nodules. The performance of the proposed method is compared with level set, NLM clustering, Active Contour Without Edges (ACWE), Fuzzy C-Means (FCM) clustering and Neutrosophic based Watershed segmentation methods using the same image dataset. To validate the SNDRLS method, the manual demarcations from three expert radiologists are employed as ground truth. The SNDRLS yields the closest boundaries to the ground truth compared to other methods as revealed by six assessment measures (true positive rate is 95.45 ± 3.5%, false positive rate is 7.32 ± 5.3% and overlap is 93.15 ± 5. 2%, mean absolute distance is 1.8 ± 1.4 pixels, Hausdorff distance is 0.7 ± 0.4 pixels and Dice metric is 94.25 ± 4.6%). The experimental results show that the SNDRLS is able to delineate multiple nodules in thyroid ultrasound images accurately and effectively. The proposed method achieves the automated nodule boundary even for low-contrast, blurred, and noisy thyroid ultrasound images without any human intervention. Additionally, the SNDRLS has the ability to determine the controlling parameters adaptively from SNLM clustering.  相似文献   

3.
《Applied ergonomics》2011,42(1):71-75
The amount of sleep obtained between shifts is influenced by numerous factors including the length of work and rest periods, the timing of the rest period relative to the endogenous circadian cycle and personal choices about the use of non-work time. The current study utilised a real-world live-in mining environment to examine the amount of sleep obtained when access to normal domestic, family and social activities was restricted. Participants were 29 mining operators (26 male, average age 37.4 ± 6.8 years) who recorded sleep, work and fatigue information and wore an activity monitor for a cycle of seven day shifts and seven night shifts (both 12 h) followed by either seven or fourteen days off. During the two weeks of work participants lived on-site. Total sleep time was significantly less (p < 0.01) while on-site on both day (6.1 ± 1.0 h) and night shifts (5.7 ± 1.5 h) than days off (7.4 ± 1.4 h). Further, night shift sleep was significantly shorter than day-shift sleep (p < 0.01). Assessment of subjective fatigue ratings showed that the sleep associated with both days off and night shifts had a greater recovery value than sleep associated with day shifts (p < 0.01). While on-site, participants obtained only 6 h of sleep indicating that the absence of competing domestic, family and social activities did not convert to more sleep. Factors including shift start times and circadian influences appear to have been more important.  相似文献   

4.
Staphylococcus aureus sortase A is an attractive target of Gram-positive bacteria that plays a crucial role in anchoring of surface proteins to peptidoglycan present in bacterial cell wall. Inhibiting sortase A is an elementary and essential effort in preventing the pathogenesis. In this context, in silico virtual screening of in-house database was performed using ligand based pharmacophore model as a filter. The developed pharmacophore model AAHR 11 consists of two acceptors, one hydrophobic and one ring aromatic feature. Top ranked molecule KKR1 was docked into the active site of the target. After profound analysis, it was analyzed and optimized based on the observations from its binding pose orientation. Upgraded version of KKR1 was KKR2 and has improved docking score, binding interactions and best fit in the binding pocket. KKR1 along with KKR2 were further validated using 100 ns molecular dynamic studies. Both KKR1 and KKR2 contain Indole-thiazolidine moiety and were synthesized. The disk diffusion assay has good initial results (ZI of KKR1, KKR2 were 24, 38 mm at 10 μg/mL and ZI of Ampicillin was 22 at 10 μg/mL) and calculated MICs of the molecules (KKR1 5.56 ± 0.28 μg/mL, KKR2 1.32 ± 0.12 μg/mL, Ampicillin 8 ± 1.1 μg/mL) were in good agreement with standard drug Ampicillin. KKR1 has shown IC50 of 1.23 ± 0.14 μM whereas the optimized lead molecule KKR2 show IC50 of 0.008 ± 0.07 μM. Results from in silico were validated by in vitro studies and proved that indole-thiazolidine molecules would be useful for future development as lead molecules against S. aureus sortase A.  相似文献   

5.
Leishmaniasis is a neglected tropical disease, caused by several species of Leishmania. Being an opportunistic lipid-scavenging pathogen, Leishmania relies extensively on lipid metabolism especially for host–pathogen interaction, utilizing host lipids for energy and virulence. The rational approach is to target lipid metabolism of the pathogen focusing lipid-catabolizing lipases. The LdLip3 lipase is considered as drug target as it is constitutively expressed in both promastigote and amastigote forms. Since the LdLip3 structure is not known, we modeled its three-dimensional structure to implement structure-based drug discovery approach. Similarity-based virtual screening was carried out to identify potential inhibitors utilizing NCI diversity set on ZINC database including natural products. Implementing computational and experimental approaches, four anti-leishmanial agents were discovered. The screened molecules ZINC01821375, ZINC04008765, ZINC06117316 and ZINC12653571 had anti-leishmanial activity with IC50 (% viable promastigotes vs. concentration) of 5.2 ± 1.8 μM, 13.1 ± 2.6 μM, 9.4 ± 2.6 μM and 17.3 ± 3.1 μM, respectively. The molecules showed negligible toxicity toward mouse macrophages. Based on the contact footprinting analysis, new molecules were designed with better predicted free energy of binding than discovered anti-leishmanial agents. Further validation for the therapeutic utility of discovered molecules can be carried out by the research community to combat leishmaniasis.  相似文献   

6.
In this paper, a quadtree-based mesh generation method is described to create guaranteed-quality, geometry-adapted all-quadrilateral (all-quad) meshes with feature preservation for arbitrary planar domains. Given point cloud, our method generates all-quad meshes with these points as vertices and all the angles are within [45°, 135°]. For given planar curves, quadtree-based spatial decomposition is governed by the curvature of the boundaries and narrow regions. 2-refinement templates are chosen for local mesh refinement without creating any hanging nodes. A buffer zone is created by removing elements around the boundary. To guarantee the mesh quality, the angles facing the boundary are improved via template implementation, and two buffer layers are inserted in the buffer zone. It is proved that all the elements of the final mesh are quads with angles between 45° ± ε and 135° ± ε (ε  5°) with the exception of badly shaped elements that may be required by the sharp angles in the input geometry. We also prove that the scaled Jacobians defined by two edge vectors are in the range of [sin(45° ? ε), sin90°], or [0.64, 1.0]. Furthermore, sharp features and narrow regions are detected and preserved automatically. Boundary layer meshes are generated by splitting elements of the second buffer layer. We have applied our algorithm to a set of complicated geometries, including the Lake Superior map and the air foil with multiple components.  相似文献   

7.
This study utilized an external logger system for onsite measurements of computer activities of two professional groups—twelve university administrators and twelve computer-aided design (CAD) draftsmen. Computer use of each participant was recorded for 10 consecutive days—an average of 7.9 ± 1.8 workdays and 7.8 ± 1.5 workdays for administrators and draftsmen, respectively. Quantitative parameters computed using recorded data were daily dynamic duration (DD) and static duration, daily keystrokes, mouse clicks, wheel scrolling counts, mouse movement and dragged distance, average typing and clicking rates, and average time holding down keys and mouse buttons. Significant group differences existed in the number of daily keystrokes (p < 0.0005) and mouse clicks (p < 0.0005), mouse distance moved (p < 0.0005), typing rate (p < 0.0001), daily mouse DD (p < 0.0001), and keyboard DD (p < 0.005). Both groups had significantly longer mouse DD than keyboard DD (p < 0.0001). Statistical analysis indicates that the duration of computer use for different computer tasks cannot be represented by a single formula with same set of quantitative parameters as those associated with mouse and keyboard activities. Results of this study demonstrate that computer exposure during different tasks cannot be estimated solely by computer use duration. Quantification of onsite computer activities is necessary when determining computer-associated risk of musculoskeletal disorders. Other significant findings are discussed.  相似文献   

8.
《Applied ergonomics》2011,42(1):91-97
The purpose of this study was to assess sleep quality and comfort of participants diagnosed with low back pain and stiffness following sleep on individually prescribed mattresses based on dominant sleeping positions. Subjects consisted of 27 patients (females, n = 14; males, n = 13; age 44.8 yrs ± SD 14.6, weight 174 lb. ±SD 39.6, height 68.3 in. ± SD 3.7) referred by chiropractic physicians for the study. For the baseline (pretest) data subjects recorded back and shoulder discomfort, sleep quality and comfort by visual analog scales (VAS) for 21 days while sleeping in their own beds. Subsequently, participants’ beds were replaced by medium-firm mattresses specifically layered with foam and latex based on the participants’ reported prominent sleeping position and they again rated their sleep comfort and quality daily for the following 12 weeks. Analysis yielded significant differences between pre- and post means for all variables and for back pain, we found significant (p < 0.01) differences between the first posttest mean and weeks 4 and weeks 8–12, thus indicating progressive improvement in both back pain and stiffness while sleeping on the new mattresses. Additionally, the number of days per week of experiencing poor sleep and physical discomfort decreased significantly. It was concluded that sleep surfaces are related to sleep discomfort and that is indeed possible to reduce pain and discomfort and to increase sleep quality in those with chronic back pain by replacing mattresses based on sleeping position.  相似文献   

9.
ContextDefect prediction research mostly focus on optimizing the performance of models that are constructed for isolated projects (i.e. within project (WP)) through retrospective analyses. On the other hand, recent studies try to utilize data across projects (i.e. cross project (CP)) for building defect prediction models for new projects. There are no cases where the combination of within and cross (i.e. mixed) project data are used together.ObjectiveOur goal is to investigate the merits of using mixed project data for binary defect prediction. Specifically, we want to check whether it is feasible, in terms of defect detection performance, to use data from other projects for the cases (i) when there is an existing within project history and (ii) when there are limited within project data.MethodWe use data from 73 versions of 41 projects that are publicly available. We simulate the two above-mentioned cases, and compare the performances of naive Bayes classifiers by using within project data vs. mixed project data.ResultsFor the first case, we find that the performance of mixed project predictors significantly improves over full within project predictors (p-value < 0.001), however the effect size is small (Hedgesg = 0.25). For the second case, we found that mixed project predictors are comparable to full within project predictors, using only 10% of available within project data (p-value = 0.002, g = 0.17).ConclusionWe conclude that the extra effort associated with collecting data from other projects is not feasible in terms of practical performance improvement when there is already an established within project defect predictor using full project history. However, when there is limited project history, e.g. early phases of development, mixed project predictions are justifiable as they perform as good as full within project models.  相似文献   

10.
The development of a thermal switch based on arrays of liquid–metal micro-droplets is presented. Prototype thermal switches are assembled from a silicon substrate on which is deposited an array of 1600 30-μm liquid–metal micro-droplets. The liquid–metal micro-droplet array makes and breaks contact with a second bare silicon substrate. A gap between the two silicon substrates is filled with either air at 760 Torr, air at of 0.5 Torr or xenon at 760 Torr. Heat transfer and thermal resistance across the thermal switches are measured for “on” (make contact) and “off” (break contact) conditions using guard-heated calorimetry. The figure of merit for a thermal switch, the ratio of “off” state thermal resistance over “on” state thermal resistance, Roff/Ron, is 129 ± 43 for a xenon-filled thermal switch that opens 100 μm and 60 ± 17 for an 0.5 Torr air-filled thermal switch that opens 25 μm. These thermal resistance ratios are shown to be markedly higher than values of Roff/Ron for a thermal switch based on contact between polished silicon surfaces. Transient temperature measurements for the liquid–metal micro-droplet switches indicate thermal switching times of less than 100 ms. Switch lifetimes are found to exceed one-million cycles.  相似文献   

11.
Tri-o-thymotide (I) has been used as an electroactive material in PVC (poly(vinyl chloride)) matrix for fabrication of chromium(III)-selective sensor. The membrane containing tri-o-thymotide, sodium tetraphenyl borate (NaTPB), dibutyl phthalate (DBP) and PVC in the optimum ratio 5:1:75:100 (w/w) exhibits a working concentration range of 4.0 × 10−6 to 1.0 × 10−1 M with a Nernstian slope of 20.0 ± 0.1 mV/decade of activity in the pH range of 2.8–5.1. The detection limit of this sensor is 2.0 × 10−7 M. The electrode exhibits a fast response time of 15 s, shows good selectivity towards Cr3+ over a number of mono-, bi- and trivalent cations and can also be used in partially non-aqueous medium (up to 15%, v/v) also. The assembly has been successfully used as an indicator electrode in the potentiometric titration of chromium(III) against EDTA and also to determine Cr(III) quantitatively in electroplating industry waste samples.  相似文献   

12.
Intensive care is one of the most important components of the modern medical system. Healthcare professionals need to utilize intensive care resources effectively. Mortality prediction models help physicians decide which patients require intensive care the most and which do not. The Simplified Acute Physiology System 2nd version (SAPS II) is one of the most popular mortality scoring systems currently available. This study retrospectively collected data on 496 patients admitted to intensive care units from year 2000 to 2001. The average patient age was 59.96 ± 1.83 years old and 23.8% of patients died before discharge. We used these data as training data and constructed an exponential Bayesian mortality prediction model by combining BSM (Bayesian statistical model) and GA (genetic algorithm). The optimal weights and the parameters were determined with GA. Furthermore, we prospectively collected data on 142 patients for testing the new model. The average patient age for this group was 57.80 ± 3.33 years old and 21.8% patients died before discharge. The mortality prediction power of the new model was better than SAPS II (p < 0.001). The new model combining BSM and GA can manage both binary data and continuous data. The mortality rate is predicted to be high if the patient’s Glasgow coma score is less than 5.  相似文献   

13.
Application of the sustainability concept to environmental projects implies that at least three feature categories (i.e., economic, social, and environmental) must be taken into account by applying a participative multi-criterion analysis (MCA). However, MCA results depend crucially on the methodology applied to estimate the relative criterion weights. By using a logically consistent set of data and methods (i.e., linear regression [LR], factor analysis [FA], the revised Simos procedure [RSP], and the analytical hierarchy process [AHP]), the present study revealed that mistakes from using one weight-estimation method rather than an alternative are non-significant in terms of satisfaction of specified acceptable standards (i.e., a risk of up to 1% of erroneously rejecting an option), but significant for comparisons between options (i.e., a risk of up to 11% of choosing a worse option by rejecting a better option). In particular, the risks of these mistakes are larger if both differences in statistical or computational algorithms and in data sets are involved (e.g., LR vs. AHP). In addition, the present study revealed that the choice of weight-estimation methods should depend on the estimated and normalised score differences for the economic, social, and environmental features. However, on average, some pairs of weight-estimation methods are more similar (e.g., AHP vs. RSP and LR vs. AHP are the most and the least similar, respectively), and some single weight-estimation methods are more reliable (i.e., FA > RSP > AHP > LR).  相似文献   

14.
The implicit Colebrook–White equation has been widely used to estimate the friction factor for turbulent fluid-flow in rough-pipes. In this paper, the state-of-the-art review for the most currently available explicit alternatives to the Colebrook–White equation, is presented. An extensive comparison test was established on the 20 × 500 grid, for a wide range of relative roughness (ε/D) and Reynolds number (R) values (1 × 10?6 ? ε/D ? 5 × 10?2; 4 × 103 ? R ? 108), covering a large portion of turbulent flow zone in Moody’s diagram. Based on the comprehensive error analysis, the magnitude points in which the maximum absolute and the maximum relative error are occurred at the pair of ε/D and R values, are observed. A limiting case of the most of these approximations provided friction factor estimates that are characterized by a mean absolute error of 5 × 10?4, a maximum absolute error of 4 × 10?3 whereas, a mean relative error of 1.3% and a maximum relative error of 5.8%, over the entire range of ε/D and R values, respectively. For practical purposes, the complete results for the maximum and the mean relative errors versus the 20 sets of ε/D value, are also indicated in two comparative figures. The examination results for error properties of these approximations gives one an opportunity to practically evaluate the most accurate formula among of all the previous explicit models; and showing in this way its great flexibility for estimating turbulent flow friction factor. Comparative analysis for the mean relative error profile revealed, the classification for the best-fitted six equations examined was in a good agreement with those of the best model selection criterion claimed in the recent literature, for all performed simulations.  相似文献   

15.
ObjectiveThe purpose of this study was to assess associations between depression and problematic internet use (PIU) among female college students, and determine whether Internet use time moderates this relationship.MethodThis cross-sectional survey included 265 female college students from four U.S. universities. Students completed the Patient Health Questionnaire-9 (PHQ-9), the Problematic and Risky Internet Use Screening Scale (PRIUSS) and self-reported daily Internet use. Analyses included multivariate analysis of variance and Poisson regression.ResultsParticipants reported mean age of 20.2 years (SD = 1.7) and were 84.9% Caucasian. The mean PHQ-9 score was 5.4 (SD = 4.6); the mean PRIUSS score was 16.4 (SD = 11.1). Participants’ risk for PIU increased by 27% with each additional 30 min spent online using a computer (RR = 1.27, 95% CI: 1.14–1.42, p < .0001). Risk for PIU was significantly increased among those who met criteria for severe depression (RR = 8.16 95% CI: 4.27–15.6, p < .0001). The PHQ-9 items describing trouble concentrating, psychomotor dysregulation and suicidal ideation were most strongly associated with PIU risk.ConclusionsThe positive relationship between depression and PIU among female college students supports screening for both conditions, particularly among students reporting particular depression symptoms.  相似文献   

16.
The development of a new amperometric biosensor for oxalate utilising two enzymes, oxalate oxidase (OXO) and horseradish peroxidase (HRP), incorporated into carbon paste electrode modified with silica gel coated with titanium oxide containing toluidine blue is described. OXO has been immobilised on silica gel modified with titanium oxide surface using glutaraldehyde for crosslinking. HRP has been immobilised with covalent binding with carbodiimide on graphite powder. The biosensor showed a good performance with a linear response range between 0.1 and 2.0 mmol l−1 of oxalate, fit by the equation i=0.33(±0.04)+2.29(±0.04) [oxalate], where i is the current in μA and [oxalate] is the oxalate concentration in mmol l−1 with a correlation coefficient of 0.998 for n=20. The biosensor could be used for 80 determinations when stored in a succinate buffer at pH 3.8 in a refrigerator. The response time was about 0.5 s. The detection limit, considering three times the noise, was 0.09 mmol l−1 for oxalate. The time for oxalate determination in spinach samples decreased by 3 days when this biosensor was used, compared to the AOAC method.  相似文献   

17.
The construction of symmetric and symplectic exponentially fitted modified Runge–Kutta–Nyström (SSEFRKN) methods is considered. Based on the symmetry, symplecticity, and exponentially fitted conditions, new explicit modified RKN integrators with FSAL property are obtained. The new integrators integrate exactly differential systems whose solutions can be expressed as linear combinations of functions from the set { exp(± iωt)}, ω > 0, i2 = −1, or equivalently from the set { cos(ωt), sin(ωt)}. The phase properties of the new integrators are examined and their periodicity regions are obtained. Numerical experiments are accompanied to show the high efficiency and competence of the new SSEFRKN methods compared with some highly efficient nonsymmetric symplecti EFRKN methods in the literature.  相似文献   

18.
Aboveground dry biomass was estimated for the 1.3 M km2 forested area south of the treeline in the eastern Canadian province of Québec by combining data from an airborne and spaceborne LiDAR, a Landsat ETM+ land cover map, a Shuttle Radar Topographic Mission (SRTM) digital elevation model, ground inventory plots, and vegetation zone maps. Plot-level biomass was calculated using allometric relationships between tree attributes and biomass. A small footprint portable laser profiler then flew over these inventory plots to develop a generic airborne LiDAR-based biomass equation (R2 = 0.65, n = 207). The same airborne LiDAR system flew along four portions of orbits of the ICESat Geoscience Laser Altimeter System (GLAS). A square-root transformed equation was developed to predict airborne profiling LiDAR estimates of aboveground dry biomass from GLAS waveform parameters combined with an SRTM slope index (R2 = 0.59, n = 1325).Using the 104,044 quality-filtered GLAS pulses obtained during autumn 2003 from 97 orbits over the study area, we then predicted aboveground dry biomass for the main vegetation areas of Québec as well as for the entire Province south of the treeline. Including cover type covariances both within and between GLAS orbits increased standard errors of the estimates by two to five times at the vegetation zone level and as much as threefold at the provincial level. Aboveground biomass for the whole study area averaged 39.0 ± 2.2 (standard error) Mg ha? 1 and totalled 4.9 ± 0.3 Pg. Biomass distributions were 12.6% northern hardwoods, 12.6% northern mixedwood, 38.4% commercial boreal, 13% non-commercial boreal, 14.2% taiga, and 9.2% treed tundra. Non-commercial forests represented 36% of the estimated aboveground biomass, thus highlighting the importance of remote northern forests to C sequestration. This study has shown that space-based forest inventories of northern forests could be an efficient way of estimating the amount, distribution, and uncertainty of aboveground biomass and carbon stocks at large spatial scales.  相似文献   

19.
The utilization of mathematical and computational tools for pollutant assessment frameworks has become increasingly valuable due to the capability to interpret integrated variable measurements. Artificial neural networks (ANNs) are considered as dependable and inexpensive techniques for data interpretation and prediction. The self-organizing map (SOM) is an unsupervised ANN used for data training to classify and effectively recognize patterns embedded in the input data space. Application of SOM–ANN is useful for recognizing spatial patterns in contaminated zones by integrating chemical, physical, ecotoxicological and toxicokinetic variables in the identification of pollution sources and similarities in the quality of the samples. Water (n = 11), soil (n = 38) and sediment (n = 54) samples from four areas in the Niger Delta (Nigeria) were classified based on their chemical, toxicological and physical variables applying the SOM. The results obtained in this study provided valuable assessment using the SOM visualization capabilities and highlighted zones of priority that might require additional investigations and also provide productive pathway for effective decision making and remedial actions.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号