首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The antineoplastic drug Adriamycin is administered by short term i.v. infusion. The drug is noted for its acute and chronic cardiotoxicity in addition to its toxicity on the bone marrow and gut. A physiological flow model and computer program, which contains a subprogram for continuous infusion of a drug, was used to simulate the distribution of Adriamycin during infusion in heart, bone marrow, gut, kidney, muscle, skin, liver adn bile. Simulations were carried out for 1, 2, 5, 10 and 20 min of infusion. Computations were made for the concentration of drug in these organs as a function of time over a total period of 48 min. Simulations showed the heart and kidney would contain high concentrations of drug during infusion which rapidly declined upon cessation of infusion. Bone marrow and gut showed a less rapid accumulation and decline. Muscle and skin showed even a slower accumulation and decline. Simulations also showed the effect of altering the rate of infusion on the concentration in the liver and excretion in the bile.  相似文献   

2.
On the small sample size problems such as appearance-based recognition, empirical studies have shown that ICA projections have trivial effect on improving the recognition performance over whitened PCA. However, what causes the ineffectiveness of ICA is still an open question. In this study, we find out that this small sample size problem of ICA is caused by a special distributional phenomenon of the high-dimensional whitened data: all data points are similarly distant, and nearly perpendicular to each other. In this situation, ICA algorithms tend to extract the independent features simply by the projections that isolate single or very few samples apart and congregate all other samples around the origin, without any concern on the clustering structure. Our comparative study further shows that the ICA projections usually produce misleading features, whose generalization ability is generally worse than those derived by random projections. Thus, further selection of the ICA features is possibly meaningless. To address the difficulty in pursuing low-dimensional features, we introduce a locality pursuit approach which applies the locality preserving projections in the high-dimensional whitened space. Experimental results show that the locality pursuit performs better than ICA and other conventional approaches, such as Eigenfaces, Laplacianfaces, and Fisherfaces.  相似文献   

3.
《电子技术应用》2017,(7):74-77
针对传统云水含量传感器测量误差大、精度低、功耗大等问题,设计了一种探空仪搭载的云液水含量传感器。该新型传感器系统包括探头、嵌入式处理电路、高精度信号检测电路以及数据传输电路。通过采用计算流体动力学分析方法对云水含量传感器进行仿真与数值计算,验证了该云水含量传感器设计的可行性。采用模糊自适应(PID)算法对传感器表面探头进行温度控制,提高了系统的稳定性。设计了一种基于24 bit ADC的高精度测控电路,提高了系统的测量精度。实验结果显示,在云水含量值是1.0 g/m3时,传感器所消耗的平均功率为1.63 W。与传统热线仪传感器比较,这种新型云水含量传感器具有精度高、功耗低等优点。  相似文献   

4.
The effect of non-orthogonality of an entangled non-orthogonal state-based quantum channel is investigated in detail in the context of the teleportation of a qubit. Specifically, average fidelity, minimum fidelity and minimum assured fidelity (MASFI) are obtained for teleportation of a single-qubit state using all the Bell-type entangled non-orthogonal states known as quasi-Bell states. Using Horodecki criterion, it is shown that the teleportation scheme obtained by replacing the quantum channel (Bell state) of the usual teleportation scheme by a quasi-Bell state is optimal. Further, the performance of various quasi-Bell states as teleportation channel is compared in an ideal situation (i.e., in the absence of noise) and under different noise models (e.g., amplitude and phase damping channels). It is observed that the best choice of the quasi-Bell state depends on the amount non-orthogonality, both in noisy and noiseless case. A specific quasi-Bell state, which was found to be maximally entangled in the ideal conditions, is shown to be less efficient as a teleportation channel compared to other quasi-Bell states in particular cases when subjected to noisy channels. It has also been observed that usually the value of average fidelity falls with an increase in the number of qubits exposed to noisy channels (viz., Alice’s, Bob’s and to be teleported qubits), but the converse may be observed in some particular cases.  相似文献   

5.
Natural processes, such as dust storms and sea salt spray, and anthropogenic activities, such as the burning of fossil fuels and biomass, introduce aerosols into the atmosphere. Their concentration, geographic distribution and particle size promote significant climatic consequences. Aerosol transport processes, from landmasses to oceans, are scarcely understood because of inadequate in-situ observations. This study reports the results of spectral aerosol optical depth (AOD) measurements using a five-channel (380, 440, 500, 675 and 870 nm) handheld MICROTOPS Sun-photometer used during a sea-truth data collection campaign conducted in the central Bay of Bengal (BOB) during the northeastern monsoon period (10 November to 13 December 2007). For the entire cruise period, the mean values of the daily average of the AODs at 500 nm and 870 nm were 0.39 ± 0.065 and 0.22 ± 0.047, respectively, the mean value of the Angstrom exponent (α) was 1.23 ± 0.2 and the turbidity parameter (β) was 0.183 ± 0.044. A smaller α value together with a larger β value suggests the presence of an abundance of smaller aerosol particles near the coast. An air mass back-trajectory analysis was undertaken to identify the potential source regions of the aerosols. Analysis of the results demonstrated the effect of the aerosol transport and source regions on the spectral behaviour of the AODs. In-situ measured AOD (550 nm) and α (550 nm, 865 nm) values were further compared with Moderate Resolution Imaging Spectroradiometer (MODIS)-derived parameters. The in-situ and MODIS-derived AOD values were found to be in good agreement, with a coefficient of determination (R 2) of 0.78 and a standard error of 0.05, while the R 2 for α was 0.68 with a standard error of 0.14.  相似文献   

6.
As micrototal analysis system (μ-TAS) becoming more extensively used, techniques for fabricating microchannels, microactuators, and measuring systems are becoming increasingly important. This study describes a novel method for fabricating a closed microchannel that can be used to measure the pico-scale flow rate of a liquid solution with an accuracy of better than 1 pL/s. The flow rate of 9 pL/s is calculated from the measurements. Without any high temperature or high-voltage processing, the microchannel can be integrated in the complementary metal–oxide–semiconductor (CMOS) microfluidic system and the fabricating process involves only some standard CMOS processes and common materials. The batch fabrication of a single integrated circuit (IC) chip is essential to reaching the goal of a system in one chip.  相似文献   

7.
8.
Flood disasters are the most common natural risk and tremendous efforts are spent to improve their simulation and management. However, simulation-based investigation of actions that can be taken in case of flood emergencies is rarely done. This is in part due to the lack of a comprehensive framework which integrates and facilitates these efforts. In this paper, we tackle several problems which are related to steering a flood simulation. One issue is related to uncertainty. We need to account for uncertain knowledge about the environment, such as levee-breach locations. Furthermore, the steering process has to reveal how these uncertainties in the boundary conditions affect the confidence in the simulation outcome. Another important problem is that the simulation setup is often hidden in a black-box. We expose system internals and show that simulation steering can be comprehensible at the same time. This is important because the domain expert needs to be able to modify the simulation setup in order to include local knowledge and experience. In the proposed solution, users steer parameter studies through the World Lines interface to account for input uncertainties. The transport of steering information to the underlying data-flow components is handled by a novel meta-flow. The meta-flow is an extension to a standard data-flow network, comprising additional nodes and ropes to abstract parameter control. The meta-flow has a visual representation to inform the user about which control operations happen. Finally, we present the idea to use the data-flow diagram itself for visualizing steering information and simulation results. We discuss a case-study in collaboration with a domain expert who proposes different actions to protect a virtual city from imminent flooding. The key to choosing the best response strategy is the ability to compare different regions of the parameter space while retaining an understanding of what is happening inside the data-flow system.  相似文献   

9.
This paper compares performances of PID controllers designed for a Gryphon robot joints using four hybrid evolutionary algorithms. We try to reduce the transient state of the step response. To this end, a function of some specifications of the step response (i.e. overshoot, settling time, rise time and the steady state error) is defined. We minimize this function by using four approaches, i.e. Particle swarm optimization (PSO) algorithm, the queen-bee (QB) algorithm, the genetic algorithm (GA) (all hybridized with the Nelder-Mead (NM) algorithm), and the shuffled complex evolution (SCE), and the results are compared. For designing a PID controller we should test several methods and based on the obtained results we can select one of them as the best method. In our case, the queen-bee for joints 1, 2, and 4, the genetic algorithm for joint 3, and the shuffled complex evolution method for joint 5 produce better results.  相似文献   

10.
A new methodology, namely, artificial neural network (ANN) approach was proposed for modeling and predicting flow behavior of the polyethylene melt through nanochannels of nanoporous alumina templates. Wetting length of the nanochannels was determined to be a function of time, temperature, diameter of nanochannels, and surface properties of the inner wall of the nanochannels. An ANN was designed to forecast the relationship between the length of wetting as output parameter and other aforementioned parameters as input variables. It was demonstrated that the ANN method is capable of modeling this phenomenon with high accuracy. The designed ANN was then employed to obtain the wetting length of the nanochannels for those cases, which were not reported by the wetting experiments. The results were then analyzed statistically to identify the effect of each independent variable, namely, time, temperature, diameter of nanochannels, and surface properties of the inner wall of nanochannels as well as their combinations on the wetting length of the nanochannels. Interesting results were attained and discussed.  相似文献   

11.
This paper presents a computational study of the separated flow in a planar asymmetric diffuser. The steady RANS equations for turbulent incompressible fluid flow and six turbulence closures are used in the present study. The commercial software code, FLUENT 6.3.26, was used for solving the set of governing equations using various turbulence models. Five of the used turbulence models are available directly in the code while the v2f turbulence model was implemented via user defined scalars (UDS) and user defined functions (UDF). A series of computational analysis is performed to assess the performance of turbulence models at different grid density. The results show that the standard kω, SST kω and v2f models clearly performed better than other models when an adverse pressure gradient was present. The RSM model shows an acceptable agreement with the velocity and turbulent kinetic energy profiles but it failed to predict the location of separation and attachment points. The standard kε and the low-Re kε delivered very poor results.  相似文献   

12.
The design of a human–computer interactive system can be unacceptable for a range of reasons. User performance concerns, for example the likelihood of user errors and time needed for a user to complete tasks, are important areas of consideration. For safety-critical systems it is vital that tools are available to support the analysis of such properties before expensive design commitment has been made. In this work, we give a unified formal verification framework for integrating two kinds of analysis: (1) predicting bounds for task-completion times via exhaustive state-space exploration, and (2) detecting user-error related design issues. The framework is based on a generic model of cognitively plausible behaviour that captures assumptions about cognitive behaviour decided through a process of interdisciplinary negotiation. Assumptions made in an analysis, including those relating to the performance consequences of users recovering from likely errors, are also investigated in this framework. We further present a novel way of exploring the consequences of cognitive mismatches, on both correctness and performance grounds. We illustrate our analysis approach with a realistic medical device scenario: programming an infusion pump. We explore an initial pump design and then two variations based on features found in real designs, illustrating how the approach identifies both timing and human error issues.  相似文献   

13.
The hybridization of flow and image analysis in the prescreening of gynecological smears combines the advantages of both methods. Taking the Heidelberg flow analysis system (HEIFAS) as an example the prerequisites of a successful combination of both systems are presented: (a) preservation of cell morphology; (b) avoidance of preferential cell loss; (c) ability to restain sorted cells; and (d) recognition of false alarms at high resolution. Two TV-based imaging instruments have been applied to cells sorted by two-parameter flow cytometry. Both instruments operating in the absorption mode (FAZYTAN) as well as in the fluorescence mode (LEYTAS) allow the detection of false alarms in flow. This preliminary study shows, that a discrimination of false alarms and true suspicious cells was found to be possible.  相似文献   

14.
15.
This study was conducted to investigate the virtual display effects on direct interaction performance metrics such as accuracy, task completion time, and comfort. Eighteen participants performed tapping (pointing) tasks in the coronal plane by directly reaching for tapping targets at three egocentric distance levels, with three indices of difficulty at each egocentric distance. The position data and severity of cybersickness symptoms were collected with a motion system and a symptom questionnaire, respectively. The results indicated that accuracy was higher with the stereoscopic widescreen display than with the head mounted display. However, no significant differences in task completion time, throughput, and cybersickness were observed between the two VR displays. In addition, increasing the egocentric distance improved accuracy and lengthened the task completion time, whereas increasing the task difficulty lengthened the task completion time but did not affect the accuracy. The findings are important and informative for users in choosing between the two virtual reality displays. Generally, the stereoscopic widescreen display can be recommended for tasks requiring high egocentric distance accuracy in the coronal plane. Furthermore, developers may refer to these findings in designing interfaces that allow a more natural way of interaction for users.  相似文献   

16.

Waterflooding is a significantly important process in the life of an oil field to sweep previously unrecovered oil between injection and production wells and maintain reservoir pressure at levels above the bubble-point pressure to prevent gas evolution from the oil phase. This is a critical reservoir management practice for optimum recovery from oil reservoirs. Optimizing water injection volumes and optimizing well locations are both critical reservoir engineering problems to address since water injection capacities may be limited depending on the geographic location and facility limits. Characterization of the reservoir connectivity between injection and production wells can greatly contribute to the optimization process. In this study, it is proposed to use computationally efficient methods to have a better understanding of reservoir flow dynamics in a waterflooding operation by characterizing the reservoir connectivity between injection and production wells. First, as an important class of artificial intelligence methods, artificial neural networks are used as a fully data-driven modeling approach. As an additional powerful method that draws analogy between source/sink terms in oil reservoirs and electrical conductors, capacitance–resistance models are also used as a reduced-physics-driven modeling approach. After understanding each method’s applicability to characterize the interwell connectivity, a comparative study is carried out to determine strengths and weaknesses of each approach in terms of accuracy, data requirements, expertise requirements, training algorithm and processing times.

  相似文献   

17.
This paper describes a comparative study of a multidimensional visualisation technique and multivariate statistical process control (MSPC) for process historical data analysis. The visualisation technique uses parallel coordinates which visualise multidimensional data using two dimensional presentations and allow identification of clusters and outliers, therefore, can be used to detect abnormal events. The study is based on a database covering 527 days of operation of an industrial wastewater treatment plant. It was found that both the visualisation technique and MSPC based on T2 chart captured the same 17 days as “clearly abnormal” and another eight days as “likely abnormal”. Pattern recognition using K-means clustering was also applied to the same data in literature and was found to have identified 14 out of the 17 “clearly abnormal” days.  相似文献   

18.
This paper describes a method to detect knee stress using liquid crystal thermography and presents the results of a case study in which the system was applied to two carpet installers. The method involves placing heat-sensitive sheets of film on the knees of workers at various intervals during the work day. The thermographic sheets react to variations in heat by changing colour. The measurements are taken with the worker's knee positioned in an illuminated, enclosed box. Once the patch stabilizes, the exhibited colours are recorded with an 8 mm video camera. The colour pattern, ranging from brown to blue, provides a thermal record of what is believed to be knee stress resulting from installing carpet. The thermographic records are stored in computer memory for subsequent analysis using an AT&T TARGA 16 video board. Custom software allows computation of the area of each distinct colour pattern as a percentage of total patch size. These records provide a characterization of knee response (inflammation) resulting from the biomechanical load sustained by the knee during the carpet installation task.  相似文献   

19.
Sequential diagnostic strategy (SDS) is widely used in engineering systems for fault isolation. In order to find source faults efficiently, the optimized SDS selects the most useful tests and schedules them in an optimized sequence. In this paper, a multiple-objective mathematical model for SDS optimization problem in large-scale engineering system is established, and correspondingly, a quantum-inspired genetic algorithm (QGA) specially targeted at this SDS optimization problem is developed. This QGA algorithm uses the form of probability amplitude of quantum bit to encode each possible diagnostic strategy extracted from fault-test dependency matrix, and then goes through evolutionary process to find the optimal strategy considering dual objectives of the expected testing cost and the number of contributing tests. Crossover and mutation operations are combined with quantum encoding in this algorithm to expand the diversity of population within a small population size and to increase the possibility of obtaining the global optimum. A case of control moment gyro system from real practice is used to verify the effectiveness of this algorithm, and a comparative study with two conventional intelligent optimization algorithms proposed for this problem, PSO and genetic algorithm, are presented to reveal its advantages.  相似文献   

20.
An authentication system using a chair with sensors attached is described. Pressure distribution (hipprint) measured by network-connected sensors on the chair is used for identifying the person sitting on the chair. Hipprint information is not sufficient for maintaining a high level of security but is sufficient for providing personalized services such as automatic log-in at home or in a small office. In experiments, we obtained correct identification rates of 99.6% for five people and 93.2% for ten people. A false rejection rate of 9.2% and a false acceptance rate of 1.9% were achieved using another group of 20 people. The results also showed that changes in hipprints can be used to estimate what the person sitting on the chair is doing, for example, using a mouse or leaning back.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号