BACKGROUND: Hypertension and hypercholesterolemia are frequently associated with this leading to considerable cardiovascular risk. METHODS: An open parallel randomized study was performed in which the effects of doxazosin, an alpha-adrenergic blocker and enalapril, an inhibitor of the angiotensin converting enzyme were compared in 70 patients with essential high blood pressure and plasma cholesterol levels greater than 240 mg/dl. Following 2-4 weeks of placebo administration the patients were randomly treated with one of the two drugs. When required doses were increased and hydrochlorothiazide added until blood pressure lower than 160/95 mmHg was achieved. After this period the patients were observed for a minimum of 8 weeks. The mean length of the study was of 22 weeks. RESULTS: Both drugs significantly reduced blood pressure without modifying cardiac frequency. Doxazosin tended to favorably modify the lipid profile of the plasma while enalapril significantly reduced the levels of cholesterol, lipids and high density lipoproteins (HDL). Upon termination of the study the total HDL/cholesterol index increased 8.6% in those treated with doxazosin and decreased 5.5% in those receiving enalapril (p < 0.05). CONCLUSIONS: Although doxazosin and enalapril are potent antihypertensive drugs, the effects on plasma lipid obtained with doxazosin indicate that a reduction in cardiovascular risk was achieved with this drug in the patients included in this study. 相似文献
A new method of implementing efficient FIR filters is presented. It involves approximation of an equiripple FIR by a rounding operation and application of the derived impulse response by a simple recursive equation. The technique is extremely efficient for lowpass, highpass, bandpass, and bandstop filters with sharp transitions and low edge frequencies 相似文献
The binding thermodynamics of the HIV-1 protease inhibitor acetyl pepstatin and the substrate Val-Ser-Gln-Asn-Tyr-Pro-Ile-Val-Gln, corresponding to one of the cleavage sites in the gag, gag-pol polyproteins, have been measured by direct microcalorimetric analysis. The results indicate that the binding of the peptide substrate or peptide inhibitor is entropically driven; i.e., it is characterized by an unfavorable enthalpy and a favorable entropy change, in agreement with a structure-based thermodynamic analysis based upon an empirical parameterization of the energetics. Dissection of the binding enthalpy indicates that the intrinsic interactions are favorable and that the unfavorable enthalpy originates from the energy cost of rearranging the flap region in the protease molecule. In addition, the binding is coupled to a negative heat capacity change. The dominant binding force is the increase in solvent entropy that accompanies the burial of a significant hydrophobic surface. Comparison of the binding energetics obtained for the substrate with that obtained for synthetic nonpeptide inhibitors indicates that the major difference is in the magnitude of the conformational entropy change. In solution, the peptide substrate has a higher flexibility than the synthetic inhibitors and therefore suffers a higher conformational entropy loss upon binding. This higher entropy loss accounts for the lower binding affinity of the substrate. On the other hand, due to its higher flexibility, the peptide substrate is more amenable to adapt to backbone rearrangements or subtle conformational changes induced by mutations in the protease. The synthetic inhibitors are less flexible, and their capacity to adapt is more restricted. The expected result is a more pronounced effect of mutations on the binding affinity of the synthetic inhibitors. On the basis of the thermodynamic differences in the mode of binding of substrate and synthetic inhibitors, it appears that a key factor to understanding resistance is given by the relative balance of the different forces that contribute to the binding free energy and, in particular, the balance between conformational and solvation entropy. 相似文献
Gas-phase selective oxidation of toluene has been carried out on vanadium oxide systems (5–20 wt.% of V2O5, equivalent to 0.4–1.7 theoretical monolayers) supported on TiO2–sepiolite (with titania loading around the theoretical monolayer, 12 wt.%) and on sepiolite. A study has been made on both the influence of vanadia loading and of the support on the catalytic behaviour of the supported vanadium systems. The reducibility by H2 TPR was also studied as well as the acid and basic/redox sites from the results of conversion of the 2-propanol test reaction of the solids. Benzaldehyde, benzoic acid and several coupling products were the main ones detected, attaining over 50% selectivity towards the benzaldehyde and benzoic acid products at a total conversion around 10%. The activity and selectivity to the selective products exhibited by vanadium systems supported on mixed support were superior to those exhibited by the systems supported on sepiolite and increased notably in both series with the increase in vanadium loading. The best catalytic behaviour exhibited by the vanadium systems supported on mixed support, which also exhibited the highest density of sites capable of being reduced (as well as their reducibility) and of those responsible for propanone formation, could be attributed not only to the different balance of the vanadia species existing in the two supports (monomeric + oligomeric/polymeric), but also to such other factors as the nature of the support and, concretely, its chemical composition. 相似文献
A Program for Outliers Elimination in Multidimensional Space (POEMS), which allows the user to eliminate outliers from training spaces as a prior step to any statistical study, is presented. Even though the program can be applied to any scientific field, the characteristics of POEMS makes it particularly suitable for 'series design' on Quantitative Structure-Activity Relationships studies. 相似文献
In this paper we propose a genetic algorithm (GA) for solving the DNA fragment assembly problem in a computational grid. The algorithm, which is named GrEA, is a steady-state GA which uses a panmitic population, and it is based on computing parallel function evaluations in an asynchronous way. We have implemented GrEA on top of the Condor system, and we have used it to solve the DNA assembly problem. This is an NP-hard combinatorial optimization problem which is growing in importance and complexity as more research centers become involved on sequencing new genomes. While previous works on this problem have usually faced 30 K base pairs (bps) long instances, we have tackled here a 77 K bps long one to show how a grid system can move research forward. After analyzing the basic grid algorithm, we have studied the use of an improvement method to still enhance its scalability. Then, by using a grid composed of up to 150 computers, we have achieved time reductions from tens of days down to a few hours, and we have obtained near optimal solutions when solving the 77 K bps long instance (773 fragments). We conclude that our proposal is a promising approach to take advantage of a grid system to solve large DNA fragment assembly problem instances and also to learn more about grid metaheuristics as a new class of algorithms for really challenging problems. 相似文献
Due to the increase and complexity of computer systems, reducing the overhead of fault tolerance techniques has become important in recent years. One technique in fault tolerance is checkpointing, which saves a snapshot with the information that has been computed up to a specific moment, suspending the execution of the application, consuming I/O resources and network bandwidth. Characterizing the files that are generated when performing the checkpoint of a parallel application is useful to determine the resources consumed and their impact on the I/O system. It is also important to characterize the application that performs checkpoints, and one of these characteristics is whether the application does I/O. In this paper, we present a model of checkpoint behavior for parallel applications that performs I/O; this depends on the application and on other factors such as the number of processes, the mapping of processes and the type of I/O used. These characteristics will also influence scalability, the resources consumed and their impact on the IO system. Our model describes the behavior of the checkpoint size based on the characteristics of the system and the type (or model) of I/O used, such as the number I/O aggregator processes, the buffering size utilized by the two-phase I/O optimization technique and components of collective file I/O operations. The BT benchmark and FLASH I/O are analyzed under different configurations of aggregator processes and buffer size to explain our approach. The model can be useful when selecting what type of checkpoint configuration is more appropriate according to the applications’ characteristics and resources available. Thus, the user will be able to know how much storage space the checkpoint consumes and how much the application consumes, in order to establish policies that help improve the distribution of resources.
Time series prediction is a complex problem that consists of forecasting the future behavior of a set of data with the only
information of the previous data. The main problem is the fact that most of the time series that represent real phenomena
include local behaviors that cannot be modelled by global approaches. This work presents a new procedure able to find predictable
local behaviors, and thus, attaining a better level of total prediction. This new method is based on a division of the input
space into Voronoi regions by means of Evolution Strategies. Our method has been tested using different time series domains.
One of them that represents the water demand in a water tank, through a long period of time. The other two domains are well
known examples of chaotic time series (Mackey-Glass) and natural phenomenon time series (Sunspot). Results prove that, in
most of cases, the proposed algorithm obtain better results than other algorithms commonly used. 相似文献
Data envelopment analysis (DEA) is a performance measurement tool that was initially developed without consideration of the decision maker (DM)'s preference structures. Ever since, there has been a wide literature incorporating DEA with value judgements such as the goal and target setting models. However, most of these models require prior judgements on target or weight setting. This paper will establish an equivalence model between DEA and multiple objective linear programming (MOLP) and show how a DEA problem can be solved interactively without any prior judgements by transforming it into an MOLP formulation. Various interactive multiobjective models would be used to solve DEA problems with the aid of PROMOIN, an interactive multiobjective programming software tool. The DM can then search along the efficient frontier to locate the most preferred solution where resource allocation and target levels based on the DM's value judgements can be set. An application on the efficiency analysis of retail banks in the UK is examined. Comparisons of the results among the interactive MOLP methods are investigated and recommendations on which method may best fit the data set and the DM's preferences will be made. 相似文献