共查询到20条相似文献,搜索用时 15 毫秒
1.
2.
3.
《Drug development and industrial pharmacy》2013,39(8):889-904
AbstractAttempts to establish and utilize in vitro/in vivo correlations for the assessment of extended-release (ER) solid oral dosage forms was reemphasized at a recent International Congress. In 1988 the United States Pharmacopeial's (USP) Subcommittee on Biopharmaceutics proposed 3 levels of such correlations, A, B and C in decreasing order of importance. The highest order, level A, is assumed when successful prediction of the complete drug serum/plasma concentrations versus time profile using dissolution data is achieved. This report describes the successful establishment of Level A correlations for 2 different ER oral dosage forms of theophylline using the “Biorelevant” technique first proposed by Leeson et al in 1985. Dissolution studies were undertaken on the 2 different formulations, namely, Theodur ® 300 mg tablets, and Retafyllin 300 mg tablets. The dissolution studies were performed using the USP Apparatus 2 (paddle) in buffered media over the pH range 3.0 to 7.5. These data were subsequently used to simulate in vivo profiles, which, under specific dissolution conditions were extremely well correlated with the in vivo data following administration of the respective dosage forms to healthy human volunteers. 相似文献
4.
This paper discusses the current level of the road safety problems of cycling and cyclists, why cyclists run relatively high risks, and why cyclists may be considered as 'vulnerable road users'. This paper is based on peer-reviewed research which give some idea how to reduce the number of cyclist casualties. However, this research is rather limited and the results cannot (easily) be transferred from one setting or country to another: generalization of results should only be done with the utmost care, if it is to be done at all. Interventions to reduce cyclist casualties worldwide seem to be of an incidental nature; that is to say, they are implemented in a rather isolated way. In a Safe System approach, such as the Dutch Sustainable Safety vision, the inherent risks of traffic are dealt with in a systematic, proactive way. We illustrate how this approach is especially effective for vulnerable road users, such as cyclists. Finally, the paper addresses the question of whether it is possible to make more cycling good for road safety. We conclude that when the number of cyclists increases, the number of fatalities may increase, but will not necessarily do so, and the outcome is dependent on specific conditions. There is strong evidence that well-designed bicycle facilities-physically separated networks-reduce risks for cyclists, and therefore have an impact on the net safety result, for example if car-kilometres are substituted by bicycle kilometres. Policies to support cycling should incorporate these findings in order to make more cycling good for road safety. 相似文献
5.
Hauer E 《Accident; analysis and prevention》2008,40(4):1634-1635
When a road safety study is contemplated one has establish how many accidents are needed to reach conclusions with a given level of confidence. Later, when the results are in, one has to be explicit about the confidence with which conclusions are stated. The purpose of this note is to describe a back-of-the envelope way of answering such questions with a precision that is sufficient for practical purposes. 相似文献
6.
7.
In this article we present a precise definition of the notion "own-group preference" and characterize all functions capable
of correctly measuring it. Examples of such functions are provided. The weighted Lorenz curve and the theory developed for
it will be our main tools for reaching this goal. We further correct our earlier articles on this subject. In the context
of own-language preference, Bookstein and Yitzhaki proposed the logarithm of the odds-ratio as an acceptable measure of own-group
preference. We now present a general framework within which the concept of own-group preference, and its opposite, namely
own-group aversion, can be precisely pinpointed. This framework is derived form inequality theory and is based on the use
of the weighted Lorenz curve. The concept of own-group preference is an interesting notion with applications in different
fields such as sociology, political sciences, economics, management science and of course, the information sciences. Some
examples are provided.
This revised version was published online in June 2006 with corrections to the Cover Date. 相似文献
8.
Mathematical models based on average or steady-state conditions are insufficient when dealing with dynamic situations faced by production–inventory systems in current business environments. Therefore, the use of mathematical tools based on control theory to handle time-varying phenomena has been reinvigorated in order to accommodate these new needs. Given the variety of research approaches in the field over several decades, there is a need to provide a review of this work. This review identifies some major research efforts for applying control theoretic methods to production–inventory systems. It is shown that in general, control theory is applied to reduce inventory variation, reduce demand amplification and optimize ordering rules. Some control theory tools applied are block diagram algebra, Mason's gain formula, Bode plots, Laplace transform, Z transform and optimal control. Basic approaches are classified within stochastic control theory and deterministic control theory. Two important issues are then identified within the deterministic models. First, separate efforts to integrate systems horizontally (i.e. supply chain) or vertically (i.e. hierarchical approach) are identified. Second, none of the reviewed models implemented a systematic way to calculate all the required model parameters. Some authors presented suggestions to optimize some parameters, but no reference was found that tried to obtain these parameter values from a real system. 相似文献
9.
Feed-in tariffs (FITs) are among the most favoured policies with which to drive the deployment of renewable energy. This paper offers insights into quantifying dynamic FITs to realise the expected installed capacity target with minimum policy cost under uncertainties of renewable intermittence and technology learning. We incorporate real options and use stochastic dynamic programming to model the strategic behaviour between policy-maker and investor and extend the one-time investment decision described by Farrell et al. [2017. ‘Specifying an Efficient Renewable Energy Feed-in Tariff.’ The Energy Journal 38: 53–75] to multiple-period decisions. An approach that combines binary tree scenario generation and a least squares Monte Carlo method is used to numerically identify the optimal FITs plan in practice. China’s offshore wind power investment is used as a case study to investigate the relationships among the optimal dynamic FITs level, the total policy cost, the expected capacity target, and the learning effect. The simulation results demonstrate that our proposed dynamic FITs can track the changes in technology learning well and that they can avoid the inefficiency of fixed FITs in stimulating technology adoption in the initial periods, along with overpayment by the policy-maker. 相似文献
10.
Although much concern over type I errors has permeated psychology for decades, there is less concern over type II errors. In fact, type II errors constitute a serious problem in safety research that can result in accidents and fatalities because researchers fail to reject the null hypothesis due to arbitrary probability thresholds. The purpose of this paper is to reveal how often type II errors occur and the effect they have on applied ergonomics research. Computer simulations using population parameters were generated, revealing that type II errors happen quite often, particularly with effect sizes between 0.2 and 1.2. A utility analysis also reveals that the cost of type II errors on society is much greater than it needs to be. Solutions for avoiding type II errors are discussed. 相似文献
11.
It is widely feared that a novel, highly pathogenic, human transmissible influenza virus may evolve that could cause the next global pandemic. Mitigating the spread of such an influenza pandemic would require not only the timely administration of antiviral drugs to those infected, but also the implementation of suitable intervention policies for stunting the spread of the virus. Towards this end, mathematical modelling and simulation studies are crucial as they allow us to evaluate the predicted effectiveness of the various intervention policies before enforcing them. Diagnosis plays a vital role in the overall pandemic management framework by detecting and distinguishing the pathogenic strain from the less threatening seasonal strains and other influenza-like illnesses. This allows treatment and intervention to be deployed effectively, given limited antiviral supplies and other resources. However, the time required to design a fast and accurate testkit for novel strains may limit the role of diagnosis. Herein, we aim to investigate the cost and effectiveness of different diagnostic methods using a stochastic agent-based city-scale model, and then address the issue of whether conventional testing approaches, when used with appropriate intervention policies, can be as effective as fast testkits in containing a pandemic outbreak. We found that for mitigation purposes, fast and accurate testkits are not necessary as long as sufficient medication is given, and are generally recommended only when used with extensive contact tracing and prophylaxis. Additionally, in the event of insufficient medication and fast testkits, the use of slower, conventional testkits together with proper isolation policies while waiting for the diagnostic results can be an equally effective substitute. 相似文献
12.
Jenkins T Bovi A Edwards R 《Philosophical transactions. Series A, Mathematical, physical, and engineering sciences》2011,369(1942):1826-1839
Depletion of oil reserves and the associated effects on climate change have prompted a re-examination of the use of plant biomass as a sustainable source of organic carbon for the large-scale production of chemicals and materials. While initial emphasis has been placed on biofuel production from edible plant sugars, the drive to reduce the competition between crop usage for food and non-food applications has prompted massive research efforts to access the less digestible saccharides in cell walls (lignocellulosics). This in turn has prompted an examination of the use of other plant-derived metabolites for the production of chemicals spanning the high-value speciality sectors through to platform intermediates required for bulk production. The associated science of biorefining, whereby all plant biomass can be used efficiently to derive such chemicals, is now rapidly developing around the world. However, it is clear that the heterogeneity and distribution of organic carbon between valuable products and waste streams are suboptimal. As an alternative, we now propose the use of synthetic biology approaches to 're-construct' plant feedstocks for optimal processing of biomass for non-food applications. Promising themes identified include re-engineering polysaccharides, deriving artificial organelles, and the reprogramming of plant signalling and secondary metabolism. 相似文献
13.
14.
Inventor disambiguation is an increasingly important issue for users of patent data. We propose and test a number of refinements to the original Massacrator algorithm, originally proposed by Lissoni et al. (The keins database on academic inventors: methodology and contents, 2006) and now applied to APE-INV, a free access database funded by the European Science Foundation. Following Raffo and Lhuillery (Res Policy 38:1617–1627, 2009) we describe disambiguation as a three step process: cleaning&parsing, matching, and filtering. By means of sensitivity analysis, based on MonteCarlo simulations, we show how various filtering criteria can be manipulated in order to obtain optimal combinations of precision and recall (type I and type II errors). We also show how these different combinations generate different results for applications to studies on inventors’ productivity, mobility, and networking; and discuss quality issues related to linguistic issues. The filtering criteria based upon information on inventors’ addresses are sensitive to data quality, while those based upon information on co-inventorship networks are always effective. Details on data access and data quality improvement via feedback collection are also discussed. 相似文献
15.
With the ever‐expanding number of manufactured nanomaterials (MNMs) under development there is a vital need for nanotoxicology studies that test the potential for MNMs to cause harm to health. An extensive body of work in cell cultures and animal models is vital to understanding the physicochemical characteristics of MNMs and the biological mechanisms that underlie any detrimental actions to cells and organs. In human subjects, exposure monitoring is combined with measurement of selected health parameters in small panel studies, especially in occupational settings. However, the availability of further in vivo human data would greatly assist the risk assessment of MNMs. Here, the potential for controlled inhalation exposures of MNMs in human subjects is discussed. Controlled exposures to carbon, gold, aluminum, and zinc nanoparticles in humans have already set a precedence to demonstrate the feasibility of this approach. These studies have provided considerable insight into the potential (or not) of nanoparticles to induce inflammation, alter lung function, affect the vasculature, reach the systemic circulation, and accumulate in other organs. The need for further controlled exposures of MNMs in human volunteers ‐ to establish no‐effect limits, biological mechanisms, and provide vital data for the risk assessment of MNMs ‐ is advocated. 相似文献
16.
Cottingham K 《Analytical chemistry》2005,77(9):197A-200A
17.
18.
Jiameng Liu Linghao He Shuangrun Zhao Lijun Hu Sizhuan Li Zhihong Zhang Miao Du 《Small (Weinheim an der Bergstrasse, Germany)》2023,19(42):2302600
An n-n type heterojunction comprising with Cu N and B N dual active sites is synthesized via in situ growth of a conductive metal–organic framework (MOF) [Cu3(HITP)2] (HITP = 2,3,6,7,10,11-hexaiminotriphenylene) on hexagonal boron nitride (h-BN) nanosheets (hereafter denoted as Cu3(HITP)2@h-BN) for the electrocatalytic nitrogen reduction reaction (eNRR). The optimized Cu3(HITP)2@h-BN shows the outstanding eNRR performance with the NH3 production of 146.2 µg h−1 mgcat−1 and the Faraday efficiency of 42.5% due to high porosity, abundant oxygen vacancies, and Cu N/B N dual active sites. The construction of the n-n heterojunction efficiently modulates the state density of active metal sites toward the Fermi level, facilitating the charge transfer at the interface between the catalyst and reactant intermediates. Additionally, the pathway of NH3 production catalyzed by the Cu3(HITP)2@h-BN heterojunction is illustrated by in situ FT-IR spectroscopy and density functional theory calculation. This work presents an alternative approach to design advanced electrocatalysts based on conductive MOFs. 相似文献
19.
20.
Kinetic-theory, with the assumption of equipartition of granular energy, suggests that the pressure and viscosity of a granular mixture vary monotonically with the mass-ratio. Our simulation results show a non-monotonic behaviour that can be explained qualitatively by a simple model allowing for non-equipartition of granular energy between the species with different mass. 相似文献