Acute lung injury (ALI) afflicts approximately 200,000 patients annually and has a 40% mortality rate. The COVID-19 pandemic has massively increased the rate of ALI incidence. The pathogenesis of ALI involves tissue damage from invading microbes and, in severe cases, the overexpression of inflammatory cytokines such as tumor necrosis factor-α (TNF-α) and interleukin-1β (IL-1β). This study aimed to develop a therapy to normalize the excess production of inflammatory cytokines and promote tissue repair in the lipopolysaccharide (LPS)-induced ALI. Based on our previous studies, we tested the insulin-like growth factor I (IGF-I) and BTP-2 therapies. IGF-I was selected, because we and others have shown that elevated inflammatory cytokines suppress the expression of growth hormone receptors in the liver, leading to a decrease in the circulating IGF-I. IGF-I is a growth factor that increases vascular protection, enhances tissue repair, and decreases pro-inflammatory cytokines. It is also required to produce anti-inflammatory 1,25-dihydroxyvitamin D. BTP-2, an inhibitor of cytosolic calcium, was used to suppress the LPS-induced increase in cytosolic calcium, which otherwise leads to an increase in proinflammatory cytokines. We showed that LPS increased the expression of the primary inflammatory mediators such as toll like receptor-4 (TLR-4), IL-1β, interleukin-17 (IL-17), TNF-α, and interferon-γ (IFN-γ), which were normalized by the IGF-I + BTP-2 dual therapy in the lungs, along with improved vascular gene expression markers. The histologic lung injury score was markedly elevated by LPS and reduced to normal by the combination therapy. In conclusion, the LPS-induced increases in inflammatory cytokines, vascular injuries, and lung injuries were all improved by IGF-I + BTP-2 combination therapy. 相似文献
Software development processes have been evolving from rigid, pre-specified, and sequential to incremental, and iterative. This evolution has been dictated by the need to accommodate evolving user requirements and reduce the delay between design decision and feedback from users. Formal verification techniques, however, have largely ignored this evolution and even when they made enormous improvements and found significant uses in practice, like in the case of model checking, they remained confined into the niches of safety-critical systems. Model checking verifies if a system’s model \(\mathcal{M}\) satisfies a set of requirements, formalized as a set of logic properties \(\Phi\). Current model-checking approaches, however, implicitly rely on the assumption that both the complete model \(\mathcal{M}\) and the whole set of properties \(\Phi\) are fully specified when verification takes place. Very often, however, \(\mathcal{M}\) is subject to change because its development is iterative and its definition evolves through stages of incompleteness, where alternative design decisions are explored, typically to evaluate some quality trade-offs. Evolving systems specifications of this kind ask for novel verification approaches that tolerate incompleteness and support incremental analysis of alternative designs for certain functionalities. This is exactly the focus of this paper, which develops an incremental model-checking approach for evolving Statecharts. Statecharts have been chosen both because they are increasingly used in practice natively support model refinements. 相似文献
The temperature dependence of the diffusion coefficient of ethanol-soluble substances from ground cloves (particle size 250
μm) during extraction was estimated by fitting batch extraction data at several temperatures (27.8, 40, 50, and 60°C) to a
previously developed mass transfer model. The model was based on spherical geometry of particles. Nonlinear regression analysis
was used to develop an equation that describes the diffusivity as a function of temperature. The temperature dependence ofDA was of the Arrhenius type. 相似文献
Importance sampling is a technique that is commonly used to speed up Monte Carlo simulation of rare events. However, little is known regarding the design of efficient importance sampling algorithms in the context of queueing networks. The standard approach, which simulates the system using an a priori fixed change of measure suggested by large deviation analysis, has been shown to fail in even the simplest network settings. Estimating probabilities associated with rare events has been a topic of great importance in queueing theory, and in applied probability at large. In this article, we analyse the performance of an importance sampling estimator for a rare event probability in a Jackson network. This article carries out strict deadlines to a two-node Jackson network with feedback whose arrival and service rates are modulated by an exogenous finite state Markov process. We have estimated the probability of network blocking for various sets of parameters, and also the probability of missing the deadline of customers for different loads and deadlines. We have finally shown that the probability of total population overflow may be affected by various deadline values, service rates and arrival rates. 相似文献
We introduce N‐PolyVector fields, a generalization of N‐RoSy fields for which the vectors are neither necessarily orthogonal nor rotationally symmetric. We formally define a novel representation for N‐PolyVectors as the root sets of complex polynomials and analyze their topological and geometric properties. A smooth N‐PolyVector field can be efficiently generated by solving a sparse linear system without integer variables. We exploit the flexibility of N‐PolyVector fields to design conjugate vector fields, offering an intuitive tool to generate planar quadrilateral meshes. 相似文献
This paper presents an empirical study of control logic specifications used to document industrial control logic code in manufacturing applications. More than one hundred input/output related property specifications from ten different reusable function blocks were investigated. The main purpose of the study was to provide understanding of how the specifications are expressed by industrial practitioners, in order to develop new tools and methods for specifying control logic software, as well as for evaluating existing ones. In this paper, the studied specifications are used to evaluate linear temporal logic in general and the specification language ST-LTL, tailored for functions blocks, in particular. The study shows that most specifications are expressed as implications, that should always be fulfilled, between input and output conditions. Many of these implications are complex since the input and output conditions may be mixed and involve sequences, timer issues and non-boolean variables. Using ST-LTL it was possible to represent all implications of this study. The few non-implication specifications could be specified in ST-LTL as well after being altered to suit the specification language. The paper demonstrates some advantages of ST-LTL compared to standard linear temporal logic and discusses possible improvements such as support for automatic rewrite of complex specifications. 相似文献
The statistical selectivity models were developed for four different Fischer–Tropsch synthesis product range, including methane (CH4), light olefins (C2=C4), light paraffins (C2–C4), and long-chain hydrocarbons (C5+), based on the experimental data obtained over thirteen γ-Al2O3 supported cobalt-based catalysts with different cobalt particle and pore sizes. The input variables consist of cobalt metal particle size and catalyst pore size. The cubic and quadratic polynomial equations were fitted to the experimental data, however, the mathematical models were subjected to model reduction for the enhancement of model adequacy, which was investigated through ANOVA. The multi-objective optimization revealed that the maximum C5+?selectivity (84.150%) could be achieved at the cobalt particle size and pore sizes of 14.764 and 23.129 nm, respectively, while keeping the selectivity to other hydrocarbon products minimum.