排序方式: 共有5条查询结果,搜索用时 5 毫秒
1
1.
M Pinto Ferreira R DeLucia M Luiz Aizenstein I Glezer C Scavone 《Canadian Metallurgical Quarterly》1998,105(6-7):549-560
Dopamine (DA) and fencamfamine (FCF) modulatory action on Na,K-ATPase and Mg-ATPase activity were evaluated in rat striatum. DA and FCF induced a decrease in Na,K-ATPase, without affecting Mg-ATPase activity. The effect of FCF was dose-dependent from 10 to 100 microM, with an IC50 of 4.7 x 10(-5) M. Furthermore, the effect of FCF (100 microM) increasing AMPc levels, but not GMPc, was nonadditive with that of DA (10 microM), which is consistent to a common site of action. The 8-bromo-cyclic AMP also induced a specific reduction in the Na,K-ATPase activity. The reduction of Na,K-ATPase induced by FCF (100 microM) was blocked by either SCH 23390 or sulpiride, which are D1 and D2 receptor antagonists. The decrease in striatal NA,K-ATPase activity induced by FCF was blocked by KT 5720, a selective inhibitor of cyclic AMP-dependent protein kinase (PKA), but not by KT 5823, a selective inhibitor of cyclic GMP-dependent protein kinase (PKG). Otherwise, KT 5720 or KT 5823 did not produce any change in Na,K-ATPase or Mg-ATPase activity. These data suggest that FCF reduces Na,K-ATPase activity through cyclic AMP-dependent changes in protein phosphorylation via a PKA mechanism. 相似文献
2.
We investigate the complexity of learning for the well-studied model in which the learning algorithm may ask membership and
equivalence queries. While complexity theoretic techniques have previously been used to prove hardness results in various
learning models, these techniques typically are not strong enough to use when a learning algorithm may make membership queries.
We develop a general technique for proving hardness results for learning with membership and equivalence queries (and for
more general query models). We apply the technique to show that, assuming , no polynomial-time membership and (proper) equivalence query algorithms exist for exactly learning read-thrice DNF formulas,
unions of halfspaces over the Boolean domain, or some other related classes. Our hardness results are representation dependent, and
do not preclude the existence of representation independent algorithms.?The general technique introduces the representation problem for a class F of representations (e.g., formulas), which is naturally associated with the learning problem for F. This problem is related to the structural question of how to characterize functions representable by formulas in F, and is a generalization of standard complexity problems such as Satisfiability. While in general the representation problem is in , we present a theorem demonstrating that for "reasonable" classes F, the existence of a polynomial-time membership and equivalence query algorithm for exactly learning F implies that the representation problem for F is in fact in co-NP. The theorem is applied to prove hardness results such as the ones mentioned above, by showing that the
representation problem for specific classes of formulas is NP-hard.
Received: December 6, 1994 相似文献
3.
Bressler Idan Ben Bashat Dafna Buchsweiler Yuval Aizenstein Orna Limon Dror Bokestein Felix Blumenthal T. Deborah Nevo Uri Artzi Moran 《Magma (New York, N.Y.)》2023,36(1):33-42
Magnetic Resonance Materials in Physics, Biology and Medicine - Treatment response assessment in patients with high-grade gliomas (HGG) is heavily dependent on changes in lesion size on MRI.... 相似文献
4.
Generalized tensor-based morphometry of HIV/AIDS using multivariate statistics on deformation tensors 总被引:3,自引:0,他引:3
Lepore N Brun C Chou YY Chiang MC Dutton RA Hayashi KM Luders E Lopez OL Aizenstein HJ Toga AW Becker JT Thompson PM 《IEEE transactions on medical imaging》2008,27(1):129-141
This paper investigates the performance of a new multivariate method for tensor-based morphometry (TBM). Statistics on Riemannian manifolds are developed that exploit the full information in deformation tensor fields. In TBM, multiple brain images are warped to a common neuroanatomical template via 3-D nonlinear registration; the resulting deformation fields are analyzed statistically to identify group differences in anatomy. Rather than study the Jacobian determinant (volume expansion factor) of these deformations, as is common, we retain the full deformation tensors and apply a manifold version of Hotelling's $T(2) test to them, in a Log-Euclidean domain. In 2-D and 3-D magnetic resonance imaging (MRI) data from 26 HIV/AIDS patients and 14 matched healthy subjects, we compared multivariate tensor analysis versus univariate tests of simpler tensor-derived indices: the Jacobian determinant, the trace, geodesic anisotropy, and eigenvalues of the deformation tensor, and the angle of rotation of its eigenvectors. We detected consistent, but more extensive patterns of structural abnormalities, with multivariate tests on the full tensor manifold. Their improved power was established by analyzing cumulative p-value plots using false discovery rate (FDR) methods, appropriately controlling for false positives. This increased detection sensitivity may empower drug trials and large-scale studies of disease that use tensor-based morphometry. 相似文献
5.
We present two related results about the learnability of disjunctive normal form (DNF) formulas. First we show that a common approach for learning arbitrary DNF formulas requires exponential time. We then contrast this with a polynomial time algorithm for learning most (rather than all) DNF formulas. A natural approach for learning boolean functions involves greedily collecting the prime implicants of the hidden function. In a seminal paper of learning theory, Valiant demonstrated the efficacy of this approach for learning monotone DNF, and suggested this approach for learning DNF. Here we show that no algorithm using such an approach can learn DNF in polynomial time. We show this by constructing a counterexample DNF formula which would force such an algorithm to take exponential time. This counterexample seems to capture much of what makes DNF hard to learn, and thus is useful to consider when evaluating the run-time of a proposed DNF learning algorithm. This hardness result, as well as other hardness results for learning DNF, relies on the construction of particular hard-to-learn formulas, formulas that appear to be relatively rare. This raises the question of whether most DNF formulas are learnable. For certain natural definitions of most DNF formulas, we answer this question affirmatively. 相似文献
1