Lipogenesis is the process by which fatty acids are synthesized. In metabolic syndrome, an insulin resistant state along with high plasma levels of free fatty acids (FFA) and hyperglycemia may contribute to the lipogenic process. The aim of the present study was to investigate the effects of oral administration of metformin on the expression of lipogenic genes and glycemic profile in mice fed with low‐carbohydrate high‐fat diet by evaluating their metabolic profile. SWISS male mice were divided into 4 groups (N = 7) that were fed with standard (ST), standard plus metformin (ST + MET), low‐carbohydrate high‐fat diet (LCHFD) and low‐carbohydrate high‐fat diet plus metformin (LCHFD + MET) (100 mg kg?1 diet) diets respectively. Food intake, body weight and blood parameters, such as glucose tolerance, insulin sensitivity, glucose, HDL‐c, total cholesterol, triglycerides, ASL and ALT levels were assessed. Histological analyses were performed on hematoxylin and eosin‐stained epididymal adipose tissue histological specimens. The expression levels of peroxisome proliferator‐activated receptor (PPARγ), sterol regulatory element‐binding protein 1 (SREBP1), fatty acid synthase (FAS) and acetyl‐CoA carboxylase (ACC), were assessed by RT‐PCR. This study showed that metformin decreased adipocyte area, body weight and food consumption in obese animals when compared to the standard group. Furthermore, the expression of lipogenic markers in adipose tissue were diminished in obese animals treated with metformin. This data showed that oral administration of metformin improved glucose and lipid metabolic parameters in white adipose tissue by reducing the expression of lipogenesis markers, suggesting an important clinical application of MET in treating obesity‐related diseases in metabolic syndrome. 相似文献
Engineering with Computers - In this article, a methodology based on Discrete Element Method (DEM) and Finite Elements Method (FEM) combined with modified Approximate Periodic Boundary Condition... 相似文献
This paper presents a PVS development of relevant results of the theory of rings. The PVS theory includes complete proofs of the three classical isomorphism theorems for rings, and characterizations of principal, prime and maximal ideals. Algebraic concepts and properties are specified and formalized as generally as possible allowing in this manner their application to other algebraic structures. The development provides the required elements to formalize important algebraic theorems. In particular, the paper presents the formalization of the general algebraic-theoretical version of the Chinese remainder theorem (CRT) for the theory of rings, as given in abstract algebra textbooks, proved as a consequence of the first isomorphism theorem. Also, the PVS theory includes a formalization of the number-theoretical version of CRT for the structure of integers, which is the version of CRT found in formalizations. CRT for integers is obtained as a consequence of the general version of CRT for the theory of rings.
Interior gateway routing protocols like Open Shortest Path First (OSPF) and Distributed Exponentially Weighted Flow Splitting (DEFT) send flow through forward links toward the destination node. OSPF routes only on shortest‐weight paths, whereas DEFT sends flow on all forward links, but with an exponential penalty on longer paths. Finding suitable weights for these protocols is known as the weight setting problem (WSP). In this paper, we present a biased random‐key genetic algorithm for WSP using both protocols. The algorithm uses dynamic flow and dynamic shortest path computations. We report computational experiments that show that DEFT achieves less network congestion when compared with OSPF, while, however, yielding larger delays. 相似文献
The safe belief semantics uses intermediate logics to definean extension of answer sets to all propositional formulas, butonly considering one kind of negation. In this work we extendsafe beliefs adding the strong negation connective. The mainfeature of our extension is that strong negation can occur beforeany formula, and not only at the atomic level. We give resultsconcerning the relation between strong negation extensions ofintermediate logics and safe beliefs and consider the way inwhich strong negation can be eliminated from any formula whilepreserving its semantics. We also propose two new notions ofequivalence: substitution equivalence and contextualized equivalence.We prove that they are both more general than strong equivalenceand, for propositional formulas where strong negation may occurat the non-atomic level, substitution equivalence captures anotion of equivalence that cannot be captured by strong equivalencealone. 相似文献
Large-scale similarity search engines are complex systems devised to process unstructured data like images and videos. These systems are deployed on clusters of distributed processors communicated through high-speed networks. To process a new query, a distance function is evaluated between the query and the objects stored in the database. This process relays on a metric space index distributed among the processors. In this paper, we propose a cache-based strategy devised to reduce the number of computations required to retrieve the top-k object results for user queries by using pre-computed information. Our proposal executes an approximate similarity search algorithm, which takes advantage of the links between objects stored in the cache memory. Those links form a graph of similarity among pre-computed queries. Compared to the previous methods in the literature, the proposed approach reduces the number of distance evaluations up to 60%. 相似文献
The abundant computing resources in current organizations provide new opportunities for executing parallel scientific applications and using resources. The Enterprise Desktop Grid Computing (EDGC) paradigm addresses the potential for harvesting the idle computing resources of an organization’s desktop PCs to support the execution of the company’s large-scale applications. In these environments, the accuracy of response-time predictions is essential for effective metascheduling that maximizes resource usage without harming the performance of the parallel and local applications. However, this accuracy is a major challenge due to the heterogeneity and non-dedicated nature of EDGC resources. In this paper, two new prediction techniques are presented based on the state of resources. A thorough analysis by linear regression demonstrated that the proposed techniques capture the real behavior of the parallel applications better than other common techniques in the literature. Moreover, it is possible to reduce deviations with a proper modeling of prediction errors, and thus, a Self-adjustable Correction method (SAC) for detecting and correcting the prediction deviations was proposed with the ability to adapt to the changes in load conditions. An extensive evaluation in a real environment was conducted to validate the SAC method. The results show that the use of SAC increases the accuracy of response-time predictions by 35%. The cost of predictions with self-correction and its accuracy in a real environment was analyzed using a combination of the proposed techniques. The results demonstrate that the cost of predictions is negligible and the combined use of the prediction techniques is preferable. 相似文献
We estimate the success probability of quantum protocols composed of Clifford operations in the presence of Pauli errors. Our method is derived from the fault-point formalism previously used to determine the success rate of low-distance error correction codes. Here we apply it to a wider range of quantum protocols and identify circuit structures that allow for efficient calculation of the exact success probability and even the final distribution of output states. As examples, we apply our method to the Bernstein–Vazirani algorithm and the Steane [[7,1,3]] quantum error correction code and compare the results to Monte Carlo simulations. 相似文献
This paper presents a novel automatic framework to perform 3D face recognition. The proposed method uses a Simulated Annealing-based approach (SA) for range image registration with the Surface Interpenetration Measure (SIM), as similarity measure, in order to match two face images. The authentication score is obtained by combining the SIM values corresponding to the matching of four different face regions: circular and elliptical areas around the nose, forehead, and the entire face region. Then, a modified SA approach is proposed taking advantage of invariant face regions to better handle facial expressions. Comprehensive experiments were performed on the FRGC v2 database, the largest available database of 3D face images composed of 4,007 images with different facial expressions. The experiments simulated both verification and identification systems and the results compared to those reported by state-of-the-art works. By using all of the images in the database, a verification rate of 96.5 percent was achieved at a False Acceptance Rate (FAR) of 0.1 percent. In the identification scenario, a rank-one accuracy of 98.4 percent was achieved. To the best of our knowledge, this is the highest rank-one score ever achieved for the FRGC v2 database when compared to results published in the literature. 相似文献