Consideration is given to the buoyancy effects on the fully developed gaseous slip flow in a vertical rectangular microduct. Two different cases of the thermal boundary conditions are considered, namely uniform temperature at two facing duct walls with different temperatures and adiabatic other walls (case A) and uniform heat flux at two walls and uniform temperature at other walls (case B). The rarefaction effects are treated using the first-order slip boundary conditions. By means of finite Fourier transform method, analytical solutions are obtained for the velocity and temperature distributions as well as the Poiseuille number. Furthermore, the threshold value of the mixed convection parameter to start the flow reversal is evaluated. The results show that the Poiseuille number of case A is an increasing function of the mixed convection parameter and a decreasing function of the channel aspect ratio, whereas its functionality on the Knudsen number is not monotonic. For case B, the Poiseuille number is decreased by increasing each of the mixed convection parameter, the Knudsen number, and the channel aspect ratio. 相似文献
Diagnosis, detection and classification of tumors, in the brain MRI images, are important because misdiagnosis can lead to death. This paper proposes a method that can diagnose brain tumors in the MRI images and classify them into 5 categories using a Convolutional Neural Network (CNN). The proposed network uses a Convolutional Auto-Encoder Neural Network (CANN) to extract and learn deep features of input images. Extracted deep features from each level are combined to make desirable features and improve results. To classify brain tumor into three categories (Meningioma, Glioma, and Pituitary) the proposed method was applied on Cheng dataset and has reached a considerable performance accuracy of 99.3%. To diagnosis and grading Glioma tumors, the proposed method was applied on IXI and BraTS 2017 datasets, and to classify brain images into six classes including Meningioma, Pituitary, Astrocytoma, High-Grade Glioma, Low-Grade Glioma and Normal images (No tumor), the all datasets including IXI, BraTS2017, Cheng and Hazrat-e-Rassol, was used by the proposed network, and it has reached desirable performance accuracy of 99.1% and 98.5%, respectively.
Highly porous free-standing co-poly(vinylidene fluoride)/modacrylic/SiO2 nanofibrous membrane was developed using electrically-assisted solution blow spinning method. The performance and the potential of the membrane as a lithium-ion battery separator were investigated. The addition of modacrylic enhanced the solution spinnability that resulted in defect-free membranes. Moreover, the presence of modacrylic enhanced the dimensional and thermal stabilities, while the addition of hydrophilic SiO2 nanoparticle enhanced both mechanical property and ionic conductivity. Combustion test results illustrated that the presence of modacrylic provide flame retarding property over a set of different polymeric-based membranes. Electrochemical performance results showed that the developed membrane can increase the battery capacity compared with the commercial separator. 相似文献
Magnetic Resonance Materials in Physics, Biology and Medicine - The success of parallel Magnetic Resonance Imaging algorithms like SENSitivity Encoding (SENSE) depends on an accurate estimation of... 相似文献
This work aim at developing a non-destructive tool for the evaluation of bonded plastic joints. The paper examines infrared thermographic transmission and reflection mode imaging and validates the feasibility of the thermal NDT approach for this application. Results demonstrate good estimation performance for adhesion integrity, uniformity and bond strength using a transmission mode application of infrared thermography. In addition, results from a pulsed infrared thermographic application using a modified dynamic infrared tomography scheme show good performance for estimating adhesion layer thickness mapping and detecting delaminations. 相似文献
Many recent software engineering papers have examined duplicate issue reports. Thus far, duplicate reports have been considered a hindrance to developers and a drain on their resources. As a result, prior research in this area focuses on proposing automated approaches to accurately identify duplicate reports. However, there exists no studies that attempt to quantify the actual effort that is spent on identifying duplicate issue reports. In this paper, we empirically examine the effort that is needed for manually identifying duplicate reports in four open source projects, i.e., Firefox, SeaMonkey, Bugzilla and Eclipse-Platform. Our results show that: (i) More than 50 % of the duplicate reports are identified within half a day. Most of the duplicate reports are identified without any discussion and with the involvement of very few people; (ii) A classification model built using a set of factors that are extracted from duplicate issue reports classifies duplicates according to the effort that is needed to identify them with a precision of 0.60 to 0.77, a recall of 0.23 to 0.96, and an ROC area of 0.68 to 0.80; and (iii) Factors that capture the developer awareness of the duplicate issue’s peers (i.e., other duplicates of that issue) and textual similarity of a new report to prior reports are the most influential factors in our models. Our findings highlight the need for effort-aware evaluation of approaches that identify duplicate issue reports, since the identification of a considerable amount of duplicate reports (over 50 %) appear to be a relatively trivial task for developers. To better assist developers, research on identifying duplicate issue reports should put greater emphasis on assisting developers in identifying effort-consuming duplicate issues. 相似文献
Reuse of software components, either closed or open source, is considered to be one of the most important best practices in software engineering, since it reduces development cost and improves software quality. However, since reused components are (by definition) generic, they need to be customized and integrated into a specific system before they can be useful. Since this integration is system-specific, the integration effort is non-negligible and increases maintenance costs, especially if more than one component needs to be integrated. This paper performs an empirical study of multi-component integration in the context of three successful open source distributions (Debian, Ubuntu and FreeBSD). Such distributions integrate thousands of open source components with an operating system kernel to deliver a coherent software product to millions of users worldwide. We empirically identified seven major integration activities performed by the maintainers of these distributions, documented how these activities are being performed by the maintainers, then evaluated and refined the identified activities with input from six maintainers of the three studied distributions. The documented activities provide a common vocabulary for component integration in open source distributions and outline a roadmap for future research on software integration. 相似文献
Applying model predictive control (MPC) in some cases such as complicated process dynamics and/or rapid sampling leads us to poorly numerically conditioned solutions and heavy computational load. Furthermore, there is always mismatch in a model that describes a real process. Therefore, in this paper in order to prevail over the mentioned difficulties, we design a robust MPC using the Laguerre orthonormal basis in order to speed up the convergence at the same time with lower computation adding an extra parameter “a” in MPC. In addition, the Kalman state estimator is included in the prediction model and accordingly the MPC design is related to the Kalman estimator parameters as well as the error of estimations which helps the controller react faster against unmeasured disturbances. Tuning the parameters of the Kalman estimator as well as MPC is another achievement of this paper which guarantees the robustness of the system against the model mismatch and measurement noise. The sensitivity function at low frequency is minimized to tune the MPC parameters since the lower the magnitude of the sensitivity function at low frequency the better command tracking and disturbance rejection results. The integral absolute error (IAE) and peak of the sensitivity are used as constraints in optimization procedure to ensure the stability and robustness of the controlled process. The performance of the controller is examined via the controlling level of a Tank and paper machine processes. 相似文献