We propose novel techniques to find the optimal achieve the maximum loss reduction for distribution networks location, size, and power factor of distributed generation (DG) to Determining the optimal DG location and size is achieved simultaneously using the energy loss curves technique for a pre-selected power factor that gives the best DG operation. Based on the network's total load demand, four DG sizes are selected. They are used to form energy loss curves for each bus and then for determining the optimal DG options. The study shows that by defining the energy loss minimization as the objective function, the time-varying load demand significantly affects the sizing of DG resources in distribution networks, whereas consideration of power loss as the objective function leads to inconsistent interpretation of loss reduction and other calculations. The devised technique was tested on two test distribution systems of varying size and complexity and validated by comparison with the exhaustive iterative method (EIM) and recently published results. Results showed that the proposed technique can provide an optimal solution with less computation. 相似文献
Bug fixing accounts for a large amount of the software maintenance resources. Generally, bugs are reported, fixed, verified and closed. However, in some cases bugs have to be re-opened. Re-opened bugs increase maintenance costs, degrade the overall user-perceived quality of the software and lead to unnecessary rework by busy practitioners. In this paper, we study and predict re-opened bugs through a case study on three large open source projects—namely Eclipse, Apache and OpenOffice. We structure our study along four dimensions: (1) the work habits dimension (e.g., the weekday on which the bug was initially closed), (2) the bug report dimension (e.g., the component in which the bug was found) (3) the bug fix dimension (e.g., the amount of time it took to perform the initial fix) and (4) the team dimension (e.g., the experience of the bug fixer). We build decision trees using the aforementioned factors that aim to predict re-opened bugs. We perform top node analysis to determine which factors are the most important indicators of whether or not a bug will be re-opened. Our study shows that the comment text and last status of the bug when it is initially closed are the most important factors related to whether or not a bug will be re-opened. Using a combination of these dimensions, we can build explainable prediction models that can achieve a precision between 52.1–78.6 % and a recall in the range of 70.5–94.1 % when predicting whether a bug will be re-opened. We find that the factors that best indicate which bugs might be re-opened vary based on the project. The comment text is the most important factor for the Eclipse and OpenOffice projects, while the last status is the most important one for Apache. These factors should be closely examined in order to reduce maintenance cost due to re-opened bugs. 相似文献
The position control system of an electro-hydraulic actuator system (EHAS) is investigated in this paper. The EHAS is developed by taking into consideration the nonlinearities of the system: the friction and the internal leakage. A variable load that simulates a realistic load in robotic excavator is taken as the trajectory reference. A method of control strategy that is implemented by employing a fuzzy logic controller (FLC) whose parameters are optimized using particle swarm optimization (PSO) is proposed. The scaling factors of the fuzzy inference system are tuned to obtain the optimal values which yield the best system performance. The simulation results show that the FLC is able to track the trajectory reference accurately for a range of values of orifice opening. Beyond that range, the orifice opening may introduce chattering, which the FLC alone is not sufficient to overcome. The PSO optimized FLC can reduce the chattering significantly. This result justifies the implementation of the proposed method in position control of EHAS. 相似文献
In smart environments, pervasive computing contributes in improving daily life activities for dependent people by providing personalized services. Nevertheless, those environments do not guarantee a satisfactory level for protecting the user privacy and ensuring the trust between communicating entities. In this study, we propose a trust evaluation model based on user past and present behavior. This model is associated with a lightweight authentication key agreement protocol (Elliptic Curve-based Simple Authentication Key Agreement). The aim is to enable the communicating entities to establish a level of trust and then succeed in a mutual authentication using a scheme suitable for low-resource devices in smart environments. An innovation in our trust model is that it uses an accurate approach to calculate trust in different situations and includes a human-based feature for trust feedback, which is user rating. Finally, we tested and implemented our scheme on Android mobile phones in a smart environment dedicated for handicapped people. 相似文献
The aim of this paper is to deal with an output controllability problem. It consists in driving the state of a distributed
parabolic system toward a state between two prescribed functions on a boundary subregion of the system evolution domain with
minimum energy control. Two necessary conditions are derived. The first one is formulated in terms of subdifferential associated
with a minimized functional. The second one is formulated as a system of equations for arguments of the Lagrange systems.
Numerical illustrations show the efficiency of the second approach and lead to some conjectures.
Recommended by Editorial Board member Fumitoshi Matsuno under the direction of Editor Jae Weon Choi.
Zerrik El Hassan is a Professor at the university Moulay Ismail of Meknes in Morocco. He was an Assistant Professor in the faculty of sciences
of Meknes and researcher at the university of Perpignan (France). He got his doctorat d etat in system regional analysis (1993)
at the University Mohammed V of Rabat, Morocco. Professor Zerrik wrote many papers and books in the area of systems analysis
and control. Now he is the Head of the research team MACS (Modeling Analysis and Control of Systems) at the university Moulay
Ismail of Meknes in Morocco.
Ghafrani Fatima is a Researcher at team MACS at the University Moulay Ismail of Meknes in Morocco. She wrote many papers in the area of systems
analysis and control. 相似文献
The use of crowdsourcing in a pedagogically supported form to partner with learners in developing novel content is emerging as a viable approach for engaging students in higher-order learning at scale. However, how students behave in this form of crowdsourcing, referred to as learnersourcing, is still insufficiently explored.
Objectives
To contribute to filling this gap, this study explores how students engage with learnersourcing tasks across a range of course and assessment designs.
Methods
We conducted an exploratory study on trace data of 1279 students across three courses, originating from the use of a learnersourcing environment under different assessment designs. We employed a new methodology from the learning analytics (LA) field that aims to represent students' behaviour through two theoretically-derived latent constructs: learning tactics and the learning strategies built upon them.
Results
The study's results demonstrate students use different tactics and strategies, highlight the association of learnersourcing contexts with the identified learning tactics and strategies, indicate a significant association between the strategies and performance and contribute to the employed method's generalisability by applying it to a new context.
Implications
This study provides an example of how learning analytics methods can be employed towards the development of effective learnersourcing systems and, more broadly, technological educational solutions that support learner-centred and data-driven learning at scale. Findings should inform best practices for integrating learnersourcing activities into course design and shed light on the relevance of tactics and strategies to support teachers in making informed pedagogical decisions. 相似文献
In this article, we will present a new set of hybrid polynomials and their corresponding moments, with a view to using them for the localization, compression and reconstruction of 2D and 3D images. These polynomials are formed from the Hahn and Krawtchouk polynomials. The process of calculating these is successfully stabilized using the modified recurrence relations with respect to the n order, the variable x and the symmetry property. The hybrid polynomial generation process is carried out in two forms: the first form contains the separable discrete orthogonal polynomials of Krawtchouk–Hahn (DKHP) and Hahn–Krawtchouk (DHKP). The latter are generated as the product of the discrete orthogonal Hahn and Krawtchouk polynomials, while the second form is the square equivalent of the first form, it consists of discrete squared Krawtchouk–Hahn polynomials (SKHP) and discrete polynomials of Hahn–Krawtchouk squared (SHKP). The experimental results clearly show the efficiency of hybrid moments based on hybrid polynomials in terms of localization property and computation time of 2D and 3D images compared to other types of moments; on the other hand, encouraging results have also been shown in terms of reconstruction quality and compression despite the superiority of classical polynomials.
The edge computing model offers an ultimate platform to support scientific and real-time workflow-based applications over the edge of the network. However, scientific workflow scheduling and execution still facing challenges such as response time management and latency time. This leads to deal with the acquisition delay of servers, deployed at the edge of a network and reduces the overall completion time of workflow. Previous studies show that existing scheduling methods consider the static performance of the server and ignore the impact of resource acquisition delay when scheduling workflow tasks. Our proposed method presented a meta-heuristic algorithm to schedule the scientific workflow and minimize the overall completion time by properly managing the acquisition and transmission delays. We carry out extensive experiments and evaluations based on commercial clouds and various scientific workflow templates. The proposed method has approximately 7.7% better performance than the baseline algorithms, particularly in overall deadline constraint that gives a success rate.
The Journal of Supercomputing - This paper designs and develops a computational intelligence-based framework using convolutional neural network (CNN) and genetic algorithm (GA) to detect COVID-19... 相似文献