Power plants in Kuwait use gas turbines (GT) only for a few hours to produce power at peak load times. Peak loadoccurs in the summer due to the air-conditioning load. As an example, the average number of operating hours for six gas turbines in the Doha East power plant was 16 in the summer of 2001. There is little concern about efficiency of these GT since they work for a very short time during the year. However, a recent increase in desalted seawater demand suggests the use of these GT to operate reverse osmosis (RO) desalting systems all year around. The summer outside design temperature in Kuwait for air-conditioning calculations is 48°C dry bulb temperature (DBT), and 28°C wet bulb temperature (WBT); but the ambient temperature can easily reach 60°C. Gas turbine power output and efficiency are drastically reduced by the increase in temperature of intake air to the gas turbine's compressor, especially during harsh Kuwaiti summer conditions. Thus, it is essential to investigate cooling of air intake to the GT compressor. The performance of a typical GT unit and its ability to produce desalted waterby a RO desalting system at different ambient temperatures are presented. Calculation of needed capacities for the cooling of intake air to the GT compressor was performed for evaporative cooling, single and multiple mechanical vapor compression cycles, and combined indirect evaporation cooling with the refrigeration system. The improvements of power output and efficiency due to the cooling of air intake of the GT and the resulting increase in desalted water are also presented. 相似文献
Security threats are crucial challenges that deter Mixed reality (MR) communication in medical telepresence. This research aims to improve the security by reducing the chances of types of various attacks occurring during the real-time data transmission in surgical telepresence as well as reduce the time of the cryptographic algorithm and keep the quality of the media used. The proposed model consists of an enhanced RC6 algorithm in combination. Dynamic keys are generated from the RC6 algorithm mixed with RC4 to create dynamic S-box and permutation table, preventing various known attacks during the real-time data transmission. For every next session, a new key is created, avoiding possible reuse of the same key from the attacker. The results obtained from our proposed system are showing better performance compared to the state of art. The resistance to the tested attacks is measured throughout the entropy, Pick to Signal Noise Ratio (PSNR) is decreased for the encrypted image than the state of art, structural similarity index (SSIM) closer to zero. The execution time of the algorithm is decreased for an average of 20%. The proposed system is focusing on preventing the brute force attack occurred during the surgical telepresence data transmission. The paper proposes a framework that enhances the security related to data transmission during surgeries with acceptable performance.
Titanium aluminides have become promising materials for high-temperature applications, but the relatively poor oxidation resistance
and elevated- temperature strength of these alloys limit their application to temperatures lower than 1000 °C. Niobium addition
improves the properties of titanium aluminide. However, the mechanical, metallurgical, and corrosion properties of Ti- Al-
Nb may be improved by treatment with a laser beam. Consequently, the present study examines the properties of Ti- 15Al- 20Nb
alloy subjected to the Nd:YAG laser melting process. Hardness in the surface region increases to twice the base material hardness,
and corrosion resistance improves considerably after laser treatment. 相似文献
Many recent software engineering papers have examined duplicate issue reports. Thus far, duplicate reports have been considered a hindrance to developers and a drain on their resources. As a result, prior research in this area focuses on proposing automated approaches to accurately identify duplicate reports. However, there exists no studies that attempt to quantify the actual effort that is spent on identifying duplicate issue reports. In this paper, we empirically examine the effort that is needed for manually identifying duplicate reports in four open source projects, i.e., Firefox, SeaMonkey, Bugzilla and Eclipse-Platform. Our results show that: (i) More than 50 % of the duplicate reports are identified within half a day. Most of the duplicate reports are identified without any discussion and with the involvement of very few people; (ii) A classification model built using a set of factors that are extracted from duplicate issue reports classifies duplicates according to the effort that is needed to identify them with a precision of 0.60 to 0.77, a recall of 0.23 to 0.96, and an ROC area of 0.68 to 0.80; and (iii) Factors that capture the developer awareness of the duplicate issue’s peers (i.e., other duplicates of that issue) and textual similarity of a new report to prior reports are the most influential factors in our models. Our findings highlight the need for effort-aware evaluation of approaches that identify duplicate issue reports, since the identification of a considerable amount of duplicate reports (over 50 %) appear to be a relatively trivial task for developers. To better assist developers, research on identifying duplicate issue reports should put greater emphasis on assisting developers in identifying effort-consuming duplicate issues. 相似文献
The goal of this paper is to describe a novel fault tolerant tracking control (FTTC) strategy based on robust fault estimation and compensation of simultaneous actuator and sensor faults. Within the framework of fault tolerant control (FTC) the challenge is to develop an FTTC design strategy for nonlinear systems to tolerate simultaneous actuator and sensor faults that have bounded first time derivatives. The main contribution of this paper is the proposal of a new architecture based on a combination of actuator and sensor Takagi-Sugeno (T-S) proportional state estimators augmented with proportional and integral feedback (PPI) fault estimators together with a T-S dynamic output feedback control (TSDOFC) capable of time-varying reference tracking. Within this architecture the design freedom for each of the T-S estimators and the control system are available separately with an important consequence on robust L2 norm fault estimation and robust L2 norm closed-loop tracking performance. The FTTC strategy is illustrated using a nonlinear inverted pendulum example with time-varying tracking of a moving linear position reference. 相似文献
Changes in operational environment of the process industry such as decreasing selling prices, increased competition between companies and new legislation, set requirements for performance and effectiveness of the industrial production lines and processes. For the basis of this study, a life cycle profit (LCP) model of a pulp process was constructed using different kind of process information including chemical consumptions and production levels of material and energy flows in unit processes. However, all the information needed in the creation of relevant LCP model was not directly provided by information systems of the plant. In this study, neural networks was used to model pulp bleaching process and fill out missing information and furthermore to create estimators for the alkaline chemical consumption. A data-based modelling approach was applied using an example, where factors affecting the sodium hydroxide consumption in the bleaching stage were solved. The results showed that raw process data can be refined into new valuable information using computational methods and moreover to improve the accuracy of life cycle profit models. 相似文献
The World Wide Web(WWW) comprises a wide range of information, and it is mainly operated on the principles of keyword matching which often reduces accurate information retrieval. Automatic query expansion is one of the primary methods for information retrieval, and it handles the vocabulary mismatch problem often faced by the information retrieval systems to retrieve an appropriate document using the keywords. This paper proposed a novel approach of hybrid COOT-based Cat and Mouse Optimization (CMO) algorithm named as hybrid COOT-CMO for the appropriate selection of optimal candidate terms in the automatic query expansion process. To improve the accuracy of the Cat and Mouse Optimization (CMO) algorithm, the parameters are tuned with the help of the Coot algorithm. The best suitable expanded query is identified from the available expanded query sets also known as candidate query pools. All feasible combinations in this candidate query pool should be obtained from the top retrieved documents. Benchmark datasets such as the GOV2 Test Collection, the Cranfield Collections, and the NTCIR Test Collection are utilized to assess the performance of the proposed hybrid COOT-CMO method for automatic query expansion. This proposed method surpasses the existing state-of-the-art techniques using many performance measures such as F-score, precision, and mean average precision (MAP).
In this article, a new population-based algorithm for real-parameter global optimization is presented, which is denoted as self-organizing centroids optimization (SOC-opt). The proposed method uses a stochastic approach which is based on the sequential learning paradigm for self-organizing maps (SOMs). A modified version of the SOM is proposed where each cell contains an individual, which performs a search for a locally optimal solution and it is affected by the search for a global optimum. The movement of the individuals in the search space is based on a discrete-time dynamic filter, and various choices of this filter are possible to obtain different dynamics of the centroids. In this way, a general framework is defined where well-known algorithms represent a particular case. The proposed algorithm is validated through a set of problems, which include non-separable problems, and compared with state-of-the-art algorithms for global optimization. 相似文献
With the increasing and rapid growth rate of COVID-19 cases, the healthcare scheme of several developed countries have reached the point of collapse. An important and critical steps in fighting against COVID-19 is powerful screening of diseased patients, in such a way that positive patient can be treated and isolated. A chest radiology image-based diagnosis scheme might have several benefits over traditional approach. The accomplishment of artificial intelligence (AI) based techniques in automated diagnoses in the healthcare sector and rapid increase in COVID-19 cases have demanded the requirement of AI based automated diagnosis and recognition systems. This study develops an Intelligent Firefly Algorithm Deep Transfer Learning Based COVID-19 Monitoring System (IFFA-DTLMS). The proposed IFFA-DTLMS model majorly aims at identifying and categorizing the occurrence of COVID19 on chest radiographs. To attain this, the presented IFFA-DTLMS model primarily applies densely connected networks (DenseNet121) model to generate a collection of feature vectors. In addition, the firefly algorithm (FFA) is applied for the hyper parameter optimization of DenseNet121 model. Moreover, autoencoder-long short term memory (AE-LSTM) model is exploited for the classification and identification of COVID19. For ensuring the enhanced performance of the IFFA-DTLMS model, a wide-ranging experiments were performed and the results are reviewed under distinctive aspects. The experimental value reports the betterment of IFFA-DTLMS model over recent approaches. 相似文献
We study the equilibrium behavior of informed traders interacting with market scoring rule (MSR) market makers. One attractive feature of MSR is that it is myopically incentive compatible: it is optimal for traders to report their true beliefs about the likelihood of an event outcome provided that they ignore
the impact of their reports on the profit they might garner from future trades. In this paper, we analyze non-myopic strategies
and examine what information structures lead to truthful betting by traders. Specifically, we analyze the behavior of risk-neutral
traders with incomplete information playing in a dynamic game. We consider finite-stage and infinite-stage game models. For
each model, we study the logarithmic market scoring rule (LMSR) with two different information structures: conditionally independent
signals and (unconditionally) independent signals. In the finite-stage model, when signals of traders are independent conditional
on the state of the world, truthful betting is a Perfect Bayesian Equilibrium (PBE). Moreover, it is the unique Weak Perfect
Bayesian Equilibrium (WPBE) of the game. In contrast, when signals of traders are unconditionally independent, truthful betting
is not a WPBE. In the infinite-stage model with unconditionally independent signals, there does not exist an equilibrium in which
all information is revealed in a finite amount of time. We propose a simple discounted market scoring rule that reduces the
opportunity for bluffing strategies. We show that in any WPBE for the infinite-stage market with discounting, the market price
converges to the fully-revealing price, and the rate of convergence can be bounded in terms of the discounting parameter.
When signals are conditionally independent, truthful betting is the unique WPBE for the infinite-stage market with and without
discounting. 相似文献