Iron oxide films were made by chemical vapour deposition and annealing post-treatment. Optical and d.c. electrical measurements probed the Fe2O3 Fe3O4 transition. It could be understood as a thermally activated process, with an activation energy equal to the band-gap of Fe2O3. A.c. electrical data gave evidence against the transition being percolative. 相似文献
Xylose fermentation by Pichia stipitis was examined using a two-stage batch process. The cells were first grown aerobically on D-xylose (5 g/L), whereafter additional xylose (10 g/L) was added and fermented during anaerobic conditions (T=30°C). The optimum pH value for a fermentation with constant pH was found to be 4.5 (maximum specific ethanol production rate 0.21 g/(g h). Forced square wave cycling of the pH between 4 and 5, and 3.5 and 5.5 (cycle time 30 min) during the fermentation stage resulted in a fermentation rate lower than the maximum rate, but with unchanged ethanol yields. 相似文献
Change impact analysis is a change management activity that previously has been studied much from a technical perspective. For example, much work focuses on methods for determining the impact of a change. In this paper, we present results from a study on the role of impact analysis in the change management process. In the study, impact analysis issues were prioritised with respect to criticality by software professionals from an organisational perspective and a self-perspective. The software professionals belonged to three organisational levels: operative, tactical and strategic. Qualitative and statistical analyses with respect to differences between perspectives as well as levels are presented. The results show that important issues for a particular level are tightly related to how the level is defined. Similarly, issues important from an organisational perspective are more holistic than those important from a self-perspective. However, our data indicate that the self-perspective colours the organisational perspective, meaning that personal opinions and attitudes cannot easily be disregarded. In comparing the perspectives and the levels, we visualise the differences in a way that allow us to discuss two classes of issues: high-priority and medium-priority. The most important issues from this point of view concern fundamental aspects of impact analysis and its execution. 相似文献
Tin oxide thin films were deposited by reactive radio-frequency magnetron sputtering onto In(2)O(3):Sn-coated and bare glass substrates. Optical constants in the 3002500-nm wavelength range were determined by a combination of variable-angle spectroscopic ellipsometry and spectrophotometric transmittance measurements. Surface roughness was modeled from optical measurements and compared with atomic-force microscopy. The two techniques gave consistent results. The fit between experimental optical data and model results could be significantly improved when it was assumed that the refractive index of the Sn oxide varied across the film thickness. Varying the oxygen partial pressure during deposition made it possible to obtain films whose complex refractive index changed at the transition from SnO to SnO(2). An addition of hydrogen gas during sputtering led to lower optical constants in the full spectral range in connection with a blueshift of the bandgap. Electrochemical intercalation of lithium ions into the Sn oxide films raised their refractive index and enhanced their refractive-index gradient. 相似文献
Software requirements are often formulated on different levels and hence they are difficult to compare to each other. To address
this issue, a model that allows for placing requirements on different levels has been developed. The model supports both abstraction
and refinement of requirements, and hence requirements can both be compared with each other and to product strategies. Comparison
between requirements will allow for prioritization of requirements, which in many cases is impossible if the requirements
are described on different abstraction levels. Comparison to product strategies will enable early and systematic acceptance
or dismissal of requirements, minimizing the risk for overloading. This paper presents an industrial evaluation of the model.
It has been evaluated in two different companies, and the experiences and findings are presented. It is concluded that the
requirements abstraction model provides helpful improvements to the industrial requirements engineering process.
This article describes an evaluation of six different methods for prioritizing software requirements. Based on the quality requirements for a telephony system, the authors individually used all six methods on separate occasions to prioritize the requirements. The methods were then characterized according to a number of criteria from a user's perspective. We found the analytic hierarchy process to be the most promising method, although it may be problematic to scale-up. In an industrial follow-up study we used the analytic hierarchy process to further investigate its applicability. We found that the process is demanding but worth the effort because of its ability to provide reliable results, promote knowledge transfer and create consensus among project members. 相似文献
Network Function Virtualization (NFV) has been identified to revamp the provisioning of next-generation network services. This new paradigm allows cloud and network/service providers to compose their network services, also known as service function chains (SFCs), in an agile way since the software of the network function is decoupled from the legacy hardware. To reap the benefits of this new technology, there is a need for novel mechanisms that help cloud and network/service providers deploy the increasingly complex virtual network services seamlessly, efficiently, and in a time-efficient way. Existing state-of-the-art techniques often rely on the Integer Linear Programming framework, heuristics/metaheuristics, and greedy methods to deploy the services function chains. However, these techniques although reasonable and acceptable, still suffer from several key limitations: convergence time and scalability. To this end, we propose RAFALE, a suite of solution techniques, to tame this complexity by leveraging the concept of similarity from machine learning and skip-gram modeling framework. To the best of our knowledge, we are the first to tackle these key limitations and propose a suite of solutions to them. RAFALE, a novel approach proposed to find the similarity between the new incoming virtual network service request and all the already-deployed services to learn from the previous experience of deploying techniques and use the same or close similar provisioning techniques. RAFALE is the first and the only method that develops the idea of detecting the similarity between virtual network services. Experimental results show that RAFALE reduces greatly the convergence time needed for provisioning virtual network services and can scale to 100 virtual network functions per virtual network service compared to the state-of-the-art. The Experimental results prove that RAFALE accomplished the NFV promises; decreasing the time and complexity of managing and deploying the virtual services, and providing a solution that is agile, faster, and scalable to deploy the new service requests by skipping one or more service provisioning steps (i.e., detecting and resolving the conflicts among policies, placement, and chaining) while satisfying the validated NFV policies.
The demand for biofuels and biochemicals is expected to increase in the future, which will in turn increase the demand for biomass feedstock. Large gasification plants fueled with biomass feedstock are likely to be a key enabling technology in a resource‐efficient, bio‐based economy. Furthermore, the costs for producing biofuels and biochemicals in such plants could potentially be decreased by utilizing inexpensive low‐grade residual biomass as feedstock. This study investigates the usage of shredded tree bark as a feedstock for the production of biomethane in the GoBiGas demonstration plant in Gothenburg, Sweden, based on a 32 MWth industrial dual fluidized bed gasification unit. The plant was operated with bark feedstock for 12 000 hours during the period 2014 to 2018. Data from the measurement campaign were processed using a stochastic approach to establish the plant's mass and energy balances, which were then compared with operation of the plant with wood pellets. For this comparison, an extrapolation algorithm was developed to predict plant performance using bark dried to the same moisture content as wood pellets, ie, 8%w.b. Plant operation with bark feedstock was evaluated for operability, efficiency, and feedstock‐related cost. The gas quality achieved during the test period was similar to that obtained for operation with wood pellets. Furthermore, no significant ash sintering or agglomeration problems were observed more than 750 hours of operation. The calculated biomass‐to‐biomethane efficiency is 43% to 47% (lower heating value basis) for operation with wet bark. However, the predicted biomass‐to‐biomethane efficiency can be increased to 55%–65% for operation with bark feedstock dried to 8% moisture content, with corresponding feedstock costs in the range of 24.2 to 32.7 EUR/MWh; ie, a cost reduction of about 40% compared with wood pellets. 相似文献
ABSTRACTThe city of Lahti, Finland, has developed a unique policy of combining city strategy work with strategic master planning in an iterative process. It thereby offers insights to research on strategic spatial planning, exemplifying how institutional frameworks of statutory planning can be utilized as resources in strategic planning. Three lessons from the Lahti case are drawn: (1) utilize the moments of opportunity in the institutional environment of statutory planning, (2) shift the focus from the level of ‘strategic plans’ to the policy level of strategy work, (3) develop strategic planning as a platform for diverse ‘languages’. 相似文献