Carbon thermograms, which classify carbon aerosol according to its volatility, were obtained for fine-particle samples from an isolated highway vehicle source and a vehicle-dominated ambient site. The thermograms from the sites were compared after scaling by the carbon monoxide concentration. The high- and low-volatility carbon fractions in the ambient sample agreed to within 10% of the corresponding fractions in the highway vehicle sample. Excess carbon in the range of intermediate volatility comprised 15 to 19% of the ambient carbon mass and is attributed to aerosols from secondary processes and nonvehicular primary sources. When lead was used as a tracer to scale the thermograms, the high- and low-volatility ambient carbon fractions were underestimated by a factor of 2. The low volatility fraction (“black carbon”) present in the atmospheric sample was evolved at lower temperatures than the equivalent fraction in the isolated highway vehicle sample. This creates an ambiguity in defining the low-volatility fraction, which is a problem if black carbon is used as a tracer. The scaling technique described in this work avoids the problem because it does not require an estimation of the low-volatility carbon fraction. 相似文献
As part of the California Regional PM2.5 and PM10 Air Quality Study (CRPAQS) particle size distributions were measured simultaneously at two sites; the city of Fresno and the agricultural site of Angiola. Reported here are data obtained by scanning mobility analysis over the size range from 10 nm to 400 nm for the intensive study period from December 1, 2000 through February 6, 2001. These high time resolution data show variability in the character of the distributions, as well as the in the total number concentrations. The most pronounced feature of the data set is a consistent, nighttime maxima in particle number concentrations with a modal diameter near 80 nm during the evening hours at the urban Fresno site. Although these maxima are correlated with CO, NO, and black carbon, the particle size is larger than the 30–40 nm modal diameters observed for traffic aerosols during the commute hours, and is attributed to a non-vehicle source. At the agricultural site, the morning maxima particle number concentration coincides with the maxima in NO concentration, but often precedes the morning maxima in black carbon. Values for the geometric mean particle diameter varied from day to day, but are correlated between the two sites, with somewhat larger particle sizes at Angiola during periods of stagnation. 相似文献
Two chemically synthesized flavin derivatives, 8‐trifluoromethyl‐ and 8‐bromoriboflavin (8‐CF3RF and 8‐BrRF), were photochemically characterized in H2O and studied spectroscopically after incorporation into the LOV domain of the blue light photoreceptor YtvA from Bacillus subtilis. The spectroscopic studies were paralleled by high‐level quantum chemical calculations. In solution, 8‐BrRF showed a remarkably high triplet quantum yield (0.97, parent compound riboflavin, RF: 0.6) and a small fluorescence quantum yield (0.07, RF: 0.27). For 8‐CF3RF, the triplet yield was 0.12, and the fluorescence quantum yield was 0.7. The high triplet yield of 8‐BrRF is due to the bromine heavy atom effect causing a stronger spin–orbit coupling. Theoretical calculations reveal that the decreased triplet yield of 8‐CF3RF is due to a smaller charge transfer and a less favorable energetic position of T2, required for intersystem crossing from S1 to T1, as an effect of the electron‐withdrawing CF3 group. The reconstitution of the LOV domain with the new flavins resulted in the typical LOV photochemistry, consisting of triplet state formation and covalent binding of the chromophore, followed by a thermal recovery of the parent state, albeit with different kinetics and photophysical properties. 相似文献
Renewal point processes show up in many different fields of science and engineering. In some cases the renewal points become the only observable parts of an anticipated hidden random variation of some physical quantity. The hypothesis might be that a hidden random process originating from zero or some other low value only becomes visible at the time of first crossing of some given value level, and that the process is restarted from scratch immediately after the level crossing. It might then be of interest to reveal the defining properties of this hidden process from a sample of observed first-passage times. In this paper the hidden process is first anticipated as a non-stationary Ornstein–Uhlenbeck (OU) process with unknown parameters that have to be estimated only by use of the information contained in a sample of first-passage times. The estimation method is a direct application of the Fortet integral equation of the OU process. A non-stationary Feller process is considered subsequently. As the OU process, the Feller process has a known transition probability distribution that allows the formulation of the integral equation. The described integral equation estimation method also provides a subjective graphical test of the applicability of the OU process or the Feller process when applied to a reasonably large sample of observed first-passage data.
These non-stationary processes have several applications in biomedical research, for example as idealized models of the neuron membrane potential. When the potential reaches a certain threshold the neuron fires, whereupon the potential drops to a fixed initial value, from where it continuously builds up again until the next firing. Also in civil engineering there are hidden random phenomena such as internal cracking or corrosion that after some random time break through to the material surface and become observable. However, the OU process has as a model of physical phenomena the defect of not being bounded to the negative side. This defect is not present for the Feller process, which therefore may provide a useful modeling alternative to the OU process. 相似文献
The Eurasian round goby (Neogobius melanostomus) invaded the freshwater North American Great Lakes in ~ 1990 via accidental introduction from ballast water discharge. Its genotypes in the Great Lakes traced to estuaries in the northern Black Sea, where the round goby flourishes in a variety of salinities to 22 parts per thousand (ppt). To prevent further introductions, U.S. and Canadian Coast Guard regulations now require that vessels exchange ballast water at sea before entering the Great Lakes. Since salinity tolerance of the invasive round goby population is poorly understood, we tested 230 laboratory-acclimated fish in three experimental scenarios: (1) rapid salinity increases (0–40 ppt), simulating ballast water exchange, (2) step-wise salinity increases, as during estuarine tidal fluxes or migration from fresh to saltwater, and (3) long-term survivorship and growth (to 4 months) at acclimated salinities. Almost all gobies survived experiments at 0–20 ppt, whereas none survived ≥ 30 ppt, and at 25 ppt only 15% withstood rapid changes and 30% survived step-wise increases. Ventilation frequencies were lowest at 10–15 ppt in step-wise experiments, in conditions that were near isotonic with fish internal plasma concentrations, reflecting lower energy expenditure for osmoregulation. Growth rates appeared greatest at 5–10 ppt, congruent with the larger sizes reached by gobies in Eurasian brackish waters. Thus, we predict that the Great Lakes round goby would thrive in brackish water estuaries along North American coasts, if introduced. However, oceanic salinities appear fatal to the invasive round goby, which likely cannot withstand complete seawater ballast exchanges or oceanic habitats. 相似文献
Semiotics is considered fundamental to an understanding of human–computer interaction, and of all computer artifacts. Informatics should therefore be viewed as technical semiotics (or semiotics engineering). In particular, interaction between human and computer is characterized by features of communication, a sort of communication, however, that lacks decisive communicative features. It must be identified as a process of pseudo-communication. Interaction is viewed as the coupling of two autonomous processes: a sign process (carried out by the human user) and a signal process (carried out by the computer). Software appears as a semiotic entity in a duplicate way: calculated and calculating, i.e. both as a result and agent of calculations. This dialectics characterizes the class of signs on the computer medium. Problems of software design (functionality and usability design) are specific problems of the coupling of sign and signal processes. 相似文献