首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
2.
As a part of the pre-flight calibration and validation activities for the Ocean Color and Temperature Scanner (OCTS) and the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) ocean color satellite instruments, a radiometric measurement comparison was held in February 1995 at the NEC Corporation in Yokohama, Japan. Researchers from the National Institute of Standards and Technology (NIST), the National Aeronautics and Space Administration/Goddard Space Flight Center (NASA/GSFC), the University of Arizona Optical Sciences Center (UA), and the National Research Laboratory of Metrology (NRLM) in Tsukuba, Japan used their portable radiometers to measure the spectral radiance of the OCTS visible and near-infrared integrating sphere at four radiance levels. These four levels corresponded to the configuration of the OCTS integrating sphere when the calibration coefficients for five of the eight spectral channels, or bands, of the OCTS instrument were determined. The measurements of the four radiometers differed by −2.7 % to 3.9 % when compared to the NEC calibration of the sphere and the overall agreement was within the combined measurement uncertainties. A comparison of the measurements from the participating radiometers also resulted in agreement within the combined measurement uncertainties. These results are encouraging and demonstrate the utility of comparisons using laboratory calibration integrating sphere sources. Other comparisons will focus on instruments that are scheduled for spacecraft in the NASA study of climate change, the Earth Observing System (EOS).  相似文献   

3.
The pre-launch characterization and calibration of remote sensing instruments should be planned and carried out in conjunction with their design and development to meet the mission requirements. The onboard calibrators such as blackbodies and the sensors such as spectral radiometers should be characterized and calibrated using SI traceable standards. In the case of earth remote sensing, this allows inter-comparison and intercalibration of different sensors in space to create global time series of climate records of high accuracy where some inevitable data gaps can be easily bridged. The recommended best practice guidelines for this pre-launch effort is presented based on experience gained at National Institute of Standards and Technology (NIST), National Aeronautics and Space Administration (NASA) and National Oceanic and Atmospheric Administration (NOAA) programs over the past two decades. The currently available radiometric standards and calibration facilities at NIST serving the remote sensing community are described. Examples of best practice calibrations and intercomparisons to build SI (international System of Units) traceable uncertainty budget in the instrumentation used for preflight satellite sensor calibration and validation are presented.  相似文献   

4.
Researchers investigating climate change have used historical tide-gauge measurements from all over the world to investigate the changes in sea-level that have occurred over the last century or so. However, such estimates are a combination of any true sea-level variations and any vertical movements of the land at the specific tide-gauge. For a tide- gauge record to be used to determine the climate related component of changes in sea-level, it is therefore necessary to correct for the vertical land movement component of the observed change in sea-level.In 1990, the Institute of Engineering Surveying and Space Geodesy and Proudman Oceanographic Laboratory started developing techniques based on the Global Positioning System (GPS) for measuring vertical land movements (VLM) at tide-gauges in the UK. This paper provides brief details of these early developments and shows how they led to the establishment of continuous GPS (CGPS) stations at a number of tide-gauges. The paper then goes on to discuss the use of absolute gravity (AG), as an independent technique for measuring VLM at tide-gauges. The most recent results, from CGPS time-series dating back to 1997 and AG time-series dating back to 1995/1996, are then used to demonstrate the complementarity of these two techniques and their potential for providing site-specific estimates of VLM at tide-gauges in the UK.  相似文献   

5.
Nowadays particle size and mass concentration measurements are the important parameter of the ambient air quality standards of several countries. The regulatory limits of mass concentration of particulate matter (PM) for the size classes of PM2.5 and PM10, i.e., particle sizes of less than or equal to 2.5 and 10 μm in aerodynamic diameter, respectively in air are defined on yearly and hourly time-weighted-average basis. However, these limits are different in different regulations of the countries. Both of the parameters relate with the human health, climate and other issues, therefore accurate and precise measurement of these parameters are very important. Despite this, so far not much work has progressed in national metrology institutes (NMIs) worldwide on calibration and traceability issue of PM measurements. In this paper in context of PM measurement traceability, we present systematically the (1) air quality regulation in different countries, (2) reference methods for size and mass measurements, (3) variation/error and limitations of PM measurements based on the current results in this study and previously published results, (4) current status of PM size and mass calibration facility, (5) expected uncertainty in PM measurements, (6) add-on uncertainty in other parameters of national ambient air quality standards due to PM measurements, (7) where does traceability of PM issue stand against other parameters of air quality standards and its impact on health and climate, (8) NMIs working on this issue, (9) status at Bureau International des Poids et Mesures (BIPM), France and (10) conclusion. The aim of this paper is to better understand the importance of international system of units (SI) traceability issue in PM measurements, so wherever and whenever it is measured, should be acceptable everywhere, and data should be comparable for improving air quality and thus the quality of life. Funding agencies should be aware of this issue, and accept the results from the principle investigators and team only when their results have the traceability link to SI. NMIs should make program to involve industries in gas and aerosol metrology work to fulfill the requirement of calibration and standards. The regulatory authorities/ministry should work together with NMIs to improve the data quality of ambient measurements. This will greatly help to better make the policies and decisions on the related impacts. These were also the ultimate goals of “one-day pre-AdMet workshop” organized at National Physical Laboratory, New Delhi, India on February 20th, 2013.  相似文献   

6.
The problem of identifying sources of airborne pollutants and providing quantitative estimates of the contributions of each of those sources is important for airborne particulate matter. Various forms of factor analysis have been applied to this problem. However, in factor analysis, there is the fundamental problem of rotational ambiguity that makes the problem ill-posed. Thus, the incorporation of additional information can be useful in improving the solutions. Especially for identifying local sources, wind data (direction and speed) could be valuable additional information in such receptor modeling. However, wind data cannot be used directly as dependent variables in factor analytic modeling because the dependence of observed concentrations on wind variables is far from linear. An expanded multilinear model has been developed in which the wind direction, speed and other variables are included as independent variables. For each source, the analysis computes a directional profile that indicates how much of the concentrations are explained by the factors depending on wind direction, speed, and other values. This model has been tested using simulated data developed by the U.S. Environmental Protection Agency as part of a workshop to test advanced factor analysis methods. For most of the local sources, well-defined directional profiles were obtained.  相似文献   

7.
Since the British National Archive put forward the concept of the digital continuity in 2007, several developed countries have worked out their digital continuity action plan. However, the technologies of the digital continuity guarantee are still lacked. At first, this paper analyzes the requirements of digital continuity guarantee for electronic record based on data quality theory, then points out the necessity of data quality guarantee for electronic record. Moreover, we convert the digital continuity guarantee of electronic record to ensure the consistency, completeness and timeliness of electronic record, and construct the first technology framework of the digital continuity guarantee for electronic record. Finally, the temporal functional dependencies technology is utilized to build the first integration method to insure the consistency, completeness and timeliness of electronic record.  相似文献   

8.
Annually laminated sediments from marine or lacustrine settings represent valuable high-resolution archives of climate change that record variation due to changing precipitation and run-off from land or variation in biological productivity and flux in the water column. Because of their annual resolution such sediments may capture abrupt changes of interannual to decadal scales rivaling corals and ice cores in resolution. Laminated sediments often occur intermittently in the sediment column, and the onset and cessation of laminae commonly record the abrupt crossing of thresholds related to climate change, for example, in the degree of oxygenation of bottom waters. Such records from marginal basins and continental margins have been pivotal in demonstrating that abrupt changes hitherto documented only in high-latitude ice cores are synchronous with climatic change at low latitudes. These insights into global teleconnections have improved our understanding of the mechanisms of rapid climate change. In deep-sea settings, the discovery of the episodic occurrence of laminated diatom-rich sediments in the Equatorial Pacific and Southern Ocean provides evidence for massive climate-related biogeochemical excursions tied to abrupt changes in the input, distribution and availability of nutrients in the oceans.  相似文献   

9.
We developed a dataset of local-scale daily climate change scenarios for Japan (called ELPIS-JP) using the stochastic weather generators (WGs) LARS-WG and, in part, WXGEN. The ELPIS-JP dataset is based on the observed (or estimated) daily weather data for seven climatic variables (daily mean, maximum and minimum temperatures; precipitation; solar radiation; relative humidity; and wind speed) at 938 sites in Japan and climate projections from the multi-model ensemble of global climate models (GCMs) used in the coupled model intercomparison project (CMIP3) and multi-model ensemble of regional climate models form the Japanese downscaling project (called S-5-3). The capability of the WGs to reproduce the statistical features of the observed data for the period 1981-2000 is assessed using several statistical tests and quantile-quantile plots. Overall performance of the WGs was good. The ELPIS-JP dataset consists of two types of daily data: (i) the transient scenarios throughout the twenty-first century using projections from 10 CMIP3 GCMs under three emission scenarios (A1B, A2 and B1) and (ii) the time-slice scenarios for the period 2081-2100 using projections from three S-5-3 regional climate models. The ELPIS-JP dataset is designed to be used in conjunction with process-based impact models (e.g. crop models) for assessment, not only the impacts of mean climate change but also the impacts of changes in climate variability, wet/dry spells and extreme events, as well as the uncertainty of future impacts associated with climate models and emission scenarios. The ELPIS-JP offers an excellent platform for probabilistic assessment of climate change impacts and potential adaptation at a local scale in Japan.  相似文献   

10.
The Workshop on Medical Preparedness for Chemical, Biological, Radiological, Nuclear, and Explosives (CBRNE) events: national scan was held on 20 and 21 May 2010 at the Diefenbunker Museum in Ottawa, Canada. The purpose of the workshop was to provide the CBRNE Research and Technology Initiative with a Canadian national profile of existing capabilities and anticipated gaps in casualty management consistent with the community emergency response requirements. The workshop was organised to enable extensive round-table discussions and provide a summary of key gaps and recommendations for emergency response planners.  相似文献   

11.
The impact of design on logistics cannot be ignored, and design for logistics is a new concept similar to design for manufacturing or design for assembly. Engineering change is one of the scenarios that would require logistics support. Change control of a product data management (PDM) system is one of the major approaches for handling engineering changes today. According to principles of configuration management, during the change control workflow, there are three different dates: release date, effective date, and effectivity date utilised for controlling and managing change planning and scheduling. Effective date is the exact date that a released change takes effect to the shop floor workshop. Effectivity date is the expected date that decision makers plan for the change to take effect. In normal situations, multiple disciplines, such as design and development, purchasing, shop floor workshop, quality control, and so on, are involved in making a change decision on when a change is to become effective. In this paper, a linear programming effectivity decision model is proposed to concurrently support changes of design scheduling, and production planning and scheduling when an engineering change occurs. The proposed model succeeded in solving an integration problem of design scheduling, production planning and shop floor scheduling.  相似文献   

12.
We propose a change point approach based on the segmented regression technique for testing the constancy of the regression parameters in a linear profile data set. Each sample collected over time in the historical data set consists of several bivariate observations for which a simple linear regression model is appropriate. The change point approach is based on the likelihood ratio test for a change in one or more regression parameters. We compare the performance of this method to that of the most effective Phase I linear profile control chart approaches using a simulation study. The advantages of the change point method over the existing methods are greatly improved detection of sustained step changes in the process parameters and improved diagnostic tools to determine the sources of profile variation and the location(s) of the change point(s). Also, we give an approximation for appropriate thresholds for the test statistic. The use of the change point method is demonstrated using a data set from a calibration application at the National Aeronautics and Space Administration (NASA) Langley Research Center. Copyright © 2006 John Wiley & Sons, Ltd.  相似文献   

13.
评估气候变化对建筑能耗的影响对于建筑节能设计具有重要意义。本文对上海2008—2017年逐时气象数据进行分析,结果表明,相对于标准气象年基准数据,空调度时数(27℃)最大值增加了206%,而采暖度日数(18℃)最小值降低了10%左右。为量化气候变化对建筑能耗的影响范围,以上海一栋采用风冷式多联机的办公建筑为研究对象,利用2008—2017年实测逐时气象数据与标准气象年数据,对比分析空调运行能耗的变化情况。研究结果表明,空调度时数增加的主要原因是7月和8月的高温天气大幅增加,该2个月建筑冷负荷最大增加63%,全年建筑负荷最大增加13%,多联机运行能耗平均增加幅度为16%~26%。  相似文献   

14.
Fluvial landforms and sediments can be used to reconstruct past hydrological conditions over different time scales once allowance has been made for tectonic, base-level and human complications. Field stratigraphic evidence is explored here at three time scales: the later Pleistocene, the Holocene, and the historical and instrumental period. New data from a range of field studies demonstrate that Croll-Milankovitch forcing, Dansgaard-Oeschger and Heinrich events, enhanced monsoon circulation, millennial- to centennial-scale climate variability within the Holocene (probably associated with solar forcing and deep ocean circulation) and flood-event variability in recent centuries can all be discerned in the fluvial record. Although very significant advances have been made in river system and climate change research in recent years, the potential of fluvial palaeohydrology has yet to be fully realized, to the detriment of climatology, public health, resource management and river engineering.  相似文献   

15.
This paper presents a general review of the governmental activities in Japan on reference materials and evaluation of thermophysical properties data and then describes recent developments at the National Research Laboratory of Metrology (NRLM) in the field of thermophysical properties and related standards. As for reference materials, the past and present activities organized by the government and a few related associations are reviewed from the point of view of establishing traceability systems, whereas, on the evaluation of thermophysical properties, the framework of collaborative research for establishing data base systems by network sharing is mentioned. Then the recent studies of NRLM on the measurements and standards of thermophysical properties as well as its calibration services are summarized.Presented at the Japan-United States Joint Seminar on Thermophysical Properties, October 24–26, 1983, Tokyo, Japan.  相似文献   

16.
This paper summarizes the extension of new market mechanisms for environmental services, explains of the importance of generating price information indicative of the cost of mitigating greenhouse gases (GHGs) and presents the rationale and objectives for pilot GHG-trading markets. It also describes the steps being taken to define and launch pilot carbon markets in North America and Europe and reviews the key issues related to incorporating carbon sequestration into an emissions-trading market. There is an emerging consensus to employ market mechanisms to help address the threat of human-induced climate changes. Carbon-trading markets are now in development around the world. A UK market is set to launch in 2002, and the European Commission has called for a 2005 launch of an European Union (EU)-wide market, and a voluntary carbon market is now in formation in North America. These markets represent an initial step in resolving a fundamental problem in defining and implementing appropriate policy actions to address climate change. Policymakers currently suffer from two major information gaps: the economic value of potential damages arising from climate changes are highly uncertain, and there is a lack of reliable information on the cost of mitigating GHGs. These twin gaps significantly reduce the quality of the climate policy debate. The Chicago Climate Exchange, for which the authors serve as lead designers, is intended to provide an organized carbon-trading market involving energy, industry and carbon sequestration in forests and farms. Trading among these diverse sectors will provide price discovery that will help clarify the cost of combating climate change when a wide range of mitigation options is employed. By closing the information gap on mitigation costs, society and policymakers will be far better prepared to identify and implement optimal policies for managing the risks associated with climate change. Establishment of practical experience in providing tradeable credits for carbon-absorbing land-use practices, especially reforestation and conservation management of agricultural soils, will also help demonstrate the viability of a new tool for financing activities that improve water quality, support biodiversity and constitute important elements of long-term sustainability in land-use management.  相似文献   

17.
Random effects tobit models are developed in predicting hourly crash rates with refined-scale panel data structure in both temporal and spatial domains. The proposed models address left-censoring effects of crash rates data while accounting for unobserved heterogeneity across groups and serial correlations within group in the meantime. The utilization of panel data in both refined temporal and spatial scales (hourly record and 1-mile roadway segments on average) exhibits strong potential on capturing the nature of time-varying and spatially varying contributing variables that is usually ignored in traditional aggregated traffic accident modeling. 1-year accident data and detailed traffic, environment, road geometry and surface condition data from a segment of I-25 in Colorado are adopted to demonstrate the proposed methodology. To better understand significantly different characteristics of crashes, two separate models, one for daytime and another for nighttime, have been developed. The results show major difference in contributing factors towards crash rate between daytime and nighttime models, implying considerable needs to investigate daytime and nighttime crashes separately using refined-scale data. After the models are developed, a comprehensive review of various contributing factors is made, followed by discussions on some interesting findings.  相似文献   

18.
In December 1990 the U.S. Environmental Protection Agency sponsored a workshop to discuss the applicability of an interim "toxicity equivalency factor" (TEF) approach to assessing risks posed by exposures to complex mixtures of polychlorinated biphenyls (PCBs). The group concluded that application of the TEF approach to PCBs would be less straightforward than it was in the case of chlorinated dibenzo-p-dioxins and dibenzofurans (CDDs/CDFs). It appears that "dioxin"-like properties of some PCB congeners are amenable to a TEF treatment that is compatible with that used for CDDs/CDFs. Such a scheme also seems to have utility in assessing risks to wildlife. Other non-"dioxin"-like toxic endpoints (e.g., neurotoxicity) appear to have a different structure-activity-related mechanism-of-action that requires a separate TEF scheme. The workshop identified data gaps in toxicology and analytical chemistry that hinder adoption of proposed TEF schemes for PCBs at this time.  相似文献   

19.
The economics of abrupt climate change   总被引:1,自引:0,他引:1  
The US National Research Council defines abrupt climate change as a change of state that is sufficiently rapid and sufficiently widespread in its effects that economies are unprepared or incapable of adapting. This may be too restrictive a definition, but abrupt climate change does have implications for the choice between the main response options: mitigation (which reduces the risks of climate change) and adaptation (which reduces the costs of climate change). The paper argues that by (i) increasing the costs of change and the potential growth of consumption, and (ii) reducing the time to change, abrupt climate change favours mitigation over adaptation. Furthermore, because the implications of change are fundamentally uncertain and potentially very high, it favours a precautionary approach in which mitigation buys time for learning. Adaptation-oriented decision tools, such as scenario planning, are inappropriate in these circumstances. Hence learning implies the use of probabilistic models that include socioeconomic feedbacks.  相似文献   

20.
A small error in the aerosol measurements can lead to a considerable influence on our understanding towards its impact on climate. To better simulate aerosol effects on the earth’s radiation budget, the chemical and physical characterizations of aerosol particles with accurate measurements have been a key interest in the aerosol research for last several decades. Recent advances in the chemical characterization of aerosols at bulk and molecular levels, and their physical characterization, such as size distribution, hygroscopicity, and cloud condensation nuclei (CCN) activity have improved our knowledge to better understand the aerosol sources, concentration distributions, atmospheric processing and their potential climate impacts. Apart from the complexity of atmospheric aerosols, because of the limited availability of aerosol certified reference materials, traceability data and measurement protocols, it is still a challenging task to measure the aerosol properties with reduced uncertainty. The recent developments on aerosol analytical techniques (on-line and off-line), which include gas chromatography/flame ionization detector (GC/FID)/mass spectrometry (MS)/isotope ratio mass spectrometry (irMS), ion chromatography (IC), organic carbon/elemental carbon (OC/EC) and water-soluble organic carbon (WSOC) analyzers, and physical measurements using scanning mobility particle sizer (SMPS), hygroscopicity tandem differential mobility analyzer (HTDMA) and cloud condensation nuclei-counter (CCN-C) are discussed with the metrological issues in the measurements. The importance of aerosol metrology is highlighted giving the data obtained from the laboratory studies and aerosol field campaigns.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号