首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Tidal wetlands in California are mostly estuarine salt marshes characterized by tidal channels and mudflats that are flooded and drained on a semidiurnal basis. Depths are rarely greater than 2 or 3 m, except where dredging occurs for harbor operations, and lengths from head to mouth are usually in the range of 1–10 km. This paper presents a coupled set of models for prediction of flow, solute transport, and particle transport in these systems. The flow and solute transport models are based upon depth-integrated conservation equations while the particle transport model is quasi-three-dimensional. Common to these models is an assumption that a turbulent boundary layer extends vertically from the bed and can be described by the law of the wall. This feature of the model accounts for: (1) momentum transfer to the bed, (2) longitudinal dispersion of dissolved material based on the work of Elder (1959), and (3) advection and turbulent diffusion of particles in three dimensions. A total variation diminishing finite volume scheme is used to solve the depth-integrated equations. Using this model, we show that dispersion can be accurately modeled using physically meaningful mixing coefficients. Calibration is therefore directed at modifying bed roughness, which scales both the rate of advection and dispersion.  相似文献   

2.
In order to implement efficient and effective management strategies for coastal water quality in Southern California, it is important to consider the relative pollutant contributions from urban dry-weather flow (DWF) and wet-weather flow (WWF). This study uses both historical flow coupled with water quality monitoring data and computer modeling to characterize the annual DWF and WWF discharges from an urban catchment in Los Angeles, Calif. The DWF and WWF pollutant loading of the trace metals copper, lead, nickel, and chromium for 6 water years dating from 1991 to 1996 is predicted. The results indicate that DWF contributes a considerable amount of flow and pollutants. Approximately, 9–25% of the total annual Ballona Creek flow volume is DWF. The simulations indicate DWF accounts for 54, 19, 33, and 44% of the average annual load of total chromium, copper, lead, and nickel, respectively. In the dry season, the simulations indicate DWF accounts for 89, 59, 58, and 90% of the load of total chromium, copper, lead, and nickel, respectively. This research suggests DWF controls may be an important part of pollution mitigation plans for urban stormwater drainage systems in Southern California.  相似文献   

3.
Collection of accurate, complete, and reliable field data is not only essential for active management of construction projects involving various tasks, such as material tracking, progress monitoring, and quality assurance, but also for facility and infrastructure management during the service lives of facilities and infrastructure systems. Limitations of current manual data collection approaches in terms of speed, completeness, and accuracy render these approaches ineffective for decision support in highly dynamic environments, such as construction and facility operations. Hence, a need exists to leverage the advancements in automated field data capture technologies to support decisions during construction and facility operations. These technologies can be used not only for acquiring data about the various operations being carried out at construction and facility sites but also for gathering information about the context surrounding these operations and monitoring the workflow of activities during these operations. With this, it is possible for project and facility managers to better understand the effect of environmental conditions on construction and facility operations and also to identify inefficient processes in these operations. This paper presents an overview of the various applications of automated field data capture technologies in construction and facility fieldwork. These technologies include image capture technologies, such as laser scanners and video cameras; automated identification technologies, such as barcodes and Radio Frequency Identification (RFID) tags; tracking technologies, such as Global Positioning System (GPS) and wireless local area network (LAN); and process monitoring technologies, such as on-board instruments (OBI). The authors observe that although applications exist for capturing construction and facility fieldwork data, these technologies have been underutilized for capturing the context at the fieldwork sites as well as for monitoring the workflow of construction and facility operations.  相似文献   

4.
Monitoring data from event-based monitoring systems are becoming more and more prevalent in civil engineering. An example is truck weigh-in-motion (WIM) data. These data are used in the transportation domain for various analyses, such as analyzing the effects of commercial truck traffic on pavement materials and designs. It is important that such analyses use good quality data or at least account appropriately for any deficiencies in the quality of data they are using. Low quality data may exist due to problems in the sensing hardware, in its calibration, or in the software processing the raw sensor data. The vast quantities of data collected make it infeasible for a human to examine all the data. The writers propose a data mining approach for automatically detecting semantic anomalies i.e., unexpected behavior in monitoring data. The writers’ method provides automated assistance to domain experts in setting up constraints for data behavior. The effectiveness of this method is shown by reporting its successful application to data from an actual WIM system, the experimental data the Minnesota department of transportation collected by its Minnesota road research project (Mn/ROAD) facilities. The constraints the expert set up by applying this method were useful for automatic anomaly detection over the Mn/ROAD data, i.e., they detected anomalies the expert cared about, e.g., unlikely vehicles and erroneously classified vehicles, and the misclassification rate was reasonable for a human to handle (usually less than 3%). Moreover, the expert gained insights about system behavior, such as realizing that a system-wide change had occurred. The constraints detected, for example, periods in which the WIM system reported that roughly 20% of the vehicles classified as three-axle single-unit trucks had only one axle.  相似文献   

5.
Interpretation of the data that can be collected by automated monitoring systems on construction sites is the most significant challenge to providing useful management information. Distinct construction operations must be identified and associated with construction activities, so that they can be related to construction plans. Earlier research has indicated that construction equipment can be monitored conveniently and that individual equipment operations can be isolated and characterized. In this work, an approach has been developed for unique association of isolated equipment operations with planned construction activities. The approach is based on comparison of the values of various characteristics, calculated for each equipment operation, against preset filters of characteristic values for all expected basic construction activities. The composition of the set of characteristics is different for each data stream monitored and is dependent on the nature of the construction activities. The method has the distinct advantage of ensuring the uniqueness of each filter within the collection of filters when the system is calibrated at the start of any project, rather than during online data processing. In this way, rapid and accurate interpretation of monitored data can be guaranteed. The method was tested using data collected during construction of a high-rise office tower.  相似文献   

6.
Evaporation pan (Ep) data are often used to estimate reference evapotranspiration (ET0) for use in water resource planning and irrigation scheduling. This paper reviews equations to estimate ET0 from Ep and provides a simpler method to make this conversion for arid climatic conditions like in California. The new method accounts for fetch differences by first adjusting the Ep rates to values expected for 100?m of grass fetch. Then it relies on an empirical relationship between ET0 and the adjusted Ep to determine Kp values; thus, eliminating the need for relative humidity and wind speed data that are often unavailable. The method is conceptually simpler, easier to code into computer applications, and within California, it gave better results than methods based on relative humidity and wind speed. However, the method might require calibration in more humid or windier climates.  相似文献   

7.
Data quality is extremely important where information dramatically influences the decisions being made. In the context of civil infrastructure systems, planning and management activities are critically dependent on data to support the efficient allocation of resources, detailed cost-benefit analysis, and informed decision-making. A Web-based tool, called Web-Vacuum, which employs data-mining (DM) techniques and partially implements a two-level data-quality assessment procedure, was developed to support the general purpose of data-quality assessment. The algorithms, workflow, and interfaces used in Web-Vacuum are presented. A data-quality assessment case study using a bridge management system data set is used to demonstrate that the application of Web-Vacuum can be used to assist in determining the quality of a data set.  相似文献   

8.
Conceptual representations of information contained in product and process models are often difficult to use for accessing data when performing engineering tasks. This is especially true if project-management information contained in product and process models needs to be made accessible on a mobile computer on construction sites. To make this information accessible, customized conceptual and visual information representations are needed. For the project-management tasks of progress monitoring and creating and administering punch lists, existing approaches that provide access to relevant project information are ineffective and inefficient in transforming information from product and process models into usable representations. As a result, these applications do not always provide information representations that are of the required structure, granularity, and type. In this paper, we describe a navigational model framework, which is an approach that effectively and efficiently creates and manages different views of information contained in product and process models. We validated this framework by implementing a prototype system and testing it through a designed set of experiments. The use cases for these experiments were established in an extensive study on the information and data collection needs on construction sites.  相似文献   

9.
Past project data sources provide key information for construction cost estimators. Previous research studies show that relying only on one’s own experience during estimation results in estimators’ bias. Having and referring to historical databases, containing objective information on what happened in past projects, are essential for reducing estimators’ biases. The first step toward development of useful project history databases is to understand what information estimators require from past projects. The research described in this paper targets estimators’ information needs identified through interviews, brainstorming sessions, task analyses, and card games conducted with estimators with different experience levels and specialized in heavy/civil and commercial construction projects, and exploration of historical and standard databases available in companies to determine what is being currently represented. Findings show that estimators need contextual information, depicting the conditions under which specific production rates were achieved, so that they can identify which production rate would be more realistic to use during the production rate estimation of an activity in a new bid. Comparison of the contextual information needs identified in this research with information items available in historical data sources (such as company cost reports, RSMeans, previous studies) highlighted some gaps and important opportunities for improvements in those sources. The identified contextual information items are significant for practitioners in developing ways to augment their existing project history databases to make them more beneficial for estimators.  相似文献   

10.
After a natural disaster strikes buildings, it is vital to immediately retrieve the related local information for efficient search and rescue (S&R) operations. Although it seems convenient to store the required local information (e.g., information about neighborhood, buildings) in a centralized database, S&R teams usually cannot access centralized databases because the information infrastructure is usually damaged or overloaded immediately after a disaster. This paper describes the search and rescue data access point (SR-DAP) system that was designed for storing and retrieving the required local information in/from data storage units that are deployed at buildings. In the paper, the developed approach is presented, and two key technologies (i.e., radio-frequency identification (RFID) tags and wireless sensor nodes) that are used as local storage mediums in SR-DAP are empirically evaluated. The results of the field experiments show that current technologies can be effectively utilized in the developed system. However, comparison of the technologies highlights the fact that the current wireless sensor technology is advantageous over RFID technology.  相似文献   

11.
The demand for urban underground space has been increasing in the past decades to create living space and to avoid traffic congestion. A critical concern during the design and development of the underground space is the influence of construction-related ground movements on neighboring facilities and utilities. Currently, engineers can estimate ground movements using a combination of semiempirical methods and numerical model simulation. However, these advanced analyses require accurate as-built construction staging data, which most projects lack. The traditional approach of collecting construction-staging data is both labor intensive and time consuming. This paper explores the use of three-dimensional laser scanning technology to accurately capture construction activities during development of an urban excavation. The paper describes the planning, execution, and data processing phases of collecting accurate construction as-built staging information over a period of 4?months at an urban excavation site in Evanston, Ill. The resulting data provide an unprecedented level of detail on the as-built site conditions and provide much needed information to civil engineering disciplines involved in an urban excavation including construction management and structural and geotechnical engineering.  相似文献   

12.
Most previous investigations on tide-induced watertable fluctuations in coastal aquifers have been based on one-dimensional models that describe the processes in the cross-shore direction alone, assuming negligible along-shore variability. A recent study proposed a two-dimensional approximation for tide-induced watertable fluctuations that took into account coastline variations. Here, we further develop this approximation in two ways, by extending the approximation to second order and by taking into account capillary effects. Our results demonstrate that both effects can markedly influence watertable fluctuations. In particular, with the first-order approximation, the local damping rate of the tidal signal could be subject to sizable errors.  相似文献   

13.
In this study, bioassessment data collected between 1998 and 2005 were synthesized and analyzed for streams and rivers throughout the San Diego Hydrologic Region to provide a spatial and temporal context for the results of several monitoring projects conducted between 1998 and 2005 and to ascertain the applicability of the Southern California benthic macroinvertebrate index of biological integrity (SoCal B-IBI) to the region’s streams. The water quality of the sites studied in the region, as reflected by temporal and spatial analyses of SoCal B-IBI scores, was found to be quite poor. When streams were analyzed individually most showed stable scores over the time frame of the study with some showing better scores in the fall. Spatially, scores were found to be better farther away from the coast in the upstream reaches of the watersheds. This study further explored the applicability of the SoCal B-IBI to a focused geographic region by demonstrating the necessity of each component metric to the assignment of biological condition. Although all component metric scores were deemed to be necessary, the percent intolerant individuals score had a more significant effect in driving impairment. The analysis of the component metrics of the SoCal B-IBI provides useful insights to the changes in scores among and between the sampled sites in the region’s watersheds. Based on this study, natural resource management agencies responsible for managing water quality should incorporate regular measures of biological integrity into their water quality programs to ascertain regional and temporal trends.  相似文献   

14.
Labor productivity is a fundamental piece of information for estimating and scheduling a construction project. The current practice of labor productivity estimation relies primarily on either published productivity data or an individual’s experience. There is a lack of a systematic approach to measuring and estimating labor productivity. Although historical project data hold important predictive productivity information, the lack of a consistent productivity measurement system and the low quality of historical data may prevent a meaningful analysis of labor productivity. In response to these problems, this paper presents an approach to measuring productivity, collecting historical data, and developing productivity models using historical data. This methodology is applied to model steel drafting and fabrication productivities. First, a consistent labor productivity measurement system was defined for steel drafting and shop fabrication activities. Second, a data acquisition system was developed to collect labor productivity data from past and current projects. Finally, the collected productivity data were used to develop labor productivity models using such techniques as artificial neural network and discrete-event simulation. These productivity models were developed and validated using actual data collected from a steel fabrication company.  相似文献   

15.
Currently, there is not an understanding of the project factors having a statistically significant relationship with highway construction duration. Other industry sectors have successfully used statistical regression analysis to identify and model the project parameters related to construction duration. While the need is seen for such work in highway construction, there are very few studies which attempt to identify duration-influential parameters and their relationship with the highway construction duration. The purpose of this work is to describe the highway construction data needed for such a study, identify a data source, collect early-design project data, and prepare the data for statistical regression analysis. The Virginia Department of Transportation is identified as the optimal data source. The data collected include historical contract and project level parameters. To prepare for statistical regression analysis, the contract duration collected is converted to construction duration by a seasonal adjustment process which removes historically typical nonworking days.  相似文献   

16.
Vacuum bagging and pressure bagging are established techniques used by the composites industry for fabricating components. This paper describes a study that explored the adaptation of these techniques for improving the FRP-concrete bond in the repair of partially submerged piles. Prototype vacuum bagging and pressure bagging systems were developed and bond improvement assessed from results of pullout tests on full size piles repaired under simulated tide in the laboratory. Pressure bagging gave better bond and was found to be simpler because it did not require an airtight seal. A field demonstration project was conducted in which pressure bagging was used in combination with two different GFRP systems to repair two corroding piles supporting the Friendship Trails Bridge across Tampa Bay. Inspection of the postcured wrap showed no evidence of air voids. The study demonstrates that techniques developed by the composites industry may be readily adapted to provide effective and inexpensive means for improving FRP-concrete bond.  相似文献   

17.
The transportation infrastructure is key to economic development in the United States. Providing a high level of serviceability through periodic inspection and maintenance is important in keeping the transportation system operational and in avoiding major replacement efforts. Of particular importance is the inventory of bridges in the national transportation infrastructure, due to their high cost and direct impact on public safety. The focus of this paper is on information management in support of bridge maintenance functions. Particularly, the research project discussed in the paper addresses the need for inclusion of construction as-built data in the bridge management database along with the periodic inspection and maintenance data. Attention to this type of data has been lacking. Therefore, the paper promotes bridge as-built data, discusses its role in bridge management, and demonstrates the proper design of an as-built information management model and system that is integrated with existing standard bridge management systems such as Pontis.  相似文献   

18.
The geotechnical earthquake engineering community often adopts empirically derived models. Unfortunately, the community has not embraced the value of model validation, leaving practitioners with little information on the uncertainties present in a given model and the model’s predictive capability. In this study, we present a machine learning technique known as support vector regression (SVR) together with rigorous validation for modeling lateral spread displacements and outline how this information can be used for identifying gaps in the data set. We demonstrate the approach using the free face lateral displacement data. The results illustrate that the SVR has relatively better predictive capability than the commonly used empirical relationship derived using multilinear regression. Moreover, the analysis of the SVR model and its support vectors helps in identifying gaps in the data and defining the scope for future data collection.  相似文献   

19.
This study investigates the feasibility of using turbidity (T) as a surrogate for suspended sediment concentration (C) in an irrigation-dominated watershed in southeastern California. A nonlinear T–C relationship was developed and evaluated using two independent sets of data obtained by physical sampling and laboratory turbidimeter. The relationship was interpreted in terms of the heterogeneous particle size distribution in the samples. The effects of spatial and temporal variation of particle sizes and water colors on the relationship were examined. Further, possible effects of laboratory procedures on the relationship such as time delay of sample measurement and calibration of T for C using lab-prepared samples were analyzed. The study showed that the variation of particle size distribution is the key factor controlling the T–C relationship. Water color and time delay for sample analysis did not significantly affect the turbidity values, whereas laboratory procedures may mislead the T–C relationship. It is concluded that turbidity may be a surrogate for suspended sediment concentration in such irrigation-dominated watersheds in arid regions, though the T–C relationship has to be established with care.  相似文献   

20.
While estimating activity production rates, cost estimators rely on historical production rates. To have realistic and useful cost estimates based on historical production rates, such production rate data should be augmented with historical contextual information that depict conditions under which activity production rates were achieved in past projects. This information is needed in determining which production rate to use among alternates for a similar activity existing in a new bid. Estimators need contextual information especially when they are unfamiliar with the work being estimated. Hence, such information items need to be identified, collected, and stored for estimators’ use in new projects. This paper details a construction-method specific and an extensible approach that is developed for enabling cost estimators to define contextual information items that need to be collected on job sites and stored as part of project histories. Based on this approach, the writers implemented a prototype system, called as ContextGen, and performed user-tests with estimators with different experience levels. Results showed that the developed approach captures method-specific information needs of estimators and is extensible to incorporate new contextual information items that can have different data representations. The developed approach is also precise in retrieving contextual information items specific to a construction method from a set of predefined contextual information items available in a library.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号