首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
BACKGROUND: Quality measurement in long term care (LTC) presents many challenges: the lack of a uniform definition of quality, the existence of multiple domains for measurement, a multitude of potential perspectives, and regulatory influences that emphasize measurement only of poor quality. Research efforts have yet to solve the issues of measurement; however, operators of long term care facilities must use the current state of the art in quality measurement as the basis for their quality improvement efforts. A project was commissioned by management of a large integrated delivery system with a robust network of LTC facilities who wished to implement a continuous quality improvement process on the basis of a measurement tool that provides a comprehensive resident-centered assessment of quality. The objectives of this project, therefore, were to identify domains of quality, to select and adapt validated instruments for measurement within each domain, to pilot test a data collection process, and to develop an operational quality profiling report format for LTC facilities. DESIGN AND METHODS: Using an expert panel and the LTC research literature, an operational measurement tool was developed, consisting of four domains of quality: organizational, clinical, environmental, and social. DISCUSSION: A pilot study conducted in two nursing facilities demonstrated that the data collection process could be operationalized within tight resource and budgetary constraints. The development of an operational quality assessment tool enables management to take a consistent view of diverse institutions, focusing in detail on quality of care as it is perceived by residents. The tool allows evaluation of trends over time and comparison to external norms.  相似文献   

2.
BACKGROUND: Health status data are an increasingly important component of outcomes assessment and can be used to facilitate quality assessment and improvement efforts. An enormous challenge to the use of health status data among hospitalized patients, however, is collecting baseline data at the time of treatment, an essential component for risk-adjusting subsequent outcomes. The Mid America Heart Institute of Saint Luke's Hospital (Kansas City, Mo), attempted to integrate the collection of health status assessments within the process of performing coronary revascularization. THE DATA COLLECTION STRATEGY: The data collection strategy was developed for each admission portalelective outpatients (admissions for same-day procedures), inpatients, and emergent cases. Health status data were collected on all patients with coronary artery disease who were receiving a percutaneous coronary intervention or coronary artery bypass graft with no disruption to physician scheduling or nursing staff. RESULTS: In general, patients were agreeable to completing the health status survey. Despite initial efforts to educate the hospital staff about the goal and purpose of health status assessment, staff members who were unaware of the uses of these data seemed to minimize their value. Providing examples of how to use these data relative to the staff member's specific occupational role facilitated buy-in for this project. EPILOGUE: After the pilot study, which lasted until June 1999, data were continually collected for 18 months, through August 2000, even with the cessation of external grant funding for this project. Baseline data collection finally stopped, primarily because of a failure to accommodate data collection into the routine flow of patient care by existing nursing staff.  相似文献   

3.
《国际生产研究杂志》2012,50(17):4846-4859
The article focuses on the study of characteristics of digital geographical data and the influence of these characteristics on the quality of the basis being ready for the decision-making processes. Characteristics of digital geographical data are mainly described from the point of view of their technical parameters and keeping the technological indicators while obtaining them. Our system brings a user's view into the assessment of characteristics. We define user's requirements on data and suggest the system of their evaluation. The assessment of data characteristics comes out of the standard ISO 19113 and the theory of value analysis. Both systems are joined into a complex system for data evaluation. Technical characteristics of data are evaluated mainly by the level of accomplishment of qualitative indicators, e.g. meeting all requirements of horizontal and vertical mean square error, or by completeness of filling of all expected information. Meeting all users’ requirements is usually expressed by the level of user's satisfaction with the particular product. This level is set based on the survey among users. Simultaneously, the weight of the individual evaluating criteria is set according to specific types of solved tasks. The system of data evaluation is supplemented with a system of calculation of costs that are needed for obtaining the data. It is possible, however, to work either with the complete database, or to count also the influences of particular groups of data (e.g. communications, residential buildings, etc.) on the complete quality of the final product. The system of costs calculation and evaluation of data quality then enables the optimisation of the use of funds or disposable time for obtaining quality data. The use of the complete system is presented in a pilot project in which a model of terrain passability by a military heavy vehicle is solved.  相似文献   

4.
Implementation of a Quality Systems approach to making defensible environmental program decisions depends upon multiple, interrelated components. Often, these components are developed independently and implemented at various facility and program levels in an attempt to achieve consistency and cost savings. The U.S. Department of Energy, Office of Environmental Management (DOE-EM) focuses on three primary system components to achieve effective environmental data collection and use. (1) Quality System guidance, which establishes the management framework to plan, implement, and assess work performed; (2) A Standardized Statement of Work for analytical services, which defines data generation and reporting requirements consistent with user needs; and (3) A laboratory assessment program to evaluate adherence of work performed to defined needs, e.g., documentation and confidence. This paper describes how DOE-EM fulfills these requirements and realizes cost-savings through participation in interagency working groups and integration of system elements as they evolve.  相似文献   

5.
The quality evaluation and assessment of radiological data is the final step in the overall environmental data decisionprocess. This quality evaluation and assessment process is performed outside of the laboratory, and generally the radiochemist is not involved. However, with the laboratory quality management systems in place today, the data packages of radiochemical analyses are frequently much more complex than the project/program manager can effectively handle and additionally, with little involvement from radiochemists in this process, the potential for misinterpretation of radiological data is increasing. The quality evaluation and assessment of radiochemistry data consists of making three decisions for each sample and result, remembering that the laboratory reports all the data for each analyses as well as the uncertainty in each of these analyses. Therefore, at the data evaluation and assessment stage, the decisions are: (1) is the radionuclide of concern detected (each data point always has a number associated with it?); (2) is the uncertainty associated with the result greater than would normally be expected; and (3) if the laboratory rejected the analyses is there serious consequences to other samples in the same group? The need for the radiochemist's expertise for this process is clear. Quality evaluation and assessment requires the input of the radiochemist particularly in radiochemistry because of the lack of redundancy in the analytical data. This paper describes the role of the radiochemist in the quality assessment of radiochemical data for environmental decision making.  相似文献   

6.
The Corps of Engineers works with local restoration advisory boards (RAB) to exchange information and develop plans for restoration of closed military bases for civilian reuse. Meetings of the RAB to discuss progress in environmental assessment and restoration of former defense sites can be contentious due to the complex technical nature of the information to be shared and the personal stake that the members of the community have in ensuring that contentious areas are restored for safe use. A prime concern of community representatives is often the quality of the data used to make environmental decisions. Laboratory case narratives and data flags may suggest laboratory errors and low data quality to those without an understanding of the information's full meaning. RAB members include representatives from local, state, and tribal governments, the Department of Defense, the Environmental Protection Agency, and the local community. The Corps of Engineers representatives usually include project technical and management personnel, but these individuals may not have sufficient expertise in the project quality assurance components and laboratory data quality procedures to completely satisfy community concerns about data quality. Communication of this information to the RAB by a quality assurance professional could serve to resolve some of the questions members have about the quality of acquired data and proper use of analytical results, and increase community trust that appropriate decisions are made regarding restoration. Details of the effectiveness of including a quality assurance professional in RAB discussions of laboratory data quality and project quality management are provided in this paper.  相似文献   

7.
Development of energy-efficient data collection and routing schemes for Underwater Wireless Sensor Networks (UWSNs) is a challenging issue due to the peculiarities of the underlying physical layer technology. Since the recharging or replacement of sensor nodes is almost impossible after deployment, the critical issue of network lifetime maximization must be considered right from the beginning of designing the routing schemes. We propose a mobile sink (MS)-based data collection scheme that can extend network lifetime, taking into account power-constrained sensor nodes, partitioned networks with geographically distant data collection points and periodic monitoring applications with delay-tolerance. Lifetime extension is achieved by mitigating the ‘sink neighbourhood problem’ and by deferring the data transmissions until the MS is at the most favourable location for data transfer. Unlike the models available for terrestrial WSNs, we consider non-zero travel time of the MS between data collection points, thus making our model more realistic for UWSNs, both connected and partitioned. The performance of the proposed mobility-assisted data collection scheme is thoroughly investigated using both analytical and simulation models. The analytical results are compared to those of existing models to assess their effectiveness and to investigate the trade-offs. Results show that, with a network size of 60 nodes, the network lifetime achieved by the proposed model is 188% higher than that of static sink model and 91% higher than that of mobile sink model (MSM). The proposed maximum lifetime routing scheme is implemented in the network simulation platform OMNET++, for validating the analytical results as well as for evaluating other performance metrics that are not tractable via analytical methods. Both analytical and simulation results demonstrate the superiority of the proposed model in capturing realistic network conditions and providing useful performance indicators prior to network deployment.  相似文献   

8.
This article proposes a simple strategy for establishing sensitivity requirements (quantitation limits) for environmental chemical analyses when the primary data quality objective is to determine if a contaminant of concern is greater or less than an action level (e.g., an environmental "cleanup goal," regulatory limit, or risk-based decision limit). The approach assumes that the contaminant concentrations are normally distributed with constant variance (i.e., the variance is not significantly dependent upon concentration near the action level). When the total or "field" portion of the measurement uncertainty can be estimated, the relative uncertainty at the laboratory's quantitation limit can be used to determine requirements for analytical sensitivity. If only the laboratory component of the total uncertainty is known, the approach can be used to identify analytical methods or laboratories that will not satisfy objectives for sensitivity (e.g., when selecting methodology during project planning).  相似文献   

9.
Experience data on the reliability of equipment has become vital to many types of engineering and maintenance analyses. The consequences of incorrect design or poor maintenance may adversely affect: safety, the environment or cost in most categories of process industries, and, in particular, offshore exploration and production industries. The OREDA project is a data collection programme for the offshore industry which has been operating since the early 80's. A high level of knowledge has been gained from this programme on: specification of data, data collection methods and the utilization of data. Some of the results and the knowledge gained from this project are presented in this paper.  相似文献   

10.
针对地震勘探中时间同步问题和其他数据采集领域的采集同步问题,设计并完善基于GPS授时技术的同步技术方案,利用GPS授时信号全方位、实时性、连续性和高精度的特点,以GPS的PPS信号为基准来校准本地时钟,采用单片机和无线通信技术实现数据采集同步。通过对无线遥测各道地震数据进行同步处理,解决了当地震勘探仪采集的道数增多时数据采集和存储的实时性问题,特别是解决了未来数字式检波器中独立同步和分时处理等关键问题。  相似文献   

11.
HPLC/MS is a linear technique characterized by serial injection and analysis of individual samples. Parallel-format high-throughput screens for druglike properties present a significant analytical challenge. Analysis speed and system ruggedness are key requirements for bioanalysis of thousands of samples per day. The tasks involved in LC/MS analysis are readily divided into three areas, sample preparation/liquid handling, LC/MS method building/sample analysis, and data processing. Several automation and multitasking strategies were developed and implemented to minimize plating and liquid handling errors, reduce dead times within the analysis cycle, and allow for comprehensive review of data. Delivering multiple samples to multiple injectors allows the autosampler time to complete its wash cycles and aspirate the next set of samples while the previous set is being analyzed. A dual-column chromatography system provides column cycling and peak stacking and allows rapid throughput using conventional LC equipment. Collecting all data for a compound into a single file greatly reduces the number of data files collected, increases the speed of data collection, allows rugged and complete review of all data, and provides facile data management. The described systems have analyzed over 40 000 samples per month for two years and have the capacity for over 2000 samples per instrument per day.  相似文献   

12.
13.
A heuristic design method for rapid volumetric magnetic resonance imaging data acquisition trajectories is presented, using a series of second-order cone optimization subproblems. Other researchers have considered non-raster data collection trajectories and under-sampled data patterns. This work demonstrates that much higher rates of under-sampling are possible with an asymmetric set of trajectories, with very little loss in resolution, but the addition of noise-like artefacts. The proposed data collection trajectory, Durga, further minimizes collection time by incorporating short un-refocused excitation pulses, resulting in above 98% collection efficiency for balanced steady state free precession imaging. The optimization subproblems are novel, in that they incorporate all requirements, including data collection (coverage), physicality (device limits), and signal generation (zeroth- and higher- moment properties) in a single convex problem, which allows the resulting trajectories to exhibit a higher collection efficiency than any existing trajectory design.  相似文献   

14.
Software reliability assessment models in use today treat software as a monolithic block. An aversion towards ‘atomic' models seems to exist. These models appear to add complexity to the modeling, to the data collection and seem intrinsically difficult to generalize. In 1997, we introduced an architecturally based software reliability model called FASRE. The model is based on an architecture derived from the requirements which captures both functional and nonfunctional requirements and on a generic classification of functions, attributes and failure modes. The model focuses on evaluation of failure mode probabilities and uses a Bayesian quantification framework. Failure mode probabilities of functions and attributes are propagated to the system level using fault trees. It can incorporate any type of prior information such as results of developers' testing, historical information on a specific functionality and its attributes, and, is ideally suited for reusable software. By building an architecture and deriving its potential failure modes, the model forces early appraisal and understanding of the weaknesses of the software, allows reliability analysis of the structure of the system, provides assessments at a functional level as well as at a systems' level. In order to quantify the probability of failure (or the probability of success) of a specific element of our architecture, data are needed. The term element of the architecture is used here in its broadest sense to mean a single failure mode or a higher level of abstraction such as a function. The paper surveys the potential sources of software reliability data available during software development. Next the mechanisms for incorporating these sources of relevant data to the FASRE model are identified.  相似文献   

15.
Environmental regulatory policy states a goal of "sound science." The practice of good science is founded on the systematic identification and management of uncertainties; i.e., knowledge gaps that compromise our ability to make accurate predictions. Predicting the consequences of decisions about risk and risk reduction at contaminated sites requires an accurate model of the nature and extent of site contamination, which in turn requires measuring contaminant concentrations in complex environmental matrices. Perfecting analytical tests to perform those measurements has consumed tremendous regulatory attention for the past 20-30 years. Yet, despite great improvements in environmental analytical capability, complaints about inadequate data quality still abound. This paper argues that the first generation data quality model that equated environmental data quality with analytical quality was a useful starting point, but it is insufficient because it is blind to the repercussions of multifaceted issues collectively termed "representativeness." To achieve policy goals of "sound science" in environmental restoration projects, the environmental data quality model must be updated to recognize and manage the uncertainties involved in generating representative data from heterogeneous environmental matrices.  相似文献   

16.
This paper describes the design and implementation of a versatile, open-architecture research data acquisition system using a commercially available medical ultrasound scanner. The open architecture will allow researchers and clinicians to rapidly develop applications and move them relatively easy to the clinic. The system consists of a standard PC equipped with a camera link and an ultrasound scanner equipped with a research interface. The ultrasound scanner is an easy-to-use imaging device that is capable of generating high-quality images. In addition to supporting the acquisition of multiple data types, such as B-mode, M-mode, pulsed Doppler, and color flow imaging, the machine provides users with full control over imaging parameters such as transmit level, excitation waveform, beam angle, and focal depth. Beamformed RF data can be acquired from regions of interest throughout the image plane and stored to a file with a simple button press. For clinical trials and investigational purposes, when an identical image plane is desired for both an experimental and a reference data set, interleaved data can be captured. This form of data acquisition allows switching between multiple setups while maintaining identical transducer, scanner, region of interest, and recording time. Data acquisition is controlled through a graphical user interface running on the PC. This program implements an interface for third-party software to interact with the application. A software development toolkit is developed to give researchers and clinicians the ability to utilize third-party software for data analysis and flexible manipulation of control parameters. Because of the advantages of speed of acquisition and clinical benefit, research projects have successfully used the system to test and implement their customized solutions for different applications. Three examples of system use are presented in this paper: evaluation of synthetic aperture sequential beamformation, transverse oscillation for blood velocity estimation, and acquisition of spectral velocity data for evaluating aortic aneurysms.  相似文献   

17.
The present study demonstrated the utility of the FACETS software for evaluating items in the field test stage of item development for a clinical early childhood instrument. The research focus was the equivalence of the parent/caretaker interview, structured assessment, and observational methods for data collection for a developmental inventory for children from birth to age seven. Data for this study were from a field test with some missing responses. The Rasch-based software FACETS was used to test the equivalence of the methods of data collection as well as to identify items for which the methods did not provide equivalent information. Thirty-five items were studied from the adaptive domain, 26 items from the communication subdomain, 34 items from the motor domain, and 31 from the personal-social domain. When the methods were unconstrained, the overall test indicated that at least two of the methods were not equivalent. However, FACETS bias analyses with the method measures constrained to equality allowed the identification of a limited number of items and associated methods that were possibly problematic. The use of FACETS allowed test developers to focus on items from a field test event that were inconsistent with the targeted test development model.  相似文献   

18.
EPA and other government organizations make decisions based on environmental measurements. How good are the data? How well are the data generators performing? What measurements apply to them? How can the data life cycle processes be improved so data generators can continually provide the best data? EPA's Quality Management System requirements go beyond evaluation of environmental data quality itself to examine systems associated with production, collection, processing (validation/verification), transfer, reduction, storage, and retrieval of data throughout a life cycle. This QMS specifies minimum quality requirements for particular environmental programs. But how can you measure and compare programs that go well beyond the minimum, towards optimal quality? This paper compares EPA's requirements for Quality Management Systems (R2) and Project Plans (R5) to the Software Engineering Institute Capability Maturity Model (CMMISM). The CMMISM model provides for growth (staged or continuous) and a comprehensive assessment that is not yet provided in EPA's R2 or R5. Properly implemented, the CMMISM model serves as a quality framework for integrating and aligning organizational processes and implementing a program of continual process improvements. It identifies process areas ("things to do"), and provides measures of performance ("how well things are done") against specific goals and practices. CMMISM uses a Systems Engineering Management approach, built on process models, that helps identify "how good" the system is. Goodness is defined as stages in a complete model for optimal operation. CMMISM provides two methods for evaluating the goodness of the project. The Staged model in CMMISM provides a Maturity Level that is a well-defined evolutionary plateau describing the manner in which a specified set of processes are performed. As the organization advances in maturity, these levels become more defined and processes are tailored for specific project needs. The other method is called the Continuous Model in CMMISM, and it allows you to achieve Capability Levels. These are used to describe how well each project is doing in relationship to the different process areas. There are six Capability Levels from 0-5 that apply to individual process areas. Organizations using the Capability Level approach can select individual process areas that are important to specific projects and work to improve the processes. Improving capability in individual process areas raises the organization's overall quality of products delivered. The Continuous Model, unlike the Staged Model, lets you pick higher maturity level process areas before completing all of the ones below. Environmental measurement programs need to focus on the quality of the systems where data are collected, processed, transferred, and so forth. DynCorp built on the quality foundation from our experience with R2 to successfully implement CMMISM practices in the development of Forms II Lite and other applications. DynCorp is now migrating to the CMMISM model that has evolved from the existing CMM model. The CMMISM model focuses on the full cycle of Requirements Management from identification, development, collection, refinement, analysis, and validation throughout a project life cycle. It also has a more refined focus on the identification, development, collection, analysis, and evaluation of meaningful measurements, so the results can be used to improve a process or product.  相似文献   

19.
The extent of the drinking-driving problem is most directly measured by in-depth investigations of accidents, which provides the proportion of accidents in which alcohol was a contributory factor. Indirectly, the number of accidents caused by alcohol can be estimated from the results of case-control studies alone or in combination with other BAC-distributions of accident-involved road users or of a random sample of road users. The problems associated with the use of this indirect method are numerous and the number of case-control studies is limited. To facilitate data collection surrogate measures for alcohol-related and non-alcohol-related accidents can be used for the monitoring of changes over time. The choice and use of these measures, however, is complex and poorly documented in most studies. The use of multiple measures in the evaluation of drinking-driving countermeasures may yield different results which will be of help in the interpretation of the effects of a countermeasure.  相似文献   

20.
Decision-making under uncertainty describes most environmental remediation and waste management problems. Inherent limitations in knowledge concerning contaminants, environmental fate and transport, remedies, and risks force decision-makers to select a course of action based on uncertain and incomplete information. Because uncertainties can be reduced by collecting additional data., uncertainty and sensitivity analysis techniques have received considerable attention. When costs associated with reducing uncertainty are considered in a decision problem, the objective changes; rather than determine what data to collect to reduce overall uncertainty, the goal is to determine what data to collect to best differentiate between possible courses of action or decision alternatives. Environmental restoration and waste management requires cost-effective methods for characterization and monitoring, and these methods must also satisfy regulatory requirements. Characterization and monitoring activities imply that, sooner or later, a decision must be made about collecting new field data. Limited fiscal resources for data collection should be committed only to those data that have the most impact on the decision at lowest possible cost.Applying influence diagrams in combination with data worth analysis produces a method which not only satisfies these requirements but also gives rise to an intuitive representation of complex structures not possible in the more traditional decision tree representation. This paper demonstrates the use of influence diagrams in data worth analysis by applying to a monitor-and-treat problem often encountered in environmental decision problems.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号