首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The majority of Statistical Quality Control (SQC) microcomputer systems/packages are based on the idea of storing data and producing results such as QC charts from which a user may be able to make inferences regarding Quality. The selection of the type of control charts (e.g. charts for variables vs. charts for attributes) and interpretation of the results are usually left to the user. This is mainly due to the fact that SQC packages are generic systems while interpretations may be product specific.

Presented in this paper is a system which tries through dialogue with the user to direct him to the proper chart(s) in the package. An analysis component for each chart determines statistical phenomena (both good and bad) and provides general explanations. Another component in the system accesses an accompanying knowledge database (DB) — keyed by phenomena — which provide possible explanation as well as advise. The user may add to the knowledge base at any time assisted by the statistical DB Management Subsystem.

The elements of the system are the IIE Microsoftware Statistical Quality Control package and a number of “add-on” routines supporting DB creation and the inference engine.  相似文献   


2.
In June 1989, Georgetown University Hospital (GUH) implemented an internally developed, Model 204 DBMS-based comprehensive Decision Support tool, the Financial Performance Report (FPR). FPR was designed to bring order out of management reporting chaos by integrating labor productivity reporting and cost accounting in a progressively detailed display that provides the basis for “Total Productivity Management”. This paper will describe the philosophy that lead to developing the tool, the objectives, the system components, and the principal features of the report.  相似文献   

3.
The genomics, proteomics, clinical, and drug discovery laboratories have a growing need to maintain valuable samples at ultra-low (−80°C) temperatures in a validated, secure environment. Automated sample processing systems have until now required manual (off-line) storage of samples at −80°C, reducing system reliability and speed. Both of these important needs are addressed by the Sample Process Management System being introduced by BIOPHILE Inc. Conventional sample management processes, such as storage, retrieval, and cataloging, are increasingly strained by the growing sample populations. There are variable sample types, access requirements and storage requirements. Security and inventory procedures are implemented manually. The evolving technologies present in the laboratory cannot interface with conventional manual storage techniques. Addressing these limitations, the primary benefits of BIOPHILE's solutions are:
• Fully validated sample management process that coordinates the life-cycles of samples and their related data.
• Robotic technology to securely store and retrieve samples, improving their accessibility and stability. Thermal shock is reduced, improving sample longevity and quality. The robotic technology allows integration with larger automation systems.
• A process program to develop a Sample Management Strategy. This strategy is developed by analyzing long-term research goals, current baseline processes, and identification of current sample life cycles. A full validation documentation package can be generated, providing a high level of quality assurance.
• Improved sample visibility and quality assurance - automated sample population cataloging; controlled sample management access and security.
  相似文献   

4.
MICOSAS (pronounced “my c sas”, from Mixed Integer Continuous Scheduling Assist System), schedules the production of nearly one thousand parts on twenty-eight machines to result in the minimum total number of mahcine shifts assigned. The microcomputer software system consists of three custom-designed modules and a commercially available mathematical programming package. It enables the user to obtain an initial feasible schedule for a typical scheduling scenario in less than an hour. The user can then perturb the parameters defining the mathematical model in seeking better schedules with about the same turnaround time.  相似文献   

5.
L M Schleifer  O G Okogbaa 《Ergonomics》1990,33(12):1495-1509
Psychophysiological effects of computer system response time (slow vs. rapid) and method of pay (incentive vs. nonincentive) were assessed in a computer-based data entry task among forty-five professional typists. Cardiovascular responses (i.e., heart rate and blood pressure) were monitored on a regular basis over four consecutive workdays. Heart rate and blood pressure did not vary significantly with slow or rapid response times. Incentive pay, however, significantly increased blood pressure and decreased heart rate variability across the workdays compared to nonincentive pay. Irrespective of response time or method of pay, performance of the data entry task for sustained periods of time was associated with reduced heart rate and increased heart rate variability. This temporal effect was indicative of reduced effort or increased mental fatigue. The results of this study suggest that incentive pay programmes in data entry work may produce stress-related physiological reactivity among healthy workers.  相似文献   

6.
《Ergonomics》2012,55(12):1495-1505
Psychophysiological effects of computer system response time (slow vs. rapid) and method of pay (incentive vs. nonincentive) were assessed in a computer-based data entry task among forty-five professional typists. Cardiovascular responses (i.e., heart rate and blood pressure) were monitored on a regular basis over four consecutive workdays. Heart rate and blood pressure did not vary singificantly with slow or rapid response times. Incentive pay, however, significantly increased blood pressure and decreased heart rate variability across the workdays compared to nonincentive pay. Irrespective of response time or method of pay, performance of the data entry task for sustained periods of time was associated with reduced heart rate and increased heart rate variability. This temporal effect was indicative of reduced effort or increased mental fatigue. The results of this study suggest that incentive pay programmes in data entry work may produce stress-related physiological reactivity among healthy workers.  相似文献   

7.
This paper presents a case of introducing new technology to a single stage in a maintenance operation composed of sequence of stages. The process - Thermal tile replacement - is a low volume, high value operation. A method for data collection at each stage, to estimate the variability in process quality, cost and duration is presented. The method involves: Identifying key product features, accuracy measure for each, rate of product rejection by feature and the associated probability density functions at each stage. The method relates accuracy variability by feature, “effect” to the contributing stage in the process “cause”. Simulation is used to justify the introduction of a new technology and to predict the percentage of product conformity in a “before” and “after” scenarios for the implementation of the new technology. The simulation model enables the quantification of technology impact on the product quality, overall productivity and the associated cost savings.  相似文献   

8.
Current effort at Kennedy Space Center (KSC) involves implementation of voice data entry in the thermal Tile Processing System (TPS). Large amounts of data collection and recording along with the advancement in voice recognition and speech synthesis justified the introduction of a voice data entry system in support of TPS.

This paper presents a training and an implementation strategies for the VDE System in TPS at KSC. Training Strategies are tailored to different levels of the target population with varying objectives for the training sessions. Training scenarios and training manuals are designed to support the objectives of each session. A unique feature of the VDE is the fact that the “system” also needs to be trained for specific users.

Implementation strategy calls for parallel implementation for the current manual system over a period of four consecutive missions. This would be followed by gradual phasing out of the current manual system and phasing in of the VDE system over a span of three missions.

The methodologies used and the developed strategies can be used for the introduction of new technologies in general, and that related to voice and speech recognition in particular.  相似文献   


9.
基于物资管理系统中对物料编码进行统一管理的需求,前台在DELPHI 2007中引入OCI11组件,后台采用Oracle 11G数据库,设计了一套物料编码管理子系统。该系统设计灵活、扩展方便、级长及每级长度都可自定义,每一级的每一个节点都可以作为叶子节点,节点上的属性包括名称、型号、计量单位、存储仓库、货架号和质量等级等,物资管理系统中对物料的引用都是物料编码,实现了物资管理系统中对物料的高效查询、分组统计便捷、仓库移库方便和货架号移位方便等管理要求。  相似文献   

10.
The Total Productivity Model (TPM), developed by Sumanth in 1979, has been applied to a number of situations involving manufacturing as well as service-oriented operations. The diversity of applications on the one hand, and the flexibility of usage of the microcomputers on the other hand, have helped to focus the need for a micro-computer based Decision Support System (DSS). This paper presents such a system using the Macintosh Computer. This DSS is a highly interactive, menu-driven program that can provide on-the screen capabilities of individual operational units as well as the firm that comprises of them. The system has several convenient features to assess the “PRODUCTIVITY- ORIENTED PROFITABILITY” of any type companies/organizations. The system logic and flow chart, data input and output formats, and sensitivity analysis will be shown and discussed in the paper. A balanced critique will be offered to view the system in a proper perspective with respect to structural elements of the Decision Support System.  相似文献   

11.
This paper presents a proof-theoretical framework that accounts for the entire process of register allocation—liveness analysis is proof reconstruction (similar to type inference), and register allocation is proof transformation from a proof system with unrestricted variable accesses to a proof system with restricted variable access. In our framework, the set of registers acts as a “working set” of the live variables at each instruction step, which changes during the execution of the code. This eliminates the ad hoc notion of “spilling”. Memory–register moves are systematically incorporated in our proof transformation process. Its correctness is a direct corollary of our construction; the resulting proof is equivalent to the proof of the original code modulo treatment of structural rules. The framework serves as a basis for reasoning about formal properties of register allocation process, and it also yields a clean and systematic register allocation algorithm. The algorithm has been implemented, demonstrating the feasibility of the framework.  相似文献   

12.
位平面编码(BPC)是JPEG2000编码器中EBCOT的重要组成部分。为了解决BPC实现的低效问题,提出了一种改进的并行硬件结构来实现字级位平面编码算法。对编码通道预测和上下文形成的流水线并行处理技术进行了研究,实现了在一个时钟周期内对一个条带列的所有位样本并行编码。当样本系数被顺次编码时,包含在每个位平面中的三个通道编码在一次扫描中完成。该系统结构已经通过ModelSim实现和TSMC综合。结果表明,该结构可以有效减少硬件成本,并提供高速的数据处理能力,适合实时图像和视频的应用。  相似文献   

13.
激励机制是P2P(对等网)中的一个重要的研究课题。P2P这种新的成功模式需要建立一个应用已有成熟的技术的激励机制来实现。本文提出了一种在P2P环境申基于拍卖的激励机制。在P2P所有资源都标以虚拟价格,P2P中节点通过拍卖来决定它获得需要的怎样所应该付出的代价,并用虚拟货币进行支付,而节点通过贡献资源得到虚拟货币。在这种激励机制下,节点可以最大化利用自己的资源,如果P2P每个节点都能够实现资源的最大化利用,我们就可以实现在P2P中自私的节点也愿意贡献自己的资源这一目的。  相似文献   

14.
激励机制广泛应用于群智感知、P2P视频点播、机会网络等场景中,是提升信息网络服务质量与效率的关键。现有的激励机制通常依赖于类似银行的可信中心,但可信中心因其管控不透明、易受攻击等特征而存在系统信任缺失和隐私泄露问题。基于区块链的激励机制可作为上述问题的解决方案,区块链具有去中心化、开放性、不可窜改、匿名性等特征,可在互不了解的多方间建立可靠的信任关系,而且基于区块链的密码货币获得现实世界广泛的关注和认可。介绍区块链技术及密码货币差异;然后总结基于区块链的激励机制研究现状,包括激励机制交易形式、激励机制分类和激励机制评价标准;最后对现有激励机制进行总结和展望。  相似文献   

15.
基于小波和阈值的矢量量化图像压缩方法   总被引:3,自引:0,他引:3  
毛玉星  杨士中 《计算机应用》2003,23(4):38-39,42
文章介绍一种新的矢量分类搜索算法,在保持较好分类效果的情况下,加快了码书的搜索速度,同时在给定失真测度的前提下,使码书的码元数量随不同的图像而变化,自适应地调整分类结果。分类完毕后对码书不同子带和特定顺序分级组织,再对相关数据进行游程编码和熵编码。实验表明,此方法具有很好的压缩效果和较快的计算速度。  相似文献   

16.
为了提高质量可伸缩高性能视频编码(SHVC)的编码速度,提出一种基于质量SHVC的帧内预测算法。首先,利用层间相关性来预测可能的深度,排除可能性较小的深度;其次,对可能的编码深度,采用层间预测(ILR)模式进行编码,并对得到的残差系数进行分布拟合检验,判断是否满足拉普拉斯分布从而跳过帧内模式;最后,对深度编码得到的深度残差系数判断是否满足深度提前终止判断条件,如果满足该条件则提前终止以提高编码速度。实验结果表明,所提算法能够在保证编码效率损失很小的情况下使编码速度提高79%。  相似文献   

17.
Constrained multibody system dynamics an automated approach   总被引:1,自引:0,他引:1  
The governing equations for constrained multibody systems are formulated in a manner suitable for their automated, numerical development and solution. Specifically, the “closed loop” problem of multibody chain systems is addressed.

The governing equations are developed by modifying dynamical equations obtained from Lagrange's form of d'Alembert's principle. This modification, which is based upon a solution of the constraint equations obtained through a “zero eigenvalues theorem,” is, in effect, a contraction of the dynamical equations.

It is observed that, for a system with n generalized coordinates and m constraint equations, the coefficients in the constraint equations may be viewed as “constraint vectors” in n-dimensional space. Then, in this setting the system itself is free to move in the nm directions which are “orthogonal” to the constraint vectors.  相似文献   


18.
现有的低延迟语音编码算法(LD-CELP)需要16 kb/s比特率,无疑会妨碍它的应用。提出了一种采用两阶段码书搜索的方法可以在提高低延迟语音编码算法性能的同时降低码率。首先构造了两个子码书:一个后向更新的自适应码书和一个具有代数结构的固定码书;然后设计了两阶段码书搜索方法使滤波后的激励矢量和目标矢量之间的均方误差保持最小。这样就得到了一个在8 kHz采样率下具有2.5 ms延迟的10 kb/s两阶段码书搜索的CELP编码器。用平均分段信噪比(SSNR)和感知语音质量评价(PESQ)测试,本算法具有和16 kb/s的G.728相当的编码质量。  相似文献   

19.
In this contribution we report about a study of a very versatile neural network algorithm known as “Self-organizing Feature Maps” and based on earlier work of Kohonen [1,2]. In its original version, the algorithm addresses a fundamental issue of brain organization, namely how topographically ordered maps of sensory information can be formed by learning.

This algorithm is investigated for a large number of neurons (up to 16 K) and for an input space of dimension d900. To meet the computational demands this algorithm was implemented on two parallel machines, on a self-built Transputer systolic ring and on a Connection Machine CM-2.

We will present below

1. (i) a simulation based on the feature map algorithm modelling part of the synaptic organization in the “hand-region” of the somatosensory cortex,
2. (ii) a study of the influence of the dimension of the input-space on the learning process,
3. (iii) a simulation of the extended algorithm, which explicitly includes lateral interactions, and
4. (iv) a comparison of the transputer-based “coarse-grained” implementation of the model, and the “fine-grained” implementation of the same system on the Connection Machine.
  相似文献   

20.
This paper proposes a compound image coding method named united coding (UC). In UC, several lossless coding tools such as dictionary-entropy coders, run-length encoding (RLE), Hextile, and a few filters used in portable network graphics (PNG) format are united into H.264 like intraframe hybrid video coding. The basic coding unit (BCU) has a size typically between 16?×?16 pixels to 64?×?64 pixels. All coders in UC are used to code each BCU. Then, the lossless coder that generates minimum bit-rate (R) is chosen as the optimal lossless coder. Finally, the final optimal coder is chosen from the lossy intraframe hybrid coder and the optimal lossless coder using R-D cost based optimization criterion. Moreover, the data coded by one lossless coder can be used as the dictionary of other lossless coders. Experimental results demonstrate that compared with H.264, UC achieves up to 20 dB PSNR improvement and better visual picture quality for compound images with mixed text, graphics and natural picture. Compared with lossless coders such as gzip and PNG, UC can achieve 2–5 times higher compression ratio with just a minor loss and keep partial-lossless picture quality. The partial-lossless nature of UC is indispensable for some typical applications, such as cloud computing and rendering, cloudlet-screen computing and remote desktop, where lossless coding of partial image regions is demanded. On the other hand, the implementation complexity and cost increment of UC is moderate, typically less than 25 % of a traditional hybrid coder such as H.264.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号