首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
随着工业物联网的发展,对工业控制领域的要求越来越高.为了迎合物联网的需要,在底层传感器等设备层需要有一些新技术的引入,提供大数据,从而推动物联网的发展.能简化工业控制领域底层设备连接,提供设备的大量数据信息的一种新技术——IO-Link,由此出现.本文重点介绍什么是IO-Link、IO-Link的技术优势,以及IO-L...  相似文献   

2.
The data reviewed here show that histamine, octopamine, and serotonin are abundant in the visual system of the horseshoe crab Limulus polyphemus. Anatomical and biochemical evidence, including new biochemical data presented here, indicates that histamine is a neurotransmitter in primary retinal afferents, and that it may be involved in visual information processing within the lateral eye. The presence of histamine in neurons of the central nervous system outside of the visual centers suggests that this amine also has functions unrelated to vision. However, the physiological actions of histamine in the Limulus nervous system are not yet known. Octopamine is present in and released from the axons of neurons that transmit circadian information from the brain to the eyes, and octopamine mimics the actions of circadian input on many retinal functions. In addition, octopamine probably has major functions in other parts of the nervous system as octopamine immunoreactive processes are widely distributed in the central nervous system and in peripheral motor nerves. Indeed, octopamine modulates functions of the heart and exoskeletal muscles as well as the eyes. A surprising finding is that although octopamine is a circulating neurohormone in Limulus, there is no structural evidence for its release into the hemolymph from central sites. The distribution of serotonin in Limulus brain suggests this amine modulates the central processing of visual information. Serotonin modulates cholinergic synapses in the central nervous system, but nothing further is known about its physiological actions.  相似文献   

3.
Serial histologic sections of a whole human brain may have extensions of up to 130 x 130 mm within the coronal plane around the temporal lobe. To date, however, technology has not provided a bright field microscope that is able to shift the object holder continuously in the x- and y-direction over such distances and still possess the same optical capabilities as comparable devices. We developed a new light microscope to continuously quantify such sections. We also developed the computing environment for controlling the device and for analyzing the data produced. In principle, we are now able to quantify each neuron of a human brain. The data ultimately will provide the most detailed structural information about the human brain ascertained thus far. Such detailed information of the spatial distribution of neurons is essential to develop realistic models for simulation of large-scale neuronal networks and to investigate the significance of neuronal arrangements with respect to neuronal signal processing in the CNS. After preprocessing of the data produced by the new microscope, we are able to detect lamination patterns in the spatial distribution of gravity centers of cells. Furthermore, morphological features like size of the projection area and mean staining intensity are visualized as a particle process. The particle process presents the sizes and staining intensity of perikaryons and allows a distinction of gray matter and white matter. These results provide evidence that the system works correctly and can be applied to a systematic analysis of a larger sequence of serial histologic sections. The objective of this study is to introduce the very large section analyzing microscope (VLSAM) and to present the initial data produced by the system. Moreover, we will discuss workload and future developments of the parallel image analysis system that are associated with the microscope.  相似文献   

4.
This paper presents a study on the frictional anisotropy of semi-crystalline UHMWPE polymer film deposited on DLC-overcoated Si substrate. For UHMWPE film slid against a silicon nitride ball, there is a remarkable difference in the coefficient of friction between the forward and reverse directions after the slider has been initially slid against the film for certain number of cycles. The changes in the friction are greatly influenced by the initial number of sliding cycles. This frictional behavior is explained in terms of crystallinity change and molecular orientational effects on UHMWPE and micro-topographical effects due to the initial sliding. Nanoscratch test is conducted to understand the friction of the polymer film in the sliding track and the data are compared with the macroscale friction data. The results show that the friction in the reverse of the initial sliding direction is high in comparison to that in the forward direction and this behavior mainly depends upon the number of initial sliding cycles. The initial sliding cycles affect the crystallinity and molecular orientation of the film, as well as the film topography. This combined effect on the polymer film results in an anisotropic frictional behavior of the film.  相似文献   

5.
A means for improving the contrast in the images produced from digital light micrographs is described that requires no intervention by the experimenter: zero‐order, scaling, tonally independent, moderated histogram equalization. It is based upon histogram equalization, which often results in digital light micrographs that contain regions that appear to be saturated, negatively biased or very grainy. Here a non‐decreasing monotonic function is introduced into the process, which moderates the changes in contrast that are generated. This method is highly effective for all three of the main types of contrast found in digital light micrography: bright objects viewed against a dark background, e.g. fluorescence and dark‐ground or dark‐field image data sets; bright and dark objects sets against a grey background, e.g. image data sets collected with phase or Nomarski differential interference contrast optics; and darker objects set against a light background, e.g. views of absorbing specimens. Moreover, it is demonstrated that there is a single fixed moderating function, whose actions are independent of the number of elements of image data, which works well with all types of digital light micrographs, including multimodal or multidimensional image data sets. The use of this fixed function is very robust as the appearance of the final image is not altered discernibly when it is applied repeatedly to an image data set. Consequently, moderated histogram equalization can be applied to digital light micrographs as a push‐button solution, thereby eliminating biases that those undertaking the processing might have introduced during manual processing. Finally, moderated histogram equalization yields a mapping function and so, through the use of look‐up tables, indexes or palettes, the information present in the original data file can be preserved while an image with the improved contrast is displayed on the monitor screen.  相似文献   

6.
With the tremendous advances in electronics today, it is now possible to take existing analytical instruments and give then a "brain" so that many analytical procedures which were impossible or costly a few years ago can now be done with relative ease, at a fraction of the costs. This paper deals with a microcomputer-controlled air monitor which allows us to identify many different components at a multitude of different locations. System operation is explained as well as data accuracy as they relate to application in a plastics research pilot plant.  相似文献   

7.
Biotribology and tribocorrosion are often not included in numerical or computational modeling efforts to predict wear because of the apparent complexity in the geometry, the variability in removal rates, and the challenge associated with mixing time-dependent removal processes such as corrosion with cyclic material removal from wear. The lollipop is an accessible bio-tribocorrosion problem that is well known but underexplored scientifically as a tribocorrosion process. Stress-assisted dissolution was found to be the dominant tribocorrosion process driving material removal in this system. A model of material removal was described and approached by lumping the intrinsically time-dependent process with a mechanically driven process into a single cyclic volumetric material removal rate. This required the collection of self-reported wear data from 58 participants that were used in conjunction with statistical analysis of actual lollipop cross-sectional information. Thousands of repeated numerical simulations of material removal and shape evolution were conducted using a simple Monte Carlo process that varied the input parameters and geometries to match the measured variability. The resulting computations were analyzed to calculate both the average number of licks required to reach the Tootsie Roll® center of a Tootsie Roll® pop, as well as the expected variation thereof.  相似文献   

8.
S. Myrdal  M. Foster 《Scanning》1994,16(3):155-167
The in vivo function of a biologically active molecule is governed in part by the dynamics of its distribution within its target tissue. To enhance our ability to probe living cells, we have endeavored to improve live confocal microscopy methods and to develop analytical methods that simplify the handling of the resulting complex data sets. To do this we attached a recently developed micro-incubation system to the stage of a Leica confocal laser scanning microscope and were able to maintain physiologic culture conditions over several hours. Axial stability was achieved by modifying the room air conditioning. Laser illumination was low enough to retain cell viability through several hours of continuous scanning. With this setup, planar, time-resolved data sets (xyt) were produced by continuously rescanning a single xy plane at the rate of one scan/min. As an alternative, volumetric data sets (xyz) were acquired by stepping the scanned plane through the z axis. In both types of data sets, a semi-quantitative determination of the concentration of a fluorescent reporter molecule (e.g., FITC) over a gray level range of 0--255 was recorded along with the positional information. Thus, concentration (as intensity of fluorescence, or i) gave a fourth variable by either scan method, resulting in high-density xyti or xyzi data sets. The biological model we used to examine these methods was the penetration of a FITC-labeled, anti-carcinoma monoclonal antibody into cultured spheroids of tumor cells bearing the antibody-binding epitope. In one case, the distribution of antibody-FITC conjugate was compared with that of a long wavelength membrane dye. DiIC18(5). Several different software analyses were compared, including examining xyt data sets as “volumes.” We observed that by increasing the displayed resolution of one variable, the demonstrable resolution of the other variables was reduced. For example, with high temporal resolution, either quantitative or positional resolution had to be sacrificed. Thus, we needed to perform several different analyses of a single data set to compare all of the variables properly. In these experiments, the dynamic aspects of the changes in antibody-FITC distribution were examined. Along with comparison of antibody-FITC penetration with that of DiI, these data suggest an as yet unexplained biological transport of antibody into a tumor spheroid, which is not consistent with mere passive diffusion through the fluid of extracellular clefts. Using this model system, we have performed and analyzed highly time-resolved confocal microscopy on living specimens maintained under physiologic conditions.  相似文献   

9.
The requirements for implementing a radiology imaging network are similar to those for local area networks now being designed for other purposes to manage large data films. A radiology department serving a 500-bed hospital generates about 927 megabytes of digitally formatted data per working day. These data are expected to be on line for the patient's hospitalization period. The retrieval rate of these data among the interactive diagnosis display stations requires data throughput rates of between 2 and 5 megabits per second. This throughput rate requires signaling rates of between 20 and 50 megabits per second. Analog hard-copy generation of the images on the network is required by the referring physician for selected images that support the consultation report. Digital laser recorders using paper may be quite satisfactory. Long-term archiving must be low in cost and requires a database scheme capable of managing more than a terabyte of image data. Radiology networks must be required to bridge with other hospital information systems.  相似文献   

10.
This article describes a technique to measure the temperature of a resistively heated ferromagnetic wire. The wire's temperature rapidly increases, a scenario in which a thermocouple or thermistor's thermal inertia prevents it from keeping up with the rapid temperature variation. The temperature is derived from electrical measurands (voltage and current) and time, as well as thermophysical data such as heat losses and emissivity, and is based on a dynamical thermal-electrical energy conservation principle. We go on to use our technique for the quantitative determination of the Curie point as well as the magnetic susceptibility at elevated temperatures. The results are in good agreement with accepted values.  相似文献   

11.
The deformation and fracture of specimens of a carbon-carbon composite material with different dimensions of a stress concentrator in the form of a central hole with diameters of 7, 10, and 13 mm were studied using a developed combined method. The results of a numerical analysis of experimental data are presented in the form of diagrams (dependences) of the shear-deformation intensity and the acoustic-emission activity as functions of the loading time. The factors that cause similarity and differences of the investigation results are discussed and interpreted. It is proposed that the obtained data be used for the nondestructive testing of composite materials via the selection of characteristic stages of the deformation development and the moment that precedes fracturing.  相似文献   

12.
This article describes the design and implementation of a wearable, multiparameter physiological monitoring system called the Sensing Belt system, which consists of multiple sensors integrated into fabric that communicates with a physiological data acquisition unit (PDAU) that in turn transmits these data to a remote monitoring center (RMC) for analysis. A number of vital signs can be acquired by the system, including electrocardiography (ECG), respiratory inductance plethysmograph (RIP), posture/activity, multipoint skin temperature (TSK), and rectal temperature (TRC). The physiological data can be stored on a MicroSD card or transmitted to the RMC, where specialized analysis will be provided to extract parameters such as heart rate (HR), respiratory rate (RR), respiratory sinus arrhythmia (RSA), and human energy expenditure. The RMC can receive physiological data from up to 16 Sensing Belt users simultaneously. A medical validation test was carried out to compare the accuracy of the physiological data obtained from the Sensing Belt system with data obtained concurrently from traditional, calibrated laboratory physiological monitoring instruments. The results showed that most of the variables measured by the Sensing Belt are within acceptable error limits. The mean temperature on two trials (walking and running) showed significantly higher mean differences than on other trials, but the correlation coefficient (r) remained high (0.985 and 0.989, respectively). This study demonstrates the accuracy of the Sensing Belt system for the monitoring of these physiological parameters and suggests that it could be used to provide a complete human physiological monitoring platform for the study of human heat stress, cold stress, and thermal comfort.  相似文献   

13.
The application of data mining techniques in the design of modern foundry materials allows achieving higher product quality indicators. Designing of a new product always requires thorough knowledge of the effect of alloying elements on the microstructure and hence also on the properties of the examined material. The conducted experimental studies allow for a qualitative assessment of the indicated relationships, but it is the use of intelligent computational techniques that enables building an approximation model of the microstructure and, owing to this, make predictions with high precision. The developed model of prediction supports the technology-related decisions as early as at the stage of casting design and is considered the first step in selecting the type of material used.  相似文献   

14.
Discrimination between three different sources of variability in a vibration-based structural health monitoring system is investigated: environmental or operational effects, sensor faults, and structural damage. Separating the environmental or operational effects from the other two is based on the assumption that measurements under different environmental or operational conditions are included in the training data. Distinguishing between sensor fault and structural damage utilizes the fact that the sensor faults are local, while structural damage is global. By localizing the change to a sensor which is then removed from the network, the two different influences can be separated. The sensor network is modelled as a Gaussian process and the generalized likelihood ratio test (GLRT) is then used to detect and localize a change in the system. A numerical and an experimental study are performed to validate the proposed method.  相似文献   

15.
Multiphase flow, especially two-phase gas-liquid flow, is of great importance for a variety of applications and industrial processes, for example in the nuclear, chemical, or oil and gas industries. In this contribution, we present simulation results for gas-liquid slug flow in large horizontal pipes. Six test cases with different oil, water, and gas flow rates are considered, which cover a wide range of different slug flows. The numerical predictions are validated by comparison with experimental data obtained from video observations. The relative error of the mean liquid level between experiment and simulation is less than 12.3% for all but one test cases. Furthermore, a frequency analysis is performed. The single-sided amplitude spectrum as well as the smoothed power spectral density are calculated. For both, experimental and simulation data, one observes an increase of the dominant frequencies if the ratio of liquid and gas superficial velocity is increased.  相似文献   

16.
Miniaturized machine tools have been established as a promising technology for machining the miniature components in wider range of materials. Spindle of a miniaturized machine tool needs to provide extremely high rotational speed, while maintaining the accuracy. In this work, a capacitive sensor-based measurement technique is followed for assessing radial errors of a miniaturized machine tool spindle. Accuracy of spindle error measurement is affected by inherent error sources such as sensor offset, thermal drift of spindle, centering error, and form error of the target surface installed in the spindle. In the present work, a model-based curve-fitting method is proposed for accurate interpretation and analysis of spindle error measurement data in time domain. Experimental results of the proposed method are presented and compared with the commonly followed discrete Fourier transform-based frequency domain-filtering method. Proposed method provides higher resolution for the estimation of fundamental frequency of spindle error data. Synchronous and asynchronous radial error values are evaluated in accordance with ANSI/ASME B89.3.4M [9] standard at various spindle speeds and number of spindle revolutions. It is found that the spindle speed and number of spindle revolutions does not have much influence on synchronous radial error of the spindle. On the other hand, asynchronous radial error motion exhibits a significant speed-dependant behavior with respect to the number of spindle revolutions.  相似文献   

17.
Past condition monitoring techniques of gearboxes have utilised many different approaches such as time-series averaging, amplitude and phase demodulation, time–frequency distribution and wavelet analysis. Only recently have statistical approaches taken a hold in gear tooth failure detection. Non-linear adaptive algorithms for independent component analysis (ICA) have been shown to separate unknown, statistically independent sources that have been mixed in dynamic systems. This paper proposes the application of an information maximisation based blind source separation algorithm (a type of ICA) to gear vibration measurements. It is shown that the individual gear and pinion vibrations cannot be separated using the blind separation algorithm, but the learning curve of the updated parameter can be used to detect impulsive and random changes in the data. It is shown that the algorithm is capable of tracking the higher-order statistics of the meshing signature using a single measure. This results in a detection scheme that is shown to find localised damage in a single tooth failure, two adjacent teeth failures, two non-adjacent teeth failures and multiple adjacent teeth failures. This method does not need a priori information about the loading, speed or type of gear measured.  相似文献   

18.
本实验室参加了由法国洋酒协会组织的,针对全球范围内的各种品牌的名酒建立数据库的课题。本文主要为该课题提供Ca,Cu,Fe,Pb微量元素的分析数据。从法国洋酒协会数据统计中心处理的近100多家分析室上报的数据可知,本实验室分析的均在每次统计的正态分布图之中,接近中心值,结果满意。  相似文献   

19.
Scanning ion microscopy has received a boost in the last decade, thanks to the development of novel ion sources employing light ions, like He+, or ions from inert gases, like Ne+ and Ar+. Scanning ion images, however, might not be as easy to interpret as SEM micrographs. The contrast mechanisms are different, and there is always a certain degree of sample sputtering. The latter effect, on the one hand, prevents assessing the resolution on the basis of a single image, and, on the other hand, limits the probing time and thus the signal-to-noise ratio that can be obtained. In order to fully simulate what happens when energetic ions impact on a sample, a Monte Carlo approach is often used. In this paper, a different approach is proposed. The contrast is simulated using curves of secondary electron yields versus the incidence angle of the beam, while the surface modification prediction is based on similar curves for the sputtering yield. Finally, Poisson noise from primary ions and secondary electrons is added to the image. It is shown that the evaluation of an ion imaging tool cannot be condensed in a single number, like the spot size or the edge steepness, but must be based on a more complex analysis taking into account at least three parameters: sputtering, contrast and signal-to-noise ratio. It is also pointed out that noise contributions from the detector cannot be neglected for they can actually be the limiting factor in imaging with focused ion beams. While providing already good agreement with experimental data in some imaging aspects, the proposed approach is highly modular. Further effects, like edge enhancement and detection, can be added separately.  相似文献   

20.
Mass customization (MC) as a business strategy is designed to simultaneously compete on two rival competitive priorities—the price and customization level of a product. MC academics and experts have gone a step further. They suggest that MC is a unique strategy whose implementation promises across-the-board improvement in all four of the competitive priorities (price, quality, flexibility, and speed) simultaneously. Its growing adoption by businesses in recent years, the steep rise in success stories associated with MC, and the voluminous body of publications in a short period of its existence have created a need to study the directions, trends, application potential, and research strategies embedded in these publications. Accordingly, this paper studies and analyzes the trends and directions of the research published in 1,124 MC publications that have appeared in journals and magazines since the inception of the term mass customization in 1987 by Stan Davis in his classic book Future Perfect. Statistical trend analyses are conducted to study the vitality and health of the field of MC using number of publications and number of publication outlets and their respective trends. The publication outlet data conform to an S curve, establishing maturity of the MC field. The publication data show that the MC field has passed through four stages of growth: incubation or slow (1987–1992), exponential (1993–2003), stable and matured (2003–2005). There is a slight dip in 2006 in terms of publication outlets; there are, however, confirmatory factors that indicate that the dip in 2006 may be an outlier. This paper also suggests developing a clear understanding of the value and type of research embodied in MC publications through three types of taxonomic analyses. The frameworks for all three taxonomies are set forth, two of which have been previously employed in other areas of OR/MS (Reisman and Kirschnik, Oper Res 42(4):577–588, 1994; Oper Res 43(5):731–740, 1995): The first taxonomic framework first classifies the paper as a theory paper or an application paper. At the second stage, the application content of the publication is determined based on a five-point scale ranging from simple modeling of the real world to bona fide real-world application. The second taxonomic framework suggests usage of a taxonomy comprised of seven distinct types of research strategies. The former analysis provides important information about the application worthiness of the MC publications and hence their usefulness to the real world. The second analysis provides information about the type of research strategies used by MC researchers, which, in turn, allows drawing conclusions about the quality and rigor of such research. The third taxonomic framework suggested recommends classification of all publications among multi-level containers based on the disciplines that intersect with MC and their branches.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号