共查询到20条相似文献,搜索用时 15 毫秒
1.
A. V. Bataev A. A. Davydov N. Yu. Nalutin S. V. Sinitsyn 《Automatic Control and Computer Sciences》2011,45(7):373-379
A method of preparation of the test data that provides a specified level of coverage of the requirements for functional testing
is proposed. The application of the method simplifies the maintenance of the project life cycle data configuration, which
includes the requirements, source code, and generated tests, in a consistent state. A classification of software defects is
introduced. An approach to formalization of the analysis requirement and implementation of the tested system based on the
presentation of a partition into classes of equivalency in the form of a logic equation system is proposed. An approximate
method of solving the acquired equations is proposed. The applicability of the method in real processes of industrial projects
is discussed. 相似文献
2.
Mathieu Le Coz François Delclaux Pierre Genthon Guillaume Favreau 《Computers & Geosciences》2009,35(8):1661-1670
Digital Elevation Models (DEMs) are used to compute the hydro-geomorphological variables required by distributed hydrological models. However, the resolution of the most precise DEMs is too fine to run these models over regional watersheds. DEMs therefore need to be aggregated to coarser resolutions, affecting both the representation of the land surface and the hydrological simulations. In the present paper, six algorithms (mean, median, mode, nearest neighbour, maximum and minimum) are used to aggregate the Shuttle Radar Topography Mission (SRTM) DEM from 3″ (90 m) to 5′ (10 km) in order to simulate the water balance of the Lake Chad basin (2.5 Mkm2). Each of these methods is assessed with respect to selected hydro-geomorphological properties that influence Terrestrial Hydrology Model with Biogeochemistry (THMB) simulations, namely the drainage network, the Lake Chad bottom topography and the floodplain extent.The results show that mean and median methods produce a smoother representation of the topography. This smoothing involves the removing of the depressions governing the floodplain dynamics (floodplain area<5000 km2) but it eliminates the spikes and wells responsible for deviations regarding the drainage network. By contrast, using other aggregation methods, a rougher relief representation enables the simulation of a higher floodplain area (>14,000 km2 with the maximum or nearest neighbour) but results in anomalies concerning the drainage network. An aggregation procedure based on a variographic analysis of the SRTM data is therefore suggested. This consists of preliminary filtering of the 3″ DEM in order to smooth spikes and wells, then resampling to 5′ via the nearest neighbour method so as to preserve the representation of depressions. With the resulting DEM, the drainage network, the Lake Chad bathymetric curves and the simulated floodplain hydrology are consistent with the observations (3% underestimation for simulated evaporation volumes). 相似文献
3.
斑图生成是一个具有生物学背景的复杂系统问题,相比传统的动力学等方法,采用CUP算法进行斑图的生成和分析。同时CUP算法具有容易并行、解释方式多样(概率,结构化)、具有多尺度求解和解释能力、参数多样化和结构多样化等优点,其斑图应用对复杂系统的研究分析有实际的指导意义。 相似文献
4.
Multidimensional visualisation for process historical data analysis: a comparative study with multivariate statistical process control 总被引:1,自引:1,他引:1
This paper describes a comparative study of a multidimensional visualisation technique and multivariate statistical process control (MSPC) for process historical data analysis. The visualisation technique uses parallel coordinates which visualise multidimensional data using two dimensional presentations and allow identification of clusters and outliers, therefore, can be used to detect abnormal events. The study is based on a database covering 527 days of operation of an industrial wastewater treatment plant. It was found that both the visualisation technique and MSPC based on T2 chart captured the same 17 days as “clearly abnormal” and another eight days as “likely abnormal”. Pattern recognition using K-means clustering was also applied to the same data in literature and was found to have identified 14 out of the 17 “clearly abnormal” days. 相似文献
5.
Thi Thieu Hoa Le Luigi Palopoli Roberto Passerone Yusi Ramadian 《International Journal on Software Tools for Technology Transfer (STTT)》2013,15(3):211-228
The growing level of complexity of modern embedded systems, which are increasingly distributed and heterogeneous, calls for new design approaches able to reconcile mathematical rigour with flexibility, user-friendliness and scalability. In the last few years, Timed Automata (TA) have emerged as a very promising formalism for the availability of very effective verification tools. However, their adoption in the industrial community is still slow. The reluctance of industrial practitioners is partly motivated by persistent concerns on the ease of use of this formalism, on the scalability of the verification process and on the quality of the feedback that the designer receives. In this paper, we discuss these issues by showing the application of the TA formalism on a case study of industrial complexity. We expose the generality of the approach, the efficiency of state of the art tools, but also the limitations in the semantics and in dealing with free design parameters. 相似文献
6.
Eshghi Amin Mousavi S. Meysam Mohagheghi Vahid 《Neural computing & applications》2019,31(9):5109-5133
Neural Computing and Applications - Major factors of project success include using tools of performance measurements and feedbacks. Earned value management (EVM) is a unique issue within... 相似文献
7.
S. S. Luque 《International journal of remote sensing》2013,34(13-14):2589-2610
Natural disturbance suppression and anthropogenic perturbations have altered the composition and structure of the New Jersey Pinelands National Reserve (NJPNR). The combination of satellite remote sensing imagery and GIS provided the means to map and monitor land cover change at landscape level scales in the NJPNR. The Pinelands has experienced a change in landcover, with the mixed deciduous forest replacing the pine forest community. 相似文献
8.
Using the analytic network process (ANP) in a SWOT analysis - A case study for a textile firm 总被引:5,自引:0,他引:5
?hsan Yüksel 《Information Sciences》2007,177(16):3364-3382
Strengths, Weaknesses, Opportunities and Threats (SWOT) analysis does not provide an analytical means to determine the importance of the identified factors or the ability to assess decision alternatives according to these factors. Although the analysis successfully pinpoints the factors, individual factors are usually described briefly and very generally. For this reason, SWOT analysis possesses deficiencies in the measurement and evaluation steps. Although the analytic hierarchy process (AHP) technique removes these deficiencies, it does not allow for measurement of the possible dependencies among the factors. The AHP method assumes that the factors presented in the hierarchical structure are independent; however, this assumption may be inappropriate in light of certain internal and external environmental effects. Therefore, it is necessary to employ a form of SWOT analysis that measures and takes into account the possible dependency among the factors. This paper demonstrates a process for quantitative SWOT analysis that can be performed even when there is dependence among strategic factors. The proposed algorithm uses the analytic network process (ANP), which allows measurement of the dependency among the strategic factors, as well as AHP, which is based on the independence between the factors. Dependency among the SWOT factors is observed to effect the strategic and sub-factor weights, as well as to change the strategy priorities. 相似文献
9.
F. Sunar 《International journal of remote sensing》2013,34(2):225-235
The analysis of changes has become an important use of multi-spectral Landsat TM data. With the repetitive acquisition of imagery, it is possible to determine the types and extent of changes in an environment. Many digital change-detection techniques, such as image overlay, image differencing, and principal component analysis, have been used widely for this purpose. The objective of this study was to detect land cover changes by using the multi-date Landsat TM imagery for the Ikitelli area, Istanbul, Turkey, employing different methods. Each change detection method used was assessed, with its ability to detect specific changes. 相似文献
10.
Since the 1980s, the development of information and communication technologies (ICTs) has greatly changed people’s modes of production and lifestyle, and it has also had a significant influence on traditional social structures. Microblogs – a type of social media application such as Twitter or Weibo – have served as an important platform for network governance in some local governments in China. This study makes an attempt to answer the following questions: What types of strategies should governments implement on social media platforms during public emergencies? What are the effects of these strategies? Based on the case of the Shifang Incident, which was a large-scale environmental protest that occurred in Shifang, China in 2012, we analyze all the messages posted during the incident on the official microblog of the Shifang government and examine the public feedback by using an online big data analysis tool. In line with the time sequence and the extent of the conflict, we divide the Shifang Incident into three phases: the fermentation period, the confrontation period, and the digestion period. In addition, we classify government strategies on social media into five categories: introducing, appealing, explaining, rumor-refuting, and decision-making. The analysis results show that different government strategies are applied to different phases of the incident and that the responses of the public also vary during different periods. 相似文献
11.
This paper deals with the observability analysis of nonlinear tubular bioreactor models. Due to the lack of tools for the observability analysis of nonlinear infinite-dimensional systems, the analysis is performed on a linearized version of the model around some steady-state profile, in which coefficients can be functions of the spatial variable. The study starts from an example of tubular bioreactor that will serve as a case study in the present paper. It is shown that such linear models with coefficients dependent on the spatial variable are Sturm–Liouville systems and that the associated linear infinite-dimensional system dynamics are described by a Riesz-spectral operator that generates a C0 (strongly continuous)-semigroup. The observability analysis based on infinite-dimensional system theory shows that any finite number of dominant modes of the system can be made observable by an approximate point measurement. 相似文献
12.
Geologically, La Paz City is located in an unstable area. During the history of La Paz city, many landslides have destroyed houses and valuable infrastructures. In the last decades, time series Interferometric Synthetic Aperture Radar (InSAR) technologies have demonstrated a great capacity for detecting slow ground displacement, achieving an accuracy of millimetre-level. In order to have a better landslide monitoring of La Paz city, in this study, the Sentinel-1 SAR images have been processed by Persistent Scatterer Interferometry (PSI) and the Small Baseline Subset (SBAS) techniques. The time span of the datasets is from March 2015 to August 2016. Both ascending and descending Synthetic Aperture Radar (SAR) images have been processed to obtain the line of sight (LOS) ground velocity, and then the results have been combined to estimate the up-down and east-west displacement. Several active movement areas have been identified, showing a surface velocity up to 158 mm year?1 westward and 49 mm year?1 eastward. Furthermore, two important findings have been discovered. First, the InSAR result has detected movement in Auquisamaa hill before the area collapsed (15 February 2017), where five houses are buried. Second, the InSAR result has identified that there are still some unstable sites in Callapa area, where a mega-landslide has destroyed more than a thousand of houses in February 2011. In conclusion, we have verified that the InSAR technology could be a very useful tool to help La Paz public institutions for a better management of urban planning, landslide areas delimitation and landslide risk mitigation. 相似文献
13.
14.
15.
Classification management for grassland using MODIS data: a case study in the Gannan region,China 总被引:1,自引:0,他引:1
Xia Cui Tian Gang Liang Yu Ying Shen Xing Yuan Liu Yong Liu 《International journal of remote sensing》2013,34(10):3156-3175
Classification of grasslands is a convenient method to measure and manage the sustainability of Chinese grasslands. In this study, a timely and reliable procedure was examined using remote-sensing (RS) techniques. Linear regression analysis between field survey data and Moderate-Resolution Imaging Spectroradiometer (MODIS) data showed that among 17 vegetation indices (VIs) evaluated, the enhanced vegetation index (EVI) was the best VI to simulate forage dry biomass and cover in the Gannan region. The results of precision estimation of the models showed that power and logarithm regression satisfactorily simulated grassland dry biomass and grassland cover, respectively. The index of classification management of grasslands (ICGs) was used to subdivide grasslands into conservation grasslands and moderately productive grasslands in the Gannan region, where no grasslands fell into intensively productive grasslands. Conservation grasslands accounted for 2.04% of the available grasslands, whereas moderately productive grasslands were 97.96% of the available grasslands, and this is related to the history of the grasslands’ use and the per capita income in the Gannan region. This study proposes that the area of conservation grasslands and that of moderately productive grasslands are determined by increases in per capita income and changes in the human use of grasslands. 相似文献
16.
Bo-Wei Chen Yang-Yen Ou Chun-Chia Kung Ding-Ruey Yeh Seungmin Rho Jhing-Fa Wang 《Multimedia Tools and Applications》2016,75(9):4851-4865
This study explores the relationship between maternal love and brain regions by using functional magnetic resonance imaging (fMRI). Also, a novel pattern analysis for fMRI based on the discovered brain regions is proposed in this work. Firstly, to identify which region responds to stimuli, a statistical t-test is used after the scan. Based on these preliminary regions of interest, this study develops discriminant features extracted from multivoxels for cognitive modeling. In total, five parameters are used in the time-series and contextual analysis, including the proposed blood-oxygen-level-dependent (BOLD) contrast edge, BOLD contrast centroid, activated voxels, mean, and variance. Furthermore, this study also proposes a test function for examining voxel activation based on variance, so that insignificant voxels and irrelevant outliers can be removed from the features. After the feature extraction from brain regions of interest, the analysis subsequently uses Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA) for reducing the feature size. Lastly, this study adopts a computer-aided pattern recognizer, the Support Vector Machine (SVM), to facilitate automation of the proposed analysis. A dataset consisting of brain-scanning images from 22 subjects was used for evaluation. The statistical result shows that the neural circuitry associated with maternal bonds indeed appears in the relevant brain regions as indicated by the other research. Such regions are subsequently used for assessment of the proposed analysis. Classification result shows that the proposed approach can effectively identify activated samples. Besides, our system achieves an accuracy rate of as high as 83.33 %. A comparison among different systems reveals that the proposed system is superior to the others and establishes its feasibility. 相似文献
17.
Today’s security threats like malware are more sophisticated and targeted than ever, and they are growing at an unprecedented rate. To deal with them, various approaches are introduced. One of them is Signature-based detection, which is an effective method and widely used to detect malware; however, there is a substantial problem in detecting new instances. In other words, it is solely useful for the second malware attack. Due to the rapid proliferation of malware and the desperate need for human effort to extract some kinds of signature, this approach is a tedious solution; thus, an intelligent malware detection system is required to deal with new malware threats. Most of intelligent detection systems utilise some data mining techniques in order to distinguish malware from sane programs. One of the pivotal phases of these systems is extracting features from malware samples and benign ones in order to make at least a learning model. This phase is called “Malware Analysis” which plays a significant role in these systems. Since API call sequence is an effective feature for realising unknown malware, this paper is focused on extracting this feature from executable files. There are two major kinds of approach to analyse an executable file. The first type of analysis is “Static Analysis” which analyses a program in source code level. The second one is “Dynamic Analysis” that extracts features by observing program’s activities such as system requests during its execution time. Static analysis has to traverse the program’s execution path in order to find called APIs. Because it does not have sufficient information about decision making points in the given executable file, it is not able to extract the real sequence of called APIs. Although dynamic analysis does not have this drawback, it suffers from execution overhead. Thus, the feature extraction phase takes noticeable time. In this paper, a novel hybrid approach, HDM-Analyser, is presented which takes advantages of dynamic and static analysis methods for rising speed while preserving the accuracy in a reasonable level. HDM-Analyser is able to predict the majority of decision making points by utilising the statistical information which is gathered by dynamic analysis; therefore, there is no execution overhead. The main contribution of this paper is taking accuracy advantage of the dynamic analysis and incorporating it into static analysis in order to augment the accuracy of static analysis. In fact, the execution overhead has been tolerated in learning phase; thus, it does not impose on feature extraction phase which is performed in scanning operation. The experimental results demonstrate that HDM-Analyser attains better overall accuracy and time complexity than static and dynamic analysis methods. 相似文献
18.
Olivier Stasse R. Ruland F. Lamiraux A. Kheddar K. Yokoi W. Prinz 《Intelligent Service Robotics》2009,2(3):153-160
This paper illustrates through a practical example an integration of a humanoid robotic architecture, with an open-platform collaborative working environment called BSCW (Be Smart-Cooperate Worldwide). BSCW is primarily designed to advocate a futuristic shared workspace system for humans. We exemplify how a complex robotic system (such as a humanoid robot) can be integrated as a proactive collaborative agent which provides services and interacts with other agents sharing the same collaborative environment workspace. Indeed, the robot is seen as a ‘user’ of the BSCW which is able to handle simple tasks and reports on their achievement status. We emphasis on the importance of using standard software such as CORBA (Common Object Request Broker Architecture) in order to easily build interfaces between several interacting complex software layers, namely from real-time constraints up to basic Internet data exchange. 相似文献
19.
Authenticity is a substantial matter and a current concern of the organic food industry. Organic foods are appreciated by customers because of their benefits to health and friendliness to the environment. However, currently, the most common way for customers to confirm that the organic food they are buying are organic is by certificates and label information, which can be fraudulent. Furthermore, it is interesting to gain insight into organic food composition and visualize which mineral components are fundamental in the differentiation of organic from conventional food. This work addresses these problems using data mining concepts and techniques in a comparative study of organic and conventional food focusing on grape juice, but the proposed methodology can be adapted and employed for analysis of other types of organic food. This article presents a data mining analysis of the elemental composition of 37 grape juice samples collected from different locations in Brazil. The elemental composition of grape juice samples was determined by inductively-coupled plasma-mass spectrometry (ICP-MS). Forty-four elements were determined in the two types of samples, namely organic and conventional grape juice. Special effort was devoted to selecting the variables (elements) that best described each type of grape juice. Predictive models based on support vector machines, neural networks and decision trees were developed to successfully differentiate organic from conventional grape juice samples. We found that, according to the F-score, Chi-square and Random Forest Importance variable selection measures, the components Na, Sn, P, K, Sm and Nd are among the most important variables in the differentiation of organic and conventional grape juice samples. Particularly, the components Na, Sn and K received first, second or third position according to at least two methods. On the other hand, all variable selection methods considered indicated that Ag, Zn, Cr, Be and Pd were among the least important variables for the differentiation of organic and conventional grape juices. SVM yielded an accuracy of 89.18%, both CART and MLP achieved an accuracy of 86.48%. 相似文献
20.
Position error signal generation in hard disk drives based on a field programmable gate array (FPGA)
The position error signal (PES) in current hard disk drives is generated from the embedded servo data and used as the input for the track following controller. The servo pattern design and the decoding are both quite complicated in terms of the servo writing and servo detection, but they are important for the system dynamics study and track following controller design. In this paper, a novel scheme based on discrete fourier transformation (DFT) to decode the servo signal from a special magnetic servo pattern and generate the PES is proposed. In the scheme adjacent magnetic tracks with different frequencies are recorded to the disk and used as servo tracks to encode the position information. Simulation results show that the amplitudes at the two writing frequencies in the readback spectrum depend on the magnetic head position. The quadrature PES defined by the difference of the amplitudes is almost linear between the two adjacent tracks The simulation and off-line experimental results analysis agree with each other and prove the feasibility of this scheme. A real-time signal acquiring and processing system with a commercial field programmable gate array (FPGA) and ADC/DAC chips was built, and the proposed scheme was implemented in the FPGA to do the high-speed signal analysis. The magnetic head position information is extracted from the readback spectrum in the FPGA and transferred to a PC host for real-time graphic display using a labview interface. The system demonstrates an ability to generate the PES at 25 K samples per second with a resolution around 3 nm. The sampling rate can be enhanced further to 125 kHz if more servo sectors are written to the disk. This system provides a re-configurable research stage for studying the dynamic behavior of hard disk drives and for developing the control algorithm for track following. 相似文献