首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
ContextThe context of this research is software process improvement (SPI) success factors for small and medium Web companies.ObjectiveThe primary objective of this paper is to propose a theoretical framework of SPI success factors for small and medium Web companies.MethodThe theoretical framework presented in this study aggregated the results of three previous research phases by applying principles of theoretical integration and comparative analysis. Those three previous phases were all empirical in nature, and comprise: a systematic review of SPI in small and medium Web companies [1], [2]; a replication study [3] and a grounded theory-based initial exploratory framework of factors in small and medium Web companies [4].ResultsThe theoretical framework includes 18 categories of SPI success factors, 148 properties of these categories and 25 corresponding relationships, which bind these categories together. With the help of these relationships, the categories and properties of SPI success factors can be directly translated into a set of guidelines, which can then be used by the practitioners of small and medium Web companies to improve the current state of SPI in their companies and achieve overall company success.ConclusionThe comprehensive theoretical framework of SPI success factors presented herein provides evidence regarding key factors for predicting SPI success for small and medium Web companies. The framework can be used as a baseline for a successful implementation of SPI initiatives in the mentioned domain.  相似文献   

2.
ContextModel-Driven Development (MDD) is a paradigm that prescribes building conceptual models that abstractly represent the system and generating code from these models through transformation rules. The literature is rife with claims about the benefits of MDD, but they are hardly supported by evidences.ObjectiveThis experimental investigation aims to verify some of the most cited benefits of MDD.MethodWe run an experiment on a small set of classes using student subjects to compare the quality, effort, productivity and satisfaction of traditional development and MDD. The experiment participants built two web applications from scratch, one where the developers implement the code by hand and another using an industrial MDD tool that automatically generates the code from a conceptual model.ResultsOutcomes show that there are no significant differences between both methods with regard to effort, productivity and satisfaction, although quality in MDD is more robust to small variations in problem complexity. We discuss possible explanations for these results.ConclusionsFor small systems and less programming-experienced subjects, MDD does not always yield better results than a traditional method, even regarding effort and productivity. This contradicts some previous statements about MDD advantages. The benefits of developing a system with MDD appear to depend on certain characteristics of the development context.  相似文献   

3.
Abstract. We generalize the construction of Gabber and Galil to essentially every unimodular matrix in SL 2 (Z). It is shown that every parabolic or hyperbolic fractional linear transformation explicitly defines an expander of bounded degree and constant expansion. Thus all but a vanishingly small fraction of unimodular matrices define expanders.  相似文献   

4.
ContextDiagnosing processes in a small company requires process assessment practices which give qualitative and quantitative results; these should offer an overall view of the process capability. The purpose is to obtain relevant information about the running of processes, for use in their control and improvement. However, small organizations have some problems in running process assessment, due to their specific characteristics and limitations.ObjectiveThis paper presents a methodology for assessing software processes which assist the activity of software process diagnosis in small organizations. There is an attempt to address issues such as the fact that: (i) process assessment is expensive and typically requires major company resources and (ii) many light assessment methods do not provide information that is detailed enough for diagnosing and improving processes.MethodTo achieve all this, the METvalCOMPETISOFT assessment methodology was developed. This methodology: (i) incorporates the strategy of internal assessments known as rapid assessment, meaning that these assessments do not take up too much time or use an excessive quantity of resources, nor are they too rigorous and (ii) meets all the requirements described in the literature for an assessment proposal which is customized to the typical features of small companies.ResultsThis paper also describes the experience of the application of this methodology in eight small software organizations that took part in the COMPETISOFT project. The results obtained show that this approach allows us to obtain reliable information about the strengths and weaknesses of software processes, along with information to companies on opportunities for improvement.ConclusionThe assessment methodology proposed sets out the elements needed to assist with diagnosing the process in small organizations step-by-step while seeking to make its application economically feasible in terms of resources and time. From the initial application it may be seen that this assessment methodology can be useful, practical and suitable for diagnosing processes in this type of organizations.  相似文献   

5.
Abstract

The application of nitrogen to small plots of barley increased leaf chlorophyll concentrations and produced darker green canopies. Leaf pigment concentrations were estimated independently of crop biomass using a band-pass radiometer to view the canopy at an oblique angle to minimize the influence of the soil background.  相似文献   

6.
Abstract

Information-based organizations are structured to function with as small and efficient a staff as possible. To this end, executives at The Promus Companies are using IT to spread decision-making authority and responsibility for customer satisfaction and customer service throughout the organization.  相似文献   

7.
Abstract

This article addresses the primary threats to computer networks that a small business might encounter and also provides strategies to counter these threats. It emphasizes the key characteristics associated with each category of security threat and provides approaches to eliminate or alleviate these threats. The article also presents a case study of a small insurance company for which the authors helped design, implement and secure computer networks. This case study further clarifies the concepts and strategies presented in the paper.  相似文献   

8.
ContextMore and more, small and medium-sized enterprises (SMEs) are using software to augment the functionality of their products and offerings. Variability management of software is becoming an interesting topic for SMEs with expanding portfolios and increasingly complex product structures. While the use of software product lines to resolve high variability is well known in larger organizations, there is less known about the practices in SMEs.ObjectiveThis paper presents results from a survey of software developing SMEs. The purpose of the paper is to provide a snapshot of the current awareness and practices of variability modeling in organizations that are developing software with the constraints present in SMEs.MethodA survey with questions regarding the variability practices was distributed to software developing organizations in a region of Sweden that has many SMEs. The response rate was 13% and 25 responses are used in this analysis.ResultsWe find that, although there are SMEs that develop implicit software product lines and have relatively sophisticated variability structures for their solution space, the structures of the problem space and the product space have room for improvement.ConclusionsThe answers in the survey indicate that SMEs are in situations where they can benefit from more structured variability management, but the awareness need to be raised. Even though the problem space similarity is high, there is little reuse and traceability activities performed. The existence of SMEs with qualified variability management and product line practices indicates that small organizations are capable to apply such practices.  相似文献   

9.
BackgroundThe application of microarray data for cancer classification is important. Researchers have tried to analyze gene expression data using various computational intelligence methods.PurposeWe propose a novel method for gene selection utilizing particle swarm optimization combined with a decision tree as the classifier to select a small number of informative genes from the thousands of genes in the data that can contribute in identifying cancers.ConclusionStatistical analysis reveals that our proposed method outperforms other popular classifiers, i.e., support vector machine, self-organizing map, back propagation neural network, and C4.5 decision tree, by conducting experiments on 11 gene expression cancer datasets.  相似文献   

10.
PCA与移动窗小波变换的高光谱决策融合分类   总被引:1,自引:0,他引:1       下载免费PDF全文
目的 高光谱数据具有较高的谱间分辨率和相关性,给分类处理带来了一定的困难.为了提高分类精度,提出一种结合PCA与移动窗小波变换的高光谱决策融合分类算法.方法 首先,利用相关系数矩阵对原始高光谱数据进行波段分组;然后,利用主成分分析对每组数据进行谱间降维;再根据提出的移动窗小波变换法进行空间特征提取;最后,采用线性意见池(LOP)决策融合规则对多分类器的分类结果进行融合.结果 采用两组来自不同传感器的数据进行实验,所提算法的分类精度和Kappa系数均高于已有的5种分类算法.与SVM-RBF算法相比,本文算法的分类精度高出了8%左右.结论 实验结果表明,本文算法充分挖掘了高光谱图像的谱间-空间信息,能有效提高分类正确率,在小样本情况下和噪声环境中也具有良好的分类性能.  相似文献   

11.
ContextThe Next Release Problem involves determining the set of requirements to implement in the next release of a software project. When the problem was first formulated in 2001, Integer Linear Programming, an exact method, was found to be impractical because of large execution times. Since then, the problem has mainly been addressed by employing metaheuristic techniques.ObjectiveIn this paper, we investigate if the single-objective and bi-objective Next Release Problem can be solved exactly and how to better approximate the results when exact resolution is costly.MethodsWe revisit Integer Linear Programming for the single-objective version of the problem. In addition, we integrate it within the Epsilon-constraint method to address the bi-objective problem. We also investigate how the Pareto front of the bi-objective problem can be approximated through an anytime deterministic Integer Linear Programming-based algorithm when results are required within strict runtime constraints. Comparisons are carried out against NSGA-II. Experiments are performed on a combination of synthetic and real-world datasets.FindingsWe show that a modern Integer Linear Programming solver is now a viable method for this problem. Large single objective instances and small bi-objective instances can be solved exactly very quickly. On large bi-objective instances, execution times can be significant when calculating the complete Pareto front. However, good approximations can be found effectively.ConclusionThis study suggests that (1) approximation algorithms can be discarded in favor of the exact method for the single-objective instances and small bi-objective instances, (2) the Integer Linear Programming-based approximate algorithm outperforms the NSGA-II genetic approach on large bi-objective instances, and (3) the run times for both methods are low enough to be used in real-world situations.  相似文献   

12.
ContextA software product line is a family of related software products, typically created from a set of common assets. Users select features to derive a product that fulfills their needs. Users often expect a product to have specific non-functional properties, such as a small footprint or a bounded response time. Because a product line may have an exponential number of products with respect to its features, it is usually not feasible to generate and measure non-functional properties for each possible product.ObjectiveOur overall goal is to derive optimal products with respect to non-functional requirements by showing customers which features must be selected.MethodWe propose an approach to predict a product’s non-functional properties based on the product’s feature selection. We aggregate the influence of each selected feature on a non-functional property to predict a product’s properties. We generate and measure a small set of products and, by comparing measurements, we approximate each feature’s influence on the non-functional property in question. As a research method, we conducted controlled experiments and evaluated prediction accuracy for the non-functional properties footprint and main-memory consumption. But, in principle, our approach is applicable for all quantifiable non-functional properties.ResultsWith nine software product lines, we demonstrate that our approach predicts the footprint with an average accuracy of 94%, and an accuracy of over 99% on average if feature interactions are known. In a further series of experiments, we predicted main memory consumption of six customizable programs and achieved an accuracy of 89% on average.ConclusionOur experiments suggest that, with only few measurements, it is possible to accurately predict non-functional properties of products of a product line. Furthermore, we show how already little domain knowledge can improve predictions and discuss trade-offs between accuracy and required number of measurements. With this technique, we provide a basis for many reasoning and product-derivation approaches.  相似文献   

13.
DAVID KAHN 《Cryptologia》2013,37(3):254-256
Abstract

This article describes the operation of a small U.S. Navy station in post-World War II China that primarily intercepted Soviet naval traffic as part of the worldwide BOURBON project targeting all Soviet communications systems.  相似文献   

14.
《EDPACS》2013,47(2):12-20
Abstract

Much of the audit literature focuses on Fortune 1000 con-trol and infrastructure issues. As a result, IT reviews of small firms or branch offices of larger firms sometimes inappropriately retrofit large-scale control and security practices onto small office IT architectures. This article outlines a broad spectrum of practices that are practical for the office or plant with 10 to 50 employees. By focusing on size-appropriate practices, the auditor can more effectively promote day-to-day security and operational efficiency for these organizations.  相似文献   

15.
This paper presents a smooth control strategy for the regulation problem of an uncertain system, which assures uniform ultimate boundedness of the closed-loop system inside of the zero-state neighbourhood. This neighbourhood can be made arbitrarily small. To this end, a class of nonlinear proportional integral controllers or PI controllers was designed. The behaviour of this controller emulates very close a sliding mode controller. To accomplish this behaviour saturation functions were combined with traditional PI controller. The controller did not need a high-gain controller or a sliding mode controller to accomplish robustness against unmodelled persistent perturbations. The obtained closed-solution has a finite time of convergence in a small vicinity. The corresponding stability convergence analysis was done applying the traditional Lyapunov method. Numerical simulations were carried out to assess the effectiveness of the obtained controller.  相似文献   

16.
ObjectiveIn this paper, we present findings from an empirical study that was aimed at identifying the relative “perceived value” of CMMI level 2 specific practices based on the perceptions and experiences of practitioners of small and medium size companies. The objective of this study is to identify the extent to which a particular CMMI practice is used in order to develop a finer-grained framework, which encompasses the notion of perceived value within specific practices.MethodWe used face-to-face questionnaire based survey sessions as the main approach to collecting data from 46 software development practitioners from Malaysia and Vietnam. We asked practitioners to choose and rank CMMI level 2 practices against the five types of assessments (high, medium, low, zero or do not know). From this, we have proposed the notion of ‘perceived value’ associated with each practice.ResultsWe have identified three ‘requirements management’ practices as having a ‘high perceived value’. The results also reveal the similarities and differences in the perceptions of Malaysian and Vietnamese practitioners with regard to the relative values of different practices of CMMI level 2 process areas.ConclusionsSmall and medium size companies should not be seen as being “at fault” for not adopting CMMI – instead the Software Process Improvement (SPI) implementation approaches and its transition mechanisms should be improved. We argue that research into “tailoring” existing process capability maturity models may address some of the issues of small and medium size companies.  相似文献   

17.
The errors that arise in a quantum channel can be corrected perfectly if and only if the channel does not decrease the coherent information of the input state. We show that, if the loss of coherent information is small, then approximate quantum error correction is possible. PACS: 03.67.H, 03.65.U  相似文献   

18.
目的 针对已有主色提取方法中存在的严重误检和漏检现象以及要求主色数量固定等问题,在分析主色特征含义的基础上提出了一种用于主色提取的直方图峰值筛选与剔除算法。方法 首先根据像素的空间聚集度统计出图像的鲁棒颜色直方图,并提取其局部峰值形成候选主色集;然后根据各候选主色的隶属像素数和空间分布特征以及它们之间的共同相似像素数,对候选主色进行循环筛选;最后通过候选主色剔除过程,将隶属像素数目过少、空间分布过于分散或与其他候选主色差异较小的候选主色去掉,得到最终的图像主色。另外,针对已有主色评价方法比较片面的缺陷,设计了一个能够全面反映主色影响因素的主色综合评价模型。结果 大量的实验结果表明,本文算法提取的主色在代表图像颜色特征的有效性上超越了已有的方法,且本文算法平均评价分数是已有最高得分算法的1.1倍,相对提高了约10个百分点。结论 鉴于该算法所展示的优越性能,它在图像检索、分割和编辑等领域具有较大的潜在应用价值。  相似文献   

19.
Abstract

We investigate whether and how personal mathematical knowledge at an advanced level impacts on teaching at a lower school level. We study this in the context of functions because understanding them permeates secondary and advanced mathematics. Textbook treatment of these can be patchy, implying a need for knowledgeable teachers to rectify weaknesses. A small observational study of two mathematically well-qualified teachers, teaching at a range of levels, leads to a deeper understanding of how their personal mathematical knowledge is manifested in their teaching. Finally, we present a theory of pedagogy combining dual development, intellectual necessity, and repeated experience of reasoning, for which these teachers’ practices provide illustrations.  相似文献   

20.
Abstract

The heterogeneity of landscapes and the small size of fields has made it difficult to apply space remote sensing techniques to agriculture and forestry. However, with the availability of high resolution satellite data, research has led to a number of simple applications and, since the mid 1980s some large programmes have been launched, especially for agricultural information.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号