首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Big Data Analytics (BDA) is an emerging phenomenon with the reported potential to transform how firms manage and enhance high value businesses performance. The purpose of our study is to investigate the impact of BDA on operations management in the manufacturing sector, which is an acknowledged infrequently researched context. Using an interpretive qualitative approach, this empirical study leverages a comparative case study of three manufacturing companies with varying levels of BDA usage (experimental, moderate and heavy). The information technology (IT) business value literature and a resource based view informed the development of our research propositions and the conceptual framework that illuminated the relationships between BDA capability and organizational readiness and design. Our findings indicate that BDA capability (in terms of data sourcing, access, integration, and delivery, analytical capabilities, and people’s expertise) along with organizational readiness and design factors (such as BDA strategy, top management support, financial resources, and employee engagement) facilitated better utilization of BDA in manufacturing decision making, and thus enhanced high value business performance. Our results also highlight important managerial implications related to the impact of BDA on empowerment of employees, and how BDA can be integrated into organizations to augment rather than replace management capabilities. Our research will be of benefit to academics and practitioners in further aiding our understanding of BDA utilization in transforming operations and production management. It adds to the body of limited empirically based knowledge by highlighting the real business value resulting from applying BDA in manufacturing firms and thus encouraging beneficial economic societal changes.  相似文献   

2.
The main purpose of this study is to examine the factors that are critical to create business value from business analytics (BA). Therefore, we conduct a meta-analysis of 125 firm-level studies spanning ten years of research from across 26 countries. We found evidence that the social factors of BA, such as human resources, management capabilities, and organizational culture show a greater impact on business value, whereas technical aspects play a minor role in enhancing firm performance. Through these findings, we contribute to the ongoing debate concerning BA business value by synthesizing and validating the findings of the body of knowledge.  相似文献   

3.
Time is central to the purported business value of analytics. Yet, research has adopted a simplistic, ‘clock’ interpretation of time, ignoring its complex and socially embedded nature. There is also an overemphasis on analytics software and not on the people using them. Although analytics may be ‘fast’ to realise business value, it must cater to temporal complexities of organisations and people using it. Drawing on the temporality theory, this study develops temporal factors to examine the value of analytics. We also develop a research agenda that identifies opportunities to examine time, temporal personalities and other factors when people use analytics in the organisation.  相似文献   

4.
Big data brings great value as well as a lot of network security problems, which makes the hacker possess more and more attack strategies. This paper precisely describes the static form of hackers, and proposes the best dynamic hackers attack tactics under certain assumptions. When the proportion of the hacker’s resource input is its static probability distribution value, the hacker income reaches maximum. In particular, on the premise of uniform ratio of input and output, if the entropy of hacker reduces 1 bit, the hacker income will be double. Furthermore, this paper studies the optimal combination of hacker attacks and proposes a logarithmic optimal combination attack strategy that the hacker attacks several systems simultaneously. This strategy not only can maximize the hacker’s overall income, but also can maximize the income of each round attack. We find that the input-output ratio of each system will not change at the end of this round attack when hacker adopts the logarithmic optimal combination strategy, and find the growth rate of additional hacker income does not exceed the mutual information between the input-output ratio of the attacked system and the inedge information if an attacker can get some inedge information through other ways. Moreover, there is an optimum attack growth rate of hackers if time-varying attacked system is a stationary stochastic process. We can conclude that, in Big Data era, the more information the hacker gets, the more hacker income.  相似文献   

5.
It is now just over 50 years since the deployment of LEO-the first business computer and application - in 1951. The paper attempts to look 50 years beyond the birth of LEO in order to discern the nature and effects of business computing in 2051.Scenarios are offered of some possible business applications fifty years hence. These include the business information systems in space and the nature of manufacturing.The scenarios serve as a basis for addressing a number of issues. These include the availability of technology to support the scenarios presented, the nature of organizations shaped by future information systems, the nature of employment in the new organizational structure, consumer-vendor relations in the new economy, the effects of the new information technology on the nature of national governments, and the effect of information technologies on the structure of the global economy.  相似文献   

6.
Trond Haga 《AI & Society》2009,23(1):17-31
Incremental change and innovation are often regarded as two separate concepts, with some interconnections. Based on two practical cases that emphasize respectively incremental change and innovation, the article discusses whether a division between these concepts can be maintained. Action research is seen as having a central role, offering an integral working method in industrial networks.
Trond HagaEmail:
  相似文献   

7.
This paper discusses the capability of small firms to comply with legislative demands on risk assessment. The results of a national survey show that only a minor fraction of small firms comply. Two case studies demonstrate that small firms are able to meet the demands. An analysis of these cases leads to some hypotheses on the preconditions for compliance. Many firms need a person to mediate legislative demands. The demands for qualifications to fulfil this role are discussed. The conclusion is that it is possible to qualify persons as mediators. Therefore, to stimulate ergonomic activities in small firms resources for such mediators are required. The occupational health services are able to train a staff to undertake the task.  相似文献   

8.
Online reviews have a significant influence on consumers, and consequently firms are motivated to manipulate online reviews to promote their own products. This paper develops an analytical model to systematically explore the impact of online review manipulation on asymmetrical firms who sell substitutable search products in a competing market. Results show that a firm’s manipulation of online reviews is not necessary to hurt its competitor’s profit. In addition, if firms are free to choose whether to manipulate online reviews, both firms will always choose to manipulate online reviews. Moreover, there exists a prisoner’s dilemma in which online reviews are overmanipulated.  相似文献   

9.
The estimation of geophysical parameters from Synthetic Aperture Radar (SAR) data necessitates well‐calibrated sensors with good radiometric precision. In this paper, the radiometric calibration of the new Advanced Synthetic Aperture Radar (ASAR‐ENVISAT) sensor was assessed by comparing ASAR data with ERS‐2 and RADARSAT‐1 SAR data. By analysing the difference between radar signals of forest stands, the results show differences of varying importance between the ASAR on the one hand, and the ERS‐2 and the RADARSAT‐1 on the other. For recent data acquired at the end of 2005, the difference varies from ?0.72 to +0.72 dB, with temporal variations that can reach 1.1 dB. For older data acquired in 2003 and 2004, we observe a sharp decrease in the radar signal in the range direction, which can attain 3.5 dB. The use of revised calibration constants provided recently by the European Space Agency (ESA) significantly improves the results of the radiometric calibration, where the difference between the ASAR and the other SARs will be lower than 0.5 dB.  相似文献   

10.
If a ‘perfect market’ in economic theory ever comes true in the digital world, sellers will not be able to gain profits above the marginal costs, and the resource allocation will become much more efficient. This paper attempts to investigate whether this prediction is true in the electronic marketplace. We have observed 2000 prices listed by online and offline retailers. By comparing price levels, price adjustment, and price dispersion among CD (compact disc) retailers, we empirically test whether online retail channels are more efficient than conventional retailers. We have found that prices on the Internet are lower than those in conventional outlets. It is also found that Internet retailers adjust their prices by much smaller increments than offline counterparts, in order to flexibly respond to demand and supply conditions. Finally, the price dispersion among Internet retailers is lower than the dispersion among traditional retail stores. Our findings suggest that Internet improves efficiency of markets by lowering costs to obtain and to disseminate information on products and prices.  相似文献   

11.
Personalization is a strategic tool for product or service differentiation, especially when competition is keen in the market. Many personalization strategies have been developed and realized with this in mind. Little is known about the impact of different personalization strategies, regarding different personalization dimensions on customer retention, however. This has resulted in a lack of consensus on how best to design personalization strategies. To address the related issues, this study identifies the dimensions of personalization, and investigates the effect of each dimension on customer retention. It does so by implementing actual personalization systems corresponding to two factorial design experiments involving 372 participants. Multiple analysis of covariance reveals the effectiveness of each dimension and interactions among them. This study consequently proposes the optimal combination of personalization dimensions that leads to customer satisfaction and loyalty.  相似文献   

12.
The design of the database is crucial to the process of designing almost any Information System (IS) and involves two clearly identifiable key concepts: schema and data model, the latter allowing us to define the former. Nevertheless, the term model is commonly applied indistinctly to both, the confusion arising from the fact that in Software Engineering (SE), unlike in formal or empirical sciences, the notion of model has a double meaning of which we are not always aware. If we take our idea of model directly from empirical sciences, then the schema of a database would actually be a model, whereas the data model would be a set of tools allowing us to define such a schema.The present paper discusses the meaning of model in the area of Software Engineering from a philosophical point of view, an important topic for the confusion arising directly affects other debates where model is a key concept. We would also suggest that the need for a philosophical discussion on the concept of data model is a further argument in favour of institutionalizing a new area of knowledge, which could be called: Philosophy of Engineering.  相似文献   

13.
This paper analyses two judgments from the European Court of Justice. Both were delivered soon after the new Data Protection Regulation became applicable. The argumentation in the judgments provides timely clarification on certain key concepts in European data protection law. As the analysis will show, the ECJ continues its broad interpretation of the parties responsible for data processing. The judgments are generally compatible with the Court’s previous case law, which seeks to fill any potential gaps in the protection of individuals’ privacy rights. Hence, the aims of data protection rules are predominately interpreted by the Court within a fundamental rights framework. Even though the cases have certain significant differences, the conclusions to be drawn are similar: the argumentation of the Court is focused on data protection throughout. Moreover, the Court emphasises strong protection of personal data in all kinds of situations, even those where other rights could also be relevant.  相似文献   

14.
Learning analytics is the analysis of electronic learning data which allows teachers, course designers and administrators of virtual learning environments to search for unobserved patterns and underlying information in learning processes. The main aim of learning analytics is to improve learning outcomes and the overall learning process in electronic learning virtual classrooms and computer-supported education. The most basic unit of learning data in virtual learning environments for learning analytics is the interaction, but there is no consensus yet on which interactions are relevant for effective learning. Drawing upon extant literature, this research defines three system-independent classifications of interactions and evaluates the relation of their components with academic performance across two different learning modalities: virtual learning environment (VLE) supported face-to-face (F2F) and online learning. In order to do so, we performed an empirical study with data from six online and two VLE-supported F2F courses. Data extraction and analysis required the development of an ad hoc tool based on the proposed interaction classification. The main finding from this research is that, for each classification, there is a relation between some type of interactions and academic performance in online courses, whereas this relation is non-significant in the case of VLE-supported F2F courses. Implications for theory and practice are discussed next.  相似文献   

15.
Requirements Engineering - Expert judgement is a common method for software effort estimations in practice today. Estimators are often shown extra obsolete requirements together with the real ones...  相似文献   

16.
We used collocated observations from the Moderate Resolution Imaging Spectroradiometer (MODIS) and the Atmospheric Infrared Sounder (AIRS) to investigate correlations between cloud parameters and atmospheric stability. We focus on low clouds and specifically investigate the cloud parameters cloud cover and cloud optical thickness from MODIS. The selected atmospheric parameters from AIRS are maximum relative humidity (MRH), lower tropospheric stability (LTS), and water vapour gradient (QTS). The correlations were tested for temporal and regional variation on a global scale and over a time frame of 10 years. Cloud cover and MRH show weak correlations and strong variations on both the temporal and spatial scales. However, cloud cover and lower tropospheric stability show a high correlation in areas with low maritime clouds. The correlation is relatively stable, but slightly increased for the years 2009–2012. Correlations between cloud cover and QTS show a similar behaviour, but slightly stronger variations on the spatial and temporal scales, with better correlations in the East Pacific and from 2004 to 2012. The correlations with cloud optical thickness are weaker in all three cases. A more detailed analysis of the Southeast Pacific shows the influence of El Niño Southern Oscillation (ENSO) on most parameters, but a relatively stable behaviour for the connection of cloud fraction and LTS. Based on the analysis, we suggest that relative humidity is an insufficient approach to link atmospheric properties and low cloud cover. However, we find good correlations with respect to LTS and QTS. LTS in particular indicates low temporal fluctuations, even in the case of influence by ENSO.  相似文献   

17.
Gisin  Renner  Wolf 《Algorithmica》2008,34(4):389-412
Abstract. After carrying out a protocol for quantum key agreement over a noisy quantum channel, the parties Alice and Bob must process the raw key in order to end up with identical keys about which the adversary has virtually no information. In principle, both classical and quantum protocols can be used for this processing. It is a natural question which type of protocol is more powerful. We show that the limits of tolerable noise are identical for classical and quantum protocols in many cases. More specifically, we prove that a quantum state between two parties is entangled if and only if the classical random variables resulting from optimal measurements provide some mutual classical information between the parties. In addition, we present evidence which strongly suggests that the potentials of classical and of quantum protocols are equal in every situation. An important consequence, in the purely classical regime, of such a correspondence would be the existence of a classical counterpart of so-called bound entanglement, namely ``bound information' that cannot be used for generating a secret key by any protocol. This stands in contrast to what was previously believed.  相似文献   

18.
Requirements Engineering - A correction to this paper has been published: https://doi.org/10.1007/s00766-021-00354-4  相似文献   

19.
Some of our recent observations suggest that mental rotation may be important for reduction of motion sickness in microgravity as well as in the microgravity simulator. Therefore, we suggest that development of the ability to perform mental rotation may be important for adaptation to many virtual environments. Training virtual environment operators to perform mental rotation may enhance operator performance both by increasing their ability to "locomote in" and manipulate that environment and by reducing motion sickness associated with transitions between virtual and normal environments.  相似文献   

20.
Students in secondary education strive hard enough to understand basic programming concepts. With all that is known regarding the benefits of programming, little is the published evidence showing how high school students can learn basic programming concepts following innovative instructional formats correctly with the respect to gain/enhance their computational thinking skills. This distinction has caused lack of their motivation and interest in Computer Science courses. This case study presents the opinions of twenty-eight (n?=?28) high school students who participated voluntarily in a 3D-game-like environment created in Second Life. This environment was combined with the 2D programming environment of Scratch4SL for the implementation of programming concepts (i.e. sequence and concurrent programming commands) in a blended instructional format. An instructional framework based on Papert's theory of Constructionism to assist students how to coordinate or manage better the learning material in collaborative practice-based learning activities is also proposed. By conducting a mixed-method research, before and after finishing several learning tasks, students’ participation in focus group (qualitative data) and their motivation based on their experiences (quantitative data) are measured. Findings indicated that an instructional design framework based on Constructionism for acquiring or empowering students’ social, cognitive, higher order and computational thinking skills is meaningful. Educational implications and recommendations for future research are also discussed.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号