首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Short-term forecasts of the dynamics of coronavirus disease 2019 (COVID-19) in the period up to its decline following mass vaccination was a task that received much attention but proved difficult to do with high accuracy. However, the availability of standardized forecasts and versioned datasets from this period allows for continued work in this area. Here, we introduce the Gaussian infection state space with time dependence (GISST) forecasting model. We evaluate its performance in one to four weeks ahead forecasts of COVID-19 cases, hospital admissions and deaths in the state of California made with official reports of COVID-19, Google’s mobility reports and vaccination data available each week. Evaluation of these forecasts with a weighted interval score shows them to consistently outperform a naive baseline forecast and often score closer to or better than a high-performing ensemble forecaster. The GISST model also provides parameter estimates for a compartmental model of COVID-19 dynamics, includes a regression submodel for the transmission rate and allows for parameters to vary over time according to a random walk. GISST provides a novel, balanced combination of computational efficiency, model interpretability and applicability to large multivariate datasets that may prove useful in improving the accuracy of infectious disease forecasts.  相似文献   

2.
A simple, blended mechanistic-statistical ice motion model is presented. The model requires regular synoptic-scale wind and ice velocity observations as input. It is intended as an ice forecast tool for offshore operators. We anticipate this model will be useful for short-term forecasts from a few hours up to roughly 5 days. Ice velocity is estimated and forecast as the sum of mechanistic components (fully dynamic free drift, slowly varying deep ocean currents, and tidal oscillations) and a statistical component that minimizes errors between the mechanistic components and recent observations. The free-drift model includes the inertial effects of the ocean mixed layer and describes the combined effects of wind forcing and inertial oscillations. Deep ocean currents are estimated as the average of recent mechanistic component model errors. Tidal velocity contributions can be input by filtering recent observations at tidal frequencies or by an independent model. The effects of ice stress divergence and other unmodeled physics are approximated using optimal linear interpolation. The statistical component is estimated from the difference between recent ice motion observations and the mechanistic components of the model. Its mean and variance are then held constant for the forecast period. Using 60 days of CEAREX drift data, we found the following accuracy statistics. Average ice velocity component forecasts were accurate to within 0.002–0.003 m/s for all forecast periods up to 5 days. Nowcasts had a vector standard deviation of about 0.06 m/s, and 5-day forecasts had a vector standard deviation of 0.14 m/s. Concurrent free-drift forecasts had similar average component errors, but nowcasts were accurate to about 0.08 m/s, although after 3 days they agreed with the blended model forecasts, since the optimal estimate approaches zero after this period.  相似文献   

3.
Demand forecasting is a crucial input of any inventory system. The quality of the forecasts should be evaluated not only in terms of forecast accuracy or bias but also with regards to their inventory implications, which include the impact on the total inventory cost, the achieved service levels and the variance of orders and inventory. Forecast selection and combination are two very widely applied forecasting strategies that have shown repeatedly to increase the forecasting performance. However, the inventory performance of these strategies remains unexplored. We empirically examine the effects of forecast selection and combination on inventory when two sources of forecasts are available. We employ a large data-set that contains demands and (statistical and judgmental) forecasts for multiple pharmaceutical stock keeping units. We show that forecast selection and simple combination increase simultaneously the forecasting and inventory performance.  相似文献   

4.
Master production scheduling is complicated by demand uncertainty. The problem from uncertain forecasts is scheduling either too few or too many components relative to actual demand. To cope with this problem, schedulers often form judgments about future forecasts and they adapt production schedules to reflect beliefs and opinions about forecast errors. The research literature on scheduling has largely ignored formal methods for capturing these informal judgments and injecting them explicitly, rather than informally, into the master scheduling process. The current research demonstrates a method for using subjective assessments of forecast accuracy to improve master scheduling. First, a production loss function is derived using performance data from computer simulations of the production environment. Second, subjective assessments of forecasts errors are integrated with the loss function to reveal how forecasts should be adjusted to minimize expected scheduling losses. The procedure enhances intuitive judgmental scheduling; it reveals how managerial beliefs can be formally used to intentionally and optimally bias forecasts. In applying the procedure to 44 simulated MRP configurations, the optimally-biased forecasts are superior to unadjusted forecasts and provide an objective benchmark for using judgmental adjustments. The results demonstrate opportunities for enhancing master schedules by using subjective assessments of forecasts compared with scheduling without subjective assessments.  相似文献   

5.
Starting with the linear inventory control model which Sargent and Bradley called the variable S inventory model, and incorporating recent Box and Jenkins methodology for modeling stochastic demand and forecasters, we derive additional conditions which will ensure a steady state distribution for the inventory level. It is shown that the variable S model is, because of its feedback of current and past deviations of the inventory level from the desired inventory level, robust to a variety of estimation and specification errors in the forecaster.  相似文献   

6.
Seasonal series can be forecast by using Brown's adaptive forecasting method. The main advantage of this approach is that explicit expressions for the variance of the forecast error are derived without the use of numerical matrix inversion. This allows the forecaster to devise means to obtain a signal warning of possible failure of the forecasting model. Explicit expressions are also given for the smoothing vector, and the coefficients of the model can be updated without the use of either matrix inversion or multiplication, thus making the computation of the forecasts simple.  相似文献   

7.
We study the material requirements planning (MRP) system nervousness problem from a dynamic, stochastic and economic perspective in a two-echelon supply chain under first-order auto-regressive demand. MRP nervousness is an effect where the future order forecasts, given to suppliers so that they may plan production and organise their affairs, exhibits extreme period-to-period variability. We develop a measure of nervousness that weights future forecast errors geometrically over time. Near-term forecast errors are weighted higher than distant forecast errors. Focusing on replenishment policies for high volume items, we investigate two methods of generating order call-offs and two methods of creating order forecasts. For order call-offs, we consider the traditional order-up-to (OUT) policy and the proportional OUT policy (POUT). For order forecasts, we study both minimum mean square error (MMSE) forecasts of the demand process and MMSE forecasts coupled with a procedure that accounts for the known future influence of the POUT policy. We show that when retailers use the POUT policy and account for its predictable future behaviour, they can reduce the bullwhip effect, supply chain inventory costs and the manufacturer’s MRP nervousness.  相似文献   

8.
As enterprise resource planning (ERP) becomes the dominant management software in manufacturing and distribution systems in various industries, some problems associated with its origin, material requirements planning (MRP), still need to be resolved. We examine the effect of forecasting errors, one of the common operational problems in any business operation, in the context of an ERP-controlled manufacturing system. We consider a mitigating remedy, the use of a lot-sizing rule, to cope with the consequences of forecasting inaccuracy without resorting to costly inventory-oriented buffers. An ERP-controlled manufacturing system is simulated to see how these lot-sizing rules mitigate the forecast errors and subsequently generate acceptable system performance. The simulation results should help ease ERP users’ fear of committing another fatal error in demand forecasts, instead encouraging them to consider proper lot-sizing rules to cope with forecast errors.  相似文献   

9.
方俊涛  何桢  宋琳曦  张阳 《工业工程》2012,15(3):98-103
响应曲面方法是生产过程改进和优化的一种非常有效的方法。在传统的响应曲面模型的建立过程中,通常假定随机误差服从正态分布且相互独立具有相同的方差。但是实际生产中随机误差的方差并不是完全相同,观测值中会存在异常点,这就需要稳健的估计方法来抑制异常点对模型估计的影响。为了降低异常点对响应曲面模型最优值的影响,针对响应曲面方法中的中心复合设计,〖JP2〗充分考虑到不同实验设计位置上可能出现异常点的情况,对稳健M 回归方法:Huber 估计、Tukey 估计和Welsch 估计进行了理论比较研究。研究结果表明Welsch和Tukey 估计能有效改善异常点对响应曲面模型最优值的影响,消弱异常点对中心复合设计的干扰。通过一个来自化工方面的案例,计算了中心复合设计不同位置存在异常点与不存在异常点时,响应曲面模型的最优值,对比分析得出当异常点与响应均值的偏离程度较大时(10倍标准差),稳健M 估计尤其是Welsch和Tukey 估计显著提高响应曲面建模的稳健性。  相似文献   

10.
The performance of an X‐bar chart is usually studied under the assumption that the process standard deviation is well estimated and does not change. This is, of course, not always the case in practice. We find that X‐bar charts are not robust against errors in estimating the process standard deviation or changing standard deviation. In this paper we discuss the use of a t chart and an exponentially weighted moving average (EWMA) t chart to monitor the process mean. We determine the optimal control limits for the EWMA t chart and show that this chart has the desired robustness property. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

11.
A number of multiple-parameter adaptive exponential smoothing models have been proposed and demonstrated over the last two decades for short range forecasting. There have been conflicting results on their performance and no systematic study has been conducted to compare them in a controlled environment. The work reported here fills this void by testing a set of well known multiple-parameter adaptive procedures against the three-parameter Winters' model. First, sets of synthetic time series with known characteristics are used to compare performance for the different approaches using the standard deviation of forecast errors. Second, the information gathered at this point is used to predict the technique's performance on six empirical time series. And third, general guidelines are presented for model selection.  相似文献   

12.
We examine the impact of increasing product variety on two measures of a firm’s forecast performance – forecast accuracy and forecast bias – and test whether shared information mitigates this impact. With companies under pressure to expand product variety yet maintain good forecast accuracy understanding this relationship is critical. We use data gathered pre and post a vertical integration event, where some information forecasted prior to the merger was now available. We show that increasing product variety, and thus the number of forecasts, indeed deteriorates both forecast accuracy and bias. The vertical integration event, providing information sharing, results in improved forecast performance. Further, different product variety attributes (e.g. brand variety and pack variety) are found to have differing impacts. Increasing brand variety is found to have a significantly greater impact on forecast accuracy than pack variety. Using the vertical integration event as a natural experiment we document that expanding product variety negatively impacts forecasts and that information can help mitigated the impact. This is an important contribution as it tests the value of ‘truthful’ information given the elimination of the firm boundary post merger. Further, we show that a firm’s decision to expand product variety should include product variety attributes given their differential impact.  相似文献   

13.
The paper investigates the impact of forecasting errors and information sharing on the performance of a supply chain. It also examines the impact of forecasting errors on the value of information sharing between retailers and a supplier. Analyses of the simulation outputs show that while information sharing can bring tremendous benefits to the supplier and the entire supply chain, it hurts the retailers under most conditions. Demand pattern and forecasting error distributions faced by the retailers significantly influence the magnitudes of the cost savings as a result of information sharing. The expected bias in forecast errors has a much more significant impact on supply chain performance and the value of information sharing than the standard deviation of forecasting errors and its pattern of deterioration over time. A slight positive bias in the retailer's forecast can actually increase the benefit of sharing information for the supplier and the entire supply chain. However, it can also increase the cost for retailers. The demand pattern faced by retailers also significantly influences the impact of forecasting accuracy on the value of the information sharing. These findings will motivate companies to share information, and will help to design incentive schemes to encourage information-sharing and justify investment in information-sharing projects. The findings can also be used to minimize the negative impact of forecasting errors on supply chain performance.  相似文献   

14.
Forecasts are plentiful. Accurate long-range forecasts are rare.But some forecasts are more accurate than others are and a feware very accurate. In this paper, we first explore the caseof Moore's Law, a forecast that has proven quite accurate foralmost 40 years. We illustrate how expectations that Moore'sLaw will continue to be accurate actually make it accurate.Based on the insights of this case, we hypothesize that twofactors facilitate such self-fulfilling forecasts and so makeaccuracy more possible. We test these hypotheses on a set of3142 forecasts about US manufacturing industries during the1970s. We find that high industry concentration and high controlover the predicted variable tend to increase the accuracy offorecasts.  相似文献   

15.
A critical evaluation of the statistics of the fatigue strength distribution as determined by the staircase (or up-and-down) method is presented. The effects of test parameters (namely, step size and sample size) were analyzed using numerical simulation to determine the accuracy of fatigue strength standard deviation calculations using traditional staircase statistics, resulting in a quantification of standard deviation bias as a function of step size and sample size. A non-linear correction was formulated to mitigate this standard deviation bias inherent in small-sample tests. In addition, the simulation was used to investigate the effectiveness of a bootstrapping algorithm on standard deviation estimates. The bootstrapping algorithm was found to significantly reduce the potential of large standard deviation errors in small-sample tests. Together, the use of the non-linear correction factor and the bootstrapping algorithm may allow an improved method to estimate the statistics of a material’s fatigue strength distribution using a small-sample staircase test strategy.  相似文献   

16.
A combined method for macroconcentrations of platinum determination in chloroplatinic acid and industrial concentrate obtained during processing of some types of electronic engineering scrap that is based on a combination of gravimetric precipitation of platinum by ammonium chloride with subsequent determination of a residual concentration of platinum in a filtrate using the ISP-AES method is developed. The errors at each stage of analysis, as well as the total error, are estimated. It is shown that the accuracy of macro-concentrations of platinum determination using the combined method is not inferior to the accuracy of classical gravimetric analysis (the relative standard deviation does not exceed 0.15% for 15–40% wt concentrations of platinum). The combined method of analysis based on a combination of gravimetry and ISP-AES makes it possible to decrease labor expenditures and the time of analysis as compared to the classical one.  相似文献   

17.
We consider the problem of forecasting multi-class job flow times in a resource-sharing environment. We assume the deviation of flow times in each class from the class nominal value follows an exponential distribution with its parameter following a gamma distribution. A large simulation experiment is conducted to assess and compare the performance of the Bayes and empirical Bayes forecasting methods under differing model assumptions. Simulation results show that non-parametric empirical Bayes methods are more efficient and robust relative to the parametric empirical Bayes.  相似文献   

18.
A new strategy for identifying proteins by MALDI-TOF-MS peptide mapping is reported. In contrast to current approaches, the strategy does not rely on a good relative or absolute mass accuracy as the criterion that discriminates false positive results. The protein sequence database is first searched for all proteins that match a minimum five of the submitted masses within the maximum expected relative errors when the default or externally determined calibration constants are used, for instance, +/-500 ppm. Typically, this search retrieves many thousand candidate sequences. Assuming initially that each of these is the correct protein, the relative errors of the matching peptide masses are calculated for each candidate sequence. Linear regression analysis is then performed of the calculated relative errors as a function of m/z for each candidate sequence, and the standard deviation to the regression is used to distinguish the correct sequence among the candidates. We show that this parameter is independent of whether the mass spectrometric data were internally or externally calibrated. The result is a search engine that renders internal spectrum calibration unnecessary and adapts to the quality of the raw data without user interference. This is made possible by a dynamic scoring algorithm, which takes into account the number of matching peptide masses, the percentage of the protein's sequence covered by these peptides and, as new parameter, the determined standard deviation. The lower the standard deviation, the less cleavage peptides are required for identification and vice versa. Performance of the new strategy is demonstrated and discussed. All necessary computing has been implemented in a computer program, free access to which is provided in the Internet.  相似文献   

19.
In retail, distribution centres can forecast the stores’ future replenishment orders by computing planned orders for each stock-keeping-unit. Planned orders are obtained by simulating the future replenishment ordering of each stock-keeping-unit based on information about the delivery schedules, the inventory levels, the order policies and the point-estimate forecasts of consumer demand. Point-estimate forecasts are commonly used because automated store ordering systems do not provide information on the demand distribution. However, it is not clear how accurate the resulting planned orders are in the case of products with low and intermittent demand, which make up large parts of the assortment in retail. This paper examines the added value of modelling consumer demand with distributions, when computing the planned orders of products with low and intermittent demand. We use real sales data to estimate two versions of a planned order model: One that uses point-estimates and another that uses distributions to model the consumer demand. We compare the forecasting accuracies of the two models and apply them to two example applications. Our results show that using distributions instead of point-estimates results in a significant improvement in the accuracy of replenishment order forecasts and offers potential for substantial cost savings.  相似文献   

20.
This paper considers the problem of obtaining robust control charts for detecting changes in the mean µ and standard deviation σ of process observations that have a continuous distribution. The standard control charts for monitoring µ and σ are based on the assumption that the process distribution is normal. However, the process distribution may not be normal in many situations, and using these control charts can lead to very misleading conclusions. Although some control charts for µ can be tuned to be robust to non‐normal distributions, the most critical problem with non‐robustness is with the control chart for σ. This paper investigates the performance of two CUSUM chart combinations that can be made to be robust to non‐normality. One combination consists of the standard CUSUM chart for µ and a CUSUM chart of absolute deviations from target for σ, where these CUSUM charts are tuned to detect relatively small parameter shifts. The other combination is based on using winsorized observations in the standard CUSUM chart for µ and a CUSUM chart of squared deviations from target for σ. Guidance is given for selecting the design parameters and control limits of these robust CUSUM chart combinations. When the observations are actually normal, using one of these robust CUSUM chart combination will result in some reduction in the ability to detect moderate and large changes in µ and σ, compared with using a CUSUM chart combination that is designed specifically for the normal distribution. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号