Acessibilidade / Reportar erro

Top-down or bottom-up forecasting?

Abstracts

The operations literature continues on inconclusive as to the most appropriate sales forecasting approach (Top-Down or Bottom-up) for the determination of safety inventory levels. This paper presents the analytical results for the variance of the sales forecasting errors during the lead-time in both approaches. The forecasting method used was the Simple Exponential Smoothing and the results led to the identification of two supplementary impacts upon the forecasting error variance, and consequently, upon safety inventory levels: the Portfolio Effect and the Anchoring Effect. The first depends upon the correlation coefficient of demand between two individual items and the latter, depends upon the smoothing constant and upon the participation of the individual item in total sales. It is also analysed under which conditions these variables would favour one forecasting approach instead of the other.

forecasting approach; exponential smoothing; safety inventory levels


A literatura de operações permanece sem concluir sobre a abordagem mais adequada de previsão de vendas (Top-Down ou Bottom-Up) para o dimensionamento de estoques de segurança. Nesse manuscrito são apresentados os resultados analíticos para a variância dos erros de previsão no tempo de resposta com amortecimento exponencial nessas duas abordagens. Os resultados apontam dois impactos complementares na variância do erro de previsão, e conseqüentemente, nos níveis de estoque de segurança: Efeito Portifólio e Efeito Ancoragem. O primeiro depende do coeficiente de correlação da demanda entre os produtos e o segundo, da constante de amortecimento e da participação das vendas do produto nas vendas totais. É analisado sob quais condições essas variáveis favoreceriam uma abordagem de previsão em detrimento da outra.

abordagem de previsão; amortecimento exponencial; níveis de estoque de segurança


Top-down or bottom-up forecasting?

Peter WankeI,* * Corresponding author / autor para quem as correspondências devem ser encaminhadas ; Eduardo SalibyII

ICentre for Logistics Studies The COPPEAD Graduate School of Business Federal University of Rio de Janeiro (UFRJ) Rio de Janeiro – RJ peter@coppead.ufrj.br

IIThe COPPEAD Graduate School of Business Federal University of Rio de Janeiro (UFRJ) Rio de Janeiro – RJ saliby@coppead.ufrj.br

ABSTRACT

The operations literature continues on inconclusive as to the most appropriate sales forecasting approach (Top-Down or Bottom-up) for the determination of safety inventory levels. This paper presents the analytical results for the variance of the sales forecasting errors during the lead-time in both approaches. The forecasting method used was the Simple Exponential Smoothing and the results led to the identification of two supplementary impacts upon the forecasting error variance, and consequently, upon safety inventory levels: the Portfolio Effect and the Anchoring Effect. The first depends upon the correlation coefficient of demand between two individual items and the latter, depends upon the smoothing constant and upon the participation of the individual item in total sales. It is also analysed under which conditions these variables would favour one forecasting approach instead of the other.

Keywords: forecasting approach; exponential smoothing; safety inventory levels.

RESUMO

A literatura de operações permanece sem concluir sobre a abordagem mais adequada de previsão de vendas (Top-Down ou Bottom-Up) para o dimensionamento de estoques de segurança. Nesse manuscrito são apresentados os resultados analíticos para a variância dos erros de previsão no tempo de resposta com amortecimento exponencial nessas duas abordagens. Os resultados apontam dois impactos complementares na variância do erro de previsão, e conseqüentemente, nos níveis de estoque de segurança: Efeito Portifólio e Efeito Ancoragem. O primeiro depende do coeficiente de correlação da demanda entre os produtos e o segundo, da constante de amortecimento e da participação das vendas do produto nas vendas totais. É analisado sob quais condições essas variáveis favoreceriam uma abordagem de previsão em detrimento da outra.

Palavras-chave: abordagem de previsão; amortecimento exponencial; níveis de estoque de segurança.

1. Introduction

The objective of this paper is to analyse the behaviour of the variance of the sales forecasting error under the Top-Down and Bottom-Up approaches in order to identify under which conditions one approach would be preferred instead of the other in terms of lower safety inventory levels.

In previous studies, analyses were carried out using the Simple Exponential Smoothing method on actual sales data (see for example Kahn, 1998 and Lapide, 1998). They led to different conclusions about the adequacy of these approaches for different levels of the correlation coefficient, the sales variance and the share of a given product in total or aggregate sales. In this research, the expressions for the variance of the forecasting errors during the lead-time, assumed to be discrete, were analytically demonstrated and the two main effects that may favour the Top-Down approach to the detriment of the Bottom-Up were identified: the Portfolio Effect and the Anchoring Effect. Differently from previous researches, it was considered that sales forecasts were frozen (had the same value) during the lead-time.

The results indicate that a product with small participation in total sales and negatively correlated with the aggregate sales of the remaining products (or individual items) tends to present lower forecasting error variance under the Top-Down approach due to the compensation of part of its variance with the aggregate variance of the remaining items. Even when dealing with higher participation in total sales and correlation coefficient, the Top-Down approach may present better results than Bottom-Up if the product sales variance is sufficiently greater than the variance of the aggregate sales of the remaining products, given that a less than proportional portion of the total variance is added to the product variance.

The remainder of the paper is structured as follows: sections 2 and 3 are dedicated to the review of the Top-Down and Bottom-Up approaches and to the Simple Exponential Smoothing concepts, respectively. In section 4, the expressions for the variance of the sales forecasting errors during the lead-time are analytically demonstrated for these two approaches and the impacts of the forecasting errors on safety inventories are discussed. Readers should recall that sales forecasts were considered to be frozen during the lead-time. Finally, in section 5, the adequacy of those approaches to different classes of products or items is argued.

2. Top-Down and Bottom-Up Approaches

There exists great consensus amongst authors about the conceptualisation and operationalisation of the Top-Down (TD) and Bottom-Up (BU) sales forecasting approaches. For example, according to Lapide (1998), under the TD approach, sales forecasting is done first by aggregating all individual items, and then by disaggregating these aggregate data into individual items again, generally based on the historical percentage of the item within the total group. In this sense, Schwarzkopf et al. (1988) point out that in the TD approach is primarily forecasted the aggregate total and the subsequent disaggregation is done based on the historical proportions of each individual item. As regards the BU approach, each one of the individual items is forecasted separately and then all the forecasts are summed up in case an aggregate forecast for the group is deemed necessary (Lapide, 1998). In other words, under the BU approach, the forecaster prepares first the forecasts for each individual item, aggregating them thereafter under the interest level of the analysis (Jain, 1995).

Several authors have tried to relate the adequacy of the TD and BU approaches for different characteristics of the sales historical data such as, for example: the correlation coefficient between the sales of the individual item being studied and the aggregate sales of the remaining items ( r ), the share or proportion of the item being studied in the aggregate total sales ( ¦ ), and the ratio between the sales variance of the individual item being studied and the aggregate sales of the remaining items ( k2 ). An implicit premise in some of the researched papers is related to the Portfolio Effect, a concept initially defined by Zinn et al. (1989) to evaluate the inventory centralization impact upon the total sales variance.

According to the Portfolio Effect, inventory centralization minimizes the total variance whenever the correlation coefficient of sales between the markets is – 1 and the ratio between the sales variance of markets is 1. Generally speaking, the rationale behind the variance reduction, according to the authors, is the compensation of the sales fluctuations between two markets: when the sales from one market go up, the sales for the other market go down by the same amount.

One example of that is Kahn's (1998) paper, according to which in the TD approach, the peaks and valleys inherent to each item are cancelled off by the aggregation. The negative correlation among the individual items would reduce the aggregate sales variance. Also corroborating the main conclusions of the Portfolio Effect, Schwarzkopf et al. (1988) point out that estimates based on aggregate data are more precise than those based on individual forecasts when the individual items present independent sales patterns (null correlation).

However, Lapide (1998) suggests that, as a rule of thumb, the TD approach makes sense if and only if all the individual items sales are growing, decreasing or remaining stable, thus characterizing a positive correlation amongst the sales of different items. The author continues on ascertaining that a family of products frequently comprises items that potentially cannibalise each other, as in the case of a family with new and old products. For those items, the sales pattern is very different, given that some items increase to detriment of the others (negative correlation), which would render the BU approach preferable.

Gordon et al. (1998) and Gelly (1999) discourse on the adequacy of the TD and BU approaches for other characteristics of the sales historical data. More specifically, these authors studied more than 15,000 aggregate and disaggregate historical data series generating forecasts with Triple Exponential Smoothing. The BU approach yielded more precise forecasts in 75% of the series, and a greater accuracy was verified within individual items with strong positive correlation and whenever they represented a large proportion of the aggregate total sales. On the other hand, when the data were negatively correlated, the TD approach showed itself more accurate regardless of the item's participation in the aggregate total sales.

Finally, in the case study presented by Gelly (1999), the TD approach revealed itself to be more adequate for individual items that present more a predictable sales pattern throughout time, as for example, with a small sales coefficient of variation ( CV ). That small CV could result from a large participation of the individual item in the aggregate total sales. It could also result from a small ratio between the sales variance of the individual item and the variance of the aggregate sales of the remaining items. The characteristics that favour the TD and BU approaches are summarized in Table 1.

3. Simple Exponential Smoothing

According to Gijbels et al. (1999), the Simple Exponential Smoothing (SES) is the most commonly used model in sales forecasting. Its main advantages are related to the fact that it is a non-parametric model based on a simple algebraic formula that quickly enables the updating of the local level estimation of the sales data.

In the last twenty years, some researches were carried out to better comprehend and describe the SES and its extensions from a statistical perspective. For example, Chatfield et al. (2001) compare a variety of potential Exponential Smoothing models derived from autoregressive moving averages, structural models and non-linear dynamical spaces and conclude why the SES and its extensions are robust even despite changes in the variance of the historical data. Blackburn et al. (1995) show that the SES may introduce spurious autocorrelations in series that the trend component may have been removed and that these autocorrelations would depend upon the average age of the data and of the smoothing constant value. Finally, Gijbels et al. (1999) compare the SES with the Kernel Regression enabling a better understanding of the equivalence and best adequacy between both approaches.

The SES and its extensions were developed in the late 1950s by Brown, Winters, and Holt, amongst other authors (Chatfield et al., 2001). Among its main premises and limitations, it is worth highlighting that in the SES, eventual growth or decrease trends, seasonal fluctuations and cyclical variations are not considered. For example, the sales forecast for a random variable X with SES is as follows:

where Ft is the forecast of X for period t , Xt–1 is the actual sales of X in period t–1, Ft–1 is the forecast of X in period t–1, and a is the smoothing constant, which ranges from 0 and 1.

4. Forecasting Errors During Lead-Time

This research predominantly departs from the sample errors that are intrinsic to sales and inventory corporate planning systems. When all the historical data are available, estimates for the mean and standard deviation may be determined with the desired significance level. Inferences about the probability distribution of the sales random variable may also be done. However, many corporate systems do not use or have all the historical data, which generates uncertainty about if the forecasting errors result from random noises or from a change in one or more of the parameters values in the underlying demand model. Unfortunately, a priori, it is not possible to clarify those questions regarding the actual sales level; they can only be solved a posteriori (Silver et al., 2002).

Many forecasting models do not use all available historical data. The moving averages use only the last n data and the SES attributes declining weights upon past data (Silver & Peterson 1985). In these circumstances, according to the authors, the best average sales estimate is simply the sales forecast for the next period. As regards estimating the sales variance, the variance of the forecasting error should be used.

According to Silver & Peterson (1985) and Greene (1997), the forecasting error variance and the sales variance are not equal. According to these authors, the best adequacy of the forecasting error variance is related to the use of forecasting for sales estimation. For instance, the safety inventory level must be determined to protect against variations in the sales forecasting errors. In general, the forecasting error variance tends to be greater than the sales variance (Silver et al., 2002). This is due to the additional sampling error introduced by the forecasting models when using only part of the available historical data.

According to Harrison (1967) and Johnson & Harrison (1986), a more complex situation that must be considered occurs when the sales forecasting is considered during the replenishment lead-time ( LT ), assumed here to be discrete. In this case, the forecasting error variance during the lead-time must be estimated. According to Silver & Peterson (1985), the exact relation between the forecasting error variance and the variance of the forecasting error during the lead-time depends upon complicated relations between the sales pattern, the forecasting review procedures and the value of n used in the moving average, or the value of the smoothing constant used in the SES (see Harrison, 1967). According to the authors, one of the reasons for such complexity is that the recurrence procedure in the SES introduces a certain degree of dependence between the forecast errors at different periods of time separated by the lead-time. More recently, Snyder et al. (2004) established formulae for calculating means and variances of sales forecasts during lead-time considering several exponential smoothing models. Like Johnston & Harrison (1986), these authors considered that sales forecasts could vary during the lead-time.

Using frozen sales forecasts during the lead-time is a very common managerial practice, since it tends to be worthless to review sales forecasts values when replenishment orders are still being processed (Greene, 1997). Unless it is possible for the decision-makers to change previously placed orders during the replenishment lead-time, there is little value in reviewing sales forecasts during time interval between the placement and the receipt of an order. This research departs from previous studies for considering the sales forecasts frozen during the lead-time. In other words, for the purpose of calculating forecast errors mean and variances, sales forecasts are supposed not to change during the lead-time.

Let the following variables be introduced (source: the authors):

A t product A sales at time period t, B t aggregate sales of the remaining products (excluding At) at t, T t total aggregate sales ( At +Bt ) at t, µA expected value of At (assumed to be constant over t), µB expected value of Bt (assumed to be constant over t), µT expected value of Tt (assumed to be constant over t), sA standard deviation of At (assumed to be constant over t), V(A) variance of At, sB standard deviation of Bt (assumed to be constant over t), V(B) variance of Bt, rAB correlation coefficient between A and B also assumed constant, st standard deviation of T V(T)

variance of Tt,

k ratio between the standard deviations, sA /sB , ¦ historical share of product A sales, µA / µT , F a,t

sales forecast of A ,

F Tt , sales forecast of Tt, LT replenishment lead-time assumed here to be discrete, µLT expected value of LT , sLT standard deviation of LT , V(LT) variance of LT , LT A aggregated A sales during the lead-time, LTF A aggregated sales forecast of A during the lead-time, LTE A

sales forecast error of A during the lead-time, LTEA = LTA – LTFA .

In the remaining text, as a rule, E(·) denotes the expected value and V(·) denotes the variance of a given random variable. Also, whenever a variable is constant over time, the index t is omitted.

4.1 Bottom-Up Approach

In this case, the results are derived for an individual item (A). From the above definitions, it follows that:

(a) aggregated A sales during the lead-time:

(b) frozen sales forecast of A during lead-time:

(c) sales forecast error of A during the lead-time:

Substituting (1) and (2) into (3), we have:

Taking the expected values in equation (4), we have:

But E(A) = E(FA) , as shown in Appendix 1 Appendix 1 1 – Establishing . So:

Supposing that FA and LT are not correlated, we have:

Taking the variance of LTEA in equation (4) and supposing that A and LT are not correlated, just as A and FA , but that LTA and LTFA present correlation given by rLTA·LTFA, we have:

But rLTA·LTFA is equal to:

That is:

Substituting (6) into (5), we have:

But, as indicated in Appendix 2 Appendix 2 2 – Establishing :

And according to what is indicated in Appendix 3 Appendix 3 3 – Establishing :

Thus, substituting (8) and (9) into (7) and observing that E(FA) = E(A), we have:

Finally, as shown in Appendix 4 Appendix 4: : Establishing :

Substituting (11) into (10), we have:

That is:

This result is identical to that obtained by Eppen & Martin (1988) for the variance of the total forecast error during the lead-time, considering sales to be essentially a constant process and the use of simple exponential smoothing (see their equations (18) and (19)).

4.2 Top-Down Approach

In the BU approach the variance of the forecasting error during the lead-time is given by equation (12). Assuming that ¦ is nearly constant, it must be taken into consideration that the forecasting of At in the TD approach depends upon the forecasting of Tt, and that is given by:

As such, taking the variance of FA,t in equation (13), we have:

In a similar way to equation (11) and supposing the same smoothing constant:

Substituting (15) into (14), we have:

Thus, expanding the term V(T) in (16), we have:

Substituting (17) into (10) and also considering that k = sA /sB , we have:

That is:

As a validation procedure, spreadsheet Monte-Carlo simulations were performed to confirm the results of equations (6), (8), (9), (12) and (18). Long-range time series (more than 1,000 observations) were randomly generated and the forecasts were summed up during the lead-time, which was also randomly generated. Specifically, A and B sales were generated through correlated normal bivariate distributions and lead-times were supposed to be discrete and uniformly distributed. These distributions were chosen for the sake of simplicity, since the abovementioned results are non-parametric. Both forecasting approaches were applied when generating these data. The average error after 60 replications (randomly generated runs) was about 1% of the theoretical results given by these formulae. It is worth emphasizing that equations (12) and (18) are the key results for the analysis that follows.

Thus, equalling (12) and (18) and then isolating k on the left side of the equation in order to express it in terms of ¦ and rAB (since the remainder terms are cancelled off during this operation) one gets the critical value of k ( kcritical ). It equals the variances of the forecasting error during the lead-time in the TD and BU approaches:

Finally, a closer look into equations (12) and (18) shows that if k > kcritical , the variance of the forecasting error during the lead-time is smaller under the TD approach.

4.3 Discussion of Results

According to the results, the smaller the values of rAB and E(¦), the greater the chances that the TD approach will minimize the forecasting error variance, and consequently, the safety inventory levels. The Portfolio Effect explains part of the result: an individual item negatively correlated with the aggregate sales of the remaining items presents lower forecasting error variance under the TD approach due to the compensation of part of its variance with the aggregate variance of the remaining items.

Even for higher values of rAB and E(¦), the TD approach may present better results when compared to the BU if the value of k is sufficiently high, that is, if sA is sufficiently higher than sB . That result is fully explained by the Anchoring Effect (given that the Portfolio Effect is not present): a less than proportional portion of the total variance is added to the variance of product A. In Figure 1, the Portfolio Effect and the Anchoring Effect are illustrated with numerical examples for the particular case where µLT =1 and s LT = 0 (equations (12) and (18)).


In Figure 2, the results of equation (19) are generalized and the indifference lines between the TD and BU approaches for different values of E(¦), kand rAB are presented. Essentially, if k > kcritical the TD approach must be chosen; otherwise, the BU approach. In other words, given a pair (E(¦), rAB), if the actual value of k is greater than the respective value of kcritical linked to that pair, the TD approach must be chosen in the Simple Exponential Smoothing instead of the BU approach.


One notices from Figure 2 that, the smaller the values of rAB and E(¦) get, the greater are the chances (area of the graph above the indifference lines) of the TD approach to minimize the forecasting error variance with SES. The fact that can explain that result is the conjoint presence of the Portfolio and Anchoring Effects. However, even for higher values of E(¦) and rAB , the TD approach may present lower forecasting error variance if the value of k is sufficiently high, due to the Anchoring Effect:.

5. Conclusions and Implications

In this paper, the impacts of the Top-Down and Bottom-Up approaches upon the variance of the forecasting errors during the lead-time, and consequently, on the safety inventory levels, were analytically shown. In both approaches, greater values of the smoothing constant and of the lead-time (expected value and variance) increase the error variance by the same amount.

The Portfolio and Anchoring Effects explain why the Top-Down approach may minimize the variance of the forecasting errors during the lead-time. Under the Portfolio Effect, present when the correlation is negative, the total variance is lower due to the offsetting of the individual item variance with the variance of the remaining individual items. Under the Anchoring Effect, which is seen even when the correlation is positive, the term implicates the incorporation of a less than proportional portion of the total variance into the variance of the individual item given that E(¦) and a are lower than 1.

When the analytical solutions for the variance of the forecasting errors during the lead-time under the Top-Down and Bottom-Up approaches are made equal, the terms a , µLT , and sLT are cancelled off. The choice of the lower variance approach depends upon the comparison of the value of k with kcritical given by equation (19). That is, the lead-time and the smoothing constant do not specifically influence the determination of the more adequate forecasting approach.

Practitioners may also benefit from the results since the flexibility often sought for the sales forecasting and the inventory management process is warranted. Primarily, the results presented enable one to determine the approach that leads to the lower variance of the forecasting errors during the lead-time with a relatively small computational effort. Secondly, the presented results may be used to segment the sales forecasting process with the purpose of determining safety inventory levels, as shown in Table 2.

For example, C items typically present lower participation in total sales and greater coefficient of variation (Zinn & Croxton 2005). Given that E(¦) value is small, the Anchoring Effect is as relevant as the Portfolio Effect even when the correlation is positive, according to what is indicated by the practically straight lines for E(¦) values up to 0.25 in Figure 2. In those circumstances it is very likely that the value of k is greater than the value of kcritical and the TD approach would lead to lower variance of the forecasting errors during the lead-time.

On the other hand, since A items typically present greater participation in total sales and lower coefficient of variation, they must be individually forecasted (BU approach) mainly if they are positively correlated with the aggregate sales of the remaining items. When the correlation is negative, the TD approach may be more adequate if the value of k is sufficiently high (see Figure 2).

Finally, suggestions for future research involve the behaviour of the forecasting error variance under these two approaches when the standard deviation of f is not assumed to be zero. Some pertinent issues are therefore raised. For example, would this assumption favour a given forecasting approach? Specifically, under which conditions of a , rAB , and k is a good approximation to assume ¦ constant? What are the biases incurred in that approximation? What are the impacts of the covariance between T and f on both forecasting approaches? What are the implications for the sales forecasting and for the determination of safety inventory levels?

Recebido em 02/2007; aceito em 09/2007

Received February 2007; accepted September 2007

E(FA)

Let the forecast of A with the SES be given by (A 1):

Taking the expected values, assumed to be constant over time, we have:

The variance of the random sum of product A sales during the discrete lead-time was determined with factorial moment generating function (Zwillinger & Kokosa 2000), thus corroborating the results previously presented by Mentzer & Krishnan (1988).

F(t) represents the factorial moment generating function of the random sum of A during the lead-time. Its variance is obtained by evaluating the following expression for t = 1:

where F'(t) and F''(t) are the first and second derivatives of F()t , respectively. Let FA(t) and PLT (t) be the factorial moment generating functions of the random variables A (continuous) and LT (discrete), respectively and let t be a supporting variable:

F(t) is obtained by substituting (B 2) into (B 3):

Finally, differentiating (B 4) with respect to t and substituting the results into (B 1), we have:

V(FA·LT)

The variance of the product of the random variables FA,t and LT yields identical result to what was presented by Brown (1982).

The result

is equivalent to calculating the variance of the frozen sales forecasting during the lead-time from equation (C 1).

V(FA)

Let the forecast of At with SES be given by (D 1):

Let (D 2) be the expansion of the recursive terms in (D 1):

Taking the variances of (D 2), we have:

Finally, solving the sum to infinity of the geometric progression with ratio 1– a in (D 3), we have:

  • (1) Blackburn, K.; Orduna, F. & Sola, M. (1995). Exponential smoothing and spurious correlation: a note. Applied Economic Letters, 2, 76-79.
  • (2) Brown, R.G. (1982). Advanced Service Parts Inventory Control Materials Management Systems Inc, Vermont.
  • (3) Chatfield, C.; Koehler, A.B. & Snyder, R.D. (2001). A new look at models for exponential smoothing. The Statistician, 50, 147-159.
  • (4) Eppen, G. & Martin, K. (1988). Determining safety stock in the presence of stochastic lead time and demand. Management Science, 34, 1380-1390.
  • (5) Gelly, P. (1999). Managing bottom-up and top-down approaches: ocean spray's experiences. The Journal of Business Forecasting, 18, 3-6.
  • (6) Gijbels, I.; Pope, A. & Wand, M.P. (1999). Understanding exponential smoothing via kernel regression. Journal of the Royal Statistician Society, 61, 39-50.
  • (7) Gordon, T.P.; Morris, J.S. & Dangerfield, B.J. (1997). Top-down or bottom-up: which is the best approach to forecasting. The Journal of Business Forecasting, 6, 13-16.
  • (8) Greene, J.H. (1997). Production and Inventory Control Handbook McGraw-Hill, New York.
  • (9) Harrison, P.J. (1967). Exponential smoothing and shot-term sales forecasting. Management Science, 13, 821-842.
  • (10) Jain, C.L. (1995). How to determine the approach to forecasting. The Journal of Business Forecasting, 14, 2-3.
  • (11) Johnston, F.R. & Harrison, P.J. (1986). The variance of lead-time demand. Journal of the Operational Research Society, 31, 303-308.
  • (12) Kahn, K.B. (1998). Revisiting top-down versus bottom-up forecasting. The Journal of Business Forecasting, 17, 14-19.
  • (13) Lapide, L. (1998). A simple view of top-down versus bottom-up forecasting. The Journal of Business Forecasting, 17, 28-29.
  • (14) Mentzer, J.T. & Krishnan, R. (1988). The effect of the assumption of normality on inventory control/customer service. Journal of Business Logistics, 6, 101-120.
  • (15) Schwarzkopf, A.B.; Tersine, R.J. & Morris, J.S. (1988). Top-down versus bottom-up forecasting strategies. International Journal of Production Research, 26, 1833-1843.
  • (16) Silver, E.A. & Peterson, R. (1985). Decision Systems for Inventory Management and Production Planning Wiley & Sons, New York.
  • (17) Silver, E.A.; Pyke, D. & Peterson, R. (2002). Decision Systems for Inventory Management and Production Planning and Scheduling Wiley & Sons, New York.
  • (18) Snyder, R.; Koehler, A.; Hyndman, R. & Ord, J. (2004). Exponential smoothing models: mean and variances for lead-time demand. European Journal of Operational Research, 158, 444-455.
  • (19) Zinn, W. & Croxton, K.L. (2005). Inventory considerations in networking design. Journal of Business Logistics, 26, 149-168.
  • (20) Zinn, W.; Levy, M. & Bowersox, D.J. (1989). Measuring the effect of inventory centralization/decentralization on aggregate safety stock: the 'square root law' revisited. Journal of Business Logistics, 10, 1-14.
  • (21) Zwillinger, D. & Kokosa, S. (2000). Standard Probability and Statistics Tables and Formulae Chapman & Hall, New York.

Appendix 1 1 – Establishing

Appendix 2 2 – Establishing

Appendix 3 3 – Establishing

Appendix 4: : Establishing

  • *
    Corresponding author / autor para quem as correspondências devem ser encaminhadas
  • Publication Dates

    • Publication in this collection
      05 Mar 2008
    • Date of issue
      2007

    History

    • Received
      Feb 2007
    • Accepted
      Sept 2007
    Sociedade Brasileira de Pesquisa Operacional Rua Mayrink Veiga, 32 - sala 601 - Centro, 20090-050 Rio de Janeiro RJ - Brasil, Tel.: +55 21 2263-0499, Fax: +55 21 2263-0501 - Rio de Janeiro - RJ - Brazil
    E-mail: sobrapo@sobrapo.org.br