Acessibilidade / Reportar erro

Clusters and pivots for evaluating a large numberof alternatives in AHP

Abstract

AHP has been successful in many cases but it has a major limitation: a larger number of alternatives requires a high number of judgements in the comparison matrices. In order to reduce this problem,we present a method with clusters and pivots. This method also helps with a further four problems of the Analytic Hierarchy Process. It enlarges the comparison scale, facilitates the construction of a consistent or near consistent matrix, eliminates the problem of the choice of the priorities derivation method and allows the use of incomparable alternatives.

supplier selection; Analytic Hierarchy Process (AHP); Multicriteria Decision Making (MCDM)


Clusters and pivots for evaluating a large numberof alternatives in AHP

Alessio Ishizaka

University of Portsmouth, Portsmouth Business School, Strategy and Business Systems, Richmond Building, PortlandStreet, PO1 3ED Portsmouth, United Kingdom. E-mail: Alessio.Ishizaka@port.ac.uk

ABSTRACT

AHP has been successful in many cases but it has a major limitation: a larger number of alternatives requires a high number of judgements in the comparison matrices. In order to reduce this problem,we present a method with clusters and pivots. This method also helps with a further four problems of the Analytic Hierarchy Process. It enlarges the comparison scale, facilitates the construction of a consistent or near consistent matrix, eliminates the problem of the choice of the priorities derivation method and allows the use of incomparable alternatives.

Keywords: supplier selection, Analytic Hierarchy Process (AHP), Multicriteria Decision Making (MCDM).

1 INTRODUCTION

The Analytic Hierarchy Process (AHP) is a multi-criteria decision making method developed by Saaty (1977; 1980). Since its inception, it has been applied successfully on numerous occasions:see for example the reviews (Zahedi, 1986; Golden & Wasil et al., 1989; Shim, 1989; Vargas,1990; Saaty & Forman, 1992; Forman & Gass, 2001; Kumar & Vaidya, 2006; Omkarprasad &Sushil, 2006; Ho, 2008; Liberatore & Nydick, 2008; Sipahi & Timor, 2010). The method is similar to the weighted sum model with the exception of the weight allocation process. AHP is a successive pairwise comparison process between each criterion and each alternative, rather than a simultaneous process like the weighted sum model. Psychologists have used this technique for a long time to compare affective alternatives (Yokoyama, 1921; Saaty & Ozdemir, 2003).They argue it is easier and more accurate to express one's opinion on only two alternatives than simultaneously on all the alternatives. The absence of units is another advantage: a comparison is a relative value or a quotient a/b of two quantities a and b of the same kind. The decision maker does not need to give a numerical judgement; instead a relative verbal appreciation, more familiar in our daily lives, is sufficient. Saaty has proposed a comparison scale of nine levels(see Table 1).

However Saaty's scale sometimes suffers from shortcomings (Dyer, 1990; Ho & Xu et al., 2010). Some decision problems need a larger scale. AHP also suffers from other criticisms:

– the number of required comparisons could be very high, – only consistent or near consistent comparison matrices can be used to calculate meaningfulpriorities which could often necessitate many reconsiderations of the entries,

– the priorities depend on the method used to derive them,

– incomparable alternatives are not allowed.

This paper shows how clusters and pivots can solve these problems.

2 CLUSTERS AND PIVOTS

2.1 Description of the method

The clusters and pivots method is useful for application on matrices of high rank, when the number of pairwise comparison becomes overwhelming. It is based on four steps:

a) For each criterion the alternatives are preordered.Each criterion produces another order otherwise all criteria would be a replicate of themselves and the problem would be mono-criterion.

b) Alternatives are divided into clusters.The classical cluster analysis cannot be used in this case because we do not know a priori the scores of each alternatives on the criteria, which are often subjective to the decision-maker (it is the task of AHP to calculate the local priorities). The decision-maker must evaluate which alternatives are close enough and therefore easy to compare. A heuristic way to construct the clusters is to compare the best ordered alternative successively with the next ones, from the second best to the worst, until:

– either the cluster contains 7 elements. Psychologists have observed that it is difficult to evaluate more than 7 elements (Saaty & Ozdemir, 2003). Therefore, it is recommended to build clusters that do not contain more than 7 elements.

– or the entered comparison is 9. As no higher intensity of preference is available on the comparison scale (Table 1), it is appropriate to close the cluster.

The last compared alternative becomes the pivot at the boundary of both clusters. The same process is repeated with the pivot until 7 elements are in the cluster or a comparison value of 9 is entered, or all entries are provided. In the Figure 1, the alternative D is the pivot.


c) Comparisons are entered in clustered matrices and priorities are calculated. d) Priorities of the clusters are joined with a common element: "the pivot". The pivot is used for the conversion rate between two clusters.

In AHP all alternatives are compared with each other in a unique comparison matrix, which can be perceived as a one-cluster problem. In a scoring model, direct judgements are used. We can consider each element to be a separate cluster. The AHP and the scoring model represent the two extremes, where 1 or n clusters are used. Our model is the middle way between these two methods.

The next sections detail the advantages of clusters and pivots in AHP.

2.2 Comparison scale

Because the 1-9 scale sometimes suffers from shortcomings, Saaty (2001) has already proposed to use clusters and pivots to extend the comparison scale. The scale will be extended from 1-9 to 1-9n, where n is the number of clusters.

The fundamental scale (Table 1) is based on the logarithmic law of stimulus response (Saaty,1980). In 1846, Weber found, for example, that people could distinguish two cups of tea with 2g and 4g of sugar but could not do so if the second has only 3g. At the same time, while they could not distinguish a tea with 10g and 12g of sugar, they could between 10g and 20g of sugar, and so on at higher levels. We need to increase a stimulus s by a Δs proportional to s in order to reach a point where our senses can discriminate between s and s + Δs. Therefore, it is more accurate to compare near alternatives gathered in a cluster.

2.3 Reduction of the number of entries

Central to the analysis stage of the Analytic Hierarchy Process (AHP) is a redundant pairwise comparison process for entering judgements among defined alternatives or criteria. For each local priority vector, (n2 n)/2 comparisons are required (where n is the dimension of the matrix). It is known that the total number of comparisons and the demands placed upon the decision maker can become excessive as the number of alternatives, criteria and hierarchical levels increase. For example, a simple hierarchy with nine alternatives and five criteria would require the decision maker to perform (52 − 5)/2 + 5 ∙ (92 − 9)/2 = 190 paired comparisons. In problems with very large hierarchies, the total number of comparisons can number in the thousands. In group decisions, the problem is exacerbated since each comparison carries the potential for lengthy debate.

In fact, only the first n − 1 comparisons are actually necessary; the remaining comparisons are redundant but valuable to crosscheck judgments, measure consistency and increase accuracy.Realizing these facts, Harker (1987) developed a procedure for calculating AHP priorities with missing judgements, a gradient procedure for choosing the next comparison, and stopping rules after sufficient redundancy has been archived. However this method, allowing one to make fewer judgements, has a corresponding loss of accuracy (Forman, 1993).

We suggest subdividing a large number of alternatives into relatively homogenous groups for assessment. Clusters also considerably reduce the number of entries. For example, nine alternatives and five criteria, when working with two clusters of three criteria, two clusters of four alternatives and one cluster of three alternatives (Fig. 2) would require only 2 ∙ (32 − 3)/2 + 5 ∙2 ∙ (42 − 4)/2 + 5 ∙ (32 − 3)/2 = 72 entries. Without clusters, 190 entries are required.


2.4 Consistency

A pairwise matrix is called consistent if the transitivity (1) and the reciprocity (2) rules arerespected:

The calculated priorities are plausible only if the comparison matrices are consistent or near consistent. Especially for high order matrices, consistency is difficult to reach because the number of transitive rules (1) to be satisfied increases quadratically.

To improve an inconsistent matrix, a user can be urged to reconsider pairwise comparisons until the consistency measure proves to be satisfactory (Harker, 1987). Feedback after the completion of the comparison matrix is frustrating to the user, because it gives no hints as to what should be reconsidered in regard to the comparisons. In a previous paper (Ishizaka & Lusti, 2004), an expert module intervening after each event where a pairwise comparison contradicts the comparisons made so far, then explaining it and suggesting consistent alternatives was presented. In such an approach, the user is guided through a sequence of four steps, which lead to a consistent or near consistent matrix. However helpful the method may be, it is difficult to improve the consistency for high order matrices. Often the user does not know which alternative to choose to improve the consistency.

Clusters and pivots split high-order matrices, which decreases the dimension. The consistency improvement is easier because fewer transitive rules have to be considered. For example, in two clusters of four alternatives, only 2 ∗ 4 = 8 transitive rules (without considering reciprocal)have to be considered. Without cluster, the comparison matrix would be of dimension 7 with 35 transitive rules to be considered!

2.5 Derivation of priorities

Different methods have been developed to derive priorities. They can be divided into two groups: the eigenvalue approach and the methods minimizing the distance between the user-defined matrix and the nearest consistent matrix.

2.5.1 Eigenvalue approach

Saaty (1977; 1980) proposes the principal eigenvector p as the desired priorities vector. It is derived from the following equation:

where A is the comparison matrix; p is the priorities vector; and λ is the maximal eigenvalue.

Saaty justifies the eigenvalue approach for slightly inconsistent matrices with the perturbation theory, which says that slight variations in a consistent matrix imply slight variations of the eigenvector and the eigenvalue.

The rank reversal problem for scale inversion is the most pertinent criticism of the eigenvalue method (Johnson & Beine et al., 1979). The solution of the eigenequation (3) gives the right eigenvector p, which is not necessary the same as the left eigenvector pi: the implication p A = λ ∙ p AT pT = λ ∙ pT does not hold. The solution depends on the formulation of the problem! This right and left inconsistency (or asymmetry) arises only for inconsistent matrices with a dimension higher than three (Saaty & Vargas, 1984).

2.5.2 Geometric mean

The priorities are given by the geometric mean. The geometric mean minimizes the logarithmic error (Crawford & Williams, 1985):

where ai j is the comparison between i and j; pi is the priority of i; and n is the dimension of the matrix.

This method is insensitive to an inversion of the scale: the geometric mean of the rows and the columns give the same ranking.

2.5.3 Other methods

Many other methods are proposed. Some minimise the Euclidian distance between the user-defined matrix and the nearest consistent matrix in different metrics. These methods are difficult to apply. In particular, the least squares method has several minima and makes the choice ambiguous. Saaty & Vargas (Saaty & Vargas, 1984) give an example where the least squares methodproduces an illogical result.

Other methods (e.g. the mathematical mean of the row), who provides exactly the same priorities for consistent matrices than any other classical method, has been also used. However, their application on inconsistent matrices has not been yet mathematically justified. These methods are rather used for fast calculations.

2.5.4 Choice of the derivation method

A heated discussion has arisen over the "best" method. One side supports the eigenvalue method (Saaty & Vargas, 1984; Saaty & Vargas, 1984; Harker & Vargas, 1987; Saaty, 2003),the other side argues for the geometric mean (Barzilai & Cook et al., 1987; Barzilai & Golany, 1990; Barzilai, 1997; Aguarón & Moreno-Jimenez, 2000; Barzilai, 2001; Aguaron & Moreno-Jiménez, 2003; Leskinen & Kangas, 2005).

In fact, the right eigenvalue method, the left eigenvalue method and the geometric mean produce the same result for matrices of order three (Saaty, 2001). For matrices of order four, the ranking contradiction is rare (Ishizaka & Lusti, 2006). The use of clusters and pivots erases the problem of the choice of the derivation method if matrices are reduced at order to three or four, which provides same or similar results with all methods.

2.6 Incomparable alternatives

Sometimes, alternatives are not comparable because their differences are too high. For example,it is easy to compare the prestige of a BMW and a Mercedes because they are close. It is much more difficult to quantify precisely how many times a Mercedes is more prestigious than a Fiat because of the large difference. Typically, they do not belong to the same class of prestige.To bypass this problem, we gather comparable alternatives in clusters – thus alternatives which cannot be compared are separated. To unify the clusters we use a common alternative, the pivot,for both clusters. The drawback of this process is to find a common alternative at the boundary of both clusters, which may not be trivial.

3 CASE STUDY: SUPPLIER SELECTION

3.1 Introduction

Suppliers are of tremendous importance to their clients. This importance is accentuated with the pressure to reduce the supply base (Ogden, 2006; Sarkar & Mohapatra, 2006). Wrong supplier selection could lead to serious consequences. Hoecht & Trott (2006) have enumerated several problems due to poor supplier and unhealthy buyer-supplier relationships. Recent surveys confirm the importance given to the supplier selection process (Lieb & Bentz, 2005; Lieb & Bentz,2006; Lieb & Butner, 2007), but the analysis of criteria for selection and measuring performance of supplier has been the focus of many researchers and purchasing practitioners since the 1960's.The multicriteria nature of the problem was recognised very early. Aissaoui et al. (2007), De Boer et al. (2001), El-Sawalhi et al. (2007) and Ho et al. (2010) have compiled a review of selection methods. Among them, Analytic Hierarchy Process (AHP) has been widely used but only when the number of suppliers is small. In this case study, we will illustrate the use of clusters and pivots in AHP for the supplier selection problem among twelve candidates evaluated on the basis of their quality, on-time delivery and costs.

3.2 Quality

The twelve prospective suppliers A, B, C, D, E, F, G, H, I, J, K, L are preordered according to their quality. Then, A, the best supplier for quality, is compared successively with the next others until the entered comparison is 9 (Table 2). As the supplier A is 9 times better than F for its quality, the alternative F is declared the pivot: the last element in the first cluster and the first element in the second cluster. It can be noticed that the process requires 30 comparisons less than the classical AHP approach (in grey in Table 2).

Priorities are calculated for both clusters (Tables 3 and 4). Results of the second clusters are linked to the first one by dividing them by the ratio of the scores of the pivot F in the two clusters: 0.311/0.029.

3.3 Costs

The twelve prospective suppliers are preordered according to the criterion costs: K, H, C, A, E,D, B, G, L, J, I, F. Three clusters are built with the alternatives C and B as pivots. The process requires 38 comparisons less than the classical AHP approach (in grey in Table 5).

Priorities are calculated for the three clusters. Results of the second clusters (Table 7) are linked to the first one (Table 6) by dividing them by the ratio of the scores of the pivot B in the two clusters: 0.454/0.041. Results of the third clusters (Table 8) are converted by the ratio of the pivot C: 5.293/0.068.

3.4 On-time delivery

The twelve prospective suppliers are preordered according to the criterion costs: E, H, C, K, I,L, D, A, G, J, F, B. Two clusters are built with the alternatives A as pivot. The process requires 28 comparisons less than the classical AHP approach (in grey in Table 9). Results of the second clusters (Table 11) are linked to the first one (Table 10) by dividing them by the ratio of the scores of the pivot B in the two clusters: 0.460/0.021.

3.5 Criteria evaluation

The criteria quality, on-time delivery and costs are also evaluated in a pairwise matrix.

3.6 Criteria evaluation

Finally, the local priorities are aggregated (Table 13). The best supplier is K, thanks especially to its low costs.

4 CONCLUSION

Although AHP has been applied in numerous situations with impressive results, it can present certain problems. In particular, the high number of pairwise comparisons required may deter considering large number of alternatives. In fact in the reviewed papers applying AHP for suppliers'selection, the number of alternatives is small. In order to prevent an artificial early elimination of alternatives, we have presented in this paper a method with clusters and pivots. This technique,which can also be applied in other decision problems, has the following benefits:

– decreases the number of required comparisons,

– enlarges the fundamental scale from 1-9 to 1-9n, where n is the number of clusters,

– facilitates the construction of consistent or near consistent matrices,

– eliminates the problem of the choice of the derivation method because all methods provide the same result,

– gives the opportunity to avoid the evaluation of two alternatives, which are difficult to compare.

However, clusters and pivots should be used only when groups can be built so that the degree of association is strong between members of the same cluster and weak between members ofdifferent clusters. The pivot, which joins the two clusters, must be selected carefully because of its central role. Inaccuracies in comparisons involving the pivot will be propagated in all elements of the cluster. An abusive use of clusters will also reduce the property of redundancy and consistency check and therefore increase the inaccuracy.

Clusters and pivots help in five problems of AHP, but further research is needed to solve remaining problems of AHP:

– The rank reversal problem (Belton & Gear, 1983) has been much debated but never fully resolved, see a review in (Ishizaka & Labib, 2009; Ishizaka & Labib, 2011).

– The use of verbal scales may be ambiguous and its conversion to a numerical scale is subjective to the decision-maker (Donegan & Dodd et al., 1992; Ishizaka & Balkenborg etal., 2010).

– A ratio scale is inappropriate when an absolute zero does not exist on the preference scale (Barzilai, 2005).

– The consistency check has been criticized because it allows contradictory judgements in matrices (Kwiesielewicz & van Uden, 2004; Bana e Costa & Vansnick, 2008) or rejects reasonable matrices (Karapetrovic & Rosenbloom, 1999).

ACKNOWLEDGMENTS

I wish to thank Ian Stevens for the proofreading. I also thank the two anonymous reviewers for the valuable feedback and constructive criticism.

Received July 27, 2010 / Accepted June 15, 2011

  • 1. AGUARÓN J & MORENO-JIMENEZ J. 2000. Local Stability Intervals in the Analytic Hierarchy Process. European Journal of Operational Research, 125(1): 113132.
  • 2. AGUAR ON J & MORENO-JIMÉNEZ J. 2003. The Geometric Consistency Index: Approximated Thresholds. European Journal of Operational Research, 147(1): 137145.
  • 3. AISSAOUI N & HAOUARI M ET AL. 2007. Supplier selection and order lot sizing modeling: A review. Computers & Operations Research, 34(12): 35163540.
  • 4. BANA E COSTA C & VANSNICK J. 2008. A Critical Analysis of the Eigenvalue Method Used to Derive Priorities in AHP. European Journal of Operational Research, 187(3): 14221428.
  • 5. BARZILAI J. 1997. Deriving Weights from Pairwise Comparisons Matrices. Journal of the Operational Research Society, 48(12): 12261232.
  • 6. BARZILAI J. 2001. Notes on the analytic hierarchy process. Proceedings of the NSF Design and Manufacturing Research Conference, Tampa.
  • 7. BARZILAI J. 2005. Measurement and preference function modelling. International Transactions in Operational Research, 12(2): 173183.
  • 8. BARZILAI J & COOK WD ET AL. 1987. Consistent weights for judgements matrices of the relative importance of alternatives. Operations research letters, 6(1): 131134.
  • 9. BARZILAI J & GOLANY B. 1990. Deriving weights from pairwise comparison matrices: the additive case. Operations Research Letters, 9(6): 407410.
  • 10. BELTON V & GEAR A. 1983. On a Shortcoming of Saaty's Method of Analytical Hierarchies. Omega, 11(3): 228230.
  • 11. CRAWFORD G & WILLIAMS C. 1985. A Note on the Analysis of Subjective Judgement Matrices. Journal of Mathematical Psychology, 29(4): 387405.
  • 12. DE BOER L & LABRO E ET AL. 2001. A review of methods supporting supplier selection. European Journal of Purchasing & Supply Management, 7(2): 7589.
  • 13. DONEGAN H & DODD F ET AL. 1992. A new approach to AHP decision-making. The Statician,41(3): 295302.
  • 14. DYER J. 1990. Remarks on the Analytic Hierarchy Process. Management Science, 36(3): 249258.
  • 15. EL-SAWALHI N & EATON D ET AL. 2007. Contractor pre-qualification model: State-of-the-art. International Journal of Project Management, 25(5): 465474.
  • 16. FORMAN E & GASS S. 2001. The Analytic Hierarchy Process An Exposition. Operations Research, 49(4): 469486.
  • 17. FORMAN EH. 1993. Facts and fictions about the analytic hierarchy process. Mathematical and computer modelling, 17(4-5): 1926.
  • 18. GOLDEN B & WASIL E ET AL. 1989. The Analytic Hierarchy Process: Applications and Studies. Heidelberg, Springer-Verlag.
  • 19. HARKER P & VARGAS L. 1987. The Theory of Ratio Scale Estimation: Saaty's Analytic Hierarchy Process. Management Science, 33(11): 13831403.
  • 20. HARKER PT. 1987. Incomplete pairwise comparisons in the analytic hierarchy process. Mathematical Modelling, 9(11): 837848.
  • 21. HO W. 2008. Integrated analytic hierarchy process and its applications A literature review. European Journal of Operational Research, 186(1): 211228.
  • 22. HO W & XU X ET AL. 2010. Multi-criteria decision making approaches for supplier evaluation andselection: A literature review. European Journal of Operational Research, 202(1): 1624.
  • 23. HOECHT A & TROTT P. 2006. Outsourcing, information leakage and the risk of losing technology based competencies. European Business Review, 18(5): 395412.
  • 24. ISHIZAKA A & BALKENBORG D ET AL. 2010. Influence of aggregation and measurement scaleon ranking a compromise alternative in AHP. Journal of the Operational Research Society, 62(4):700710.
  • 25. ISHIZAKA A & LABIB A. 2009. Analytic Hierarchy Process and Expert Choice: benefits and limitations. OR Insight, 22(4): 201220.
  • 26. ISHIZAKA A & LABIB A. 2011. Review of the main developments in the analytic hierarchy process.Expert Systems with Applications, 38(11): 1433614345.
  • 27. ISHIZAKA A & LUSTI M. 2004. An Expert Module to Improve the Consistency of AHP Matrices. International Transactions in Operational Research, 11(1): 97105.
  • 28. ISHIZAKA A & LUSTI M. 2006. How to derive priorities in AHP: a comparative study. Central European Journal of Operations Research, 14(4): 387400.
  • 29. JOHNSON C & BEINE W ET AL. 1979. Right-Left Asymmetry in an Eigenvector Ranking Procedure.Journal of Mathematical Psychology, 19(1): 6164.
  • 30. KARAPETROVIC S & ROSENBLOOM E. 1999. A Quality Control Approach to Consistency Paradoxes in AHP. European Journal of Operational Research, 119(3): 704718.
  • 31. KUMAR S & VAIDYA O. 2006. Analytic hierarchy process: An overview of applications. European Journal of Operational Research, 169(1): 129.
  • 32. KWIESIELEWICZ M & VAN UDEN E. 2004. Inconsistent and Contradictory Judgements in Pairwise Comparison Method in AHP. Computers and Operations Research, 31(5): 713719.
  • 33. LESKINEN P & KANGAS J. 2005. Rank reversal in multi-criteria decision analysis with statistical modelling of ratio-scale pairwise comparisons. Journal of the Operational Research Society, 56(7):855861.
  • 34. LIBERATORE M & NYDICK R. 2008. The analytic hierarchy process in medical and health caredecision making: A literature review. European Journal of Operational Research, 189(1): 194207.
  • 35. LIEB R & BENTZ B. 2005. The North Amarican third party logistics in 2004: the provider CEO perspective. International Journal of Physical Distribution & Logistics Management, 35(8): 595611.
  • 36. LIEB R & BENTZ B. 2006. The 3PL Industry in Asia/Pacific. Supply Chain Management Review,10(9): 1015.
  • 37. LIEB R & BUTNER K. 2007. The North American Third-Party Logistics Industry in 2006: TheProvider CEO Perspective. Transportation Journal, 46(3): 4052.
  • 38. OGDEN J. 2006. Supply base reduction: an empirical study of critical success factors. Journal of Supply Chain Management, 42(4): 2939.
  • 39. OMKARPRASAD V & SUSHIL K. 2006. Analytic hierarchy process: an overview of applications.E uropean Journal of Operational Research, 169(1): 129.
  • 40. SAATY T. 1977. A scaling method for priorities in hierarchical structures. Journal of mathematical psychology, 15(3): 234281.
  • 41. SAATY T. 1980. The Analytic Hierarchy Process. New York, McGraw-Hill.
  • 42. SAATY T. 2001. Decision-making with the AHP: Why is the principal eigenvector necessary? Proceedings of the sixth international symposium on the analytic hierarchy process (ISAHP 2001), Bern.
  • 43. SAATY T. 2001. The seven pillars of the Analytic Hierarchy Process. Multiple Criteria Decision Making in the New Millennium. Proceedings of the 15th International Conference, MCDM. M. Köksalan. Berlin, Springer: 1537.
  • 44. SAATY T. 2003. Decision-making with the AHP: Why is the Principal Eigenvector necessary? European Journal of Operational Research, 145(1): 8591.
  • 45. SAATY T & FORMAN E. (1992). The Hierarchon: A Dictionary of Hierarchies. Pittsburgh, RWS Publications.
  • 46. SAATY T & OZDEMIR M. 2003. Why the magic number seven plus or minus two. Mathematical and Computer Modelling, 38(3-4): 233244.
  • 47. SAATY T & VARGAS L. 1984. Comparison of Eigenvalue, Logarithmic Least Squares and LeastSquares Methods in Estimating Ratios. Mathematical Modeling, 5(5): 309324.
  • 48. SAATY T & VARGAS L. 1984. Inconsistency and Rank Preservation. Journal of Mathematical Psychology, 28(2): 205214.
  • 49. SARKAR A & MOHAPATRA P. 2006. Evaluation of supplier capability and performance: A method for supply base reduction. Journal of Purchasing and Supply Management, 12(3): 148163.
  • 50. SHIM J. 1989. Bibliography research on the analytic hierarchy process (AHP). Socio-Economic Planning Sciences, 23(3): 161167.
  • 51. SIPAHI S & TIMOR M. 2010. The analytic hierarchy process and analytic network process: anoverview of applications. Management Decision, 48(5): 775808.
  • 52. VARGAS L. 1990. An overview of the analytic hierarchy process and its applications. European Journal of Operational Research, 48(1): 28.
  • 53. YOKOYAMA M. 1921. The Nature of the affective judgment in the method of paired comparison.The American Journal of Psychology, 32: 357369.
  • 54. ZAHEDI F. 1986. The analytic hierarchy process: a survey of the method and its applications. Interface, 16(4): 96108.

Publication Dates

  • Publication in this collection
    08 Mar 2012
  • Date of issue
    Apr 2012

History

  • Received
    27 July 2010
  • Accepted
    15 June 2011
Sociedade Brasileira de Pesquisa Operacional Rua Mayrink Veiga, 32 - sala 601 - Centro, 20090-050 Rio de Janeiro RJ - Brasil, Tel.: +55 21 2263-0499, Fax: +55 21 2263-0501 - Rio de Janeiro - RJ - Brazil
E-mail: sobrapo@sobrapo.org.br