World Library  
Flag as Inappropriate
Email this Article

Kendall rank correlation coefficient

Article Id: WHEBN0007287830
Reproduction Date:

Title: Kendall rank correlation coefficient  
Author: World Heritage Encyclopedia
Language: English
Subject: Statistical dependence, Covariance and correlation, Statistical tests, WikiProject Mathematics/List of mathematics articles (K)
Collection:
Publisher: World Heritage Encyclopedia
Publication
Date:
 

Kendall rank correlation coefficient

In statistics, the Kendall rank correlation coefficient, commonly referred to as Kendall's tau coefficient (after the Greek letter τ), is a statistic used to measure the association between two measured quantities. A tau test is a non-parametric hypothesis test for statistical dependence based on the tau coefficient.

It is a measure of rank correlation: the similarity of the orderings of the data when ranked by each of the quantities. It is named after Maurice Kendall, who developed it in 1938,[1] though Gustav Fechner had proposed a similar measure in the context of time series in 1897.[2]

Contents

  • Definition 1
    • Properties 1.1
  • Hypothesis test 2
  • Accounting for ties 3
    • Tau-a 3.1
    • Tau-b 3.2
    • Tau-c 3.3
  • Significance tests 4
  • Algorithms 5
  • See also 6
  • References 7
  • Further reading 8
  • External links 9

Definition

Let (x1y1), (x2y2), …, (xnyn) be a set of observations of the joint random variables X and Y respectively, such that all the values of (xi) and (yi) are unique. Any pair of observations (xiyi) and (xjyj) are said to be concordant if the ranks for both elements agree: that is, if both xi > xj and yi > yj or if both xi < xj and yi < yj. They are said to be discordant, if xi > xj and yi < yj or if xi < xj and yi > yj. If xi = xj or yi = yj, the pair is neither concordant nor discordant.

The Kendall τ coefficient is defined as:

\tau = \frac{(\text{number of concordant pairs}) - (\text{number of discordant pairs})}{\frac{1}{2} n (n-1) } .[3]

Properties

The denominator is the total number of pair combinations, so the coefficient must be in the range −1 ≤ τ ≤ 1.

  • If the agreement between the two rankings is perfect (i.e., the two rankings are the same) the coefficient has value 1.
  • If the disagreement between the two rankings is perfect (i.e., one ranking is the reverse of the other) the coefficient has value −1.
  • If X and Y are independent, then we would expect the coefficient to be approximately zero.

Hypothesis test

The Kendall rank coefficient is often used as a test statistic in a statistical hypothesis test to establish whether two variables may be regarded as statistically dependent. This test is non-parametric, as it does not rely on any assumptions on the distributions of X or Y or the distribution of (X,Y).

Under the null hypothesis of independence of X and Y, the sampling distribution of τ has an expected value of zero. The precise distribution cannot be characterized in terms of common distributions, but may be calculated exactly for small samples; for larger samples, it is common to use an approximation to the normal distribution, with mean zero and variance

\frac{2(2n+5)}{9n (n-1)}.[4]

Accounting for ties

A pair {(xiyi), (xjyj)} is said to be tied if xi = xj or yi = yj; a tied pair is neither concordant nor discordant. When tied pairs arise in the data, the coefficient may be modified in a number of ways to keep it in the range [−1, 1]:

Tau-a

The Tau-a statistic tests the strength of association of the cross tabulations. Both variables have to be ordinal. Tau-a will not make any adjustment for ties. It is defined as:

\tau_A = \frac{n_c-n_d}{n_0}

where nc, nd and n0 are defined as in the next section.

Tau-b

The Tau-b statistic, unlike Tau-a, makes adjustments for ties.[5] Values of Tau-b range from −1 (100% negative association, or perfect inversion) to +1 (100% positive association, or perfect agreement). A value of zero indicates the absence of association.

The Kendall Tau-b coefficient is defined as:

\tau_B = \frac{n_c-n_d}{\sqrt{(n_0-n_1)(n_0-n_2)}}

where

\begin{align} n_0 & = n(n-1)/2\\ n_1 & = \sum_i t_i (t_i-1)/2 \\ n_2 & = \sum_j u_j (u_j-1)/2 \\ n_c & = \text{Number of concordant pairs} \\ n_d & = \text{Number of discordant pairs} \\ t_i & = \text{Number of tied values in the } i^\text{th} \text{ group of ties for the first quantity} \\ u_j & = \text{Number of tied values in the } j^\text{th} \text{ group of ties for the second quantity} \end{align}

Tau-c

Tau-c differs from Tau-b as in being more suitable for rectangular tables than for square tables.

Significance tests

When two quantities are statistically independent, the distribution of \tau is not easily characterizable in terms of known distributions. However, for \tau_A the following statistic, z_A, is approximately distributed as a standard normal when the variables are statistically independent:

z_A = {(n_c - n_d) \over \sqrt{2(2n+5)\over 9n(n-1)} }

Thus, to test whether two variables are statistically dependent, one computes z_A, and finds the cumulative probability for a standard normal distribution at -|z_A|. For a 2-tailed test, multiply that number by two to obtain the p-value. If the p-value is below a given significance level, one rejects the null hypothesis (at that significance level) that the quantities are statistically independent.

Numerous adjustments should be added to z_A when accounting for ties. The following statistic, z_B, has the same distribution as the \tau_B distribution, and is again approximately equal to a standard normal distribution when the quantities are statistically independent:

z_B = {n_c - n_d \over \sqrt{ v } }

where

\begin{array}{ccl} v & = & (v_0 - v_t - v_u)/18 + v_1 + v_2 \\ v_0 & = & n (n-1) (2n+5) \\ v_t & = & \sum_i t_i (t_i-1) (2 t_i+5)\\ v_u & = & \sum_j u_j (u_j-1)(2 u_j+5) \\ v_1 & = & \sum_i t_i (t_i-1) \sum_j u_j (u_j-1) / (2n(n-1)) \\ v_2 & = & \sum_i t_i (t_i-1) (t_i-2) \sum_j u_j (u_j-1) (u_j-2) / (9 n (n-1) (n-2)) \end{array}

pvrank[6] is a very recent R package that computes rank correlations and their p-values with various options for tied ranks. It is possible to compute exact Kendall coefficient test p-values for n ≤ 60.

Algorithms

The direct computation of the numerator n_c - n_d, involves two nested iterations, as characterized by the following pseudo-code:

numer := 0
for i:=2..N do
    for j:=1..(i-1) do
        numer := numer + sign(x[i] - x[j]) * sign(y[i] - y[j])
return numer

Although quick to implement, this algorithm is O(n^2) in complexity and becomes very slow on large samples. A more sophisticated algorithm[7] built upon the Merge Sort algorithm can be used to compute the numerator in O(n \cdot \log{n}) time.

Begin by ordering your data points sorting by the first quantity, x, and secondarily (among ties in x) by the second quantity, y. With this initial ordering, y is not sorted, and the core of the algorithm consists of computing how many steps a Bubble Sort would take to sort this initial y. An enhanced Merge Sort algorithm, with O(n \log n) complexity, can be applied to compute the number of swaps, S(y), that would be required by a Bubble Sort to sort y_i. Then the numerator for \tau is computed as:

n_c-n_d = n_0 - n_1 - n_2 + n_3 - 2 S(y),

where n_3 is computed like n_1 and n_2, but with respect to the joint ties in x and y.

A Merge Sort partitions the data to be sorted, y into two roughly equal halves, y_\mathrm{left} and y_\mathrm{right}, then sorts each half recursive, and then merges the two sorted halves into a fully sorted vector. The number of Bubble Sort swaps is equal to:

S(y) = S(y_\mathrm{left}) + S(y_\mathrm{right}) + M(Y_\mathrm{left},Y_\mathrm{right})

where Y_\mathrm{left} and Y_\mathrm{right} are the sorted versions of y_\mathrm{left} and y_\mathrm{right}, and M(\cdot,\cdot) characterizes the Bubble Sort swap-equivalent for a merge operation. M(\cdot,\cdot) is computed as depicted in the following pseudo-code:

function M(L[1..n], R[1..m])
    i := 1
    j := 1
    nSwaps := 0
    while i <= n  and j <= m do
        if R[j] < L[i] then
            nSwaps := nSwaps + n - i + 1
            j := j + 1
        else
            i := i + 1
    return nSwaps

A side effect of the above steps is that you end up with both a sorted version of x and a sorted version of y. With these, the factors t_i and u_j used to compute \tau_B are easily obtained in a single linear-time pass through the sorted arrays.

A second algorithm with O(n \cdot \log{n}) time complexity, based on AVL trees, was devised by David Christensen.[8] Yet another algorithm for O(n \cdot \log{n}) time complexity was proposed more recently.[9]

See also

References

  1. ^ Kendall, M. (1938). "A New Measure of Rank Correlation".  
  2. ^  
  3. ^ Nelsen, R.B. (2001), "Kendall tau metric", in Hazewinkel, Michiel,  
  4. ^ Prokhorov, A.V. (2001), "Kendall coefficient of rank correlation", in Hazewinkel, Michiel,  
  5. ^ Agresti, A. (2010). Analysis of Ordinal Categorical Data (Second ed.). New York: John Wiley & Sons. 
  6. ^ Amerise, I.L.; Marozzi, M.; Tarsitano, A. "R package pvrank". 
  7. ^ Knight, W. (1966). "A Computer Method for Calculating Kendall's Tau with Ungrouped Data".  
  8. ^ Christensen, David (2005). "Fast algorithms for the calculation of Kendall's τ".  
  9. ^ Campello, R.J.G.B.; Hruschka, E.R. (29 March 2009). "On comparing two sequences of numbers and its applications to clustering analysis". Information Sciences 179 (8): 1025–1039.  

Further reading

  • Abdi, H. (2007). "Kendall rank correlation" (PDF). In Salkind, N.J. Encyclopedia of Measurement and Statistics. Thousand Oaks (CA): Sage. 
  • Daniel, Wayne W. (1990). "Kendall's tau". Applied Nonparametric Statistics (2nd ed.). Boston: PWS-Kent. pp. 365–377.  
  • Kendall, M. (1948) Rank Correlation Methods, Charles Griffin & Company Limited
  • Bonett, DG & Wright, TA (2000) Sample size requirements for Pearson, Kendall, and Spearman correlations, Psychometrika, 65, 23–28.

External links

  • Tied rank calculation
  • Software for computing Kendall's tau on very large datasets
  • Online software: computes Kendall's tau rank correlation
  • The CORR Procedure: Statistical Computations – McDonough School of Business
This article was sourced from Creative Commons Attribution-ShareAlike License; additional terms may apply. World Heritage Encyclopedia content is assembled from numerous content providers, Open Access Publishing, and in compliance with The Fair Access to Science and Technology Research Act (FASTR), Wikimedia Foundation, Inc., Public Library of Science, The Encyclopedia of Life, Open Book Publishers (OBP), PubMed, U.S. National Library of Medicine, National Center for Biotechnology Information, U.S. National Library of Medicine, National Institutes of Health (NIH), U.S. Department of Health & Human Services, and USA.gov, which sources content from all federal, state, local, tribal, and territorial government publication portals (.gov, .mil, .edu). Funding for USA.gov and content contributors is made possible from the U.S. Congress, E-Government Act of 2002.
 
Crowd sourced content that is contributed to World Heritage Encyclopedia is peer reviewed and edited by our editorial staff to ensure quality scholarly research articles.
 
By using this site, you agree to the Terms of Use and Privacy Policy. World Heritage Encyclopedia™ is a registered trademark of the World Public Library Association, a non-profit organization.
 


Copyright © World Library Foundation. All rights reserved. eBooks from Project Gutenberg are sponsored by the World Library Foundation,
a 501c(4) Member's Support Non-Profit Organization, and is NOT affiliated with any governmental agency or department.