### Integral test

In mathematics, the integral test for convergence is a method used to test infinite series of non-negative terms for convergence. It was developed by Colin Maclaurin and Augustin-Louis Cauchy and is sometimes known as the Maclaurin–Cauchy test.

## Statement of the test

Consider an integer N and a non-negative function f defined on the unbounded interval Template:Closed-open, on which it is monotone decreasing. Then the infinite series

$\sum_\left\{n=N\right\}^\infty f\left(n\right)$

converges to a real number if and only if the improper integral

$\int_N^\infty f\left(x\right)\,dx$

is finite. In other words, if the integral diverges, then the series diverges as well.

### Remark

The proof also gives the lower and upper bounds

$\int_N^\infty f\left(x\right)\,dx\le\sum_\left\{n=N\right\}^\infty f\left(n\right)\le f\left(N\right)+\int_N^\infty f\left(x\right)\,dx$

for the infinite series.

## Proof

The proof basically uses the comparison test, comparing the term f(n) with the integral of f over the intervals Template:Closed-open and Template:Closed-open, respectively.

Since f is a monotone decreasing function, we know that



and



Hence, for every integer nN,

1

(})

and, for every integer nN + 1,

2

(})

By summation over all n from N to some larger integer M, we get from (1)



\int_N^{M+1}f(x)\,dx=\sum_{n=N}^M\underbrace{\int_n^{n+1}f(x)\,dx}_{\le\,f(n)}\le\sum_{n=N}^Mf(n)

and from (2)



\sum_{n=N}^Mf(n)\le f(N)+\sum_{n=N+1}^M\underbrace{\int_{n-1}^n f(x)\,dx}_{\ge\,f(n)}=f(N)+\int_N^M f(x)\,dx.

Combining these two estimates yields

$\int_N^\left\{M+1\right\}f\left(x\right)\,dx\le\sum_\left\{n=N\right\}^Mf\left(n\right)\le f\left(N\right)+\int_N^M f\left(x\right)\,dx.$

Letting M tend to infinity, the result follows.

## Applications



\sum_{n=1}^\infty \frac1n diverges because, using the natural logarithm, its derivative, and the fundamental theorem of calculus, we get



\int_1^M\frac1x\,dx=\ln x\Bigr|_1^M=\ln M\to\infty \quad\text{for }M\to\infty. Contrary, the series



\sum_{n=1}^\infty \frac1{n^{1+\varepsilon}} (cf. Riemann zeta function) converges for every ε > 0, because by the power rule



\int_1^M\frac1{x^{1+\varepsilon}}\,dx

# -\frac1{\varepsilon x^\varepsilon}\biggr|_1^M

## Borderline between divergence and convergence

The above examples involving the harmonic series raise the question, whether there are monotone sequences such that f(n) decreases to 0 faster than 1/n but slower than 1/n1+ε in the sense that



\lim_{n\to\infty}\frac{f(n)}{1/n}=0 \quad\text{and}\quad \lim_{n\to\infty}\frac{f(n)}{1/n^{1+\varepsilon}}=\infty for every ε > 0, and whether the corresponding series of the f(n) still diverges. Once such a sequence is found, a similar question can be asked with f(n) taking the role of 1/n, and so on. In this way it is possible to investigate the borderline between divergence and convergence of infinite series.

Using the integral test for convergence, one can show (see below) that, for every natural number k, the series

3

(})

still diverges (cf. proof that the sum of the reciprocals of the primes diverges for k = 1) but

}

(})

|4}} converges for every ε > 0. Here lnk denotes the k-fold composition of the natural logarithm defined recursively by



\ln_k(x)= \begin{cases} \ln(x)&\text{for }k=1,\\ \ln(\ln_{k-1}(x))&\text{for }k\ge2. \end{cases} Furthermore, Nk denotes the smallest natural number such that the k-fold composition is well-defined and lnk(Nk) ≥ 1, i.e.



N_k\ge \underbrace{e^{e^{\cdot^{\cdot^{e}}}}}_{k\ e'\text{s}}=e \uparrow\uparrow k using tetration or Knuth's up-arrow notation.

To see the divergence of the series (3) using the integral test, note that by repeated application of the chain rule



\frac{d}{dx}\ln_{k+1}(x) =\frac{d}{dx}\ln(\ln_k(x)) =\frac1{\ln_k(x)}\frac{d}{dx}\ln_k(x) =\cdots =\frac1{x\ln(x)\cdots\ln_k(x)}, hence



\int_{N_k}^\infty\frac{dx}{x\ln(x)\cdots\ln_k(x)} =\ln_{k+1}(x)\bigr|_{N_k}^\infty=\infty. To see the convergence of the series (4), note that by the power rule, the chain rule and the above result



-\frac{d}{dx}\frac1{\varepsilon(\ln_k(x))^\varepsilon} =\frac1{(\ln_k(x))^{1+\varepsilon}}\frac{d}{dx}\ln_k(x) =\cdots =\frac{1}{x\ln(x)\cdots\ln_{k-1}(x)(\ln_k(x))^{1+\varepsilon}}, hence



\int_{N_k}^\infty\frac{dx}{x\ln(x)\cdots\ln_{k-1}(x)(\ln_k(x))^{1+\varepsilon}} =-\frac1{\varepsilon(\ln_k(x))^\varepsilon}\biggr|_{N_k}^\infty<\infty.