LOG#079. Zeta multiple integral.

My second post this day is a beautiful relationship between the Riemann zeta function, the unit hypercube and certain multiple integral involving a “logarithmic and weighted geometric mean”. I discovered it in my rival blog, here:

First of all, we begin with the Riemann zeta function: $\displaystyle{\zeta (s)=\sum_{n=1}^\infty n^{-s}=\sum_{n=1}^\infty \dfrac{1}{n^{s}}}$

Obviously, $\zeta (1)$ diverges (it has a pole there), but the zeta value in $s=2$ and $s=3$ can take the following multiple integral “disguise”: $\displaystyle{\zeta (2) =-\int_0^1 \dfrac{\ln (x)}{1-x}dx=-\left(-\dfrac{\pi^2}{6}\right)=\dfrac{\pi^2}{6}}$ $\displaystyle{\zeta (3)=-\dfrac{1}{2}\int_0^1\int_0^1\dfrac{\ln (xy)}{1-xy}dxdy}$

Moreover, we can even check that $\displaystyle{\int_0^1\int_0^1\int_0^1\dfrac{\ln (xyz) }{1-xyz}=-\dfrac{\pi^4}{30}=-3\zeta (4)}$

In fact, you can generalize the above multiple integral over the unit hypercube $H_n(1)=\left[0,1\right]^n=\left[0,1\right]\times \underbrace{\cdots}_{n}\times \left[0,1\right]$

(1) $\boxed{\displaystyle{-n\zeta (n+1)=\int_0^1\cdots \int_0^1 \dfrac{\ln (x_1 x_2\cdots x_n)}{1-x_1x_2\cdots x_n}dx_1dx_2\cdots dx_n}}$

or equivalently

(2) $\boxed{\displaystyle{\zeta (n+1)=-\dfrac{1}{n}\int_0^1\cdots \int_0^1\dfrac{\displaystyle{\ln \prod_{i=1}^n x_i \prod_{i=1}^n dx_i}}{\displaystyle{1-\prod_{i=1}^n x_i}}}}$

I consulted several big books with integrals (specially some russian “Big Book” of integrals, series and products or the CRC handbook) but I could not find this integral in any place. If you are a mathematician reading my blog, it would be nice if you know this result. Of course, there is a classical result that says: $\displaystyle{\zeta (n)=\left(\int_0^1\right)^n\dfrac{\displaystyle{\prod_{i=1}^n dx_i}}{\displaystyle{1-\prod_{i=1}^n x_i}}}$

but the last boxed equation was completely unknown for me. I knew the integral represeantations of $\zeta (2)$ and $\zeta (3)$ but not that general form of zeta in terms of a multidimensional integral. I like it!

In fact, it is interesting (but I don’t know if it is meaningful at all) that the last boxed integral (2) can be rewritten as follows

(3) $\boxed{\displaystyle{\zeta\left(n+1\right)=\int_0^1\cdots\int_0^1\left(\dfrac{1}{\displaystyle{1-\prod_{i=1}^n x_i}}\right)\ln\left(\dfrac{1}{\displaystyle{\sqrt[n]{\prod_{i=1}^n x_i}}}\right)\left(\prod_{i=1}^n dx_i\right)}}$

or equivalently

(4) $\boxed{\displaystyle{\zeta \left(n+1\right)=-\int_0^1\cdots \int_0^1 \omega (x_i) \ln \left(\overline{X}_{GM}\right) d^nX}}$

where I have defined the weight function $\displaystyle{\omega (x_i)=\dfrac{1}{\displaystyle{1-\prod_{i=1}^n x_i}}}$

and the geometric mean is $\displaystyle{\overline{X}_{GM}=\sqrt[n]{\prod_{i=1}^n x_i}}$

and the volume element reads $d^nX=dx_1dx_2\cdots dx_n$

I love calculus (derivatives and integrals) and I love the Riemann zeta function. Therefore, I love the Zeta Multiple Integrals (1)-(2)-(3)-(4). And you?

PS: Contact the author of the original multidimensional zeta integral ( his blog is linked above) and contact me too if you know some paper or book where those integrals appear explicitly. I believe they can be derived with the use of polylogarithms and multiple zeta values somehow, but I am not an expert (yet) with those functions.

PS(II): In math.stackexchange we found the “proof”:

Just change variables from $x_i$ to $u_i = -\log x_i$ and let $\displaystyle{u = \sum_{i=1}^{n-1} u_i}$. For $n \ge 2$, we have: $\displaystyle{I=\dfrac{1}{n-1}\iiint_{0 < x_i < 1} \frac{-\log(\prod_{i=1}^{n-1} x_i)}{1-\prod_{i=1}^{n-1} x_i}\prod_{i=1}^{n-1}dx_i}$

Then $\displaystyle{I=\dfrac{1}{n-1}\iiint_{0 < u_i < \infty}\frac{u}{1-e^{-u}}e^{-u}\prod_{i=1}^{n-1}du_i}$ $\displaystyle{I=\frac{1}{n-1} \int_0^{\infty} \dfrac{u du }{e^u - 1 }\left\{\iint_{\stackrel{u_2,\ldots,u_{n-1} > 0}{u_2+\cdots+u_{n-1} < u}}\prod_{i=2}^{n-1} du_i \right\}}$ $\displaystyle{I=\dfrac{1}{n-1} \int_0^{\infty} \frac{u du }{e^u - 1 } \dfrac{u^{n-2}}{(n-2)!}=\frac{1}{\Gamma(n)} \int_0^{\infty} \frac{u^{n-1}}{e^u - 1} du=\dfrac{1}{\Gamma(n)} \Gamma(n)\zeta(n)=\zeta(n)}$

LOG#078. Averages. I am going to speak a little bit about Statistics. The topic today are “averages”. Suppose you have a set of “measurements” $x_i$ where $i=1,...,n$. Then you can define the following quantities:

Arithemtic mean. $\boxed{\overline{X}_{AM}=\dfrac{1}{n}\displaystyle{\sum_{i=1}^n}x_i=\dfrac{\displaystyle{\sum_{i=1}^n x_i}}{n}=\dfrac{x_1+x_2+\cdots+x_n}{n}}$

Geometric mean. $\boxed{\displaystyle{\overline{X}_{GM}=\sqrt[n]{\prod_{i=1}^n x_i}=\sqrt[n]{x_1x_2\cdots x_n}=\left(\prod_{i=1}^n x_i\right)^{1/n}}}$

Harmonic mean. $\boxed{\displaystyle{\overline{X}_{HM}=\dfrac{1}{\displaystyle{\dfrac{1}{n}\sum_{i=1}^{n}\dfrac{1}{x_i}}}=\dfrac{n}{\dfrac{1}{x_1}+\dfrac{1}{x_2}+\cdots+\dfrac{1}{x_n}}}}$

Remark: In the harmonic mean we need that every measurement is not null, i.e., $x_i\neq \forall i=1,...,n$

Remark (II): $\overline{X}_{AM}\geq\overline{X}_{GM}\geq\overline{X}_{HM}$

There are some other interesting “averages”/”means”: $\boxed{\displaystyle{\overline{X}_{QM}=\sqrt{\dfrac{1}{n}\sum_{i=1}^{n}x_i^2}}}$

Generalized p-th mean. $\boxed{\displaystyle{\overline{X}_{GEN}=\sqrt[p]{\dfrac{1}{n}\sum_{i=1}^{n}x_i^p}}}$

Weighted mean/average. $\boxed{\displaystyle{\overline{X}_{WM}=\dfrac{\displaystyle{\sum_{i=1}^n w_i x_i}}{\displaystyle{\sum_{i=1}^n w_i}}}}$

where $w_i$ are the weight functions and they satisfy $\displaystyle{\sum_{i=1}^n w_i}=1$

A particularly important case occurs when the weight equals to inverse of the so-called variance of a population with finite size (generally denoted by $\sigma^2$), i.e.,  when $w_i=1/\sigma^2_i$, the weighted mean yields: $\boxed{\displaystyle{\overline{X}_{WM}=\dfrac{\displaystyle{\sum_{i=1}^n \dfrac{x_i}{\sigma^2_i}}}{\displaystyle{\sum_{i=1}^n \dfrac{1}{\sigma_i^2}}}}}$

Midrange.

Finally, a “naive” and usually bad statistical measure for a sample or data set is the midrange. Really, it is a mere measure of central tendency and no much more: $\boxed{\displaystyle{\overline{X}_{MR}=\dfrac{max(x)+min(x)}{2}}}$

Here, $max(x), min(x)$ refer to the maximum and minimum value of the sampled variable x in the full data set $x_i$.

Many of the above “averages” have their own relative importance in the theory of Statistics. But that will be the topic of a future blog post handling statistics and its applications.

What average do you like the most? Are you “on the average”? Are you “normal”? 😉 Of course, you can consult your students, friends or family if they prefer some particular mean/average over any other in their grades/cash sharing, or alike :). See you soon in other blog post!