LOG#022. Kaniadakis and relativity.

Hello, I am back! After some summer rest/study/introspection! And after an amazing July month with the Higgs discovery by ATLAS and CMS. After an amazing August month with the Curiosity rover, MSL(Mars Science Laboratory), arrival to Mars. After a hot summer in my home town…I have written lots of drafts these days…And I will be publishing all of them step to step.

We will discuss today one of interesting remark studied by Kaniadakis. He is known by his works on relatistivic physics, condensed matter physics, and specially by his work on some cool function related to non-extensive thermodynamics. Indeed, Kaniadakis himself has probed that his entropy is also related to the mathematics of special relativity. Ultimately, his remarks suggest:

1st. Dimensionless quantities are the true fundamental objects in any theory.

2nd. A relationship between information theory and relativity.

3rd. The important role of deformation parameters and deformed calculus in contemporary Physics, and more and more in the future maybe.

4nd. Entropy cound be more fundamental than thought before, in the sense that non-extensive generalizations of entropy play a more significant role in Physics.

5th. Non-extensive entropies are more fundamental than the conventional entropy.

The fundamental object we are going to find is stuff related to the following function:

exp_\kappa (x)=\left( \sqrt{1+\kappa^2x^2}\right)^{1/\kappa}

Let me first imagine two identical particles ( of equal mass) A and B, whose velocities, momenta and energies are, in certain frame S:

v_A, p_A=p(v_A), E_A=E(v_A)

v_B, p_B=p(v_B), E_B=E(v_B)

In the rest frame of particle B, S’, we have

p'_B=0

p'_A=p_A-p_B

If we define a dimensionless momentum paramenter

q=\dfrac{p}{p^\star}

\dfrac{p'_A}{p^\star}=\dfrac{p_A}{p^\star}-\dfrac{p_B}{p^\star}

we get after usual exponentiation

\exp(q'_A)=\exp(q_A)\exp(-q_B)

Galilean relativity says that the laws of Mechanics are unchanged after the changes from rest to an uniform motion reference frame. Equivalentaly, galilean relativity in our context means the invariance under a change q'_A\leftrightarrow q_A, and it implies the invariance under a change q_B\rightarrow -q_B. In turn, plugging these inte the last previous equation, we get the know relationship

\exp (q)\exp (-q)=1

Wonderful, isn’t it? It is for me! Now, we will move to Special Relativity. In the S’ frame where B is at rest, we have:

v'_B=0, p'_B=0, E'_B=mc^2

and from the known relativistic transformations for energy and momentum

v'_A=\dfrac{v_A-v_B}{1-\dfrac{v_Av_B}{c^2}}

p'_A=\gamma (v_B)p_A-\dfrac{v_B\gamma (v_B)E_A}{c^2}

E'_A=\gamma (v_B)E_A-v_B\gamma (v_B)p_A

where of course we define

\gamma (v_B)=\dfrac{1}{\sqrt{1-\dfrac{v_B}{c^2}}}

p_B=m \gamma (v_B) v_B

E_B=m \gamma (v_B) c^2

After this introduction, we can parallel what we did for galilean relativity. We can write the last previous equations in the equivalent form, after some easy algebra, as follows

p'_A=p_A\dfrac{E_B}{mc^2}-E_A\dfrac{p_B}{mc^2}

E'_A=E_AE_B\dfrac{1}{mc^2}-\dfrac{p_Ap_B}{m}

Now, we can introduce dimensionless variables instead of the triple (v, p, E), defining instead the adimensional set (u, q, \epsilon):

\dfrac{v}{u}=\dfrac{p}{mq}=\sqrt{\dfrac{E}{m\epsilon}}=\vert \kappa \vert c=v_\star<c

Note that the so-called deformation parameter \kappa is indeed related (equal) to the beta parameter in relativity. Again, from the special relativity requirement \vert \kappa \vert c<c we obtain, as we expected, that -1< \kappa <+1. Classical physics, the galilean relativity we know from our everyday experience, is recovered in the limit c\rightarrow \infty, or equivalently, if \kappa \rightarrow 0. In the dimensionless variables, the transformation of energy and momentum we wrote above can be showed to be:

q'_A=\kappa^2q_A\epsilon_B-\kappa^2q_B\epsilon_A

\epsilon'_A=\kappa^2\epsilon_A\epsilon_B-q_Aq_B

In rest frame of some particle, we get of course the result E(0)=mc^2, or in the new variables \epsilon (0)=\dfrac{1}{\kappa^2}. The energy-momentum dispersion relationship from special relativity p^2c^2-E^2=-m^2c^2 becomes:

q^2-\kappa^2\epsilon^2=-\dfrac{1}{\kappa^2}

or

\kappa^4\epsilon^2-\kappa^2q^2=1

Moreover, we can rewrite the equation

q'_A=\kappa^2q_A\epsilon_B-\kappa^2q_B\epsilon_A

in terms of the dimensionless energy-momentum variable

\epsilon_\kappa (q)=\dfrac{\sqrt{1+\kappa^2q^2}}{\kappa^2}

amd we get the analogue of the galilean addition rule for dimensionless velocities

q'_A =q_A\sqrt{1+\kappa^2q_B^2}-q_B\sqrt{1+\kappa^2q_A^2}

Note that the classical limit is recovered again sending \kappa\rightarrow 0. Now, we have to define some kind of deformed exponential function. Let us define:

\exp_\kappa (q) =\left(\sqrt{1+\kappa^2q^2}+\kappa q\right)^{1/\kappa}

Applying this function to the above last equation, we observe that

\exp_\kappa (q'_A)=\exp_\kappa (q_A) \exp_\kappa (-q_B)

Again, relativity means that observers in uniform motion with respect to each other should observe the same physical laws, and so, we should obtain invariant equations under the exchanges q'_A\leftrightarrow q_A and q_B\rightarrow -q_B. Pluggint these conditions into the last equation, it implies that the following condition holds (and it can easily be checked from the definition of the deformed exponential).

\exp_\kappa (q)\exp_\kappa (-q)=1

One interesting question is what is the inverse of this deformed exponential ( the name q-exponential or \kappa-exponential is often found in the literature). It has to be some kind of deformed logarithm. And it is! The deformed logarithm, inverse to the deformed exponential, is the following function:

\ln_\kappa (q)=\dfrac{q^{\kappa}-q^{-\kappa}}{2\kappa}

Indeed, this function is related to ( in units with the Boltzmann’s constant set to the unit k_B=1) the so-called Kaniadakis entropy!

S_{K}=-\dfrac{q^{\kappa}-q^{-\kappa}}{2\kappa}

Furthermore, the equation \exp_\kappa (q)\exp_\kappa (-q)=1 also implies that

\ln_\kappa \left(\dfrac{1}{q}\right)=-\ln_\kappa (q)

The gamma parameter of special relativity is also recasted as

\gamma =\dfrac{1}{\sqrt{1-\kappa^2}}

More generally, in fact, the deformed exponentials and logarithms develop a complete calculus based on:

\exp_\kappa (q_A)\exp_\kappa (q_B)=\exp (q_A\oplus q_B)

and the differential operators

\dfrac{d}{d_\kappa q}=\sqrt{1+\kappa^2q^2}\dfrac{d}{dq}

so that, e.g.,

\dfrac{d}{d_\kappa q}\exp_\kappa (q)=\exp_\kappa (q)

This Kanadiakis formalism is useful, for instance, in generalizations of Statistical Mechanics. It is becoming a powertool in High Energy Physics. At low energy, classical statistical mechanics gets a Steffan-Boltmann exponential factor distribution function:

f\propto \exp(-\beta E)=\exp (-\kappa E)

At high energies, in the relativistic domain, Kaniadakis approach provide that the distribution function departures from the classical value to a power law:

f\propto E^{-1/\kappa}

There are other approaches and entropies that could be interesting for additional deformations of special relativity. It is useful also in the foundations of Physics, in the Information Theory approach that sorrounds the subject in current times. And of course, it is full of incredibly beautiful mathematics!

We can start from deformed exponentials and logarithms in order to get the special theory of relativity (reversing the order in which I have introduced this topic here). Aren’t you surprised?


LOG#002. Information and noise.

We live in the information era. Read more about this age here. Everything in your sorrounding and environtment is bound and related to some kind of “information processing”. Information can also be recorded and transmitted.  Therefore, being rude, information is something which is processed, stored and transmitted. Your computer is now processing information, while you read these words. You also record and save your favourite pages and files in your computer. There are many tools to store digital information: HDs, CDs, DVDs, USBs,…And you can transmit that information to your buddies by e-mail, old fashioned postcards and letters, MSN, phone,…You are even processing information with your brain and senses, whenever you are reading this text. Thus, the information idea is abstract and very general. The following diagram shows you how large and multidisciplinary information theory(IT) is:

I  enjoyed as a teenager that old game in which you are told a message in your ear, and you transmit it to other human, this one to another and so on. Today, you can see it at big scale on Twitter. Hey! The message is generally very different to the original one! This simple example explains the other side of communication or information transmission: “noise”.  Although efficiency is also used. The storage or transmission of information is generally not completely efficient. You can loose information. Roughly speaking, every amount of information has some quantity of noise that depends upon how you transmit the information(you can include a noiseless transmission as a subtype of information process in which,  there is no lost information). Indeed, this is also why we age. Our DNA, which is continuously replicating itself thanks to the metabolism (possible ultimately thanksto the solar light), gets progressively corrupted by free radicals and  different “chemicals” that makes our cellular replication more and more inefficient. Don’t you remember it to something you do know from High-School? Yes! I am thinking about Thermodynamics. Indeed, the reason because Thermodynamics was a main topic during the 19th century till now, is simple: quantity of energy is constant but its quality is not. Then, we must be careful to build machines/engines that be energy-efficient for the available energy sources.

Before going into further details, you are likely wondering about what information is! It is a set of symbols, signs or objects with some well defined order. That is what information is. For instance, the word ORDER is giving you  information. A random permutation of those letters, like ORRDE or OERRD is generally meaningless. I said information was “something” but I didn’t go any further! Well, here is where Mathematics and Physics appear. Don’t run far away!  The beauty of Physics and Maths, or as I like to call them, Physmatics, is that concepts, intuitions and definitions, rigorously made, are well enough to satisfy your general requirements. Something IS a general object, or a set of objects with certain order. It can be certain DNA sequence coding how to produce certain substance (e.g.: a protein) our body needs. It can a simple or complex message hidden in a highly advanced cryptographic code. It is whatever you are recording on your DVD ( a new OS, a movie, your favourite music,…) or any other storage device. It can also be what your brain is learning how to do. That is  “something”, or really whatever. You can say it is something obscure and weird definition. Really it is! It can also be what electromagnetic waves transmit. Is it magic? Maybe! It has always seems magic to me how you can browse the internet thanks to your Wi-Fi network! Of course, it is not magic. It is Science. Digital or analogic information can be seen as large ordered strings of  1’s and 0’s, making “bits” of information. We will not discuss about bits in this log. Future logs will…

Now, we have to introduce the concepts through some general ideas we have mention and we know from High-School. Firstly, Thermodynamics. As everybody knows, and you have experiences about it, energy can not completely turned into useful “work”. There is a quality in energy. Heat is the most degradated form of energy. When you turn on your car and you burn fuel, you know that some of the energy is transformed into mechanical energy and a lof of energy is dissipated into heat to the atmosphere. I will not talk about the details about the different cycles engines can realize, but you can learn more about them in the references below. Simbollically, we can state that

\begin{pmatrix} AVAILABLE \\ENERGY\end{pmatrix}=\begin{pmatrix}TOTAL \;\;ENERGY \\SUPPLIED\end{pmatrix} - \begin{pmatrix}UNAVAILABLE \\ENERGY\end{pmatrix}

The great thing is that an analogue relation in information theory  does exist! The relation is:

\boxed{\mbox{INFORMATION} = \mbox{SIGNAL} - \mbox{NOISE}}

Therefore, there is some subtle analogy and likely some deeper idea with all this stuff. How do physicists play to this game? It is easy. They invent a “thermodynamic potential”! A thermodynamic potential is a gadget (mathematically a function) that relates a set of different thermodynamic variables. For all practical purposes, we will focus here with the so-called Gibbs “free-energy”. It allows to measure how useful a “chemical reaction” or “process” is. Moreover, it also gives a criterion of spontaneity for processes with constant pressure and temperature. But it is not important for the present discussion. Let’s define Gibbs free energy G as follows:

G= H - TS

where H is called enthalpy, T is the temperature and S is the entropy. You can identify these terms with the previous concepts. Can you see the similarity with the written letters in terms of energy and communication concepts? Information is something like “free energy” (do you like freedom?Sure! You will love free energy!). Thus, noise is related to entropy and temperature, to randomness, i.e., something that does not store “useful information”.

Internet is also a source of information and noise. There are lots of good readings but there are also spam. Spam is not really useful for you, isn’t it? Recalling our thermodynamic analogy, since the first law of thermodynamics says that the “quantity of energy” is constant and the second law says something like “the quality of energy, in general, decreases“, we have to be aware of information/energy processing. You find that there are signals and noise out there. This is also important, for instance, in High Energy Physics or particle Physics. You have to distinguish in a collision process what events are a “signal” from a generally big “background”.

We will learn more about information(or entropy) and noise in my next log entries. Hopefully, my blog and microblog will become signals and not noise in the whole web.

Where could you get more information? 😀 You have some good ideas and suggestions in the following references:

1) I found many years ago the analogy between Thermodynamics-Information in this cool book (easy to read for even for non-experts)

Applied Chaos Theory: A paradigm for complexity. Ali Bulent Cambel (Author)Publisher: Academic Press; 1st edition (November 19, 1992)

Unfortunately, in those times, as an undergraduate student, my teachers were not very interested in this subject. What a pity!

2) There are some good books on Thermodynamics, I love (and fortunately own) these jewels: 

Concepts in Thermal Physics, by Stephen Blundell, OUP. 2009.

A really self-contained book on Thermodynamics, Statistical Physics and topics not included in standard books. I really like it very much. It includes some issues related to the global warming and interesting Mathematics. I enjoy how it introduces polylogarithms in order to handle closed formulae for the Quantum Statistics.

Thermodynamcis and Statistical Mechanics. (Dover Books on Physics & Chemistry). Peter T. Landsberg

A really old-fashioned and weird book. But it has some insights to make you think about the foundations of Thermodynamics.

Thermodynamcis, Dover Pub. Enrico Fermi

This really tiny book is delicious. I learned a lot of fun stuff from it. Basic, concise and completely original, as Fermi himself. Are you afraid of him? Me too! E. Fermi was a really exceptional physicist and lecturer. Don’t loose the opportunity to read his lectures on Thermodynamcis.

Mere Thermodynamics. Don S. Lemons. Johns Hopkins University Press.

Other  great little book if you really need a crash course on Thermodynamics.

Introduction to Modern Statistical Physics: A Set of Lectures. Zaitsev, R.O. URSS publishings.

I have read and learned some extra stuff from URSS ed. books like this one. Russian books on Science are generally great and uncommon. And I enjoy some very great poorly known books written by generally unknow russian scientists. Of course, you have ever known about Landau and Lipshitz books but there are many other russian authors who deserve your attention.

3) Information Theory books. Classical information theory books for your curious minds are 

An Introduction to Information Theory: Symbols, Signals and Noise. Dover Pub. 2nd Revised ed. 1980.   John. R. Pierce.

A really nice and basic book about classical Information Theory.

An introduction to Information Theory. Dover Books on Mathematics. F.M.Reza. Basic book for beginners.

The Mathematical Theory of Communication. Claude E. Shannon and W.Weaver.Univ. of Illinois Press.

A classical book by one of the fathers of information and communication theory.

Mathematical Foundations of Information Theory. Dover Books on Mathematics. A.Y.Khinchin.

A “must read” if you are interested in the mathematical foundations of IT.