# LOG#022. Kaniadakis and relativity.

**Posted:**2012/08/22

**Filed under:**Information theory, Non-extensive entropy and Superstatistics, Physmatics, Relativity |

**Tags:**Entropy, non-extensive, Physmatics, Relativity Leave a comment

Hello, I am back! After some summer rest/study/introspection! And after an amazing July month with the Higgs discovery by ATLAS and CMS. After an amazing August month with the Curiosity rover, MSL(Mars Science Laboratory), arrival to Mars. After a hot summer in my home town…I have written lots of drafts these days…And I will be publishing all of them step to step.

We will discuss today one of interesting remark studied by Kaniadakis. He is known by his works on relatistivic physics, condensed matter physics, and specially by his work on some cool function related to non-extensive thermodynamics. Indeed, Kaniadakis himself has probed that his entropy is also related to the mathematics of special relativity. Ultimately, his remarks suggest:

1st. Dimensionless quantities are the true fundamental objects in any theory.

2nd. A relationship between information theory and relativity.

3rd. The important role of deformation parameters and deformed calculus in contemporary Physics, and more and more in the future maybe.

4nd. Entropy cound be more fundamental than thought before, in the sense that non-extensive generalizations of entropy play a more significant role in Physics.

5th. Non-extensive entropies are more fundamental than the conventional entropy.

The fundamental object we are going to find is stuff related to the following function:

Let me first imagine two identical particles ( of equal mass) A and B, whose velocities, momenta and energies are, in certain frame S:

In the rest frame of particle B, S’, we have

If we define a dimensionless momentum paramenter

we get after usual exponentiation

Galilean relativity says that the laws of Mechanics are unchanged after the changes from rest to an uniform motion reference frame. Equivalentaly, galilean relativity in our context means the invariance under a change , and it implies the invariance under a change . In turn, plugging these inte the last previous equation, we get the know relationship

Wonderful, isn’t it? It is for me! Now, we will move to Special Relativity. In the S’ frame where B is at rest, we have:

and from the known relativistic transformations for energy and momentum

where of course we define

After this introduction, we can parallel what we did for galilean relativity. We can write the last previous equations in the equivalent form, after some easy algebra, as follows

Now, we can introduce dimensionless variables instead of the triple , defining instead the adimensional set :

Note that the so-called deformation parameter is indeed related (equal) to the beta parameter in relativity. Again, from the special relativity requirement we obtain, as we expected, that . Classical physics, the galilean relativity we know from our everyday experience, is recovered in the limit , or equivalently, if . In the dimensionless variables, the transformation of energy and momentum we wrote above can be showed to be:

In rest frame of some particle, we get of course the result , or in the new variables . The energy-momentum dispersion relationship from special relativity becomes:

or

Moreover, we can rewrite the equation

in terms of the dimensionless energy-momentum variable

amd we get the analogue of the galilean addition rule for dimensionless velocities

Note that the classical limit is recovered again sending . Now, we have to define some kind of deformed exponential function. Let us define:

Applying this function to the above last equation, we observe that

Again, relativity means that observers in uniform motion with respect to each other should observe the same physical laws, and so, we should obtain invariant equations under the exchanges and . Pluggint these conditions into the last equation, it implies that the following condition holds (and it can easily be checked from the definition of the deformed exponential).

One interesting question is what is the inverse of this deformed exponential ( the name q-exponential or -exponential is often found in the literature). It has to be some kind of deformed logarithm. And it is! The deformed logarithm, inverse to the deformed exponential, is the following function:

Indeed, this function is related to ( in units with the Boltzmann’s constant set to the unit ) the so-called Kaniadakis entropy!

Furthermore, the equation also implies that

The gamma parameter of special relativity is also recasted as

More generally, in fact, the deformed exponentials and logarithms develop a complete calculus based on:

and the differential operators

so that, e.g.,

This Kanadiakis formalism is useful, for instance, in generalizations of Statistical Mechanics. It is becoming a powertool in High Energy Physics. At low energy, classical statistical mechanics gets a Steffan-Boltmann exponential factor distribution function:

At high energies, in the relativistic domain, Kaniadakis approach provide that the distribution function departures from the classical value to a power law:

There are other approaches and entropies that could be interesting for additional deformations of special relativity. It is useful also in the foundations of Physics, in the Information Theory approach that sorrounds the subject in current times. And of course, it is full of incredibly beautiful mathematics!

We can start from deformed exponentials and logarithms in order to get the special theory of relativity (reversing the order in which I have introduced this topic here). Aren’t you surprised?

# LOG#002. Information and noise.

**Posted:**2012/02/02

**Filed under:**Information theory, Physmatics |

**Tags:**Entropy, Information theory, Physics, Physmatics, science, Thermodynamics Leave a comment

We live in the** information era**. Read more about this age here. Everything in your sorrounding and environtment is bound and related to some kind of “information processing”. Information can also be recorded and transmitted. Therefore, being rude, information is *something* which is processed, stored and transmitted. Your computer is now processing information, while you read these words. You also record and save your favourite pages and files in your computer. There are many tools to store digital information: HDs, CDs, DVDs, USBs,…And you can transmit that information to your buddies by e-mail, old fashioned postcards and letters, MSN, phone,…You are even processing information with your brain and senses, whenever you are reading this text. Thus, the information idea is abstract and very general. The following diagram shows you how large and multidisciplinary information theory(IT) is:

I enjoyed as a teenager that old game in which you are told a message in your ear, and you transmit it to other human, this one to another and so on. Today, you can see it at big scale on Twitter. Hey! The message is generally very different to the original one! This simple example explains the other side of communication or information transmission: “noise”. Although efficiency is also used. The storage or transmission of information is generally not completely efficient. You can loose information. Roughly speaking, every amount of information has some quantity of noise that depends upon how you transmit the information(you can include a noiseless transmission as a subtype of information process in which, there is no lost information). Indeed, this is also why we age. Our DNA, which is continuously replicating itself thanks to the metabolism (possible ultimately thanksto the solar light), gets progressively corrupted by free radicals and different “chemicals” that makes our cellular replication more and more inefficient. Don’t you remember it to something you do know from High-School? Yes! I am thinking about Thermodynamics. Indeed, the reason because Thermodynamics was a main topic during the 19th century till now, is simple: quantity of energy is constant but its quality is not. Then, we must be careful to build machines/engines that be energy-efficient for the available energy sources.

Before going into further details, you are likely wondering about what information is! It is a set of symbols, signs or objects with some well defined order. That is what information is. For instance, the word ORDER is giving you information. A random permutation of those letters, like ORRDE or OERRD is generally meaningless. I said information was “something” but I didn’t go any further! Well, here is where Mathematics and Physics appear. Don’t run far away! The beauty of Physics and Maths, or as I like to call them, Physmatics, is that concepts, intuitions and definitions, rigorously made, are well enough to satisfy your general requirements. *Something* IS a general object, or a set of objects with certain order. It can be certain DNA sequence coding how to produce certain substance (e.g.: a protein) our body needs. It can a simple or complex message hidden in a highly advanced cryptographic code. It is whatever you are recording on your DVD ( a new OS, a movie, your favourite music,…) or any other storage device. It can also be what your brain is learning how to do. That is “something”, or really whatever. You can say it is something obscure and weird definition. Really it is! It can also be what electromagnetic waves transmit. Is it magic? Maybe! It has always seems magic to me how you can browse the internet thanks to your Wi-Fi network! Of course, it is not magic. It is Science. Digital or analogic information can be seen as large ordered strings of 1’s and 0’s, making “bits” of information. We will not discuss about bits in this log. Future logs will…

Now, we have to introduce the concepts through some general ideas we have mention and we know from High-School. Firstly, Thermodynamics. As everybody knows, and you have experiences about it, energy can not completely turned into useful “work”. There is a quality in energy. Heat is the most degradated form of energy. When you turn on your car and you burn fuel, you know that some of the energy is transformed into mechanical energy and a lof of energy is dissipated into heat to the atmosphere. I will not talk about the details about the different cycles engines can realize, but you can learn more about them in the references below. Simbollically, we can state that

The great thing is that an analogue relation in information theory does exist! The relation is:

Therefore, there is some subtle analogy and likely some deeper idea with all this stuff. How do physicists play to this game? It is easy. They invent a “thermodynamic potential”! A thermodynamic potential is a gadget (mathematically a function) that relates a set of different thermodynamic variables. For all practical purposes, we will focus here with the so-called Gibbs “free-energy”. It allows to measure how useful a “chemical reaction” or “process” is. Moreover, it also gives a criterion of spontaneity for processes with constant pressure and temperature. But it is not important for the present discussion. Let’s define Gibbs free energy G as follows:

where H is called enthalpy, T is the temperature and S is the entropy. You can identify these terms with the previous concepts. Can you see the similarity with the written letters in terms of energy and communication concepts? Information is something like “free energy” (do you like freedom?Sure! You will love free energy!). Thus, noise is related to entropy and temperature, to randomness, i.e., something that does not store “useful information”.

Internet is also a source of information and noise. There are lots of good readings but there are also spam. Spam is not really useful for you, isn’t it? Recalling our thermodynamic analogy, since the first law of thermodynamics says that the “quantity of energy” is constant and the second law says something like “the quality of energy, in general, *decreases*“, we have to be aware of information/energy processing. You find that there are signals and noise out there. This is also important, for instance, in High Energy Physics or particle Physics. You have to distinguish in a collision process what events are a “signal” from a generally big “background”.

We will learn more about information(or entropy) and noise in my next log entries. Hopefully, my blog and microblog will become signals and not noise in the whole web.

Where could you get more information? 😀 You have some good ideas and suggestions in the following references:

*1) I found many years ago the analogy between Thermodynamics-Information in this cool book (easy to read for even for non-experts)*

* Applied Chaos Theory: A paradigm for complexity.* Ali Bulent Cambel (Author)

**Publisher:**Academic Press; 1st edition (November 19, 1992)

Unfortunately, in those times, as an undergraduate student, my teachers were not very interested in this subject. What a pity!

*2) There are some good books on Thermodynamics, I love (and fortunately own) these jewels: *

*Concepts in Thermal Physics*, *by Stephen Blundell, OUP. 2009.*

A really self-contained book on Thermodynamics, Statistical Physics and topics not included in standard books. I really like it very much. It includes some issues related to the global warming and interesting Mathematics. I enjoy how it introduces polylogarithms in order to handle closed formulae for the Quantum Statistics.

*Thermodynamcis and Statistical Mechanics.*** (Dover Books on Physics & Chemistry). **Peter T. Landsberg

A really old-fashioned and weird book. But it has some insights to make you think about the foundations of Thermodynamics.

* Thermodynamcis*,

*Dover Pub. Enrico Fermi*

This really tiny book is delicious. I learned a lot of fun stuff from it. Basic, concise and completely original, as Fermi himself. Are you afraid of him? Me too! E. Fermi was a really exceptional physicist and lecturer. Don’t loose the opportunity to read his lectures on Thermodynamcis.

*Mere Thermodynamics.**Don S. Lemons. Johns Hopkins University Press.*

Other great little book if you really need a crash course on Thermodynamics.

**Introduction to Modern Statistical Physics: A Set of Lectures.***Zaitsev, R.O. URSS publishings.*

I have read and learned some extra stuff from URSS ed. books like this one. Russian books on Science are generally great and uncommon. And I enjoy some very great poorly known books written by generally unknow russian scientists. Of course, you have ever known about Landau and Lipshitz books but there are many other russian authors who deserve your attention.

*3) Information Theory books. Classical information theory books for your curious minds are *

**An Introduction to Information Theory: Symbols, Signals and Noise.** *Dover Pub. 2nd Revised ed. 1980. John. R. Pierce.*

A really nice and basic book about classical Information Theory.

**An introduction to Information Theory**. *Dover Books on Mathematics. F.M.Reza. Basic book for beginners.*

**The Mathematical Theory of Communication**. *Claude E. Shannon and W.Weaver.Univ. of Illinois Press.*

A classical book by one of the fathers of information and communication theory.

**Mathematical Foundations of Information Theory.** *Dover Books on Mathematics. A.Y.Khinchin.*

A “must read” if you are interested in the mathematical foundations of IT.