logo
The Entropy   





Home



Summary: We discuss the concepts of information, entropy, and observer's capability in the discrete measurement space. We derive the propagation of information in discrete j-space. The q-values are introduced.
 
logo
 
Definition
    
       
     The concept of entropy is probably one of the most profound idea developed in physics, which has enormous relevance in just about every aspect of our lives. It allows us to visualize the utilization of available resources to perform an action.

      We can consider a simple example. Suppose Joe is given $100 to purchase a text book by his mom. Now Joe goes and buys the book which costs him $50. Then on his way back he stops by to grab a pizza and soda for another $15. Then he spends another $10 on comics. Upon arriving back at home Joe returns remaining $25 to his mother.

      Now as far as Joe's mom is concerned the money spent on junk food and comics is not useful, therefore from her perspective the entropy is $15+$10 = $25. Our friend Joe thinks otherwise. He is not studious type and he did not like the text book he purchased. But for him the money spent on snacks and comics was worth it. So for Joe the entropy is $50 for the book plus $25 which he could not use, hence a total of $75. Notice the word "use". The entropy is observer dependent. The entropy tells us about the resources which can not be used for the assigned task due to inherent limitations of the system or the observer. The system for the macroscopic observer ObsM (humans, v/c <<1), exists based on measurement made by Obsc (v/c~1).  Thus while in thermodynamic systems we may not notice the observer dependence, the entropy is dependent on the observer with maximum capacity to make precise measurements.


      We will use the concept of entropy in j-space and correlate the results obtained from measurements and the observer's capability. Simply put if the observer Obsj measured a large number of states to complete a PE1j result or event, than the entropy of the observer is high. If the observer Obsj measured a small number of states to complete the same PE1j event then its entropy is low.

      The information is a combination of multiple PE1j events. At the same time the observer's resources are fixed by the interval [0j , j]. Therefore the larger the entropy less is the information an observer is likely to measure. The information is defined by the change in entropy. If we are reducing the entropy we are gaining more information. We can define entropy S as,
eqn1  
The Ω is number of measurements and kij is a constant characterizing the discrete measurement space or j-space. In thermodynamics kij is kB known as the Boltzmann constant. We will have to determine what is the significance of kij in measurement space.  We will do it in a later section.

      Remember the discussion towards the end of Qbox section! We introduced δi in j-space. The Obsi measures all the information required for a PE1j event in j-space due to δi, in one state. Hence the entropy in this case will be zero.  At the same time δi really extends Obsj capacity and as consequence the entropy for the same measurement is quite large for Obsj. This is an important distinction we need to keep in mind.

      If an observer Obsj is measuring same event, once by performing ΩA measurements and then ΩB measurements with improved efficiency such that ΩA > ΩB . The entropy for each case is SA and SB s.t. SA > SB. Then if we subtract information IAB from low entropy state SB then we get the high entropy state SA . More the entropy less the information available to an observer. Hence the information is also known as negentropy. The amount of information IAB needed for efficiency improvement is,
eqn2
We can write equation (ii) as,
eqn3
Thus we have information, entropy and resources. We use less resources if we have more information or less entropy or equivalently make less number of measurements to obtain the same result. A very obvious general rule of everyday life. The rule applies to Nature as well.

Progress of Information in j-space

     
We consider now the progress of information from i-space into the discrete j-space as the information measured by the Qbox. 
interface
Keep in mind the expansion shown is within the Qbox, implying the location of true-point or source becomes more and more uncertain as the time progresses. Or the universe being measured is within the Qbox.

     We note that the source, i-space, and the measurement space defining the observer environment, j-space, are separated. For the same source the environment of different observers and their measurements, will vary per capability of the observers. Therefore it will be prudent to separate the concept of energy which is measured in j-space using Qbox and used to define thermodynamic properties, from the information received from the i-space or the source.

 
 

     In discrete j-space the evolution from <t = 0j > state on the time scale, due to increasing entropy, will degrade the information from its original value I(t = 0j) to I(t = 0j+) such that,

eqn4
qvalThe instant (t = 0j) represents the instant the measurements in j-space start and (t = 0j+) represents the next instant at infinitesimal time interval. The exponential function is selected because its derivative is equal to itself and its initial value at q = 0 is 1. The variable q takes positive integral values. Each q-value represents a PE1j event in discrete j-space. We can write equation (iv) as,
        eqn5

We need to remove the proportionality sign. To do that we determine the initial condition based on Obsi measurements. For Obsi, the observer in i-space, all the outcomes are known and hence at the instant <t = 0j>,
eqn6

We have obtained the initial condition based on Obsi capabilities. It was not possible with Obsj, the observer pair in discrete j-space,  due to their inherent limitation in determining the origin with absolute accuracy. The variation of information in discrete j-space can be obtained from equation (v) as,

eqn7

We can rewrite equation (vii) as,

eqn8

It defines the relationship between the information at a later instant to the initial information in the terms of PE1 measurements in a discrete measurement space or j-space. The relationship is independent of kij the constant characterizing the j-space. Please note that in deriving equation (viii), we have not considered any physical characteristics of the medium or the observer in j-space except for the observer's capability.

     
      Finally we have to consider what happens when the entropy for an observer is maximized i.e. at what point the observer can not make any more measurements.
This part is straightforward. With increasing entropy the observer Obsj is highly unlikely to complete a PE1j measurement. The value of the information measured by the observer in j-space becomes 0j for the observers in j-space at t = j-, where
t = j- is the instant at which the entropy is maximized. The notation <j-> signifies an instant before the value on the time axis equals j or completion. The condition for this case can be written from equation (viii) as,
eqn9 

The 0j in R.H.S. of above equation is measured as finite by the Obsi, which means that the information I(t=j-) is finite but Obsj can not measure it as it is below his measurement capability.

      Before closing out this section,
an explanation of the entropy relationship is necessary. When we wrote the equation (i) i.e. S = kij logeΩ. The logarithmic function is due to additive nature of entropy which itself is due to the assumption of no interaction between the internal structures of the molecules. The lack of interaction corresponds to the absence of the information about <t=0j> or initial state.

      In j-space the assumption of non-interacting components is true for values of q approaching
j, but for low q values the interaction between states is much stronger and hence the contribution of interaction to entropy can not be ignored.

      The classic entropy defined by Boltzmann's H-theorem assumes no memory of the initial state and hence the
ΔS0 relationship is valid for the interacting macro-states (collisions), such as a system of ideal mono-atomic gas molecules. The calculations presented use logarithmic relationship based on the classical picture. However in the case of the memory of the initial state being available, the process is not truly stochastic.

      Well, writing equations is good and everybody does it. But then what is the connection to the nature? We shall discuss it next when we discuss the fine-structure constant alpha.
 

    Kmandook
Creative Commons License
Information on www.ijspace.org is licensed under a Creative Commons Attribution 4.0 International License.


































"Entropy measures the lack of information about the exact state of a system. - Brillouin"

- Zemansky and Dittman, Heat and Thermodynamics, The McGraw-Hill Corporation.














































"This principle (the negentropy principle of information), imposes a new limitation on physical experiments and is independent of the well-known uncertainty relations of the quantum mechanics."                      

- Leon Brillouin in Science and Information Theory.





















"His final view, expressed in 1912, seems to be that the interaction between ether and resonators is continuous, whereas the energy exchange between ordinary matter, such as atoms, and resonators is somehow quantized."

- Hendrik Antoon Lorentz’s struggle with quantum theory, A. J. Kox, Arch. Hist. Exact Sci. (2013) 67:149–170.


















"If the world has begun with a single quantum, the notions of space and time would altogether fail to have any meaning at the beginning; they would only begin to have a sensible meaning when the original quantum had been divided into a sufficient number of quanta. If this suggestion is correct, the beginning of the world happened a little before the beginning of space and time."

- G. LEMAÎTRE, "The Beginning of the World from the Point of View of Quantum Theory.", Nature 127, 706 (9 May 1931).