The Entropy 

Home 
Summary: We discuss the concepts of information, entropy, and observer's capability in the discrete measurement space. We derive the propagation of information in discrete jspace. The qvalues are introduced. Definition The concept of entropy is probably one of the most profound idea developed in physics, which has enormous relevance in just about every aspect of our lives. It allows us to visualize the utilization of available resources to perform an action. We can consider a simple example. Suppose Joe is given $100 to purchase a text book by his mom. Now Joe goes and buys the book which costs him $50. Then on his way back he stops by to grab a pizza and soda for another $15. Then he spends another $10 on comics. Upon arriving back at home Joe returns remaining $25 to his mother. Now as far as Joe's mom is concerned the money spent on junk food and comics is not useful, therefore from her perspective the entropy is $15+$10 = $25. Our friend Joe thinks otherwise. He is not studious type and he did not like the text book he purchased. But for him the money spent on snacks and comics was worth it. So for Joe the entropy is $50 for the book plus $25 which he could not use, hence a total of $75. Notice the word "use". The entropy is observer dependent. The entropy tells us about the resources which can not be used for the assigned task due to inherent limitations of the system or the observer. The system for the macroscopic observer Obs_{M} (humans, v/c <<1), exists based on measurement made by Obs_{c} (v/c~1). Thus while in thermodynamic systems we may not notice the observer dependence, the entropy is dependent on the observer with maximum capacity to make precise measurements. We will use the concept of entropy in jspace and correlate the results obtained from measurements and the observer's capability. Simply put if the observer Obs_{j} measured a large number of states to complete a PE1_{j} result or event, than the entropy of the observer is high. If the observer Obs_{j} measured a small number of states to complete the same PE1_{j} event then its entropy is low. The information is a combination of multiple PE1_{j }events. At the same time the observer's resources are fixed by the interval [0_{j} , ∞_{j}]. Therefore the larger the entropy less is the information an observer is likely to measure. The information is defined by the change in entropy. If we are reducing the entropy we are gaining more information. We can define entropy S as, The Ω is number of measurements and k_{ij} is a constant characterizing the discrete measurement space or jspace. In thermodynamics k_{ij} is k_{B} known as the Boltzmann constant. We will have to determine what is the significance of k_{ij} in measurement space. We will do it in a later section.
Remember the discussion towards the end of Qbox section! We introduced δ_{i }in jspace. The Obs_{i} measures all the information required for a PE1_{j} event in jspace due to δ_{i}, in one state. Hence the entropy in this case will be zero. At the same time δ_{i} really extends Obs_{j} capacity and as consequence the entropy for the same measurement is quite large for Obs_{j}. This is an important distinction we need to keep in mind. If an observer Obs_{j} is measuring same event, once by performing Ω_{A} measurements and then Ω_{B} measurements with improved efficiency such that Ω_{A} > Ω_{B} . The entropy for each case is S_{A} and S_{B} s.t. S_{A} > S_{B}. Then if we subtract information I_{AB} from low entropy state S_{B }then we get the high entropy state S_{A} . More the entropy less the information available to an observer. Hence the information is also known as negentropy. The amount of information I_{AB} needed for efficiency improvement is,
We can write equation (ii) as,
Thus we have information, entropy and resources. We use less resources if we have more information or less entropy or equivalently make less number of measurements to obtain the same result. A very obvious general rule of everyday life. The rule applies to Nature as well. Progress of Information in jspace We consider now the progress of information from ispace into the discrete jspace as the information measured by the Qbox.
Keep in mind the expansion shown is within the Qbox, implying the location of truepoint or source becomes more and more uncertain as the time progresses. Or the universe being measured is within the Qbox. We note that the source, ispace, and the measurement space defining the observer environment, jspace, are separated. For the same source the environment of different observers and their measurements, will vary per capability of the observers. Therefore it will be prudent to separate the concept of energy which is measured in jspace using Qbox and used to define thermodynamic properties, from the information received from the ispace or the source. In discrete jspace the evolution from <t = 0_{j }> state on the time scale, due to increasing entropy, will degrade the information from its original value I(t = 0_{j}) to I(t = 0_{j}^{+}) such that, The instant (t = 0_{j}) represents the instant the measurements in jspace start and (t = 0_{j}^{+}) represents the next instant at infinitesimal time interval. The exponential function is selected because its derivative is equal to itself and its initial value at q = 0 is 1. The variable q takes positive integral values. Each qvalue represents a PE1_{j} event in discrete jspace. We can write equation (iv) as,
We need to remove the proportionality sign. To do that we determine the initial condition based on Obs_{i} measurements. For Obs_{i}, the observer in ispace, all the outcomes are known and hence at the instant <t = 0_{j}>,
We have obtained the initial condition based on Obs_{i} capabilities. It was not possible with Obs_{j}, the observer pair in discrete jspace, due to their inherent limitation in determining the origin with absolute accuracy. The variation of information in discrete jspace can be obtained from equation (v) as, We can rewrite equation (vii) as,
It defines the relationship between the information at a later instant to the initial information in the terms of PE1 measurements in a discrete measurement space or jspace. The relationship is independent of k_{ij} the constant characterizing the jspace. Please note that in deriving equation (viii), we have not considered any physical characteristics of the medium or the observer in jspace except for the observer's capability. Finally we have to consider what happens when the entropy for an observer is maximized i.e. at what point the observer can not make any more measurements. This part is straightforward. With increasing entropy the observer Obs_{j} is highly unlikely to complete a PE1_{j} measurement. The value of the information measured by the observer in jspace becomes 0_{j} for the observers in jspace at t = ∞_{j}^{}, where t = ∞_{j}^{ }is the instant at which the entropy is maximized. The notation <∞_{j}^{}> signifies an instant before the value on the time axis equals ∞_{j} or completion. The condition for this case can be written from equation (viii) as, Before closing out this section, an explanation of the entropy relationship is necessary. When we wrote the equation (i) i.e. S = k_{ij} log_{e}Ω. The logarithmic function is due to additive nature of entropy which itself is due to the assumption of no interaction between the internal structures of the molecules. The lack of interaction corresponds to the absence of the information about <t=0_{j}> or initial state. In jspace the assumption of noninteracting components is true for values of q approaching ∞_{j}, but for low q values the interaction between states is much stronger and hence the contribution of interaction to entropy can not be ignored. The classic entropy defined by Boltzmann's Htheorem assumes no memory of the initial state and hence the ΔS ≥ 0 relationship is valid for the interacting macrostates (collisions), such as a system of ideal monoatomic gas molecules. The calculations presented use logarithmic relationship based on the classical picture. However in the case of the memory of the initial state being available, the process is not truly stochastic. Well, writing equations is good and everybody does it. But then what is the connection to the nature? We shall discuss it next when we discuss the finestructure constant alpha. Information on www.ijspace.org is licensed under a Creative Commons Attribution 4.0 International License.

"Entropy measures the lack of information about the exact state of a system.  Brillouin"  Zemansky and Dittman, Heat and Thermodynamics, The McGrawHill Corporation. "This principle (the negentropy principle of information), imposes a new limitation on physical experiments and is independent of the wellknown uncertainty relations of the quantum mechanics."  Leon Brillouin in Science and Information Theory. "His final view, expressed in 1912, seems to be that the interaction between ether and resonators is continuous, whereas the energy exchange between ordinary matter, such as atoms, and resonators is somehow quantized."  Hendrik Antoon Lorentz’s struggle with quantum theory, A. J. Kox, Arch. Hist. Exact Sci. (2013) 67:149–170. "If the world has begun with a single quantum, the notions of space and time would altogether fail to have any meaning at the beginning; they would only begin to have a sensible meaning when the original quantum had been divided into a sufficient number of quanta. If this suggestion is correct, the beginning of the world happened a little before the beginning of space and time."  G. LEMAÎTRE, "The Beginning of the World from the Point of View of Quantum Theory.", Nature 127, 706 (9 May 1931). 