Open menu

Entropy concept


Albert Einstein considered the thermodynamic laws as " primordial laws of all sciences " and Sir Arthur Eddington says about them : " the second principle of thermodynamics holds supremacy of the laws of nature ... If your theory is against the second law of thermodynamics, I can give you no hope, there is nothing to do for this new theory than fall into the abysal humiliation possible .... "
The world is full of false moralists and those who planted scarecrows ! Too bad for those who go in one direction only because the whole herd is going in the same direction !

Entropy concept in modern science

Background and actual explanation
There is no unique definition for the concept of entropy, but although different deffinitons were proposed during time, there is an accepted oppinion that all these definitions are equivalent to each other. Thus, entropy can be defined as :

• A measure of the probability of obtaining a specific result;
• A measure of the disorder of a thermodynamic system;
• A ratio between heat and absolute temperature of the system.
There is no need to dwell too much on these definitions because they are extensively described in any elementary physics textbook.
Of course, having these options to define entropy, the second law of thermodynamics does not have a standard form. Only few of them are exemplified here, but there are at least 10 different ,,expressions” for this law .
The formulation given by Kelvin: No process is possible in which the sole result is the absorption of heat from a reservoir and its complete conversion into work.;
The formulation given by Clausius: " Heat can never pass from a colder to a warmer body without some other change, connected therewith, occurring at the same time.. "
Clausius introduced the concept of entropy and gave other formulations of the second law based on this concept like: the entropy of an isolated system never decreases
It is accepted that second law of thermodynamics is the most supported and most universal law of physics; according to this, it can be concluded that entire Universe evolves in time toward a greater and greater disorder.... 
More than is accepted that the law makes it possible to define the notion of past and future both in physics and in the real universe. Since the entropy of an isolated system increases with time, measuring its value at different times, we can draw a simple graphic of entropy change and in this way we can define a time axis or a direction of time. ,,Time” will increase the degree of "disorder " of a system, and in the correspondence is biunivocal. This asymmetry at macroscopic level can provide empirical possibility of differentiating between past and future of events and therefore the thermodynamic time is an anisotropic unit. For any other physical law, time is an isotropic size, i.e. physical laws are invariant with respect to time reversal.
Later, researchers have studied the effects of the second law of thermodynamics and acknowledged "that this law is universally valid for a much wider range of processes, not just physical”. It is considered that there is no exception to this law and all events, from the simplest to the most complex, all processes in the universe obey this law. Consequently, the effects of the law were extended to chemical processes, biological, sociological, informational, etc.

Why the current concept of entropy is full of nonsense ...

1. Entropy and biological or chemical processes

Although physics postulates that entropy always increases or remains constant for an isolated system, this is not at all respected in chemistry. As far biological processes are based mainly on  chemical and biochemical processes, the law is not respected in biological systems too. There is no need to introduce new concepts in current chemistry to show that the second law of thermodynamics is blatantly contradicted by common chemical processes. A chemical reaction does not always take place to increase the entropy of the system.
Since the beginning of chemistry as a science, experimental data have shown that there are two factors that affect the spontaneity of a chemical process:
• spontaneity is favored when the process is exothermic;
• spontaneity is favored when increases, the degree of ,,randomness" of the system.
In order to mathematically describe the influence of these factors on the spontaneity of chemical processes in physical chemistry were introduced two concepts:
• enthalpy of reaction which characterize the endo or exo thermicity of a chemical process
• entropy which measures the degree of disorder of the system.
Later, Gibbs defined a new quantity called the free energy of reaction (G) defined as G = H-TS, where H is the enthalpy, T is the absolute temperature and S is entropy. In chemistry (and subsequent in biology), the reaction free energy is used as an indicator of spontaneity of a chemical process and not the entropy, as required by the second principle of thermodynamics.
Mathematically it can be determined (tab 1) which combinations of enthalpy and entropy leads to spontaneous chemical reactions (shortly a negative G means a spontaneous chemical process).








Spontaneous for any T



+ over a T value  

-  under a  T value

Non spontaneous over a T value  Spontaneous under a T value



-  over a T value  

+ under a  T value

Spontaneous under a T value Non spontaneous over a T value  




Non spontaneous at any T value  


 As is easily observed, in the current chemistry, spontaneity and evolution of a chemical systems is given by the combination of three factors, because temperature have to be considered as an independent factor beside enthalpy and entropy .
When the enthalpy and entropy have different signs (case 1 and 4), they operate in the same way and the spontaneous process is not temperature dependent. When the enthalpy and entropy are of the same sign (cases 2 and 3), their effect is opposed and therefore the temperature change will cause a factor to be dominant. In these cases, spontaneity is dependent on the temperature. For case 2, lowering the temperature below a certain limit, make free energy of reaction to become negative and the process to be spontaneous; the opposite is valid for case 3, where a temperature rise over a certain limit, make the process to be spontaneous.
It's more than hilarious the motivation of introducing the concepts of thermodynamic potential (free energy of reaction) in science instead of working directly with second termodyinamicii principle. If the variation of entropy can tell which processes are spontaneous, while do the chemistry need these thermodynamic potential? The logical conclusion to be drawn is that, in this case, the evolution of chemical and biological systems cannot be inferred from the second principle of thermodynamics. 
Consequently, without the need for new concepts, theoriticians should limit the generality of the second law of thermodynamics only to physical phenomena ( which ones ?!) and accept that for chemical and biological systems, entropy is just factor from a set of three factors that affect their evolution. 
In the new theory, which will be described in detail in the book, the current concept of entropy is reconsidered to its ,,true value”.

2. Order, disorder and entropy   

The definition of entropy as a measure of disorder for a physical system can be easily ruled out at a brief analysis of the order and disorder concepts.
In 1850, Rudolf Clausius introduced the term "entropy" as a quantity which increase due to heat. Because at that time (and even now), heat generally refers to the random movement of particles constituting a system, entropy was considered as similar to the degree of "disorder" that the system has. It is important to note that the thermodynamic studies the thermal phenomena without taking into account the atomic-molecular structure of bodies. It doesnt study the mechanism of phenomena, therefore, does not use the structural representations of material bodies. Of course the atomic order is associated with a particular pattern of arrangement of atoms within a specific structure, while the disorder is associated with the absence of template. To be more specific, in the case of solid state, the crystal structure is considered an ordered state while the amorphous structure is considered a disordered one. For liquids and gases (ideal or real) it cannot be accepted the existence of an ordered or disordered structure since the constituent particles of a liquid or gas are in constant motion.
Consider an amorphous material (glass, asphalt, etc . ) subjected to a cooling process from a temperature of T1 = 300 °C up to a temperature of T2 = -100 °C. At both temperatures the material is presented in a solid form, but we are interested in how the system order ,,increases or decreases” between the two temperatures. Let us suppose for temperature T1 we have a correspondent S1 for the system entropy and similarly entropy of the system at temperature T2 is S2. In accordance with the third law of thermodynamics, which states that lowering the temperature lowers the entropy, we must have S2 < S1 .
But is this decrease in entropy reflected in increasing order of the system ?
The question is rhetorical and the answer can give even a novice in physics. Lowering the temperature does not increase the degree of order of the system, i.e. an amorphous material can not be transformed into a crystal. The X -ray diffraction studies and electron temperature can confirm that the change of temperature does not lead to changes in the arrangement of atoms in the network, so it does not change the degree of order or disorder. With decreasing temperature changes, in fact there is only a limitation for the the movement of these atoms around the equilibrium position .
Let us consider another experiment, often used in argumentation for irreversibility of physical phenomena and entropy increase. A ball hits a plate and depending upon the nature of the collision a smaller or larger ball kinetic energy is transformed into heat. If the collision is almost elastic, only a small fraction of the energy ball is transformed into heat. If the collision is plastic ball and/or surface are deformed and near all the kinetic energy of the ball is converted into heat. In both cases, current theorists justify the entropy of the system increases (less obvious for elastic collision), because the amount of heat released during impact corresponds to an increase in entropy of the system.
But this ,,growth" of system entropy corresponds to an increase in disorganization of the system?
For nearly elastic collision, the amount of heat released is dissipated without notice macroscopic changes for studied bodies. At atomic level, using X-ray and electron diffraction it can be verified that there is no change in the original structure ( crystalline or amorphous ) of the constituent materials .
Even in case of plastic collision, where the total amount of kinetic energy is transformed into heat, there are no change in the state of order or disorder at the microscopic level (excluding the case when the amount of heat generated is so great that the material melts or a vaporization occurs!). Current theorticians claim that during these clashes, the system switches from one state with a low probability to a state with high probability, and as consequence this change will lead to chaotic motion of particles material: but this assumtion is pure fabulation. Of course in case of a plastic colision, there is a change in the form of ball and/or surface, but these are side effects that do not affect  order and disorder; these changes affects only the utility value of some goods for further practical use.
Therefore, the proposed theory  claims  that there is no connection between the concept of entropy and order at microscopic or macroscopic scale. Expanding the laws of thermodynamics to human behavior, where order and disorder are more than subjective concepts, is a unpardonable error of modern physics and this error has to be avoided in the future.

3. Why  entropy cannot be defined as a probability or as heat divided by absolute temperature either 

Experiment no. 1 . The entropy of a mixture of ideal gases
Consider two containers filled with equal volumes of noble gases connected by a narrow tube as shown in Fig. 1. A container is filled with helium and other with xenon . Temperature of both balloons is 25° C and the gas pressure is 0.5 atm inside and in equilibrium with surrounding temperature. Under these conditions both gases have a behavior similar to that of an ideal gas.

Figure 1 Entropy change during gas mix.

When the valve that connects the two compartments is put in the open position,  balloons gases mix spontaneously and system parameters remain constant (pressure, temperature, total number of moles of He or Xe in entire volume).
It is accepted that the entropy change (an increase of entropy in this case) is the driving force that generates the redistribution of atoms of He and Ar between the two balloons, until it reaches a state of maximum distribution (scattering) as shown in Fig. 1.

Why the current explanation is that a monument of absurdity ... ?  
 Kinetic molecular theory assumes that an ideal gas particles undergo elastic collisions with each other or with the walls of the container; but you would not find in any physics textbook or even in advanced treatises,  what happen when two gases with different molecular weights  mixes at the same temperature. 
Let's see what conclusions are inferred when  a volume of helium (molecular weight MHE =4)  mixes  with a volume of Xe (molecular weight  MXE  = 131). For simplicity to consider VHe = VXe = 1 m3 and system temperature T = 298K. From experimental point of view, after opening the  valve, gases mix without absorbing or releasing heat, and the temperature of system remains the same. But are these experimental facts in agreement with kinetic molecular theory ? 

We consider the temperature of a gas is to be a measure of the kinetic energy of molecules and kinetic energy depends on the square of the velocity of the molecule.
At temperature T = 298 K , helium atoms, respectively Xe atoms moving in significantly different average speeds. These can be calculated with the formula:entropy02where R is the universal gas constant, T is the temperature and M is the molecular weight.

For He we have: entropy03; For Xe we have entropy04

As can be seen there is a major difference between the speeds of the helium atoms and Xe atoms at the same temperature.
When the valve is open and gases start to mix, and even after arriving at equilibrium, there will be collisions between the helium and xenon atoms, and collisions of individual atoms of xenon or helium with container walls.
As result of collisions between atoms and walls, xenon or helium velocities remains constant ( wall can be considered infinite mass by comparison with mass of atoms).
However, helium atoms can collide with the xenon atoms inside the container. Kinetic molecular theory gives us a specific unit which estimates the distance through such atom before the collision takes place and it can be shown that we have a high frequency of such collisions. 
To simplify the situation, we consider only  frontal collisions between the  He and Xe atoms; in case of real collisions which can take place under any angle of incidence, the situation is more difficult to be treated mathematically, but the conclusions are the same.   
Noting with u1 and u2 speeds before the collision and v1 and v2 their speeds after the collision,  and considering that  m1 and m2 are the mases of the bodies participating in the collision, we can apply the law of conservation of momentum and energy, in order to calculate the final velocities.
Solving this system of equations with respect to v1 and v2 yields:
entropy07;    entropy08

In our case having u1 = 1363 m/s and u2 = -238 m / s, m1 = 4 and m2 = 131 , we get to the final :

entropy09 and entropy10
Of course minus sign means a change of direction speed vector helium atom relative to the initial direction considered positive .
As can be seen, the velocity of the atoms involved in the collision elastic changes and this will lead to the existence of two different temperatures, and both will be different from the expermental one.
The temperature given by the helium atoms must be:entropy11, and another given by xenon:entropy12

These results are more disastrous than the kinetic molecular theory. When mixing the two ideal gases, the energy of the particles should change depending on a whole range of factors such as molecular weights of the molecules involved in the collision, the collision angle , etc. Under these conditions, the current theoretical concept temperature becomes meaningless, and of course the whole thermodynamics in the absence of this concept is nonsense.
Not only the concept of temperature, but the concept of diffusion of gas is inconsistent with the experimental data too.
Suppose that the pipe connecting the two containers has length of at least 2m and the valve is fixed in the middle of the tube.When the valve is opne, the helium atoms will tend to move toward xenon tank and xenon atoms will tend to move in the helium tank. However, as a result of collisions between these species, helium atoms are returned to the same tank with a higher temperature, while the xenon atoms are able to penetrate along the connection tube to the reservoir of helium. Therefore, a diffusion of high molecular weight gas in the gas with low molecular weight has to be observed. This assumtion is in clear conflict with the law of diffusion ..... Of course, besides contradicting the diffusion law, the actual model leads to major thermal effects that have never been observed experimentally. It is impossible when mixing two gases, both at ambient temperature to arrive at  different temperatures.... one to increase its temperature to about 215 °C , and the other to decrease it to about -165 °C ......
 In the proposed theory, as stated above, the entire kinetic molecular theory of gases is considered absurd and removed from science. Additionally, the concept of entropy as currently defined is considered as absurd as kinetic theory.

Towards the end, but just as important, it is necessary to compare between them, the two definitions of entropy based on probability concept and heat.
It is recognized that entropy characterize each state of a thermodynamic system and is closely related to thermodynamic probability (also called statistical weight ) of respective state. The relationship between entropy and thermodynamic probability of a state that state was established by Boltzmann : S = k lnW where k is a proportionality constant called the Boltzmann constant and W thermodynamic probability  of the respective state
On the other hand , the entropy as a function of the second  principle of thermodynamics, to an infinitesimal reversible process can be written as: entropy13 and for irreversible processes:

A gas consists from individual molecules which have weak attractive forces between them. But for the matter which is in liquid form, and especially the solid state, the forces of attraction between particles are strong enough. This makes it impossible a probabilistic distribution of particles for such systems. For example, when a solid is formed as a result of a chemical reaction, ist constituent particles are not free to move randomly and to have a random ordering. There is a certain pattern arrangement of atoms in a solid. The simplest example would be the formation of solid ammonium chloride, when the acid chloride and ammonia gas reacts. For hydrochloric acid and ammonia in the gaseous state, it is acceptable that these molecules can have a random distribution and we can describe their comportment based on statistical laws. However, once ammonium chloride is formed  no atom of this complex has  freedom of movement and it is part of a structure called a macroscopic crystal. Neither ammonium chloride molecule as a whole, no longer is free to move inside the crystal or redistribute statistically. Therefore, the concept of entropy as a statistical weight for solids and liquids is more then a nonsense.
Equally absurd are the current interpretations of dissolution or other physico-chemical phenomena which are considered to be driven by a state of maximum probability (increased entropy).
If we take a few grains of salt (NaCl ) in a glass of water, it is noted that the salt is dissolved after a certain period of time, even in the absence of agitation. It is recognized that the dissolution occurs because the entropy of the system is increased when the salt is dissociated into ions and form a solution. But this interpretation does not give any indication about what happens to entropy when salt is added to a larger amount so as to exceed the limit of solubility of sodium chloride ( 36 g per 100 ml of water at 25 ° C). If we add 40 g salt per 100 ml water ( 25 ° C ) note that only about 36 g of the salt dissolves and the rest of 4 grams remains as solid with or without flask stirring. 
How to interpret this simple fact ?
A , common sense " interpretation should admit that there is a limiting factor that prevents entropy grows continuously. In the case presented, the solubility is a deeper and more important factor than the entropy, and we can not speak about entropy  without taking into account the solubility. 
Of course that may be yet another simple experiment to demonstrate that the solubility is more  important than the trend postulated by principle II of thermodynamics .
It is well known that water and oil ( edible ) do not mix due  to their low mutual solubility. However, if an oil -water mixture is subjected to a process of vigorous stirring,  a dispersion of oil in water  is formed and this can be maintained as long as stirring is maintained. This state of dispersion has a much higher state of disorder compared to the initial state when there were  two separate immiscible liquids based on density. However, as soon as agitation is stopped, the system returns to the initial state, and we have two immiscible layers separated by the difference in density. It seems that the law of solubility is more important  than the entropy increase aand second principle of thermodynamics, because the sistem reverses to the initial state  even in the final state entropy of the system decreases.