Meaning of Entropy
What is Entropy:
As entropy the natural tendency to the loss of order in a system is known. The word, as such, comes from the Greek ἐντροπία (entropy), which literally means 'turn', although today it is used in various figurative senses.
The term entropy was initially coined by the German physicist Rudolf Clausius when he observed that, in any irreversible process, a small amount of thermal energy always went outside the boundary of the system. Since then, the term has been used in the most varied disciplines of knowledge, such as physics, chemistry, mathematics, astrophysics, linguistics, computing or ecology, to refer to the measure of disorder to which tends a system.
Thus, for example, in Physics, entropy refers to the degree of irreversibility that, in a thermodynamic system, is achieved after a process that involves the transformation of energy. In Chemistry, on the other hand, it refers to the entropy observed in the formation of a chemical compound. In Astrophysics, he alludes to the entropy observed in black holes. In information theories, entropy is the degree of uncertainty that exists in relation to a set of data. While, in Computing, it refers to the randomness collected by an operating system or an application for use in cryptography.
Entropy in thermodynamics
As entropy is known, in the area of thermodynamics, the physical quantity that measures the part of the energy that cannot be used to do work and that, consequently, is lost. Thus, in an isolated system, always a small amount of energy will dissipate out of the system. This value, as such, always tends to grow in the course of a naturally occurring process. In this sense, entropy describes the irreversibility of thermodynamic systems. For example, when an ice cube is placed in a glass of water at room temperature, after a few minutes, the cube will go into a liquid state, as its temperature will increase, while the water will cool, until both reach thermal equilibrium. . This is because the universe tends to distribute energy uniformly, that is, to maximize entropy.
As negative entropy, or negentropy, the entropy that a system exports to keep its entropy low is called. Thus, to compensate for the degradation process to which, over time, every system is subject, some open systems manage to preserve their natural entropy thanks to the contributions of the other subsystems with which they are related. In this way, in the open system, negative entropy supposes a resistance that is sustained by the associated subsystems that allow it to rebalance the entropic system, unlike the closed system, in which the entropy process cannot stop by itself.