The site contains the best tips, tricks and solutions to problems you may encounter. Secrets, life hacks, stories and everything related to life and relationships.

Entropy: what is it in simple words. Entropy what is it: explanation of the term in simple words

15

Definition

Entropy (translated from ancient Greek – turn, transformation) is a measure, the degree of disorder (chaos) of any system. Used in the following exact and natural sciences:

  • In mathematics, it means finding the logarithm of the number of available system states;
  • In statistical science, the probabilistic value of the onset of any macroscopic state of the system;
  • In thermodynamics (physics), the degree of irreversible diffusion of energy, i.e. the standard value of its losses, which are inevitable when a hotter body interacts with a colder one;
  • In computer science, it means the information capacity of the system. An interesting fact is the following: Claude Shannon (the founder of this term in information theory) originally thought to call entropy information.

Just about complicated

Entropy is a concept that is used in more than one area of ​​human activity, so its definitions can be somewhat vague. This term displays a value, and its essence can be disassembled with simple examples. Entropy is the degree of disorder, the degree of uncertainty and disorder.

High entropy can be visualized as a street with scattered scraps of paper. If the papers are stacked neatly in a pile, then the system is ordered, and the entropy is low. Entropy indices need to be lowered, and for this you need to spend a lot of time, glue the papers together in pieces and collect them in a pile.

Entropy: what is it in simple words. Entropy what is it: explanation of the term in simple words

If the entropy of a closed system takes place, then everything is also simple here. A closed system can be imagined as a closed cabinet, if things are scattered in it, then it will not be possible to influence them from the outside, and chaos in the cabinet will be present for a long time.

Over time, things will decompose, and this will lead to order, but things need to decompose for a long time, for example, it will take 5 years for a woolen sock, and about 40 years for leather shoes. In the given example, the closet acts as an isolated system, and the decomposition of things in it is putting things in order in the structures.

The minimum is the entropy, which concerns macroscopic objects, they can be observed with the naked eye. As for the higher indicators, they often have a vacuum.

History of origin

For the first time, the concept of entropy was introduced in the era of the development of thermodynamics, when the need arose to study the processes occurring inside thermodynamic bodies. In 1865, a physicist from Germany, Rudolf Clausius, with this term described the state of the system in which heat has the ability to transform into other types of energy (mechanical, chemical, light, etc.).

Entropy: what is it in simple words. Entropy what is it: explanation of the term in simple words

The increase in entropy is caused by the influx of thermal energy into the system and is associated with the temperature at which this influx occurs. The need for this value was due to the fact that all physics is based on the idealization of abstract objects (ideal pendulum, uniform motion, mass, etc.).

In the everyday sense, entropy is the degree of chaos and uncertainty of the system: the more order in the system, and the more its elements are subordinated to any order, the less entropy.

Example: A cabinet is a specific system. If in it all things are in their places, then the entropy is less. If all things are scattered and not on their shelves, then accordingly it becomes larger.

The thermal function of enthalpy is closely related to this term – it characterizes the state of a thermodynamic system in a state of equilibrium when choosing a number of independent variables, such as pressure, entropy and the number of particles.

The opposite of entropy is called extropy.

Types of entropies

The term is used in thermodynamics, economics, information theory, and even sociology. What does he define in these areas?

In physical chemistry (thermodynamics)

Entropy: what is it in simple words. Entropy what is it: explanation of the term in simple wordsThe basic postulate of thermodynamics about equilibrium: any isolated thermodynamic system comes to an equilibrium state over time and cannot come out of it spontaneously. That is, each system strives for an equilibrium state for it. And speaking in very simple terms, such a state is characterized by disorder.

Entropy is a measure of disorder. How to identify a mess? One way is to assign to each state the number of options that can be used to implement that state. And the more such implementation methods, the greater the value of entropy. The more organized a substance (its structure), the lower its uncertainty (randomness).

The absolute value of entropy (S abs.) Is equal to the change in the energy available to a substance or system during heat transfer at a given temperature. Its mathematical value is determined from the value of the heat transfer (Q) divided by the absolute temperature (T) at which the process takes place: S abs. Q / T. This means that when transferring a large amount of heat, the S abs. will increase. The same effect will be observed for heat transfer at low temperatures.

In economics

Economics uses such a concept as the coefficient of entropy. With the help of this coefficient, the change in market concentration and its level are investigated. The higher the value of the coefficient, the higher the economic uncertainty and, consequently, the probability of the emergence of a monopoly decreases. The coefficient helps to indirectly assess the benefits acquired by the firm as a result of possible monopoly activities or changes in market concentration.

In statistical physics or information theory

Information entropy (uncertainty) is a measure of the unpredictability or uncertainty of a certain system. This value helps to determine the degree of randomness of the experiment or event being conducted. The greater the number of states in which the system can be, the greater the value of the uncertainty. All processes of ordering the system lead to the emergence of information and the reduction of information uncertainty.

With the help of information unpredictability, it is possible to identify such a channel capacity that will ensure reliable transmission of information (in a system of coded symbols). And you can also partially predict the course of experience or events, dividing them into their component parts and calculating the value of uncertainty for each of them. This method of statistical physics helps to reveal the likelihood of an event. With its help, you can decipher the encoded text, analyzing the probability of the appearance of symbols and their entropy index.

There is such a thing as the absolute entropy of a language. This value expresses the maximum amount of information that can be conveyed in a unit of this language. In this case, the symbol of the alphabet of the language (bit) is taken as a unit.

In sociology

Here entropy (information uncertainty) is a characteristic of the deviation of the society (system) or its links from the accepted (reference) state, and this manifests itself in a decrease in the efficiency of development and functioning of the system, deterioration of self-organization. A simple example: employees of a company are so heavily loaded with work (performing a large number of reports) that they do not have time to engage in their main activity (performing checks). In this example, the measure of inappropriate use of work resources by management will be information uncertainty.

How entropy manifests itself in our lives

With the help of entropy, you can explain many incomprehensible and surprising facts, for example:

Why is our life so extraordinary

Imagine the human body. The atoms that make up the body could have folded into an almost infinite number of variants and not create any form of life. From the point of view of mathematics, the likelihood of our existence is very small. And yet we exist.

In a universe where entropy rules everything, the existence of life with such a clear, stable organization is amazing.

Why we love art and beauty

Entropy can explain why art and beauty seem so aesthetically pleasing to us. The artist creates a special form of order and symmetry that the Universe most likely would never have generated on its own. The number of beautiful combinations is much less than the total number of all combinations. Beauty is a rarity in a universe full of disorder. Therefore, a symmetrical face is rare and beautiful, because there are incomparably more asymmetrical options.

Why do you need to create the ideal conditions for yourself?

Each of us has our own talents, skills and interests. But the society and culture in which we live were not created specifically for us. With entropy in mind, consider what are the chances that the environment in which you grew up is ideal for unleashing your talents?

It is extremely unlikely that life will create a situation for you that perfectly suits your abilities. Most likely, you will find yourself in a position that does not quite match your skills and needs.

We usually describe this state as “out of place”, “out of their element.” Naturally, in such conditions it is much more difficult to achieve success, to be useful, to win. Knowing this, we must ourselves create ideal living conditions for ourselves. Difficulties in life arise not because the planets are lined up so, and not because some higher forces have conspired against you. It’s just the law of entropy at work. There are many more states of disorder than ordered ones. With all this in mind, it’s not surprising that there are problems in life, but that we can solve them.

Equation and calculation of entropy

There are several ways to calculate entropy. But the two most common equations relate to reversible thermodynamic and isothermal processes (with constant temperature).

Entropy and heat death of the universe

Some scientists predict that the entropy of the universe will increase to such an extent that it creates a system that is incapable of useful work. And only heat energy will remain. The universe, they say, will die of heat death.

However, other scientists dispute the heat death theory. They argue that the universe as a system is moving further and further away from entropy. Even if the entropy inside some of its inner regions
increases.

Others see the universe as part of an even larger system. Still others say that the possible states do not have equal probability. Therefore, the usual equations for calculating entropy are irrelevant.

The most common formulations of entropy in physics

Many famous physicists have tried to explain the concept of entropy in an accessible way for ordinary people. Let’s highlight the 3 most famous formulations of the explanation.

Clausius’ statement

Heating a body with a higher temperature is not possible with a body with a lower temperature.

For example, it looks like this – you can put a kettle of water on a piece of ice (a priori, the water temperature is higher than the ice temperature), but you can’t wait for the water to boil. Although the first 2 beginnings of thermodynamics do not deny this possibility.

Thomson’s formulation

In a closed system, a process is impossible, the only result of which would be work done due to thermal energy received from any body.

Boltzmann’s statement

A decrease in entropy in a closed system is impossible.

This wording causes a lot of controversy, although everything is intuitively clear. Chaos will grow in an abandoned dwelling – dust will settle, some things will fall apart. You can put things in order, but only by applying external energy, that is, the work of a cleaner.

The problem is that the Universe in modern concepts is a closed system. It was formed somewhere 14-15 billion years ago. During this time, its entropy would lead to the fact that the galaxies disintegrated, the stars went out and no new stars would appear in principle. But our Sun is no more than 5 billion years old, and the Universe as a whole has not come into a state of chaos.

Entropy: thesis and examples

An example. T9 program. If a word contains a small number of typos, the program will easily recognize the word and suggest its replacement. The more typos, the less information the program will have about the word being entered. Consequently, an increase in confusion will lead to an increase in information uncertainty, and vice versa, the more information, the less uncertainty.

Example. Dice. There is only one way to throw out a combination of 12 or 2: 1 plus 1 or 6 plus 6. And the maximum number of ways is 7 (has 6 possible combinations). The unpredictability of the implementation of the number seven is greatest in this case.

  • In a general sense, entropy (S) can be understood as a measure of energy distribution. At a low value of S, the energy is concentrated, and at a high value, it is distributed chaotically.

Example. H2O (known to all water) in its liquid state of aggregation will have greater entropy than in solid (ice). Because in a crystalline solid, each atom occupies a certain position in the crystal lattice (order), and in a liquid state, atoms do not have certain fixed positions (disorder). That is, a body with a more rigid arrangement of atoms has a lower entropy value (S). White diamond without impurities has the lowest S value compared to other crystals.

Example. The molecule is in a vessel that has a left and a right side. If it is not known in which part of the vessel the molecule is located, then the entropy (S) will be determined by the formula S = S max = k lgW, where k is the number of implementation methods, W is the number of parts of the vessel. Information in this case will be equal to zero I = I min = 0. If it is precisely known in which part of the vessel the molecule is located, then S = S min = k ln1 = 0, and I = I max = log 2 W. Therefore, the more information, the lower the value of information uncertainty.

Example. The higher the order on the desktop, the more information you can learn about the things that are on it. In this case, the ordering of objects reduces the entropy of the “desktop” system.

Example. There is more information about the class during the lesson than during the break. The entropy in the lesson is below, as the students are seated in an orderly manner (more information on the location of each student). And during breaks, the disposition of students changes chaotically, which increases their entropy.

Example. When an alkali metal reacts with water, hydrogen is released. Hydrogen is a gas. Since gas molecules move chaotically and have high entropy, the reaction under consideration occurs with an increase in its value.

From everyday life:

  1. When writing sms messages on a mobile phone, we often use the T9 program. The fewer errors in the word we are typing, the easier it will be to recognize it by the program and the faster it will offer us its replacement. Conclusion: the more confusion, the greater the information uncertainty.
  2. When we roll two dice when playing dice, there is only one way to roll a combination of 2 or 12 (1 and 1, 6 and 6). The maximum number of ways to roll out the number 7 (6 possible combinations). Unpredictability in this case will be maximum.
  3. There is more information about the number of students during the lesson than during the break. Since in the lesson each student sits in his place, the entropy is lower. Outside the classroom, the movement of schoolchildren is characterized by randomness, which leads to an increase in the value of entropy.
  4. If you clean up the work desk, put objects in their places, then you can get more information about this or that object on it. The orderliness of things on the desk reduces the amount of entropy.

Important! Everything that surrounds us tends to increase entropy. A person intends to receive the maximum amount of information from the world around him. All theoretical directions in the study of entropy (in physics, chemistry, economics, mathematics, sociology) are aimed at establishing a balance (balance) between the intentions and desires of people and the natural processes that occur in nature.

Entropy: what is it in simple words

The Russian language, like any other, is constantly changing under the pressure of constant technological borrowing and cooperation with other states. Thanks to this, our language is rich in various foreign language borrowings.

One of the relatively new words in the Russian language was the word “entropy”, which was encountered by many of us, but not everyone understands what it really means.

What is entropy in simple words

Most often, the word “entropy” is found, of course, in classical physics. This is one of the most difficult concepts of this science, therefore even students of physics universities often face problems in the perception of this term.

This is a very specific coordinate – the capital of the Russian Federation – however, Moscow is a rather big city, so you still don’t know the exact information about my location. But when I tell you my, for example, postal code, the entropy about me, as an object, will decrease.

This is not an entirely accurate analogy, so we will give one more example for clarification. Let’s say we take ten six-sided dice. Let’s throw them all in turn, and then I will tell you the total of the dropped indicators – thirty.

Based on the sum of all the results, you will not be able to say for sure which figure and on which die fell out – you simply do not have enough data for this. In our case, each dropped out digit in the language of physicists will be called a microstate, and an amount equal to thirty, in the same physical dialect, will be called a macrostate.

If we calculate how many possible microstates three dozen can give us in total, we come to the conclusion that their number reaches almost three million values. Using a special formula, we can calculate the entropy index in this probabilistic experiment – six and a half.

Where did half come from, you might ask? This fractional part appears due to the fact that when numbering in the seventh order, we can operate with only three numbers – 0, 1 and 2.

The modern word “entropy” has Greek roots, therefore, because of the translation, it is often called the “measure of chaos.” Let’s say you decide to have a feast in your apartment on the occasion of your little daughter’s birthday.

They cleaned the entire apartment, washed the floors and windows, washed the dishes clean, and then laid out all the dishes beautifully and elegantly on the table. The initial household chaos of your apartment has significantly decreased, therefore, your house has become a system with low entropy.

Entropy in the Universe

According to the forecasts of astrophysicists, one of the options for the development of the Universe is heat death.

Our universe is (imagine how far-sighted the ancient Greeks were in this regard) sheer chaos, in which something is constantly happening: stars are born and die, new galaxies are formed, in short, beauty! At one fine moment, the entropy of the Universe will reach its maximum and there will be simply nothing to happen in it. So much for death from idleness.

Chaos permeates the entire cosmos, our entire nature, down to atoms and elementary particles. Everything is in constant motion and interaction, like a perfectly worked mechanism. And all these processes are governed by laws that we, miserable people, can express in no less beautiful mathematical language.

But how, with such a level of entropy (that is, chaos) in the Universe, anything could have arisen at all? The answer to this question is extremely simple. All matter transfers the level of its entropy to its own environment, everything it can reach.

For example, to regulate the level of entropy on Earth, a star named the Sun constantly supplies us with energy, which it produces due to an incessant thermonuclear reaction on its surface.

If our planet were a closed system, then, according to the second law of thermodynamics, the entropy inside it could only increase, but, as you already understood, the Sun allows us to keep the level of the Earth’s entropy normal.

Entropy and chaos permeate everything that surrounds us and even what is inside us. In gases and liquids, entropy plays a key role, even our momentary desires and impulses are in fact nothing more than a product of universal universal chaos.

It is not difficult to come once again to the most beautiful conclusion: the Universe, no matter how huge it may be, is a collection of an infinite number of particles of the most diverse sizes and no less diverse properties.

Everything in it, from an elementary boson to Alpha Centauri and entire galaxies, is connected by invisible threads. Such discoveries of physicists are striking not only in their complexity, but also in their beauty.

It would seem that the usual mathematical and physical formulas on the boards of unshaven pensive men with glasses are key factors for our knowledge of ourselves and ourselves and our place in the vast Universe.

We hope that this article helped you clarify what entropy really is, in which cases this word is used, and also what the discovery of this indicator led scientists and philosophers to.

Who knows, maybe reading this article will inspire you to purposefully study this wonderful science – physics. One way or another, it is vitally necessary for a modern person to be interested in science, at least for his own development.

Finally

If we combine all of the above, it turns out that entropy is a measure of the disorder or uncertainty of the system and its parts. An interesting fact is that everything in nature tends to the maximum entropy, and man – to the maximum information.

Sources used and useful links on the topic: https://advi.club/psihologiya-i-obshhestvo/183-entropiya-prostymi-slovami.html https://obraz-ola.ru/prochee/razbiraemsya-chto-takoe-entropiya .html https://tvercult.ru/nauka/entropiya-chto-eto-takoe-obyasnenie-termina-prostyimi-slovami https://Lifehacker.ru/entropy/ https://zen.yandex.ru/media/alivespace / chto-takoe-entropiia-5d98e1ab43863f00b19f8f80 https://zen.yandex.ru/media/popsci/chto-takoe-entropiia-i-kak-ona-sviazana-s-materiei-i-energiei-5c55ad170280ed00ae : obrabezov .guru / nauka / entropiya-chto-eto-takoe-obyasnenie-termina-prostymi-slovami.html

Post source: lastici.ru

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More