Загрузил kgbrainy

англ

реклама
THE MINISTRY OF EDUCATION AND SCIENCE OF THE KYRGYZ REPUBLIC
KYRGYZ NATIONAL UNIVERSITY
named after J.BALASAGYN
FACULTY: MATHEMATICS COMPUTER SCIENCE AND CYBERNETICS
DEPARTMENTS : MATHEMATICS COMPUTER SCIENCE AND COMPUTER
TECHNOLOGY
INDEPENDENT WORK
Theme: Entropy problems
Checked by: Kalenderova T
Done by: Kriuchkova K.
Group: MCHM-1-22
Bishkek 2022
Оглавление
Abstract ..................................................................................................................... 3
Introduction ............................................................................................................... 3
Entropy ...................................................................................................................... 5
History ....................................................................................................................... 7
References ............................................................................................................... 10
2
Abstract
Еarlier in several papers the question on minimizing of the entropy
increment in the achievement of some goals was arisen. As an application of the
partial differential equations theory, here the problem of minimum of entropy
increment is considered when straightening of plastic rod depends on time. Based
on the analysis of the processes discussed in this paper and in the hypotheses on
entropy have been put forward. An accurate estimate of the entropy increment is
obtained with unlimited braking of the point, and on based on - an accurate
estimate of the entropy increment when the plastic rod is straightened with
unlimited braking. The problem of the minimum of entropy increment in
straightening a plastic rod has been solved depending on time.
Introduction
The concept of entropy is interesting and meaningful. Many have focused on
the grim picture of growing disorder and thermal death that characterizes
equilibrium thermodynamics and isolated systems. In this regard, entropy is used
to indicate and measure the decrease in the availability of high-quality resources
(i.e. resources with low entropy and high free energy), the increase in pollution due
to the release of waste, chemicals and heat into the environment. , the growth of
social disorder due to the deterioration of living conditions in megacities around
the world, the "collapse" of the economy, etc.
Others identify entropy with the style of nature: since the biosphere is an
open system supported by a constant influx of solar energy, its structures and life
phenomena undergo a continuous process of self-organization. Geological
processes, atmospheric systems, ecosystems, and societies are interconnected
through a series of infinitely different and changing relationships, each receiving
energy and materials from the other, returning them, and acting through feedback
mechanisms to self-organize the whole in grand interaction. space, time, energy
and information. During this process of self-organization, entropy is generated and
then released into outer space. Living structures do not violate the second law, but
3
are fed by input sources that continuously supply low-entropy material and energy
(free energy) for the development of the system.
While the first point of view requires increased attention to preventing the
misuse of resources and preventing the degradation of both the natural and human
environment (i.e. disordered materials, due to the influx of resources from outside.
This point of view requires adaptation to the style of nature, recognizing the
existence of fluctuations (growth and fall) and resource constraints, within which,
however, many variations and new models are possible. Both points of view are
interesting and stimulate and explain the richness of the concept of entropy. The
latter, first introduced in the field of energy conversion, very soon acquired
citizenship in a number of other areas.
Based on many experiments, it has been proven that heat passes from the
warmer components of the system to colder ones, but not vice versa. The paper
considers the problem of minimizing the increase in entropy when certain goals are
achieved and the problem of developing a general methodology for research and
proof of existence, along with establishing the properties of solutions to nonlinear
differential and integro-differential equations in partial derivatives, put forward
hypotheses about entropy. Such a technique is necessary to obtain quantitative
lower bounds for the increase in entropy in almost closed systems described by
differential equations with control. A lower estimate is obtained for the increment
of entropy under controlled transformation of an extended object as a function of
time. The second law of thermodynamics and hypotheses about the lower estimate
for the increase in entropy are presented. The practical significance of this law lies,
in particular, in the fact that the increase in entropy in a certain sense corresponds
to the concept of environmental pollution.
4
Entropy
Entropy is the state function of a thermodynamic system. There is no
concept of the absolute value of entropy. In any processes, only the magnitude of
its change can be determined.
In thermodynamics, the degree of scattering is measured by the entropy
value, i.e. transition to thermal energy, any other form of energy contained in the
system. Any thermodynamic system isolated from the external world tends to
equalize the temperatures of all its parts, i.e. to the maximum increase in entropy
in it. A system that had a non-equilibrium thermal state passes to an equilibrium
one when the heat transfer processes stop.
In statistical physics, entropy is interpreted as a measure of the probability
of a system being in a given state. The more disorder, the more entropy. Any
system gradually moves to its more probable state. In the process of this,
disorder increases in it, chaos grows, and hence entropy increases.
The concept of entropy is used in chemistry, biology, computer science,
etc.
Examples of an increase in entropy are well known. These include: the
processes of cooling and heating of objects until the heat flows die out; any kind
of destruction; overgrowing gardens with weeds; cases of loss of information on
the hard drive of a computer under the influence of viruses and even an increase
in domestic chaos, requiring us to periodically clean the apartment.
It is generally accepted in science that, in addition to the initial stages of the
formation of the Universe , up to the formation of galaxies , entropy
increases in all natural processes in inanimate nature .
Living objects (see Life ), producing similar ones to themselves, streamline
the surrounding inanimate matter , building living organisms out of it.
Reasonable human activity most often leads to the creation of unlikely states of
matter. These include almost all the products, works, structures, etc. created by
him.
5
Entropy is a scientific concept, as well as a measurable physical property,
that is most commonly associated with a state of disorder, randomness, or
uncertainty. The term and the concept are used in diverse fields, from classical
thermodynamics, where it was first recognized, to the microscopic description of
nature in statistical physics, and to the principles of information theory. It has
found far-ranging applications in chemistry and physics, in biological systems and
their relation to life, in cosmology, economics, sociology, weather science, climate
change, and information systems including the transmission of information
in telecommunication.
The thermodynamic concept was referred to by Scottish scientist and
engineer William Rankine in 1850 with the names thermodynamic
function and heat-potential. In 1865, German physicist Rudolf Clausius, one of the
leading founders of the field of thermodynamics, defined it as the quotient of an
infinitesimal amount of heat to the instantaneous temperature. He initially
described it as transformation-content, in German Verwandlungsinhalt, and later
coined the term entropy from a Greek word for transformation. Referring to
microscopic constitution and structure, in 1862, Clausius interpreted the concept as
meaning disgregation.
A consequence of entropy is that certain processes are irreversible or
impossible, aside from the requirement of not violating the conservation of energy,
the latter being expressed in the first law of thermodynamics. Entropy is central to
the second law of thermodynamics, which states that the entropy of isolated
systems left to spontaneous evolution cannot decrease with time, as they always
arrive at a state of thermodynamic equilibrium, where the entropy is highest.
Austrian physicist Ludwig Boltzmann explained entropy as the measure of
the number of possible microscopic arrangements or states of individual atoms and
molecules of a system that comply with the macroscopic condition of the system.
He thereby introduced the concept of statistical disorder and probability
distributions into a new field of thermodynamics, called statistical mechanics, and
6
found the link between the microscopic interactions, which fluctuate about an
average configuration, to the macroscopically observable behavior, in form of a
simple logarithmic law, with a proportionality constant, the Boltzmann constant,
that has become one of the defining universal constants for the
modern International System of Units (SI).
In 1948, Bell Labs scientist Claude Shannon developed similar statistical
concepts of measuring microscopic uncertainty and multiplicity to the problem of
random losses of information in telecommunication signals. Upon John von
Neumann's suggestion, Shannon named this entity of missing information in
analogous manner to its use in statistical mechanics as entropy, and gave birth to
the field of information theory. This description has been identified as a universal
definition of the concept of entropy.
History
In his 1803 paper, Fundamental Principles of Equilibrium and Movement,
the French mathematician Lazare Carnot proposed that in any machine, the
accelerations and shocks of the moving parts represent losses of moment of
activity; in any natural process there exists an inherent tendency towards the
dissipation of useful energy. In 1824, building on that work, Lazare's son, Sadi
Carnot, published Reflections on the Motive Power of Fire, which posited that in
all heat-engines, whenever "caloric" (what is now known as heat) falls through a
temperature difference, work or motive power can be produced from the actions of
its fall from a hot to cold body. He used an analogy with how water falls in a water
wheel. That was an early insight into the second law of thermodynamics.[5] Carnot
based his views of heat partially on the early 18th-century "Newtonian hypothesis"
that both heat and light were types of indestructible forms of matter, which are
attracted and repelled by other matter, and partially on the contemporary views
of Count Rumford, who showed in 1789 that heat could be created by friction, as
when cannon bores are machined.[6] Carnot reasoned that if the body of the
working substance, such as a body of steam, is returned to its original state at the
7
end of a complete engine cycle, "no change occurs in the condition of the working
body".
The first law of thermodynamics, deduced from the heat-friction
experiments of James Joule in 1843, expresses the concept of energy, and
its conservation in all processes; the first law, however, is unsuitable to separately
quantify the effects of friction and dissipation.
In the 1850s and 1860s, German physicist Rudolf Clausius objected to the
supposition that no change occurs in the working body, and gave that change a
mathematical interpretation, by questioning the nature of the inherent loss of
usable heat when work is done, e.g., heat produced by friction.[7] He described his
observations as a dissipative use of energy, resulting in a transformationcontent (Verwandlungsinhalt in German), of a thermodynamic system or working
body of chemical species during a change of state.[7] That was in contrast to earlier
views, based on the theories of Isaac Newton, that heat was an indestructible
particle that had mass. Clausius discovered that the non-usable energy increases as
steam proceeds from inlet to exhaust in a steam engine. From the prefix en-, as in
'energy', and from the Greek word τροπή [tropē], which is translated in an
established lexicon as turning or change[8] and that he rendered in German
as Verwandlung, a word often translated into English as transformation, in 1865
Clausius coined the name of that property as entropy.[9] The word was adopted into
the English language in 1868.
Later, scientists such as Ludwig Boltzmann, Josiah Willard Gibbs,
and James Clerk Maxwell gave entropy a statistical basis. In 1877, Boltzmann
visualized a probabilistic way to measure the entropy of an ensemble of ideal
gas particles, in which he defined entropy as proportional to the natural
logarithm of the number of microstates such a gas could occupy.
The proportionality constant in this definition, called the Boltzmann constant, has
become one of the defining universal constants for the modern International
System of Units (SI). Henceforth, the essential problem in statistical
8
thermodynamics has been to determine the distribution of a given amount of
energy E over N identical systems. Constantin Carathéodory, a Greek
mathematician, linked entropy with a mathematical definition of irreversibility, in
terms of trajectories and integrability.
9
1.
2.
3.
4.
5.
6.
7.
8.
References
Wehrl, Alfred (1 April 1978). "General properties of entropy". Reviews of
Modern Physics.
260. Bibcode:1978RvMP...50..221W. doi:10.1103/RevModPhys.50.221.
Truesdell, C. (1980). The Tragicomical History of Thermodynamics, 1822–
1854. New York: Springer-Verlag. p. 215. ISBN 0387904034 – via Internet
Archive.
Brush, S.G. (1976). The Kind of Motion We Call Heat: a History of the
Kinetic Theory of Gases in the 19th Century, Book 2, Statistical Physics and
Irreversible Processes, Elsevier, Amsterdam, ISBN 0-444-87009-1, pp. 576–
577.
Ben-Naim, Arieh (2008). A Farewell to Entropy: Statistical
Thermodynamics Based on Information. Singapore: World-Scientific
Publishing. ISBN 9789812707062.
"Carnot, Sadi (1796–1832)". Wolfram Research. 2007. Retrieved 24
February 2010.
McCulloch, Richard, S. (1876). Treatise on the Mechanical Theory of Heat
and its Applications to the Steam-Engine, etc. D. Van Nostrand.
Jump up to:a b Clausius, Rudolf (1850). "Über die bewegende Kraft der
Wärme und die Gesetze, welche sich daraus für die Wärmelehre selbst
ableiten lassen". Annalen der Physik. 155 (3): 368–
397. Bibcode:1850AnP...155..368C. doi:10.1002/andp.18501550306.
10
Скачать