History of the development of probability theory. Subject

Libert Elena

Excitement and the desire to get rich gave impetus to the emergence of a new extremely important mathematical discipline: probability theory. Mathematicians of such stature as Pascal, Fermat, and Huygens took part in the development of its foundations.

Download:

Preview:

MBOU secondary school No. 8, Yartsevo, Smolensk region

Mathematics project:

"The history of the emergence of probability theory"

Prepared by: 11th grade student

secondary school No. 8 Libert Elena

Head: mathematics teacher

Borisenkova Olga Vladimirovna

G. Yartsevo, 2015

History of the emergence of probability theory………………………………………………………..…...3

Medieval Europe and the beginning of the New Age……………………….4

17th century: Pascal, Fermat, Huygens…..………………………………….5

XVIII century……..…………………………………………………………….7

XIX century. General trends and criticism……………………….…………..7

Application of probability theory in the 19th-20th centuries……………….…..…8

  1. Astronomy……………………………………………………….8
  2. Physics………………………….…………………………………9
  3. Biometrics……………...……………………………………………9
  4. Agriculture………………………..………………………..9
  5. Industry……………………………………………………………..10
  6. Medicine…………………………………………………………..10
  7. Bioinformatics………………...…………………………………….10
  8. Economics and banking…….…………………………….11

History of the emergence of probability theory

A French nobleman, a certain Monsieur de Mere, was a dice gambler and passionately wanted to get rich. He spent a lot of time discovering the secret of dice. He invented various options for the game, assuming that in this way he would acquire a large fortune. So, for example, he suggested throwing one die 4 times in turn and convinced his partner that at least once a six would come up. If a six did not come out in 4 throws, then the opponent won.

At that time, the branch of mathematics that today we call probability theory did not yet exist, and therefore, in order to make sure whether his assumptions were correct, Mr. Mere turned to his friend, the famous mathematician and philosopher B. Pascal, with a request that he study two famous questions , the first of which he tried to solve himself. The questions were:

How many times must two dice be thrown so that the number of times two sixes are thrown at once is more than half of the total number of throws?

How to fairly divide the money bet by two players if for some reason they stopped the game prematurely?

Pascal not only became interested in this, but also wrote a letter to the famous mathematician P. Fermat, which provoked him to study the general laws of dice and the probability of winning.

Thus, the excitement and thirst to get rich gave impetus to the emergence of a new extremely significant mathematical discipline: probability theory. Mathematicians of such caliber as Pascal and Fermat, Huygens (1629-1695), who wrote the treatise “On Calculations in Gambling”, Jacob Bernoulli (1654-1705), Moivre (1667-1754), Laplace ( 1749-1827), Gauss (1777-1855) and Poisson (1781-1840). Nowadays, probability theory is used in almost all branches of knowledge: statistics, forecasters (weather forecast), biology, economics, technology, construction, etc.

Medieval Europe and the beginning of modern times

The first problems of a probabilistic nature arose in various games of chance - dice, cards, etc. The 13th century French canon Richard de Fournival correctly calculated all possible sums of points after throwing three dice and indicated the number of ways in which each of these sums could be obtained. This number of ways can be considered as the first numerical measure of the expectedness of an event, similar to probability. Before the Fournival, and sometimes after it, this measure was often calculated incorrectly, considering, for example, that the sums of 3 and 4 points are equally probable, since both can be obtained “only one way”: by the results of throwing “three ones” and “two with two units" respectively. At the same time, it was not taken into account that three ones are actually obtained in only one way: ~1+1+1, and a two with two ones is obtained in three ways: ~1+1+2;\;1+2+1;\;2+ 1+1, so these events are not equally likely. Similar errors have repeatedly occurred in the subsequent history of science.

The extensive mathematical encyclopedia “Sum of Arithmetic, Geometry, Relations and Proportions” by the Italian Luca Pacioli (1494) contains original problems on the topic: how to divide the bet between two players if a series of games is interrupted early. An example of such a problem: the game goes up to 60 points, the winner receives the entire bet of 22 ducats, during the game the first player scored 50 points, the second - 30, and then the game had to be stopped; the original bet must be divided fairly. The solution depends on what is meant by a “fair” division; Pacioli himself proposed dividing in proportion to the points scored (55/4 and 33/4 ducats); his decision was later found to be erroneous.

Distribution of the sum of points after throwing two dice

The major algebraist of the 16th century, Gerolamo Cardano, devoted a substantial monograph to the analysis of the game, “The Book of the Game of Dice” (1526, published posthumously). Cardano carried out a complete and error-free combinatorial analysis for the values ​​of the sum of points and indicated for different events the expected value of the proportion of “favorable” events: for example, when throwing three dice, the proportion of cases in which the values ​​of all 3 dice coincide is equal to 6/216 or 1/36. Cardano made an insightful remark: the actual number of events studied can, with a small number of games, differ greatly from the theoretical one, but the more games in the series, the smaller the proportion of this difference. Essentially, Cardano came close to the concept of probability:

So, there is one general rule for calculation: you need to take into account the total number of possible drops and the number of ways in which these drops can appear, and then find the ratio of the last number to the number of remaining possible drops.

Another Italian algebraist, Niccolo Tartaglia, criticized Pacioli's approach to solving the problem of dividing the bet: after all, if one of the players has not yet scored a single point, then Pacioli's algorithm gives the entire bet to his opponent, but this can hardly be called fair, since some chances of winning the lagging behind still has them. Cardano and Tartaglia proposed their own (different) methods of division, but later these methods were considered unsuccessful.

Galileo Galilei also studied this topic, writing a treatise “On the yield of points in a game of dice” (1718, published posthumously). Galileo's presentation of the theory of the game is distinguished by its exhaustive completeness and clarity. In his main book, “Dialogue on the Two Major Systems of the World, Ptolemaic and Copernican,” Galileo also pointed out the possibility of assessing the error of astronomical and other measurements, and stated that small measurement errors are more likely than large ones, deviations in both directions are equally probable, and the average result should be close to the true value of the measured quantity. These qualitative considerations became the first ever prediction of a normal error distribution.

17th century: Pascal, Fermat, Huygens

In the 17th century, a clear understanding of the problems of probability theory began to form, and the first mathematical (combinatorial) methods for solving probabilistic problems appeared. The founders of the mathematical theory of probability were Blaise Pascal and Pierre Fermat.

Before this, the amateur mathematician Chevalier de Mere turned to Pascal about the so-called “points problem”: how many times do you need to throw two dice in order to bet on getting two sixes at least once at the same time to be profitable? Pascal and Fermat entered into correspondence with each other regarding this problem and related issues (1654). As part of this correspondence, scientists discussed a number of problems associated with probabilistic calculations; in particular, the old problem of dividing the bet was considered, and both scientists came to the decision that it was necessary to divide the bet according to the remaining chances of winning. Pascal pointed out to de Mere the mistake he made when solving the “points problem”: while de Mere incorrectly identified equally probable events, receiving the answer: 24 throws, Pascal gave the correct answer: 25 throws.

Pascal in his works far advanced the use of combinatorial methods, which he systematized in his book “Treatise on the Arithmetic Triangle” (1665). Based on a probabilistic approach, Pascal even argued (in posthumously published notes) that being a believer is more profitable than being an atheist.

Huygens first used the term “value”, and the term “expectation” appeared for the first time when Van Schouten translated Huygens’ treatise into Latin and became generally accepted in science.

The book contains a large number of problems, some with solutions, others “for independent solution.” Of the latter, the “problem of ruining the player” aroused particular interest and lively discussion. In a somewhat generalized form, it is formulated as follows: players A and B have a and b coins, respectively, one coin is won in each game, the probability of A winning in each game is p, you need to find the probability of his complete ruin. A complete general solution to the “ruin problem” was given by Abraham de Moivre half a century later (1711). Nowadays, the probabilistic scheme of the “ruin problem” is used to solve many random walk problems.

Huygens also analyzed the problem of dividing the bet, giving its final solution: the bet must be divided in proportion to the probabilities of winning when the game continues. He also pioneered the application of probabilistic methods to demographic statistics and showed how to calculate average life expectancy.

The publications of the English statisticians John Graunt (1662) and William Petty (1676, 1683) date back to the same period. Having processed data for more than a century, they showed that many demographic characteristics of the London population, despite random fluctuations, are quite stable - for example, the ratio of the number of newborn boys and girls rarely deviates from the proportion of 14 to 13, and there are small fluctuations in the percentage of deaths from specific random reasons. These data prepared the scientific community to accept new ideas.

Graunt was also the first to compile mortality tables - tables of the probability of death as a function of age. Issues of probability theory and its application to demographic statistics were also taken up by Johann Hudde and Jan de Witt in the Netherlands, who in 1671 also compiled mortality tables and used them to calculate the size of life annuities. This range of issues was outlined in more detail in 1693 by Edmund Halley.

XVIII century

The treatises of Pierre de Montmort, “An Experience in the Study of Gambling,” which appeared at the beginning of the 18th century (published in 1708 and republished with additions in 1713) and Jacob Bernoulli’s “The Art of Assumptions” (published after the scientist’s death, in the same 1713) were based on Huygens’ book. ). The latter was especially important for probability theory.

19th century

General trends and criticism

In the 19th century, the number of works on probability theory continued to grow, and there were even attempts that compromised science to extend its methods far beyond reasonable limits - for example, into the field of morality, psychology, law enforcement, and even theology. In particular, the Welsh philosopher Richard Price, and after him Laplace, considered it possible to calculate the probability of the upcoming sunrise using Bayes' formulas; Poisson tried to conduct a probabilistic analysis of the fairness of court verdicts and the reliability of witness testimony. The philosopher J. S. Mill in 1843, pointing to such speculative applications, called the calculus of probability "the disgrace of mathematics." This and other assessments indicated insufficient rigor in the justification of the probability theory.

Meanwhile, the mathematical apparatus of probability theory continued to improve. The main area of ​​its application at that time was the mathematical processing of observational results containing random errors, as well as calculations of risks in the insurance business and other statistical parameters. Among the main applied problems of probability theory and mathematical statistics of the 19th century are the following:

find the probability that the sum of independent random variables with the same (known) distribution law is within specified limits. This problem was of particular importance for the theory of measurement errors, primarily for assessing the error of observations;

establishing the statistical significance of the difference between random values ​​or series of such values. Example: comparing the results of a new and old type of drug to decide whether the new drug is truly better;

study of the influence of a given factor on a random variable (factor analysis).

By the middle of the 19th century, a probabilistic theory of artillery shooting was being formed. Most large European countries have established national statistical organizations. At the end of the century, the field of application of probabilistic methods began to successfully spread to physics, biology, economics, and sociology.

Application of probability theory in the 19th-20th centuries.

In the 19th and 20th centuries, probability theory penetrates first into science (astronomy, physics, biology), then into practice (agriculture, industry, medicine), and finally, after the invention of computers, into the everyday life of any person using modern means of receiving and transmitting information. Let's trace the application in various fields.

1.Astronomy.

It was for use in astronomy that the famous “method of least squares” was developed (Legendre 1805, Gauss 1815). The main problem for which it was initially used was the calculation of the orbits of comets, which had to be done from a small number of observations. It is clear that reliably determining the type of orbit (ellipse or hyperbola) and accurately calculating its parameters are difficult, since the orbit is observed only over a small area. The method turned out to be effective, universal, and caused heated debate about priority. It began to be used in geodesy and cartography. Now that the art of manual calculations has been lost, it is difficult to imagine that when mapping the world's oceans in the 1880s in England, a system of approximately 6,000 equations with several hundred unknowns was numerically solved using the least squares method.

2.Physics.

In the second half of the 19th century, statistical mechanics was developed in the works of Maxwell, Boltzmann and Gibbs, which described the state of rarefied systems containing a huge number of particles (of the order of Avogadro’s number). If earlier the concept of distribution of a random variable was primarily associated with the distribution of measurement errors, now a variety of quantities are distributed - speed, energy, free path length.

3.Biometrics.

In 1870-1900, the Belgian Quetelet and the Englishmen Francis Galton and Karl Pearson founded a new scientific direction - biometrics, in which for the first time the indefinite variability of living organisms and the inheritance of quantitative characteristics began to be systematically and quantitatively studied. New concepts were introduced into scientific circulation - regression and correlation.

So, until the beginning of the 20th century, the main applications of probability theory were related to scientific research. Introduction into practice - agriculture, industry, medicine - occurred in the 20th century.

4. Agriculture.

At the beginning of the 20th century in England, the task was set to quantitatively compare the effectiveness of various agricultural methods. To solve this problem, the theory of experimental design and analysis of variance were developed. The main credit for the development of this purely practical use of statistics belongs to Sir Ronald Fisher, an astronomer by training, and later a farmer, statistician, geneticist, and president of the English Royal Society. Modern mathematical statistics, suitable for widespread use in practice, were developed in England (Karl Pearson, Student, Fisher). Student was the first to solve the problem of estimating an unknown distribution parameter without using the Bayesian approach.

5. Industry.

Introduction of statistical control methods in production (Shewhart control charts). Reducing the required number of product quality tests. Mathematical methods turn out to be so important that they began to be classified. Thus, a book describing a new technique that made it possible to reduce the number of tests (“Sequential Analysis” by Wald) was published only after the end of the Second World War in 1947.

6.Medicine.

The widespread use of statistical methods in medicine began relatively recently (the second half of the 20th century). The development of effective treatment methods (antibiotics, insulin, effective anesthesia, artificial circulation) required reliable methods for assessing their effectiveness. A new concept “Evidence-Based Medicine” has emerged. A more formal, quantitative approach to the treatment of many diseases began to develop - the introduction of protocols, guidelines.

Since the mid-1980s, a new and most important factor has emerged that has revolutionized all applications of probability theory - the possibility of widespread use of fast and affordable computers. You can feel the enormity of the revolution that has taken place if you consider that one modern personal computer surpasses in speed and memory all the computers in the USSR and the USA that were available by 1968, a time when projects related to the construction of nuclear power plants, flights to the moon, the creation of thermonuclear bomb. Now, through direct experimentation, it is possible to obtain results that were previously inaccessible - thinkingofunthinkable.

7.Bioinformatics.

Since the 1980s, the number of known protein and nucleic acid sequences has increased rapidly. The volume of accumulated information is such that only computer analysis of this data can solve problems of information extraction.

8.Economics and banking.

Risk theory is widely used. Risk theory is a theory of decision making under conditions of probabilistic uncertainty. From a mathematical point of view, it is a branch of probability theory, and the applications of risk theory are almost limitless. The most advanced financial application area: banking and insurance, market and credit risk management, investment, business risk, telecommunications. Non-financial applications related to threats to health, the environment, risks of accidents and environmental disasters, and other areas are also developing.

Send your good work in the knowledge base is simple. Use the form below

Students, graduate students, young scientists who use the knowledge base in their studies and work will be very grateful to you.

Similar documents

    The emergence and development of probability theory and its applications. Solving the classic paradoxes of dice and "gambling". The paradox of Bernoulli and Bertrand's law of large numbers, birthdays and gift giving. Studying paradoxes from the book of G. Székely.

    test, added 05/29/2016

    The essence and subject of probability theory, which reflects the patterns inherent in random phenomena of a mass nature. Her study of the patterns of mass homogeneous random phenomena. Description of the most popular experiments in probability theory.

    presentation, added 08/17/2015

    The essence of the concept of "combinatorics". Historical information from the history of the development of science. The rule of sum and product, placement and permutation. General view of the formula for calculating the number of combinations with repetitions. An example of solving problems in probability theory.

    test, added 01/30/2014

    Probability theory as a mathematical science that studies patterns in mass homogeneous cases, phenomena and processes, the subject, basic concepts and elementary events. Determining the probability of an event. Analysis of the main theorems of probability theory.

    cheat sheet, added 12/24/2010

    The emergence of probability theory as a science, the contribution of foreign scientists and the St. Petersburg mathematical school to its development. The concept of statistical probability of an event, calculation of the most likely number of occurrences of an event. The essence of Laplace's local theorem.

    presentation, added 07/19/2015

    Principles for solving problems in the main sections of probability theory: random events and their admissibility, involuntary quantities, distributions and numerical characteristics of grading, basic limit theorems for sums of independent probabilistic quantities.

    test, added 12/03/2010

    The advantage of using Bernoulli's formula, its place in probability theory and application in independent tests. Historical sketch of the life and work of the Swiss mathematician Jacob Bernoulli, his achievements in the field of differential calculus.

    presentation, added 12/11/2012

    Research by J. Cardano and N. Tartaglia in the field of solving primary problems of probability theory. The contribution of Pascal and Fermat to the development of probability theory. Work by H. Huygens. First studies on demography. Formation of the concept of geometric probability.

    course work, added 11/24/2010

Definition. Probability theory is a science that studies patterns in random phenomena.

Definition. A random phenomenon is a phenomenon that, when tested repeatedly, occurs differently each time.

Definition. Experience is a human activity or process, tests.

Definition. An event is the result of an experience.

Definition. The subject of probability theory is random phenomena and specific patterns of mass random phenomena.

Event classification:

  1. The event is called reliable , if as a result of the experiment it will definitely happen.

Example. The school lesson will definitely end.

  1. The event is called impossible , if under given conditions it will never happen.

Example. If there is no electric current in the circuit, the lamp will not light up.

  1. The event is called random or impossible , if as a result of experience it may or may not occur.

Example. Event - passing an exam.

  1. The event is called equally possible , if the conditions of appearance are the same and there is no reason to assert that as a result of experience one of them has a greater chance of appearing than the other.

Example. The appearance of a coat of arms or tails when a coin is tossed.

  1. The events are called joint , if the appearance of one of them does not exclude the possibility of the appearance of the other.

Example. When shooting, missing and overshooting are joint events.

  1. The event is called incompatible , if the appearance of one of them excludes the possibility of the appearance of the other.

Example. With one shot, a hit and a miss are not simultaneous events.

  1. Two incompatible events are called opposite , if as a result of the experiment one of them will definitely occur.

Example. When passing an exam, the events “passed the exam” and “failed the exam” are called opposite.

Designation: - normal event, - opposite event.

  1. Several events form a complete group of incompatible events , if only one of them occurs as a result of the experiment.

Example. When passing an exam, it is possible: “failed the exam”, “passed with a “3””, “passed with a “4”” - a complete group of incompatible events.

Sum and product rules.

Definition. The sum of two products a And b call the event c , which consists in the occurrence of an event a or events b or both at the same time.

The sum of events is called combining events (occurrence of at least one of the events).

If the meaning of the problem is obvious what should appear a OR b , then they say that they find the sum.

Definition. By producing events a And b call the event c , which consists of the simultaneous occurrence of events a And b .

A product is the intersection of two events.



If the problem says that they find a AND b , which means they find the work.

Example. With two shots:

  1. if it is necessary to find a hit at least once, then find the sum.
  2. if it is necessary to find a hit twice, then find the product.

Probability. Property of probability.

Definition. The frequency of an event is a number equal to the ratio of the number of experiments in which the event occurred to the number of all experiments performed.

Designation: r() – event frequency.

Example. If you toss a coin 15 times and the coat of arms comes up 10 times, then the frequency of the appearance of the coat of arms is: r()=.

Definition. With an infinitely large number of experiments, the frequency of an event becomes equal to the probability of the event.

Definition of classical probability. The probability of an event is the ratio of the number of cases favorable for the occurrence of this event to the number of all uniquely possible and equally possible cases.

Designation: , where P – probability,

m – the number of cases favorable to the occurrence of the event.

n is the total number of unique and equally possible cases.

Example. 60 CHIEP students take part in the running competition. Each one has a number. Find the probability that the number of the student who won the race does not contain the number 5.

Properties of probability:

  1. The probability value is not negative and lies between the values ​​0 and 1.
  2. a probability is 0 if and only if it is a probability of an impossible event.
  3. a probability is equal to 1 if and only if it is the probability of a certain event.
  4. the probability of the same event is unchanged, does not depend on the number of experiments performed, and changes only when the conditions for conducting the experiment change.

Definition of geometric probability. Geometric probability is the ratio of the part of the region in which a selected point must be found in the entire region in which a hit at a given point is equally possible.

Area can be a measure of area, length or volume.

Example. Find the probability of a certain point falling on a section 10 km long if it is necessary for it to fall near the ends of the segment, no further than 1 km from each.

Comment.

If the domain measures s and S have different units of measurement according to the conditions of the problem, then to solve it is necessary to give s and S a single dimension.

Compound. Elements of combinatorics.

Definition. Associations of elements of different groups that differ in the order of elements or at least one element are called compounds.

Connections are:

Accommodation

Combination

Rearrangements

Definition. An arrangement of n – elements m times each is a connection that differs from each other by at least one element and the order of arrangement of the elements.

Definition. Combinations of n elements of m are called a compound consisting of the same elements, differing in at least one element.

Definition. Permutations of n elements are compounds consisting of the same elements, differing from each other only in the order of arrangement of the elements.

Example.

1) in how many ways can you form a convoy of 5 cars?

2) in how many ways can 3 duty officers be appointed in a class, if there are 25 people in total in the class?

Since the order of the elements is not important and the groups of compounds differ in the number of elements, we calculate the number of combinations of 25 elements of 3.

ways.

3) In how many ways can you make a 4-digit number from the numbers 1,2,3,4,5,6. Therefore, since connections differ in the order of arrangement and at least one element, then we calculate the arrangement of 6 elements of 4.

An example of using combinatorics elements and calculating probability.

In a batch of n products, m are defective. We randomly select l-products. Find the probability that there will be exactly k marriages among them.

Example.

10 refrigerators were brought to the store warehouse, of which 4-3-chamber, the rest - 2-chamber.

Find the probability that among 5 randomly selected hills, 3 will have 3 chambers.

Basic theorems of probability theory.

Theorem 1.

The probability of the sum of 2 incompatible events is equal to the sum of the probabilities of these events.

Consequence.

1) if an event forms a complete group of incompatible events, then the sum of their probabilities is equal to 1.

2) the sum of the probabilities of 2 opposite events is equal to 1.

Theorem 2.

The probability of the product of 2 independent events is equal to the product of their probabilities.

Definition. Event A is said to be independent of event B if the probability of occurrence of event A does not depend on whether event B occurs or not.

Definition. 2 events are called independent if the probability of the occurrence of one of them depends on the occurrence or non-occurrence of the second.

Definition. The probability of event B calculated given that event A took place is called conditional probability.

Theorem 3.

The probability of the product of 2 independent events is equal to the probability of the occurrence of one event by the conditional probability of the second, given that the first event occurred.

Example.

The library has 12 textbooks on mathematics. Of these, 2 are textbooks on elementary mathematics, 5 on probability theory, and the rest on higher mathematics. We randomly select 2 textbooks. Find the probability that they both pop in elementary math.

Theorem 4. Probability of an event occurring at least once.

The probability of the occurrence of at least one of the events forming a complete group of incompatible events is equal to the difference between the first and the product of the probabilities of events opposite to the given events.

Let then

Consequence.

If the probability of occurrence of each of the events is the same and equal to p, then the probability that at least one of these events will occur is equal to

N is the number of experiments performed.

Example.

Fire 3 shots at the target. The probability of a hit on the first shot is 0.7, on the second – 0.8, on the third – 0.9. find the probability that with three independent shots at the target there will be:

A) 0 hits;

B) 1 hit;

B) 2 hits;

D) 3 hits;

D) at least one hit.

Theorem 5. Total probability formula.

Let event A occur together with one of the hypotheses, then the probability that event A occurred is found by the formula:

And . Let's bring it to a common denominator.

That. winning one game out of 2 against an equal opponent is more likely than winning 2 games out of 4.

INTRODUCTION 3 CHAPTER 1. PROBABILITY 5 1.1. THE CONCEPT OF PROBABILITY 5 1.2. PROBABILITY AND RANDOM VARIABLES 7 CHAPTER 2. APPLICATION OF PROBABILITY THEORY IN APPLIED INFORMATION SCIENCE 10 2.1. PROBABILISTIC APPROACH 10 2.2. PROBABILISTIC OR CONTENT APPROACH 11 2.3. ALPHABETICAL APPROACH TO INFORMATION MEASUREMENT 12

Introduction

Applied computer science cannot exist separately from other sciences; it creates new information techniques and technologies that are used to solve various problems in different fields of science, technology, and in everyday life. The main directions of development of applied computer science are theoretical, technical and applied computer science. Applied informatics develops general theories of searching, processing and storing information, elucidating the laws of creation and transformation of information, use in various fields of our activity, studying the relationship between man and computer, and the formation of information technologies. Applied computer science is a field of national economy that includes automated systems for processing information, the formation of the latest generation of computer technology, elastic technological systems, robots, artificial intelligence, etc. Applied computer science forms computer science knowledge bases, develops rational methods for automating manufacturing, theoretical design bases, establishing the relationship between science and production, etc. Computer science is now considered a catalyst for scientific and technological progress, promotes the activation of the human factor, and fills all areas of human activity with information. The relevance of the chosen topic lies in the fact that probability theory is used in various fields of technology and natural science: in computer science, reliability theory, queuing theory, theoretical physics and in other theoretical and applied sciences. If you don’t know probability theory, you cannot build such important theoretical courses as “Control Theory”, “Operations Research”, “Mathematical Modeling”. Probability theory is widely used in practice. Many random variables, such as measurement errors, wear of parts of various mechanisms, dimensional deviations from standard ones, are subject to a normal distribution. In reliability theory, the normal distribution is used in assessing the reliability of objects that are subject to aging and wear, and of course, misalignment, i.e. when assessing gradual failures. Purpose of the work: to consider the application of probability theory in applied computer science. Probability theory is considered a very powerful tool for solving applied problems and a multifunctional language of science, but also an object of general culture. Information theory is the basis of computer science, and at the same time, one of the main areas of technical cybernetics.

Conclusion

So, having analyzed the theory of probability, its chronicle and state and possibilities, we can say that the emergence of this concept was not an accidental phenomenon in science, but was a necessity for the subsequent formation of technology and cybernetics. Since the software control that already exists is not capable of helping a person develop cybernetic machines that think like a person without the help of others. And the theory of probability directly contributes to the emergence of artificial intelligence. “The control procedure, where they take place - in living organisms, machines or society, is carried out according to certain laws,” said cybernetics. This means that the procedures that are not fully understood, that occur in the human brain and allow it to elastically adapt to a changing atmosphere, have the opportunity to be played out artificially in the most complex automatic devices. An important definition of mathematics is the definition of a function, but it has always been said about a single-valued function, which associates one value of the function with a single value of the argument and the functional connection between them is well defined. But in reality, involuntary phenomena occur, and many events have non-specific relationships. Finding patterns in random phenomena is the task of probability theories. Probability theory is a tool for studying the invisible and multi-valued relationships of various phenomena in numerous fields of science, technology and economics. Probability theory makes it possible to correctly calculate fluctuations in demand, supply, prices and other economic indicators. Probability theory is part of basic science like statistics and applied computer science. Because without probability theory, more than one application program, and the computer as a whole, cannot work. And in game theory it is also fundamental.

Bibliography

1. Belyaev Yu.K. and Nosko V.P. “Basic concepts and tasks of mathematical statistics.” - M.: Moscow State University Publishing House, CheRo, 2012. 2. V.E. Gmurman “Probability theory and mathematical statistics. - M.: Higher School, 2015. 3. Korn G., Korn T. “Handbook of mathematics for scientists and engineers. - St. Petersburg: Lan Publishing House, 2013. 4. Peheletsky I.D. “Mathematics textbook for students” - M. Academy, 2013. 5. Sukhodolsky V.G. "Lectures on higher mathematics for humanists." - St. Petersburg Publishing House of St. Petersburg State University. 2013; 6. Gnedenko B.V. and Khinchin A.Ya. “Elementary introduction to the theory of probability” 3rd ed., M. - Leningrad, 2012. 7. Gnedenko B.V. “Course in the theory of probability” 4th ed., M. , 2015. 8. Feller V. “Introduction to probability theory and its application” (Discrete distributions), trans. from English, 2nd ed., vol. 1-2, M., 2012. 9. Bernstein S. N. “Theory of Probability” 4th ed., M. - L., 2014. 10. Gmurman, Vladimir Efimovich. Probability theory and mathematical statistics: textbook for universities / V. E. Gmurman.-Ed. 12th, revised - M.: Higher School, 2009. - 478 p.

Updated 12/09/2009

A short excursion into the history of the application of probability theory in practice.

Until the end of the 18th century, applied statistics, without which state accounting and control is unthinkable, and therefore existed for a long time, was of an elementary, purely arithmetic nature. Probability theory remained a purely academic discipline, and its relatively complex “applications” included only gambling. Improvements in dice production technology in the 18th century stimulated the development of probability theory. Players, unwittingly, began to perform reproducible experiments en masse, as the dice became the same, standard. This is how an example of what would later be called a “statistical experiment” arose - an experiment that can be repeated an unlimited number of times under the same conditions.

In the 19th and 20th centuries, probability theory penetrates first into science (astronomy, physics, biology), then into practice (agriculture, industry, medicine), and finally, after the invention of computers, into the everyday life of any person using modern means of receiving and transmitting information. Let's trace the main stages.

1.Astronomy.

It was for use in astronomy that the famous “least squares method” was developed (Legendre 1805, Gauss 1815). The main problem for which it was initially used was the calculation of the orbits of comets, which had to be done from a small number of observations. It is clear that reliably determining the type of orbit (ellipse or hyperbola) and accurately calculating its parameters are difficult, since the orbit is observed only over a small area. The method turned out to be effective, universal, and caused heated debate about priority. It began to be used in geodesy and cartography. Now that the art of manual calculations has been lost, it is difficult to imagine that when mapping the world's oceans in the 1880s in England, a system of approximately 6,000 equations with several hundred unknowns was numerically solved using the least squares method.

In the second half of the 19th century, statistical mechanics was developed in the works of Maxwell, Boltzmann and Gibbs, which described the state of rarefied systems containing a huge number of particles (of the order of Avogadro’s number). If earlier the concept of distribution of a random variable was primarily associated with the distribution of measurement errors, now a variety of quantities are distributed - speed, energy, free path length.

3.Biometrics.

In 1870-1900, the Belgian Quetelet and the Englishmen Francis Galton and Karl Pearson founded a new scientific direction - biometrics, in which for the first time the indefinite variability of living organisms and the inheritance of quantitative characteristics began to be systematically and quantitatively studied. New concepts were introduced into scientific circulation - regression and correlation.

So, until the beginning of the 20th century, the main applications of probability theory were related to scientific research. Introduction into practice - agriculture, industry, medicine - occurred in the 20th century.

4. Agriculture.

At the beginning of the 20th century in England, the task was set to quantitatively compare the effectiveness of various agricultural methods. To solve this problem, the theory of experimental design and analysis of variance were developed. The main credit for the development of this purely practical use of statistics belongs to Sir Ronald Fisher, an astronomer(!) by training, and later a farmer, statistician, geneticist, and president of the English Royal Society. Modern mathematical statistics, suitable for widespread use in practice, were developed in England (Karl Pearson, Student, Fisher). Student was the first to solve the problem of estimating an unknown distribution parameter without using the Bayesian approach.

5. Industry. Introduction of statistical control methods in production (Shewhart control charts). Reducing the required number of product quality tests. Mathematical methods turn out to be so important that they began to be classified. Thus, a book describing a new technique that made it possible to reduce the number of tests (“Sequential Analysis” by Wald) was published only after the end of the Second World War in 1947.

6.Medicine. The widespread use of statistical methods in medicine began relatively recently (the second half of the 20th century). The development of effective treatment methods (antibiotics, insulin, effective anesthesia, artificial circulation) required reliable methods for assessing their effectiveness. A new concept “Evidence-Based Medicine” has emerged. A more formal, quantitative approach to the treatment of many diseases began to develop - the introduction of protocols, guide lines.

Since the mid-1980s, a new and most important factor has emerged that has revolutionized all applications of probability theory - the possibility of widespread use of fast and affordable computers. You can feel the enormity of the revolution that has taken place if you consider that one (!) modern personal computer surpasses in speed and memory all (!) computers in the USSR and the USA that were available by 1968, the time when projects related to the construction of nuclear power plants were already carried out , flights to the moon, the creation of a thermonuclear bomb. Now, using direct experimentation, it is possible to obtain results that were previously unavailable - thinking of unthinkable.

7.Bioinformatics. Since the 1980s, the number of known protein and nucleic acid sequences has increased rapidly. The volume of accumulated information is such that only computer analysis of this data can solve problems of information extraction.

8.Pattern recognition.



Latest materials in the section:

S.A.  Vaporization.  Evaporation, condensation, boiling.  Saturated and unsaturated vapors Evaporation and condensation in nature message
S.A. Vaporization. Evaporation, condensation, boiling. Saturated and unsaturated vapors Evaporation and condensation in nature message

All gases are vapors of any substance, therefore there is no fundamental difference between the concepts of gas and vapor. Water vapor is a phenomenon. real gas and widely...

Program and teaching aids for Sunday schools And those around you should not be judged for their sins
Program and teaching aids for Sunday schools And those around you should not be judged for their sins

The educational and methodological set "Vertograd" includes Teacher's Notes, Workbooks and Test Books in the following subjects: 1. TEMPLE STUDY...

Displacement Determine the amount of movement of the body
Displacement Determine the amount of movement of the body

When we talk about displacement, it is important to remember that displacement depends on the frame of reference in which the movement is viewed. Note...