Logo en.artbmxmagazine.com

Markov analysis

Table of contents:

Anonim

Introduction

Within this research, the objective of defining each of the concepts that make up Markov's analysis was proposed, starting from the most basic to the most complex, as well as why this study is called this way, giving it its name by its author. (Markov) in this document reflects on his biography, as well as all his achievements.

Another very important topic that we mention is the Markov chains, which refer to a tool to analyze the behavior and governance of certain types of stochastic processes, that is, processes that evolve in a non-deterministic way over time. around a set of states.

A Markov chain, therefore, represents a system that varies a state over time, each change being a transition of the system.

For such chains, the use of various terms involved is made for a better understanding, such as: States are the characterization of the situation in which the system finds itself at a given instant, transition matrix: What is the numerical arrangement where It condenses the probabilities from one state to another. The regular matrix: it is a square matrix that has an inverse. Recurring state: A state is recurring if after entering this state, the process definitely returns to that state, Ergodic Matrix: If the states in a chain are recurrent, aperiodic, and communicate with each other. Absorbing states: A Markov chain in which one or more states is an absorbing state, is an absorbing Markov chain.

All these topics are explained in more detail and with a better understanding within the document.

Markov analysis

Markov's analysis, named after the studies carried out by the Russian Andrei Andréyevich Markov between 1906 and 1907, on the sequence of chain-linked experiments and the need to mathematically discover physical phenomena. Markov's theory was developed in the 1930s and 1940s by ANKolmagoron, W. Feller, W. Doeblin, P.Levy, JLDoob, and others.

Markov analysis is a way to analyze the current movement of some variable, in order to forecast its future movement. This method has begun to be used in recent years as an instrument of marketing research, to examine and forecast the behavior of customers from the point of view of their loyalty to a brand and their ways of changing to other brands, the application of This technique is not only limited to marketing, but its field of action has been applied in various fields.

Andrei Markov

Andrei Andreyevich Markov: (June 14, 1856 - July 20, 1922) was a Russian mathematician known for his work on number theory and probability theory.

Markov was born in Ryazan, Russia. Before the age of 10, his father, a state official, was transferred to Saint Petersburg where Andrei went to study at a city institute. From the beginning he showed a certain talent for mathematics and when he graduated in 1874 he already knew several mathematicians from the University of Saint Petersburg, where he entered after graduation. At the University he was a disciple of Chebyshov and after completing his master's and doctorate theses, in 1886 he agreed to be attached to the St. Petersburg Academy of Sciences on the proposal of Chebyshov himself. Ten years later Markov had won the post of regular academic. From 1880, after defending his master's thesis, Markov taught at the University and, when Chebyshov himself left the University three years later,it was Markov who replaced him in probability theory courses. In 1905, after 25 years of academic activity, Márkov retired definitively from the University, although he continued to teach some courses on probability theory.

Apart from his academic profile, Andréi Márkov was a convinced political activist. He opposed the privileges of the Tsarist nobility and came to reject the Tsar's own decorations in protest at some political decisions related to the Academy of Sciences. His involvement in politics reached such a point that he became known as "the militant academic".

Markov dragged throughout his life problems related to a congenital malformation in the knee that would take him several times to the operating room and that, over time, was the cause of his death when on July 20, 1922 one of the many operations at He underwent a generalized infection from which he could not recover.

Although Markov influenced various fields of mathematics, for example in his works on continuous fractions, history will remember him mainly for his results related to probability theory. In 1887 he completed the proof that allowed generalizing the central limit theorem and that Chebyshov had already advanced. But his best-known contribution is another.

His theoretical work in the field of processes in which random components are involved (stochastic processes) would bear fruit in a mathematical instrument currently known as the Markov chain: sequences of values ​​of a random variable in which the value of the variable in the future it depends on the value of the variable in the present, but it is independent of the history of said variable. Markov chains today are considered an essential tool in disciplines such as economics, engineering, operations research, and many others.

Markov chains

Markov analysis

Markov chains are a tool for analyzing the behavior and governance of certain types of stochastic processes, that is, processes that evolve in a non-deterministic way over time around a set of states.

A Markov chain, therefore, represents a system that varies a state over time, each change being a transition of the system. These changes are not predetermined, although the probability of the next state as a function of the previous states is, a probability that is constant over time (system homogeneous in time). Eventually, it is a transition, the new state may be the same as the previous one, and there may be the possibility of influencing the probabilities of transition by acting appropriately on the system (decision).

Basic concepts

For the study of the Markov chains, some key concepts such as the following should be taken into account:

state

The state of a system at time t is a variable whose values ​​can only belong to the set of states in the system. The system modeled by the chain, therefore, is a variable that changes with the value of time, a change that we call transition.

Transition matrix

The array elements represent the probability that the next state is the one corresponding to the column if the current state is the one corresponding to the row.

It has 3 basic properties:

  1. The sum of the probabilities of the states must be equal to 1.The transition matrix must be square. The transition probabilities must be between 0 and 1.

Current distribution (Vector Po): It is the way in which the probabilities of the states are distributed in an initial period, (period 0). This information will allow you to find out what the distribution will be in subsequent periods.

Stable state: It can be said that the stable state is the probability distribution that at a certain point will be fixed for vector P and will not present changes in subsequent periods.

Therefore a state is recurrent if and only if it is not transitory.

Example:

Markov analysis

Suppose there are sm transient states (t1, t2,…, ts-m) and m absorbing states (a1, a2,…, am), write the transition probability matrix P as follows:

Markov analysis

conclusion

To conclude, we can say that Markov chains are a tool to analyze the behavior and governance of certain types of stochastic processes, that is, processes that evolve non-deterministically over time around a set of states.

That for its elaboration require the knowledge of various elements such as the state and the transition matrix.

These elements were discovered by their creator Markov, who carried out a sequence of chain-connected experiments and the need to mathematically discover physical phenomena

This method is very important, since it has begun to be used in recent years as a marketing research tool, to examine and forecast the behavior of customers from the point of view of their loyalty to a brand and of its forms of change to other brands, the application of this technique is no longer limited to marketing but its field of action has been applied in various fields.

Hopefully this article is very useful and that the concepts contained are clearly explained.

References

  • Pérez, J. (2011, June 3). Andrei Markov. Recovered from http: //investigacindeoperaciones.htmlMorales, L. (2011, June 2). Markov chains. Recovered from:
Markov analysis