Every finite state Markov chain has a stationary probability distributionFinite State Markov Chain Stationary DistributionStationary distribution behavior - Markov chainGiven an invariant distribution is the (finite state) Markov transition matrix unique?Definition of Stationary Distributions of a Markov ChainFinite state space Markov chainfinite state markov chain stationary distribution existenceDetermining the Stationary Distribution of a Homogeneous Markov ChainContinuous-state Markov chain: Existence and uniqueness of stationary distributionFinite state Markov Chain always has long-term stationary distribution?$pi = pi P$Always exists?Selecting a Stationary distribution of a Markov chain
How do I extrude a face to a single vertex
Freedom of speech and where it applies
Is camera lens focus an exact point or a range?
Can I use my Chinese passport to enter China after I acquired another citizenship?
Can I sign legal documents with a smiley face?
Is there a word to describe the feeling of being transfixed out of horror?
How to color a curve
Some numbers are more equivalent than others
How to align and center standalone amsmath equations?
Difference between -| and |- in TikZ
Bob has never been a M before
Find last 3 digits of this monster number
About a little hole in Z'ha'dum
Can the Supreme Court overturn an impeachment?
On a tidally locked planet, would time be quantized?
Could the E-bike drivetrain wear down till needing replacement after 400 km?
Why is Arduino resetting while driving motors?
How can Trident be so inexpensive? Will it orbit Triton or just do a (slow) flyby?
Diode in opposite direction?
List of people who lose a child in תנ"ך
Do Legal Documents Require Signing In Standard Pen Colors?
MAXDOP Settings for SQL Server 2014
Two-sided logarithm inequality
Why does Async/Await work properly when the loop is inside the async function and not the other way around?
Every finite state Markov chain has a stationary probability distribution
Finite State Markov Chain Stationary DistributionStationary distribution behavior - Markov chainGiven an invariant distribution is the (finite state) Markov transition matrix unique?Definition of Stationary Distributions of a Markov ChainFinite state space Markov chainfinite state markov chain stationary distribution existenceDetermining the Stationary Distribution of a Homogeneous Markov ChainContinuous-state Markov chain: Existence and uniqueness of stationary distributionFinite state Markov Chain always has long-term stationary distribution?$pi = pi P$Always exists?Selecting a Stationary distribution of a Markov chain
$begingroup$
I am trying to understand the following proof that every finite-state Markov chain has a stationary distribution. The proof is from here.
Let $P$ be the $k times k$ (stochastic) transition probability matrix for our Markov chain. Now,
... $1$ is an eigenvalue for $P$ and therefore also for $P^t$ .
Writing a $P^t$ invariant $v$ as $v = v^+ − v^−$ with $v^+ , v^− in (
> mathbbR_+ )^k$ , we obtain $P^t v^± = v^±$ because $P^t$ preserves
the positive cone; if $v^+neq 0$ take $ν = ( sum v^+_i )^-1 · v^+,$
otherwise normalize $v^−$.
The main thing I don't understand is
we obtain $P^t v^± = v^±$ because $P^t$ preserves
the positive cone
Why is this true?
I also don't understand why $( sum v^+_i )^-1 · v^+$ works if $v^+ neq 0$.
Is there any easier way to show that every finite state Markov chain has a stationary probability distribution?
probability proof-verification markov-chains stochastic-matrices
$endgroup$
add a comment |
$begingroup$
I am trying to understand the following proof that every finite-state Markov chain has a stationary distribution. The proof is from here.
Let $P$ be the $k times k$ (stochastic) transition probability matrix for our Markov chain. Now,
... $1$ is an eigenvalue for $P$ and therefore also for $P^t$ .
Writing a $P^t$ invariant $v$ as $v = v^+ − v^−$ with $v^+ , v^− in (
> mathbbR_+ )^k$ , we obtain $P^t v^± = v^±$ because $P^t$ preserves
the positive cone; if $v^+neq 0$ take $ν = ( sum v^+_i )^-1 · v^+,$
otherwise normalize $v^−$.
The main thing I don't understand is
we obtain $P^t v^± = v^±$ because $P^t$ preserves
the positive cone
Why is this true?
I also don't understand why $( sum v^+_i )^-1 · v^+$ works if $v^+ neq 0$.
Is there any easier way to show that every finite state Markov chain has a stationary probability distribution?
probability proof-verification markov-chains stochastic-matrices
$endgroup$
$begingroup$
For your second question, what must a state distribution vector look like and what can you say about the elements of the scaled version of $v^+$?
$endgroup$
– amd
Dec 2 '18 at 0:23
$begingroup$
They add up to 1, makes sense. I get that part now
$endgroup$
– jackson5
Dec 2 '18 at 0:25
add a comment |
$begingroup$
I am trying to understand the following proof that every finite-state Markov chain has a stationary distribution. The proof is from here.
Let $P$ be the $k times k$ (stochastic) transition probability matrix for our Markov chain. Now,
... $1$ is an eigenvalue for $P$ and therefore also for $P^t$ .
Writing a $P^t$ invariant $v$ as $v = v^+ − v^−$ with $v^+ , v^− in (
> mathbbR_+ )^k$ , we obtain $P^t v^± = v^±$ because $P^t$ preserves
the positive cone; if $v^+neq 0$ take $ν = ( sum v^+_i )^-1 · v^+,$
otherwise normalize $v^−$.
The main thing I don't understand is
we obtain $P^t v^± = v^±$ because $P^t$ preserves
the positive cone
Why is this true?
I also don't understand why $( sum v^+_i )^-1 · v^+$ works if $v^+ neq 0$.
Is there any easier way to show that every finite state Markov chain has a stationary probability distribution?
probability proof-verification markov-chains stochastic-matrices
$endgroup$
I am trying to understand the following proof that every finite-state Markov chain has a stationary distribution. The proof is from here.
Let $P$ be the $k times k$ (stochastic) transition probability matrix for our Markov chain. Now,
... $1$ is an eigenvalue for $P$ and therefore also for $P^t$ .
Writing a $P^t$ invariant $v$ as $v = v^+ − v^−$ with $v^+ , v^− in (
> mathbbR_+ )^k$ , we obtain $P^t v^± = v^±$ because $P^t$ preserves
the positive cone; if $v^+neq 0$ take $ν = ( sum v^+_i )^-1 · v^+,$
otherwise normalize $v^−$.
The main thing I don't understand is
we obtain $P^t v^± = v^±$ because $P^t$ preserves
the positive cone
Why is this true?
I also don't understand why $( sum v^+_i )^-1 · v^+$ works if $v^+ neq 0$.
Is there any easier way to show that every finite state Markov chain has a stationary probability distribution?
probability proof-verification markov-chains stochastic-matrices
probability proof-verification markov-chains stochastic-matrices
asked Dec 1 '18 at 22:23
jackson5jackson5
632513
632513
$begingroup$
For your second question, what must a state distribution vector look like and what can you say about the elements of the scaled version of $v^+$?
$endgroup$
– amd
Dec 2 '18 at 0:23
$begingroup$
They add up to 1, makes sense. I get that part now
$endgroup$
– jackson5
Dec 2 '18 at 0:25
add a comment |
$begingroup$
For your second question, what must a state distribution vector look like and what can you say about the elements of the scaled version of $v^+$?
$endgroup$
– amd
Dec 2 '18 at 0:23
$begingroup$
They add up to 1, makes sense. I get that part now
$endgroup$
– jackson5
Dec 2 '18 at 0:25
$begingroup$
For your second question, what must a state distribution vector look like and what can you say about the elements of the scaled version of $v^+$?
$endgroup$
– amd
Dec 2 '18 at 0:23
$begingroup$
For your second question, what must a state distribution vector look like and what can you say about the elements of the scaled version of $v^+$?
$endgroup$
– amd
Dec 2 '18 at 0:23
$begingroup$
They add up to 1, makes sense. I get that part now
$endgroup$
– jackson5
Dec 2 '18 at 0:25
$begingroup$
They add up to 1, makes sense. I get that part now
$endgroup$
– jackson5
Dec 2 '18 at 0:25
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
The wording in this article is a little ambiguous. I thought of two interpretations, the first of which is incorrect. The second is correct but it doesn't explain the bit about "preservation of the positive cone".
It looks like it may be a case of mistakenly using that a mapping fixes a subset when it only preserves a subset. (I.e. $f|_X = textid_X$ vs $textim(f) subset X$.)
Interpretation 1. Maybe the statement below is being claimed:
(*) Let $C$ be the positive cone. If $P^tv = v$ then for all $v^+,v^- in C$ such that $v = v^+ - v^-$ we have $P^tv^+ = v^+$.
This is false unless $P$ is the identity matrix. Let $x in C$, then according to (*) then $bar v^pm = v^pm + x$ must satisfy $P^tbar v^+ = bar v^+$ as well. But linearity says then $P^tx = x$ too. So $P^t$ fixes the positive cone. This is only true if $P$ is the identity matrix because the span of the positive cone is the whole space.
Interpretation 2. Maybe instead it means to set $v^+$ to be the vector of positive entries of $v$ with 0s in place of negatives. E.g. if $v = (1,0,2,-7)$ then $v^+ = (1,0,2,0)$ and $v^- = (0,0,0,7)$. Then the claim would be:
If $P^tv = v$ then we have $P^tv^+ = v^+$ where $v^+$ is the vector of positive entries described above.
This is true, but I don't think the cited article offers any explanation as to why, and I don't know how to prove it without using Frobenius-Perron, which is maybe a harder theorem than the one we are trying to prove.
It is a trivial consequence of Frobenius-Perron in the case of an irreducible stochastic matrix, because one has either $v = v^+$ or $v = v^-$. This is because there is a stationary state $v$ (by F-P) and the eigenspace for $lambda = 1$ is simple (also F-P). So any invariant vector is a scalar multiple of it and also has this property.
For reducible matrices the eigenspace for $lambda = 1$ is no longer simple, so we can do things like $v = v_1 - v_2$ where $v_i$ is the stationary state for the $i$th block. Then $v^+ = v_1, v^- = v_2$. Following the suggestion in the article, one would then find a stationary distribution by normalizing just the positive part $v^+ = v_1$.
$endgroup$
add a comment |
$begingroup$
To your last question, as to whether there is a simpler way of proving existence of a stationary distribution for finite state Markov chains, that depends what tools you have at your disposal. Here is a nice and short consequence of a fixed point theorem:
Let a Markov chain $mathbfP$ over $d$ states. The simplex $Delta_d$ is a convex and compact subset of $mathbbR^d$, which is a Euclidean vector space with usual inner product. We can look at the kernel as the following linear operator,
beginequation
beginsplit
mathbfP : Delta_d &to Delta_d \
mu &mapsto mu mathbfP
endsplit
endequation
As $|mathbfP|_2 leq sqrtd < infty$, the operator is bounded and therefore continuous. As a consequence, we can apply Brouwer's fixed point theorem to show that $exists pi in Delta_d, pi mathbfP = pi$.
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3021916%2fevery-finite-state-markov-chain-has-a-stationary-probability-distribution%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
The wording in this article is a little ambiguous. I thought of two interpretations, the first of which is incorrect. The second is correct but it doesn't explain the bit about "preservation of the positive cone".
It looks like it may be a case of mistakenly using that a mapping fixes a subset when it only preserves a subset. (I.e. $f|_X = textid_X$ vs $textim(f) subset X$.)
Interpretation 1. Maybe the statement below is being claimed:
(*) Let $C$ be the positive cone. If $P^tv = v$ then for all $v^+,v^- in C$ such that $v = v^+ - v^-$ we have $P^tv^+ = v^+$.
This is false unless $P$ is the identity matrix. Let $x in C$, then according to (*) then $bar v^pm = v^pm + x$ must satisfy $P^tbar v^+ = bar v^+$ as well. But linearity says then $P^tx = x$ too. So $P^t$ fixes the positive cone. This is only true if $P$ is the identity matrix because the span of the positive cone is the whole space.
Interpretation 2. Maybe instead it means to set $v^+$ to be the vector of positive entries of $v$ with 0s in place of negatives. E.g. if $v = (1,0,2,-7)$ then $v^+ = (1,0,2,0)$ and $v^- = (0,0,0,7)$. Then the claim would be:
If $P^tv = v$ then we have $P^tv^+ = v^+$ where $v^+$ is the vector of positive entries described above.
This is true, but I don't think the cited article offers any explanation as to why, and I don't know how to prove it without using Frobenius-Perron, which is maybe a harder theorem than the one we are trying to prove.
It is a trivial consequence of Frobenius-Perron in the case of an irreducible stochastic matrix, because one has either $v = v^+$ or $v = v^-$. This is because there is a stationary state $v$ (by F-P) and the eigenspace for $lambda = 1$ is simple (also F-P). So any invariant vector is a scalar multiple of it and also has this property.
For reducible matrices the eigenspace for $lambda = 1$ is no longer simple, so we can do things like $v = v_1 - v_2$ where $v_i$ is the stationary state for the $i$th block. Then $v^+ = v_1, v^- = v_2$. Following the suggestion in the article, one would then find a stationary distribution by normalizing just the positive part $v^+ = v_1$.
$endgroup$
add a comment |
$begingroup$
The wording in this article is a little ambiguous. I thought of two interpretations, the first of which is incorrect. The second is correct but it doesn't explain the bit about "preservation of the positive cone".
It looks like it may be a case of mistakenly using that a mapping fixes a subset when it only preserves a subset. (I.e. $f|_X = textid_X$ vs $textim(f) subset X$.)
Interpretation 1. Maybe the statement below is being claimed:
(*) Let $C$ be the positive cone. If $P^tv = v$ then for all $v^+,v^- in C$ such that $v = v^+ - v^-$ we have $P^tv^+ = v^+$.
This is false unless $P$ is the identity matrix. Let $x in C$, then according to (*) then $bar v^pm = v^pm + x$ must satisfy $P^tbar v^+ = bar v^+$ as well. But linearity says then $P^tx = x$ too. So $P^t$ fixes the positive cone. This is only true if $P$ is the identity matrix because the span of the positive cone is the whole space.
Interpretation 2. Maybe instead it means to set $v^+$ to be the vector of positive entries of $v$ with 0s in place of negatives. E.g. if $v = (1,0,2,-7)$ then $v^+ = (1,0,2,0)$ and $v^- = (0,0,0,7)$. Then the claim would be:
If $P^tv = v$ then we have $P^tv^+ = v^+$ where $v^+$ is the vector of positive entries described above.
This is true, but I don't think the cited article offers any explanation as to why, and I don't know how to prove it without using Frobenius-Perron, which is maybe a harder theorem than the one we are trying to prove.
It is a trivial consequence of Frobenius-Perron in the case of an irreducible stochastic matrix, because one has either $v = v^+$ or $v = v^-$. This is because there is a stationary state $v$ (by F-P) and the eigenspace for $lambda = 1$ is simple (also F-P). So any invariant vector is a scalar multiple of it and also has this property.
For reducible matrices the eigenspace for $lambda = 1$ is no longer simple, so we can do things like $v = v_1 - v_2$ where $v_i$ is the stationary state for the $i$th block. Then $v^+ = v_1, v^- = v_2$. Following the suggestion in the article, one would then find a stationary distribution by normalizing just the positive part $v^+ = v_1$.
$endgroup$
add a comment |
$begingroup$
The wording in this article is a little ambiguous. I thought of two interpretations, the first of which is incorrect. The second is correct but it doesn't explain the bit about "preservation of the positive cone".
It looks like it may be a case of mistakenly using that a mapping fixes a subset when it only preserves a subset. (I.e. $f|_X = textid_X$ vs $textim(f) subset X$.)
Interpretation 1. Maybe the statement below is being claimed:
(*) Let $C$ be the positive cone. If $P^tv = v$ then for all $v^+,v^- in C$ such that $v = v^+ - v^-$ we have $P^tv^+ = v^+$.
This is false unless $P$ is the identity matrix. Let $x in C$, then according to (*) then $bar v^pm = v^pm + x$ must satisfy $P^tbar v^+ = bar v^+$ as well. But linearity says then $P^tx = x$ too. So $P^t$ fixes the positive cone. This is only true if $P$ is the identity matrix because the span of the positive cone is the whole space.
Interpretation 2. Maybe instead it means to set $v^+$ to be the vector of positive entries of $v$ with 0s in place of negatives. E.g. if $v = (1,0,2,-7)$ then $v^+ = (1,0,2,0)$ and $v^- = (0,0,0,7)$. Then the claim would be:
If $P^tv = v$ then we have $P^tv^+ = v^+$ where $v^+$ is the vector of positive entries described above.
This is true, but I don't think the cited article offers any explanation as to why, and I don't know how to prove it without using Frobenius-Perron, which is maybe a harder theorem than the one we are trying to prove.
It is a trivial consequence of Frobenius-Perron in the case of an irreducible stochastic matrix, because one has either $v = v^+$ or $v = v^-$. This is because there is a stationary state $v$ (by F-P) and the eigenspace for $lambda = 1$ is simple (also F-P). So any invariant vector is a scalar multiple of it and also has this property.
For reducible matrices the eigenspace for $lambda = 1$ is no longer simple, so we can do things like $v = v_1 - v_2$ where $v_i$ is the stationary state for the $i$th block. Then $v^+ = v_1, v^- = v_2$. Following the suggestion in the article, one would then find a stationary distribution by normalizing just the positive part $v^+ = v_1$.
$endgroup$
The wording in this article is a little ambiguous. I thought of two interpretations, the first of which is incorrect. The second is correct but it doesn't explain the bit about "preservation of the positive cone".
It looks like it may be a case of mistakenly using that a mapping fixes a subset when it only preserves a subset. (I.e. $f|_X = textid_X$ vs $textim(f) subset X$.)
Interpretation 1. Maybe the statement below is being claimed:
(*) Let $C$ be the positive cone. If $P^tv = v$ then for all $v^+,v^- in C$ such that $v = v^+ - v^-$ we have $P^tv^+ = v^+$.
This is false unless $P$ is the identity matrix. Let $x in C$, then according to (*) then $bar v^pm = v^pm + x$ must satisfy $P^tbar v^+ = bar v^+$ as well. But linearity says then $P^tx = x$ too. So $P^t$ fixes the positive cone. This is only true if $P$ is the identity matrix because the span of the positive cone is the whole space.
Interpretation 2. Maybe instead it means to set $v^+$ to be the vector of positive entries of $v$ with 0s in place of negatives. E.g. if $v = (1,0,2,-7)$ then $v^+ = (1,0,2,0)$ and $v^- = (0,0,0,7)$. Then the claim would be:
If $P^tv = v$ then we have $P^tv^+ = v^+$ where $v^+$ is the vector of positive entries described above.
This is true, but I don't think the cited article offers any explanation as to why, and I don't know how to prove it without using Frobenius-Perron, which is maybe a harder theorem than the one we are trying to prove.
It is a trivial consequence of Frobenius-Perron in the case of an irreducible stochastic matrix, because one has either $v = v^+$ or $v = v^-$. This is because there is a stationary state $v$ (by F-P) and the eigenspace for $lambda = 1$ is simple (also F-P). So any invariant vector is a scalar multiple of it and also has this property.
For reducible matrices the eigenspace for $lambda = 1$ is no longer simple, so we can do things like $v = v_1 - v_2$ where $v_i$ is the stationary state for the $i$th block. Then $v^+ = v_1, v^- = v_2$. Following the suggestion in the article, one would then find a stationary distribution by normalizing just the positive part $v^+ = v_1$.
edited Dec 7 '18 at 14:48
answered Dec 2 '18 at 13:37
BenBen
4,303617
4,303617
add a comment |
add a comment |
$begingroup$
To your last question, as to whether there is a simpler way of proving existence of a stationary distribution for finite state Markov chains, that depends what tools you have at your disposal. Here is a nice and short consequence of a fixed point theorem:
Let a Markov chain $mathbfP$ over $d$ states. The simplex $Delta_d$ is a convex and compact subset of $mathbbR^d$, which is a Euclidean vector space with usual inner product. We can look at the kernel as the following linear operator,
beginequation
beginsplit
mathbfP : Delta_d &to Delta_d \
mu &mapsto mu mathbfP
endsplit
endequation
As $|mathbfP|_2 leq sqrtd < infty$, the operator is bounded and therefore continuous. As a consequence, we can apply Brouwer's fixed point theorem to show that $exists pi in Delta_d, pi mathbfP = pi$.
$endgroup$
add a comment |
$begingroup$
To your last question, as to whether there is a simpler way of proving existence of a stationary distribution for finite state Markov chains, that depends what tools you have at your disposal. Here is a nice and short consequence of a fixed point theorem:
Let a Markov chain $mathbfP$ over $d$ states. The simplex $Delta_d$ is a convex and compact subset of $mathbbR^d$, which is a Euclidean vector space with usual inner product. We can look at the kernel as the following linear operator,
beginequation
beginsplit
mathbfP : Delta_d &to Delta_d \
mu &mapsto mu mathbfP
endsplit
endequation
As $|mathbfP|_2 leq sqrtd < infty$, the operator is bounded and therefore continuous. As a consequence, we can apply Brouwer's fixed point theorem to show that $exists pi in Delta_d, pi mathbfP = pi$.
$endgroup$
add a comment |
$begingroup$
To your last question, as to whether there is a simpler way of proving existence of a stationary distribution for finite state Markov chains, that depends what tools you have at your disposal. Here is a nice and short consequence of a fixed point theorem:
Let a Markov chain $mathbfP$ over $d$ states. The simplex $Delta_d$ is a convex and compact subset of $mathbbR^d$, which is a Euclidean vector space with usual inner product. We can look at the kernel as the following linear operator,
beginequation
beginsplit
mathbfP : Delta_d &to Delta_d \
mu &mapsto mu mathbfP
endsplit
endequation
As $|mathbfP|_2 leq sqrtd < infty$, the operator is bounded and therefore continuous. As a consequence, we can apply Brouwer's fixed point theorem to show that $exists pi in Delta_d, pi mathbfP = pi$.
$endgroup$
To your last question, as to whether there is a simpler way of proving existence of a stationary distribution for finite state Markov chains, that depends what tools you have at your disposal. Here is a nice and short consequence of a fixed point theorem:
Let a Markov chain $mathbfP$ over $d$ states. The simplex $Delta_d$ is a convex and compact subset of $mathbbR^d$, which is a Euclidean vector space with usual inner product. We can look at the kernel as the following linear operator,
beginequation
beginsplit
mathbfP : Delta_d &to Delta_d \
mu &mapsto mu mathbfP
endsplit
endequation
As $|mathbfP|_2 leq sqrtd < infty$, the operator is bounded and therefore continuous. As a consequence, we can apply Brouwer's fixed point theorem to show that $exists pi in Delta_d, pi mathbfP = pi$.
answered Mar 16 at 8:46
ippiki-ookamiippiki-ookami
451317
451317
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3021916%2fevery-finite-state-markov-chain-has-a-stationary-probability-distribution%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
For your second question, what must a state distribution vector look like and what can you say about the elements of the scaled version of $v^+$?
$endgroup$
– amd
Dec 2 '18 at 0:23
$begingroup$
They add up to 1, makes sense. I get that part now
$endgroup$
– jackson5
Dec 2 '18 at 0:25