Talk:Harris chain

Latest comment: 1 year ago by LachlanA in topic Examples

Merger with Markov chains on a measurable state space

edit

Based on the current state of the article, it looks like Harris Chain and Markov chains on a measurable state space are synonyms, but I do not have the relevant mathematical knowledge. Is there any reason to have two different articles, or should those be merged? If not, could someone provide a description of how the two concepts differ? 7804j (talk) 15:13, 4 December 2019 (UTC)Reply

A reply to this same question was given at Talk:Markov chains on a measurable state space. 67.198.37.16 (talk) 05:05, 22 October 2020 (UTC)Reply
My understanding is that a Harris chain is analogous to a recurrent Markov chain. Not all Markov chains on measurable state spaces are Harris chains. LachlanA (talk) 06:44, 19 April 2023 (UTC)Reply

Examples

edit

It would be useful to expand the example on countable state spaces to show how a recurrent chain is Harris recurrent, and a non-recurrent chain is not. At the moment, it seems to say "if it satisfies the condition in the definition, it is Harris recurrent", which is true but not helpful. LachlanA (talk) 06:46, 19 April 2023 (UTC)Reply