As for how this particular Markov works, viewing the source code of this page will give you the best understanding: http://www.tangotiger.net/markov.html (right click, select “View Source”). But the gist of it is this:

1) Begin with the basic probabilities of singles, doubles, walks, etc. (per PA)

2) Start with the bases loaded, 2 out state — the probability of you not scoring more in the inning is equal to the chance that you’ll make an out. Next comes the runner on 3rd, 2 out, with one open base situation: now it’s possible that the batter will walk, which will not result in a score, or that he will make an out, so the chances of not scoring are a little higher. If the only runner is on 3rd, then there can be two walks without scoring, so you multiply the chance of a walk with the chance of not scoring in the previously mentioned (2 runners) state, and again add the chance of making an out. It gets more and more complex as you go on, figuring in the chances of taking extra bases on hits or outs, and chaining off of one or more of the simpler states.

3) Find the chance of the runner scoring from a particular base by calculating 1 minus the average of the applicable states (you have to convert from the chance of not scoring, which is something I believe Tango did to make the calculations more efficient).

4) Apply these chances to the chances of a batter getting on the applicable bases.

5) Find total runs per game by adding up the chances in step 4 (plus the chance of a HR) and multiplying by the team’s expected PA per game.

Hopefully that makes some sense. It’s not simple, as you can see in the Calculations sheet.

]]>For example, if there is a 50% chance of rain tomorrow given that it rained today, and a 10% chance of rain if it was dry today, that could be modeled by a Markov Chain. This is because the next day’s weather can be predicted based on the weather today, and ALL YOU NEED TO KNOW is the weather today. You would write it like this:

S = [0,1] ~ The State Space, or possible outcomes. 0=Dry, 1=Rain

P = 0 1

0 [.9 .1]

1 [.5 .5]

This is called the Probability Transition Matrix. The vertical 0,1 represents the weather today, the horizontal 0,1 represents the weather tomorrow. This tells you if it is dry today (vertical 0), it will be dry tomorrow (horizontal 0) with probability .9, and rain tomorrow with probability .1. If it is raining today (vertical 1) is will be dry tomorrow (horizontal 0) with probability .5 and rain tomorrow with probability .5.

Feel free to ask any more questions! I hope my formatting comes out.

]]>Secondly, I would say that you should write more articles, but I realize that it probably takes an immense amount of research for each one, so instead I’ll say that you should keep prioritizing quality over quantity.

Thirdly, how the hell do you do a Markov chain?

]]>Would anybody be interested in a simulator of that?

]]>