File Exchange Pick of the Week

Puzzler: Coin game 11

Posted by Doug Hull,

This is a simplified version of the first problem I ever attempted to solve in MATLAB. For my original problem, I was trying to figure out the probability of being on any square in the board game Monopoly. That is a difficult problem from a book-keeping sense, but the essence of the problem is in today’s Puzzler.

coingame1.jpg

With the rules shown in the diagram, what is the probability of being on each of the four squares after each of the first twenty turns?

I can think of two good ways of solving today’s Puzzler. I look forward to seeing how everyone else does it.

Take a little bit and work on the Puzzler. I don’t want to give away strategies before you get a chance to work on the puzzle, so I am just showing the results of each method in the preview below, but not saying what the method was.

[There is a typo in the comments on this second video. It should say the probability of moving from Square 4 to Square 1, not the other way around. Sorry!]

Discussion of code and critiques should be placed in the comments. [Click here]

11 CommentsOldest to Newest

Here’s a brute force, and perhaps less than elegant solution to the problem.


gencoins = @(x)rand(2,x)>0.5; % anonymous function to generate the coins
probs(1,:) = [1,0,0,0]; % First step probability is always square 1
nTrials = 1e6; % How many trials are we running to determine probabilities
curPos = zeros(1,nTrials);
for n=2:20
coins = gencoins(nTrials);
twoTails = ~any(coins);
curPos = mod(curPos+sum(coins),4);
curPos(twoTails)=0;
probs(n,:) = sum(bsxfun(@eq,curPos.',[0,1,2,3]))/nTrials;
end

Thanks for running these puzzles, Doug!

Dan

Dan,

I like this solution. It follows exactly the idea I put in the last wrap-up where you make your code as readable as possible, and then modify it for speed when needed.

Your implementation is very similar to my solution (video one above), but yours is a lot faster. The reason yours is faster is the use of BSXFUN. In this case, a little less clarity is worth the performance increase. I have to admit, I needed to read the doc for this function, as I have never used it!

I would have placed the definition of twoTails immediately before the line where it was used, so it is easier to follow, but this is pretty clear code with some amount of commenting where needed.

While this code is fast and pretty accurate, I am curious if anyone will come up with other methods that are exactly accurate and even faster like my second video.

Thanks!
Doug

Hello Doug,

Here is a function working for a slightly more general situation:

function probs = game(steps)
N = 4; % number of game states

% Basic transition matrices
% move(k) corresponds to "move k squares"
move = @(k)circshift(eye(N),[0 k]);
% go(k) corresponds to "go to square k"
go = @(k)ones(N,1)*((1:N)==k);

% Build the actual transition matrix
M = 1/2 * move(1) + 1/4 * move(2) + 1/4 * go(1);

% Compute probabilities 
if steps < inf
   % Finite number of steps: intermediate distributions are computed 
   % using successive matrix-vector products
   state = (1:N)==1;
   for i = 1:steps
       state = state*M;
       probs(i, 1:N) = state;
   end
else 
   % Infinite number of steps: stationary distribution is computed 
   % (assuming it exists) by solving a linear system
   probs = [zeros(1,N) 1] /[M-eye(N) ones(N,1)];
end    

Alternatively, as a finite time-homogeneous Markov chain

p20 = (0.25 * [1 2 1 0; 1 0 2 1; 2 0 0 2; 3 1 0 0])^20;
disp(p20(1, :))

cheers,

Richard

Yeah, the markov chain is of couse the obvious method here, assuming you are familiar with statistics. Richard here beat me to it though. My solution is identical to his, although quite a bit more verbose:

startPos = [1; 0; 0; 0];
numberOfMoves = 20;

probabilityMatrix = …
[ 1/4 1/2 1/4 0; …
1/4 0 1/2 1/4; …
1/2 0 0 1/2; …
3/4 1/4 0 0]';

probabilityAfter20Turns = probabilityMatrix^numberOfMoves*startPos

Hi.
One general comment about how you write your blog. I noticed that Loren allways puts a link to the comments section in the text of her entries. I read these blogs with a feed reader so I don’t get the automatic footer with links to the comments section as I get when I visit the homepage. If there is a link in the text, I can go directly to the comments section which is nice.

Sincerely
Daniel Armyr

Francois,

A beautiful solution. I like how you derived the M matrix rather than it being a “Magic Number Matrix” (which is what I did).

When I was designing this challenge, I was going to have an add-on where the number of coins and number of squares was variable. I thought that simpler was better. I see that your code would have been able to handle such a complication nicely.

Thanks for a great solution that works in a different way than the earlier one.

Doug

Daniel, Richard

I am surprised that so many people jumped no the Markov chain method, I expected that the exhaustive search or monte carlo methods would dominate. Maybe that is because I worked with distributed computing clusters for MATLAB for a while but never really studied Markov Chains.

Thanks for playing!
Doug

as with last weeks puzzler, when i had to second guess the task from a few examples (with consecutive bad proposals, of course), i’m again NOT able to run the videos from behind our two firewalls (the flash player [9,0,115,0] works fine, though, for other stuff): it shows about one 3rd of the first and kills the browser for the second…
it’s just a bit frustrating…

best from zurich
urs

ps: maybe i’m not meant to contribute… or it’s just the date…

Output: x

p = [.25 .25 .5 .75 ; .5 0 0 .25 ; .25 .5 0 0 ; 0 .25 .5 0 ];
[P D] = eig(p);
D=(abs(D)>.99).*D;
x = real(P * D * P^(-1) * [1 ; 0 ; 0 ; 0]);

The idea here is to use the eigenvalue decomposition, noting that the eigenvalues with norm less than 1 will vanish. Remove those eigenvalues and reassemble the matrix. Multiply by the starting vector and you have the result. This is accurate to machine precision. The real is to remove the roundoff imaginary components.

These postings are the author's and don't necessarily represent the opinions of MathWorks.