May 27, 2015

Playing Mastermind with GoldSim

Posted by Ryan Roper

This last weekend, I went into nerd mode and created a GoldSim version of the classic game Mastermind. If you're not familiar with Mastermind, here's the Wikipedia article about it: Mastermind (board game). According to the article, "Mastermind...is a code-breaking game for two players. The modern game with pegs was invented in 1970 by Mordecai Meirowitz, an Israeli postmaster and telecommunications expert. It resembles an earlier pencil and paper game called Bulls and Cows that may date back a century or more." I play this game with my kids and it often leaves me musing about what kind of strategy or algorithm I might devise to more systematically make guesses to solve the code. This musing usually lasts about 5 or 10 minutes before I decide it's not worth my time and I go on with my life. However, recently I thought it would be a fun and interesting exercise to create a GoldSim version of Mastermind.

Mastermind". Licensed under CC BY-SA 2.0 via Wikimedia Commons.
This doesn't just make for an entertaining blog post, but the resulting GoldSim model also makes use of some of GoldSim's most interesting features including: (1) advanced Dashboard capabilities such as dynamic hiding/showing of controls, (2) Script elements and (3) a Looping Container. So, I've also added the model to our Model Library: Mastermind in GoldSim.

The rules of the game as described in the Wikipedia article are as follows:

"The two players decide in advance how many games they will play, which must be an even number. One player becomes the codemaker, the other the codebreaker. The codemaker chooses a pattern of four code pegs. Duplicates are allowed, so the player could even choose four code pegs of the same color. The chosen pattern is placed in the four holes covered by the shield, visible to the codemaker but not to the codebreaker. The codebreaker may have a very hard time finding out the code.

"The codebreaker tries to guess the pattern, in both order and color, within twelve (or ten, or eight) turns. Each guess is made by placing a row of code pegs on the decoding board. Once placed, the codemaker provides feedback by placing from zero to four key pegs in the small holes of the row with the guess. A colored or black key peg is placed for each code peg from the guess which is correct in both color and position. A white key peg indicates the existence of a correct color code peg placed in the wrong position."

In the GoldSim version of the game, each realization is a game and each time step in a realization allows the codebreaker one guess. The duration of the simulation is 12 days with 1-day time steps, giving the codebreaker 12 guesses. Be sure to step the model to day 1 to register your first guess. NOTE that in order to play the game, you must uncheck the 'Begin simulation immediately on entering Run Mode' option in the 'General' tab of the Model Options dialog. Also, note that the model file is intended to be used with GoldSim version 11.1.3 or later.

Both the code-making and the code-breaking can be done by a human player or by GoldSim. Before play begins, these options are specified using dashboard drop-lists. If the player-specified option is selected for either, a set of sliders is dynamically shown or enabled to allow the code and the guesses to be specified. See the model library article for more details about the GoldSim implementation.

.
Playing around with the game in different modes, I observed a few interesting things.

(1) First I set the game to 'Randomly-Generated Code' and 'Computer Guess'. On each realization, GoldSim randomly generates a new code to break. Running anywhere from 100 to 500 realizations, I observed that the computer guesses the correct code within 5 guesses 85% of the time and within 6 guesses 99% of the time. Nearly always, the computer requires no more than 7 guesses. Only one time did I see an instance where it took 8 guesses.

(2) Next, I set the game to 'Player-Specified Code' and 'Computer Guess'. In this mode, you can only specify the code at the very start and not between each realization. However, GoldSim starts with a new initial guess at the beginning of each realization. Therefore the number of guesses used differs from realization to realization. I was interested to see if some code patterns are easier to crack than others. I tried the following code patterns: AAAA, AAAB, AABB, ABCC and ABCD. For all but one of the code patterns, the computer was able to guess the right code within 5 guesses 85% of the time. In the case of an AAAA pattern, GoldSim was able to guess the right code within 5 guesses 95% of the time.

Anyway, I hope you have as much fun with it as I did!

No comments:

Post a Comment