What's new
USCHO Fan Forum

This is a sample guest message. Register a free account today to become a member! Once signed in, you'll be able to participate on this site by adding your own topics and posts, as well as connect with other members through your own private inbox!

  • The USCHO Fan Forum has migrated to a new plaform, xenForo. Most of the function of the forum should work in familiar ways. Please note that you can switch between light and dark modes by clicking on the gear icon in the upper right of the main menu bar. We are hoping that this new platform will prove to be faster and more reliable. Please feel free to explore its features.

Monty Hall, we have a PROBLEM

Re: Monty Hall, we have a PROBLEM

I apologize because it wasn't stated clearly that the guy wants to keep the 12 good bottles'o'wine. He doesn't want to toss 4 of them - just one.

The solution I proposed would allow him to keep 12. I like Unofun's solution better though, it is more reliable and more elegant and far more clever.

Maybe I didn't phrase mine properly before: reprised in white

open bottles 1 - 4, so rat A gets sample from bottle 1, rat B gets sample from bottle 2, etc.

5-1/2 hours later, repeat with bottles 5 - 8, so rate A gets sample from bottle 5, rate B from bottle 6 etc.

5-1/2 hours later, repeat with botles 9 - 12, and so on. If Rat A dies, it is from 1 or 5 or 9; and the timing of Rat A's death determines which of the three bottles is responsible.
 
Re: Monty Hall, we have a PROBLEM

FreshFish your answer in #82 is a correct one, as correct as the 4-digit binary digit-rat solution from unofan ... given I mentioned the reception was "about 36 hours away". I should have said at the outset however that "even if the reception is 25 hours away, the solution remains the same". Since I didn't state that until several posts later, however, Bravo for your answer based on initial story.

But the concepts are along the same lines:
<table border="1"><tr><td>Unique combinations. There are more unique combinations among 4 rats than there are bottles to "test" (2 more in fact). And there are 12 combinations of rat-times (Three, 5 1/2-hour intervals for 4 rats), meaning either one rat-time combination results in death, or no rats die. = 13.</td></tr></table>

I wasn't looking for the latter answer, but I left myself open to it :).
 
Last edited:
Knock yourself out, though I'm not sure what can be gained.

Mix two samples, each from six bottles. Give to two rats and observe.

If one dies, split that group of six bottles into two samples of three bottles each. Now give each sample to a rat.

One rat will die narrowing it to three bottles. Two rats left, give them each a sample from a single bottle of the three.

If either rat dies, you have the poisoned bottle. If neither die, give the rat you like least a taste from the last poisoned bottle just to be sure!
 
Re: Monty Hall, we have a PROBLEM

Mix two samples, each from six bottles. Give to two rats and observe.

If one dies, split that group of six bottles into two samples of three bottles each. Now give each sample to a rat.

One rat will die narrowing it to three bottles. Two rats left, give them each a sample from a single bottle of the three.

If either rat dies, you have the poisoned bottle. If neither die, give the rat you like least a taste from the last poisoned bottle just to be sure!
The only issue is we need time to do this, and I think we ran way over 36 hours.
 
Re: Monty Hall, we have a PROBLEM

In front of you are three boxes. Each box has two drawers on it, each drawer containing a coin. One box has two gold coins. Another has two silver coins. The third has one gold and one silver. You randomly pick a box and randomly open a drawer. You find a gold coin. At this point, what are the odds you have the box with two gold coins?

2/3. We can obviously rule out the box with two silvers. Now there are two boxes in play, but there are THREE gold coins available to select from them. It must be either gold coin A from the two gold box (win,) gold coin B from the two gold box (win,) or the gold coin from the gold/silver box (lose.)
 
Re: Monty Hall, we have a PROBLEM

In front of you are three boxes. Each box has two drawers on it, each drawer containing a coin. One box has two gold coins. Another has two silver coins. The third has one gold and one silver. You randomly pick a box and randomly open a drawer. You find a gold coin. At this point, what are the odds you have the box with two gold coins?

I had to avert my eyes oh Lord when I unwittingly Replied With Quote :)

My answer:
I have a 2/3 probability of having picked the box with two gold coins. This is because I either (a) picked drawer 1 from the Gold-Gold box, (b) drawer 2 from the Gold-Gold box, or (c) the Gold-coin drawer from the Gold-Silver box. That last one had a 1/3 probability, so the remaining 2/3 chance is that I picked one of the Gold-Gold box drawers.
 
Re: Monty Hall, we have a PROBLEM

owls: it's a variation on the Monty Hall prob, I think:

You have a 50/50 shot at the two golds. Why? You eliminated the silver/silver at this point (much like removing one door that contains the goat).

Crap. You are right. I just looked it up. Stupid math.
 
Last edited:
Re: Monty Hall, we have a PROBLEM

I remember reading this story about a computer programming tournament to write a program that could win at rock-paper-scissors.

It was a round-robin format, each program would play a bunch of games against each of the other programs, aiming for the most overall wins. There was all kind of debate on what strategies would be the most effective.

One team got clever and simply entered several dozen identical programs, save for one feature. Each program had a distinct sequence of 5 moves it would throw to open each game, and they were programmed to recognize the other programs' opening sequences. If two of their programs met and recognized each other's sequences, one would immediately begin throwing nothing but rock and the other nothing but paper.

They took something like the top 10 spots AND the bottom 10 spots. Next year they limited entries to one per team :D
 
Re: Monty Hall, we have a PROBLEM

Twitch Boy's post reminded me of a different computer programming tournament.

The challenge: you and another player have an opportunity to enter into a transaction. You can either decline the transaction, or agree to proceed. If you agree to proceed, do you uphold your end of the bargain, or do you take the other player's money / goods and run?

The tournament tested a wide variety of strategies in "competition" with each other.

Obviously, in the short run, it would appear that you come out ahead by betraying the other.

However, in the long run, I'm sure many people could guess what the optimal strategy was, no matter how many variations and ideas were tested:

The optimal long-term strategy was called "tit for tat." It was very simple. You always cooperate, unless you are betrayed. Once you are betrayed, you never enter into another transaction again with the player that betrayed you.

If you do a search for the name of the winning strategy in quotation marks, you will find a number of articles describing the tests and outcomes in more detail.
 
Last edited:
Re: Monty Hall, we have a PROBLEM

There are five pirates. (We'll call them A, B, C, D, and E in order of seniority.) They come across 100 gold coins and have to figure out how to divvy them up.

They decide to do this: Pirate A will propose a distribution. All five pirates (including A) then vote whether to accept the distribution, or throw Pirate A overboard. If they throw him overboard, Pirate B gets to propose a distribution, and the surviving pirates vote to accept it or throw him overboard, and so on. Majority rules, tie vote equals acceptance.

The pirates are logical beings and do not trust each other (no promises, retaliation considerations, pinky swears, etc.) Their motivations in order of priority:

1) Survive.
2) Get as many gold coins as you can out of the deal.
3) All other things being equal, throw someone overboard (because they're pirates and it's fun!)

How many gold coins can A get away with proposing for himself?

Believe it or not, A can get away with 98 coins.

We'll work backwards. Start with just D and E. D proposes D100/E0 and there's not a darn thing E can do about it. D accepts, E rejects, proposal passes.

Now bring back C. We know that if C gets thrown overboard, D will offer D100/E0 and E gets hosed, so C can simply offer E 1 coin to get his vote. C proposes C99/D0/E1, C and E accept, D rejects, proposal passes.

Now bring back B. We know that if B gets tossed C will offer C99/D0/E1 and D gets hosed. B can offer D 1 coin to obtain his vote, and doesn't have to worry about C or E (who have motivation to toss him anyway.) B proposes B99/C0/D1/E0, B and D accept, C and E reject, proposal passes.

Now bring back A. We know that if A gets tossed B will offer B99/C0/D1/E0, and C and E get hosed. All A has to do is offer C and E one coin each and he has their vote, and doesn't have to worry about B or D. A proposes A98/B0/C1/D0/E1, A, C, and E accept, B and D reject, proposal passes.

You can even extend this out to an infinitely large number of pirates. Simply offer every alternating pirate 0, 1, 0, 1, etc. and yourself the rest.
 
Last edited:
Re: Monty Hall, we have a PROBLEM

Here is a famous contemporary unsolved puzzle: the Kryptos sculpture outside CIA headquarters.

Kryptos is a sculpture located on the grounds of CIA Headquarters in Langley, Virginia. Installed in 1990, its thousands of characters contain encrypted messages, of which three have been solved (so far). There is still a fourth section at the bottom consisting of 97 characters which remains uncracked. This webpage contains some information about the sculpture, including some photos collected from around the web, some rubbings of the sculpture taken by your intrepid webmistress, links to other articles and Kryptos discussion groups here and there, and information about other encrypted sculptures which have been created by the sculptor, Jim Sanborn.

http://www.elonka.com/kryptos/
 
Re: Monty Hall, we have a PROBLEM

There are five pirates. (We'll call them A, B, C, D, and E in order of seniority.) They come across 100 gold coins and have to figure out how to divvy them up.

They decide to do this: Pirate A will propose a distribution. All five pirates (including A) then vote whether to accept the distribution, or throw Pirate A overboard and let Pirate B propose. Majority rules, tie vote equals acceptance.

The pirates are logical beings and do not trust each other (no promises, retaliation considerations, pinky swears, etc.) Their motivations in order of priority:

1) Survive.
2) Get as many gold coins as you can out of the deal.
3) All other things being equal, throw someone overboard (because they're pirates and it's fun!)

How many gold coins can A get away with proposing for himself?

I LOVE this problem. However I'm perplexed about what to do if you can't extend the "accept / throw overboard" option for Pirate B, C, .... By omitting that do you really mean it can't be extended on down to the 2nd-to-last Pirate?
 
Re: Monty Hall, we have a PROBLEM

I LOVE this problem. However I'm perplexed about what to do if you can't extend the "accept / throw overboard" option for Pirate B, C, .... By omitting that do you really mean it can't be extended on down to the 2nd-to-last Pirate?

Yeah, I should have been more clear about that. That's how it works. If A gets tossed, B gets to propose and the surviving pirates vote to accept or toss him. Rinse and repeat until a proposal is accepted.
 
Re: Monty Hall, we have a PROBLEM

Yeah, I should have been more clear about that. That's how it works. If A gets tossed, B gets to propose and the surviving pirates vote to accept or toss him. Rinse and repeat until a proposal is accepted.
Interesting little thought experiment, but it sure doesn't translate to real life. If I'm pirate E, I'm *definitely* voting against pirate A if he proposes to keep 98 coins - I'd very happily "spend" my one coin just to bugger that selfish SOB. So if the question is posed as "how many coins can A get away with proposing for himself?" then the answer is absolutely not 98. I'd rather see D walk away with 100 when it comes down to the two of us - I couldn't blame D for taking all 100, but I absolutely have a problem with A trying to take 98 for himself first. One of your premises is that the pirates can't trust each other; pirate A certainly shouldn't trust me to take my one stinking coin if I'm pirate E even if that's the "rational" thing to do. In fact, in the real world, I would think that pirate A would need to propose something less than 20 coins for himself in order to ensure a majority vote. Even if he proposes an even split of 20 each, the remaining 4 would probably all think they can get more by splitting it fewer ways.

Edit: Also, E has no skin in the game - he can't be tossed overboard, because he'll never have to make a proposal. So if the best you offer him is 1 coin better than his status quo when they are 100 on the table, I'd predict a "no" vote virtually every time.
 
Last edited:
Interesting little thought experiment, but it sure doesn't translate to real life. If I'm pirate E, I'm *definitely* voting against pirate A if he proposes to keep 98 coins - I'd very happily "spend" my one coin just to bugger that selfish SOB. So if the question is posed as "how many coins can A get away with proposing for himself?" then the answer is absolutely not 98. I'd rather see D walk away with 100 when it comes down to the two of us - I couldn't blame D for taking all 100, but I absolutely have a problem with A trying to take 98 for himself first. One of your premises is that the pirates can't trust each other; pirate A certainly shouldn't trust me to take my one stinking coin if I'm pirate E even if that's the "rational" thing to do. In fact, in the real world, I would think that pirate A would need to propose something less than 20 coins for himself in order to ensure a majority vote. Even if he proposes an even split of 20 each, the remaining 4 would probably all think they can get more by splitting it fewer ways.

Edit: Also, E has no skin in the game - he can't be tossed overboard, because he'll never have to make a proposal. So if the best you offer him is 1 coin better than his status quo when they are 100 on the table, I'd predict a "no" vote virtually every time.

By pure logic, Twitch's answer is correct, except we know that pirates (or people) in that case will never act completely rational.
 
Re: Monty Hall, we have a PROBLEM

A hot topic that seems to be making the rounds on Facebook the last couple days is "Take the High School Quiz", which tests an adult's ability to answer questions that a current high school senior be able to answer, in a variety of areas (chemistry, algebra, physics, etc.).

One of the questions (to which I am sure many adults would leap to an incorrect conclusion) is this:

Q: A horse runs a two-lap race around a circular track. During the first lap, its average speed is 20 miles per hour. What must the horse's average speed be during the second lap so its average speed over the course of the entire two-lap race is 40 m.p.h.?

In light of the method used to answer the above question, I'll pose another one (see below, not on the Quiz). The parameters are different, but should the thinking used to solve this one be the same?

Q: Larry is missing a concert ticket to a show he's been waiting months to see. If he doesn't leave his home in half an hour, he won't get there on time. The only thing he knows for a certainty is it's in a pocket in a pair of pants he owns. He also knows that he hasn't done laundry since he bought the ticket a week ago. The one thing about Larry, though, is when something goes missing, it always ends up being in the last likely place he mentally lists, which is this: He owns 11 pairs of pants, all of which have four pockets (two front, two back).

He now goes on his search, and for the first 15 minutes, he manages to search at an average rate of 40 pockets per hour. Is it possible for him to find his concert ticket on time?



Bonus: What is the ESSENTIAL difference between the first and second question?
 
Last edited:
Re: Monty Hall, we have a PROBLEM

A hot topic that seems to be making the rounds on Facebook the last couple days is "Take the High School Quiz", which tests an adult's ability to answer questions that a current high school senior be able to answer, in a variety of areas (chemistry, algebra, physics, etc.).

One of the questions (to which I am sure many adults would leap to an incorrect conclusion) is this:

Q: A horse runs a two-lap race around a circular track. During the first lap, its average speed is 20 miles per hour. What must the horse's average speed be during the second lap so its average speed over the course of the entire two-lap race is 40 m.p.h.?

In light of the method used to answer the above question, I'll pose another one (see below, not on the Quiz). The parameters are different, but should the thinking used to solve this one be the same?

Q: Larry is missing a concert ticket to a show he's been waiting months to see. If he doesn't leave his home in half an hour, he won't get there on time. The only thing he knows for a certainty is it's in a pocket in a pair of pants he owns. He also knows that he hasn't done laundry since he bought the ticket a week ago. The one thing about Larry, though, is when something goes missing, it always ends up being in the last likely place he mentally lists, which is this: He owns 11 pairs of pants, all of which have four pockets (two front, two back).

He now goes on his search, and for the first 15 minutes, he manages to search at an average rate of 40 pockets per hour. Is it possible for him to find his concert ticket on time?



Bonus: What is the ESSENTIAL difference between the first and second question?

Larry needs to do laundry more often. He also shouldn't mix his dirty and clean clothes either.
 
Back
Top