Truncated Paper-Rock-Scissors.

(If you haven’t read Part II of my Game Theory essays, you may be missing out.)

Ok, lets look at some Known and Unknown Games.

Rock-Paper-Scissors (henceforth RPS) is a known game. Ultimate judge of your opponent’s psycology, or trivial exercise in picking the best random move? It’s the latter, and sadly, this is the case for many games that can be reduced to RPS.

(If you don’t Know RPS, Suppose: if your opponent picks randomly, you have a 1/3 chance of tying, 1/3 of winning, and 1/3 of losing. All of your choices are equivalent. Congradulations, you now Know RPS.)

Consider the situation of Truncated Paper-Rock-Scissors (henceforth RPS-). In RPS-, there are two players, player A, and player B. Player A is prohibited from using rock. Player B is aware of this limitation.

Who is more likely to win? How more likely? What strategies might A employ? What about B? (Note: ‘Choose randomly’ is an acceptable strategy, however because R, P, and S now have actual functional differences, it drifts back towards an exercise in psychology.)

If a series of games are played, where A must win one game to win the series and B must win two, who is more likely to win, assuming A and B both choose from their rational choices*? (*hint: this is different than choosing randomly.) Assume that neither player wants the game to go on forever.

Eventually, extensive analysis of RPS- will turn it into an empty game. I just came up with it last night though, and I (believe that I) Know RPS-, yet I still find it interesting. So there might be salvation for equilibria*-less games. Gamist design depends on it.

(*A big word economists like to use, and subject of a future essay.)

### Like this:

Like Loading...

*Related*

“Assume that neither player wants the game to go on forever.”

It would muck up my math, but if I were to actually play this, I’d say that if the game lasts 10 throws, then player A wins. (You can increase this, to say, 20, if you want to reduce the issue of a possible tie-fest, and decrease it, to say 7, if you want to really put pressure on B.)

B wins after 2 turns.

Short version:

B plays scissors every turn. Since A would rather lose than tie, A plays paper.

Long version:

Rules

1. A can’t use rock (B knows this)

2. A needs to win 1 game, B needs to win 2 games

3. neither player wants the game to go on forever. i think, logically, what you mean is that if a player no longer has a chance to win, they would rather lose than tie forever?

Knowing (1), then by (3) player B would never play paper, because it could only result in a loss or a tie.

But, A, knowing this would never play scissors, as that would then only result in a loss or tie.

So, B would simply play scissors every turn.

Consider this: Playing Scissors every turn is a safe strategy for B. If A knows B will do this, A can play scissors every turn.

It’s interesting that in my mental calculus, I had B willing to change and you had A willing to change. Because if B changes to rock, he has a 50% chance to win, and a 50% chance to lose.

See also my note in comment 1: here B has a very strong incentive not to drag out the game.

(That’s there because B can force the game to go on forever; A can’t.)

Well, yes, I took the statement “Assume that neither player wants the game to go on forever” to mean both players found playing forever equally distasteful. But I guess B could throw down paper to A’s scissors just as easily.

If I understand your comment 1 correctly, you’re imposing another rule: A wins after 10 rounds. It doesn’t really matter what this number is as long as it’s present.

Then, everyone has to play randomly again, because

– no matter what B plays A can tie or win

– no matter what A plays, B can win

so any strategy can be beaten by an inverse counter strategy.

Randomly, A has amazing odds, because they have a 1/3 chance of winning on any given turn AND if B doesn’t win in 10 rounds they win as well. The exact odds are hard to calculate because it’s a first one to get # wins scenario, but I imagine A has something like 75% chance of winning.

Scratch that, I’m just lazy. If A wins after n turns, the odds of B winning for n>2 is:

the sum k=2 to n of (2^k-2)/3^k

which makes it pretty easy to see as n increases the odds of B winning increases.

at 5, the odds of B winning are 26.749%

at 10, 32.466%

at 20, 33.318%

yeah, i’ll give you 3 guesses what limit that sum approaches.

Here’s my analysis. Let’s suppose that n is infinity, and A and B choose randomly. A’s rational options are scissors and paper. B’s are rock and scissors. (B has no possible reason to play paper, so that’s out of the rational options, but rock can be worth it for the chance to smash scissors.)

I put A’s odds of winning at 5/9. Why? There’s four matchups:

A S/B S-tie

A S/B R-A loses

A P/B S- A loses

A P/B R- A wins

Let’s discount the ties, since we have an infinite number of games to work with. A has a 1/3 chance of winning any single relevant game, and B has a 2/3 chance of winning.

What’s A’s chance of winning the first game? 1/3. 2/3 of the time, it goes to a second game. A’s chance of winning that game are 1/3, so it’s 3/9 a wins on the first game, and 2/9 they win on the second, for a total of 5/9. (If A doesn’t win by the second game, by definition B has.)

Decreasing n, as you noted, makes A’s chances of winning larger.

To make a larger point, I suspect if you and I played this game it would suck, because you’d be choosing randomly and I’d know that.

If I was playing with someone where I had to analyze their risk-taking stance, it’d be interesting for a couple of throws.

And at the very least, I’ve enjoyed our correspondence on the matter.

yeah, you’re right, B wouldn’t play paper.

so, odds of B winning for n>=2 is

the sum k=2 to n of 1/(2^n)

a much less complex sum. math’s been a while, but i think you can simplify it to (2^(n-1)-1)/2^n

at any rate, as n increases, B’s odds approach 50%

n=1, 0%

n=2, 25%

n=3, 37.5%

n=4, 43.75%

n=5, 46.875%

n=10, 49.902%

yeah, i should have figured B wouldn’t play paper.

wow, you’ve come up with a much more interesting situation than i first thought.

note, unlike other versions of RPS, in this if you know your opponent is gauranteed to play randomly, you can improve your odds.

-if B knows A is playing randomly, B could just play scissors

-if A knows B is playing randomly (between rock and scissors) then A would always throw paper on the first throw (and then proceeding randomly) as a 50% chance of winning is better than 25%

because of this, the first throw would always be scissors / scissors. because B knows A wants to throw paper on the first throw and for that reason A has to block scissors. by induction this would be the next throw as well until you reach a point significantly close to n, where is the number of rounds you are playing.

so really what gets interesting is how close to n, B decides to start playing randomly. it’s in B’s best interest to start playing randomly as soon as possible, because the longer B waits the further from 50% their odds. but if A switches to paper while B is still throwing scissors, then A automatically loses.

for example, say n is 6.

imagine perfect random B verse A who throws paper first:

B’s odds of winning is Bwin6 – (1/8) = 35.94%

cf. Bwin6 = 48.4375%

but B throws scissors first verse A who throws paper first = Bwin6 + 1/4 = 73.4375%

and B throws scissors first verse A who throws random = (Bwin6 – 1/4)*1.5 + 1/4 = 60.16%

so, B would like to through scissors first, except B throws scissors first verse A throws scissors first = Bwin5 = 43.75%

so really, the best strategy as B is to play scissors until an arbitray point and then start playing randomly. to combat that, A has to play scissors and try to switch to paper exactly when B switches but not before (which is an automatic win).

as such, you’ve created a meta-game called when to switch. because if at any time you switch and A switches as well you lose the game, the best time to switch is dun-dun-dun random with sooner being better (even as early as the first round). so basically, if i were to play this game and n was set randomly to something like 8, i would randomly pick a number between say 1 and 5 in my head and play scissors until then and then play rock, continuing randomly from that point.

so…what’s the best defense against that? (and is there a better offense against it, or is stability reach)

ok, it took some talking back and forth, but i think it’s time to beat this mini-game.

best strategy for B:

randomly pick 2 numbers between 1 and n and on those throws play rock. play scissors on everything else.

odds of winning rapidly approach 100% as n increases.

btw, there’s no stable solution, because the best defense against this is to

(A) randomly choose two throws to play paper

which can be better combatted with the strategy (B) always play scissors (100%)

which can be better combatted with the strategy (A) always play scissors (100%)

bringing us full circle since this strategy will always beat that.

at any rate, the 2, in pick numbers really only exists for the possiblity you’re playing an A who does all scissors. provided A switches at least once then pick 1 number is enough.

however, if you do pick 1 then not switching becomes a valid strategy for A, so you have to cover it.

sorry for so many comments, i just keeping thinking of things.

until n=7, random is still best at the odds given before:

— random —

n=1, 0%

n=2, 25%

n=3, 37.5%

n=4, 43.75%

n=5, 46.875%

n=6, 48.4375%

— pick 2 —

n=7, 54.76%

n=8, 60.71%

n=9, 65.28%

n=10, 68.89%

n=20, 84.74%

n=40, 92.44%

n=100, 96.99%

etc

dog treat…[…]On Game Theory, Part II.5: Case Study « This Way Lies Madness[…]…