How to equalize the chance of throwing the highest dice? (Riddle)

I just invented the following riddle, doing statistics work. (I actually need the answer!)


Imagine a dice game with the aim of throwing the highest dice.
The dice are special and have infinite sides with numbers ranging from 0 to 1! (uniform, no bias)

There are 2 players: Player-A has 3 dice to throw, player-B has 7
dice. This means player-B has a chance of 7/10 of winning, which is
to throw the highest number of all 10.

Now, to bring fairness to the situation, the players agree to multiply each number
thrown by player-A by a certain constant. What is the value of this
constant, so that each player has a 50% chance of winning?

Can you find a general formula to determine this constant, based on the amounts of dice the 2 players have?

(And in case this is a known problem: Do you know how this is called?)

Considerations/ Spoiler:
The adjustment-constant does not just depend on the ratio of throws (3:7 in this case); instead, the absolute number is important. For example, if the players had 300 and 700 throws, then this constant would be much closer to 1.

My intuition: I think a good estimate is to assume a homogeneous distribution of the throws: For example the 3 throws are at decimals 0.25, 0.5 and 0.75! Now the highest number would be 0.75! Do the same with player-B and you get the ratios of the expected highest numbers (-> the adjustment-constant). Unfortunately that’s just my intuition and I am not sure if this is correct.

I am thankful for all the answers but surprised that nobody used an approach similar to my described one. For completeness, here I explain where I was wrong:

I assumed the expected maximum of throws would be 1-1/(n+1), which is correct, as simulated by the following script:

import numpy as np import matplotlib.pyplot as plt

x,y,y2 = [],[],[] for n in range(1,21):
    temp = []
    for _ in range(10000):
        sample = np.random.random_sample(n,)

plt.title("Mean max = 1/(n+1)")     
plt.xlabel("Number of throws") 
plt.ylabel("Mean max of throws")

enter image description here

Which means, if we used a constant c to multiply each of the n throws of player A, the expected maximum would be equal to the m throws of player B, if we use this formula for c:

enter image description here (or)
enter image description here

But this is wrong, because the riddle does not try to equalize the mean of the maxima. Instead it wants to equalize the rank-sum of the 2 players distributions of maxima. (if we ranked each maximum throughout both distributions)

Here, just for illustrative purposes, I show how my formula is unable to accurately fit the median of maxima:

enter image description here


Multiply by \left(\frac{2(7)}{3+7}\right)^{1/3} = 1.1187

More generally, suppose that player A rolls n times and player B rolls m times (without loss of generality, we assume m \geq n). As others have already noted, the (unscaled) score of player A is
X \sim Beta(n, 1)
and the score of player B is
Y \sim Beta(m, 1)
with X and Y independent. Thus, the joint distribution of X and Y is
f_{XY}(x, y) = nmx^{n-1}y^{m-1}, \ 0 < x, y < 1.

The goal is to find a constant c such that

P(Y \geq cX) = \frac{1}{2}.

This probability can be found in terms of c, n and m as follows.

P(Y \geq cX) &= \int_0^{1/c}\int_{cx}^1 nmx^{n-1}y^{m-1}dydx \\[1.5ex] &= \cdots \\[1.5ex]
&= c^{-n}\left\{\frac{m}{n+m} \right\}

Setting this equal to 1/2 and solving for c yields

c = \left(\frac{2m}{n+m}\right)^{1/n}.

Source : Link , Question Author : KaPy3141 , Answer Author : knrumsey

Leave a Comment