What is the variance of the maximum of a sample?

I’m looking for bounds on the variance of the maximum of a set of random variables. In other words, I’m looking for closed-form formulas for $B$, such that
$$\mbox{Var}(\max_i X_i) \leq B \enspace,$$
where $X = \{ X_1, \ldots, X_M \}$ is a fixed set of $M$ random variables with finite means $\mu_1, \ldots, \mu_M$ and variances $\sigma_1^2, \ldots, \sigma_M^2$.

I can deduce that
$$\mbox{Var}(\max_i X_i) \leq \sum_i \sigma_i^2 \enspace,$$
but this bound seems very loose. A numerical test seems to indicate that $B = \max_i \sigma_i^2$ might be a possibility, but I have not been able to prove this. Any help is appreciated.

For any $n$ random variables $X_i$ , the best general bound is
$\newcommand{\Var}{\mathrm{Var}}\Var(\max X_i) \le \sum_i \Var(X_i)$ as stated in the original question.
Here is a proof sketch: If X,Y are IID then $E[(X-Y)^2] =2\Var(X)$. Given a vector of possibly dependent variables $(X_1,\ldots ,X_n)$, let $(Y_1,\ldots ,Y_n)$ be an independent vector with the same joint distribution. For any $r>0$, we have by the union bound that $P[ |\max_i X_i-\max_i Y_i|^2 >r] \le \sum_i P[ | X_i-Y_i|^2 >r]$, and integrating this $dr$ from $0$ to $\infty$ yields the claimed inequality.
If $X_i$ are IID indicators of events of probability $\epsilon$,
then $\max X_i$ is an indicator of an event of probability $n\epsilon+O(n^2 \epsilon^2)$. Fixing $n$ and letting $\epsilon$ tend to zero, we get $\Var(X_i)=\epsilon-\epsilon^2$ and $\Var(\max_i X_i)= n\epsilon +O(n^2\epsilon^2)$.