I have some covariance matrix
$$A = \begin{bmatrix}121 & c\\c & 81\end{bmatrix}
$$The problem is to determine the possible values of $c$.
Now I know that the elements of this matrix are given by the usual definition of the covariance,
$$ \frac{1}{N1} \sum_{i=1}^N (X_i – \bar{x})(Y_i – \bar{y})$$
and so e.g.
$$ \frac{1}{N1} \sum_{i=1}^N (X_i – \bar{x})^2 = 121$$
$$ \frac{1}{N1} \sum_{i=1}^N (Y_i – \bar{y})^2 = 81$$
But I can’t see how to go from here to determining $c$?
Answer
You might find it instructive to start with a basic idea: the variance of any random variable cannot be negative. (This is clear, since the variance is the expectation of the square of something and squares cannot be negative.)
Any $2\times 2$ covariance matrix $\mathbb A$ explicitly presents the variances and covariances of a pair of random variables $(X,Y),$ but it also tells you how to find the variance of any linear combination of those variables. This is because whenever $a$ and $b$ are numbers,
$$\operatorname{Var}(aX+bY) = a^2\operatorname{Var}(X) + b^2\operatorname{Var}(Y) + 2ab\operatorname{Cov}(X,Y) = \pmatrix{a&b}\mathbb A\pmatrix{a\\b}.$$
Applying this to your problem we may compute
$$\begin{aligned}
0 \le \operatorname{Var}(aX+bY) &= \pmatrix{a&b}\pmatrix{121&c\\c&81}\pmatrix{a\\b}\\
&= 121 a^2 + 81 b^2 + 2c^2 ab\\
&=(11a)^2+(9b)^2+\frac{2c}{(11)(9)}(11a)(9b)\\
&= \alpha^2 + \beta^2 + \frac{2c}{(11)(9)} \alpha\beta.
\end{aligned}$$
The last few steps in which $\alpha=11a$ and $\beta=9b$ were introduced weren’t necessary, but they help to simplify the algebra. In particular, what we need to do next (in order to find bounds for $c$) is complete the square: this is the process emulating the derivation of the quadratic formula to which everyone is introduced in grade school. Writing
$$C = \frac{c}{(11)(9)},\tag{*}$$
we find
$$\alpha^2 + \beta^2 + \frac{2c^2}{(11)(9)} \alpha\beta = \alpha^2 + 2C\alpha\beta + \beta^2 = (\alpha+C\beta)^2+(1C^2)\beta^2.$$
Because $(\alpha+C\beta)^2$ and $\beta^2$ are both squares, they are not negative. Therefore if $1C^2$ also is nonnegative, the entire right side is not negative and can be a valid variance. Conversely, if $1C^2$ is negative, you could set $\alpha=c\beta$ to obtain the value $(1C^2)\beta^2\lt 0$ on the right hand side, which is invalid.
You therefore deduce (from these perfectly elementary algebraic considerations) that
If $A$ is a valid covariance matrix, then $1C^2$ cannot be negative.
Equivalently, $C\le 1,$ which by $(*)$ means $(11)(9) \le c \le (11)(9).$
There remains the question whether any such $c$ does correspond to an actual variance matrix. One way to show this is true is to find a random variable $(X,Y)$ with $\mathbb A$ as its covariance matrix. Here is one way (out of many).
I take it as given that you can construct independent random variables $A$ and $B$ having unit variances: that is, $\operatorname{Var}(A)=\operatorname{Var}(B) = 1.$ (For example, let $(A,B)$ take on the four values $(\pm 1, \pm 1)$ with equal probabilities of $1/4$ each.)
The independence implies $\operatorname{Cov}(A,B)=0.$ Given a number $c$ in the range $(11)(9)$ to $(11)(9),$ define random variables
$$X = \sqrt{11^2c^2/9^2}A + (c/9)B,\quad Y = 9B$$
(which is possible because $11^2 – c^2/9^2\ge 0$) and compute that the covariance matrix of $(X,Y)$ is precisely $\mathbb A.$
Finally, if you carry out the same analysis for any symmetric matrix $$\mathbb A = \pmatrix{a & b \\ b & d},$$ you will conclude three things:

$a \ge 0.$

$d \ge 0.$

$ad – b^2 \ge 0.$
These conditions characterize symmetric, positive semidefinite matrices. Any $2\times 2$ matrix satisfying these conditions indeed is a variance matrix. (Emulate the preceding construction.)
Attribution
Source : Link , Question Author : user1887919 , Answer Author : Max