Marginal generally refers to something that’s a small effect, something that’s on the outside of a bigger system. It tends to diminish the importance of whatever is described as “marginal”.

So how does that apply to the probability of a subset of random variables?

Assuming that words get used because of their meaning can be a risky proposition in mathematics, so I know there isn’t necessarily an answer here, but sometimes the answer to this sort of question can help you to gain genuine insight, hence why I’m asking.

**Answer**

Consider the table below (copied from this website) representing joint probabilities of outcomes from rolling two dice:

In this common and natural way of showing the distribution, the marginal probabilities of the outcomes from the individual dice are written literally in the margins of the table (the highlighted row/column).

Of course we can’t really construct such tables for continuous random variables, but anyway I’d guess that this is the origin of the term.

**Attribution***Source : Link , Question Author : stephan , Answer Author : Jake Westfall*