Sometimes I have seen textbooks refer to the second parameter in the normal distribution as the standard deviation and the variance. For example, the random variable X ~ N(0, 4). It’s not clear whether sigma or sigma squared equals 4. I just want to find out the general convention that is used when the standard deviation or the variance is unspecified.
From what I’ve seen, when statisticians* are writing algebraic formulas, the most common convention is (by far) $N(\mu,\sigma^2)$, so $N(0,4)$ would imply the variance is $4$. However the convention is not completely universal so while I’d fairly confidently interpret the intent as “variance 4”, it’s hard to be completely sure without some additional indication (often, careful examination will yield some additional clue, such as an earlier or subsequent use by the same author).
Speaking for myself, I try to write an explicit square in there to reduce confusion. For example, rather than write $N(0,4)$, I would usually tend to write $N(0,2^2)$, which more clearly implies that the variance is 4 and the sd is 2.
When calling functions in statistics packages (such as R’s
dnorm for one example), the arguments are nearly always $(\mu, \sigma)$. (As usεr11852 points out, check the documentation. Of course in the worst case – missing or ambiguous documentation, unhelpful argument names – a little experimentation would resolve any dilemma about which it used.)
* here I mean people whose primary training is in statistics rather than learning statistics for application to some other area; conventions can vary across application areas.