(underline mine)Normally distributed IQ with mean 100 IQ have a standard deviation of 15. Up to 15 IQ points may be due to random variations.
That's an interesting generalization, Nightmare, but it's not commensurate with either the definition or application of "standard deviation." All one can say is that, on average, 34.1% of the observations occur between the mean and one standard deviation from the mean in either direction, or 68.2% of the observations occur, on average, within one standard deviation of the mean.
What one cannot say is that values within one standard deviation "may be due to random variations." That may have been given to you at some point in time as a rule of thumb, but it's a poor one, as it fails to consider both the sample size as well as the C.I. (confidence interval), both of which play a hard and fast i.e. mandatory role in calculating statistical significance (that which may due to random variations vs that which is due to the actual value as compared to the data set itself).
Standard deviation alone will not give you that that answer. For that, one must resort to hypothesis testing.