Standard deviation is a way of indicating the average amount that the figures you have cluster around a mean value. It is used when you want to know how uncertain our values are. In other words, if we look at one value, how sure can we be that it gives us an good indication of reality.
If the standard deviation is low, our values are closer to the mean value and we can say that we are more certain of our data. In a data set with a low standard deviation, any one value is likely to more closely represent a trend than in a data set with a high standard deviation.
However, it doesn't necessarily mean that the value we have chosen does this. It only indicates that it is likely to. We could have all of our values very close to the mean value and then this one value way off the scale. This would still give a lower standard deviation than a data set with values all a long way from the mean. But if we chose the wayward value from the first data set as an indication of reality, we'd be wrong.
Standard deviation is therefore an important indication of the uncertainty of our data and should usually be reported when giving a set of numerical data values such as in lexicostatistical calculations.