Standard deviation is simply a measure of the variability of your data. As you know the mean or average is a measure of the central tendency of your data. The standard deviation gives you a way to tell what distance the data points will likely be from your data set mean.
For example we might have 2 data sets that we assume to be normally distributed. Both have a mean of 10, but one has a SD of 1 and the other has a SD of 2. If we were to graph the data point occurances of the two data sets, the first would be thinner than the second because with the smaller SD the data points would more likely occur closer to the mean.
Standard deviations helps you to understand your data. It's business use can best be understood by an example.
Suppose that you have a production line making widgets. After analyzing and recording the data from your production line you determine that your average defects for a production day of 10,000 units is 5(from say a years worth of data). You also calculate the standard deviation and determine that it is 1. You would probably assume that your distribution is normal. If you get 8 or more defects you will begin to suspect that something has changed in your production line, because your probability of getting 8 defects in a day is less than 1%.See Normal Probability Distributions