Data Glossary

Coefficient of Variation
Coefficient of variation (CV) is a measurement that statisticians use to compare the variance in an estimate to the size of the estimate itself, and it is calculated using the standard error.   
Confidence Intervals
A range in which analysts believe that the true value of an estimate in the community falls, typically specified by how certain analysts are that the true value falls within the range.  Forsyth Futures uses a 95% confidence interval for most reports, meaning that analysts are 95% sure that the true value of the measure in the community falls within that range. 
Major Urban Counties
When comparing to other communities, the Forsyth Futures reports typically compare Forsyth County to Durham, Guilford, Mecklenburg, and Wake counties because they are the four other heavily populated counties with mid- to large-sized cities in North Carolina. 
Margin of Error
The difference between the the estimate of the value of the measure and the upper and lower bounds of the confidence interval is called the margin of error, which Forsyth Futures also reports at the 95% confidence level. 
Standard Error
Standard error is a measurement that statisticians use to measure variance, with higher scores indicating more variance and lower scores indicating less variance.  
Statistical Significance 
Statistically significant differences are differences that are likely not attributable to random chance.  Forsyth Futures describes measures as statistically different when analysts are at least 95% sure that the differences between two measures cannot be attributed to random chance alone. 
Survey Data
Data in which a sample smaller than the population of interest is measured to make estimates about the larger population. 
Total Count Data
An actual count of every person or event being studied
In reports, Forsyth Futures generally only calculates trends for indicators with with at least four years of data.  Data is considered trending upwards or downwards if later years of data are consistently higher or lower than earlier years of data at a statistically significant level. 
Variance is a measure of how spread out numbers in a particular dataset are.  Higher variance measures mean that the data in that dataset are more spread out than than in datasets with lower variance measures, and a variance measure of zero would indicate that all of the numbers in a dataset were exactly the same.