Let's write a function to calculate the variance of a collection of numeric data points. The variance is a measure of dispersion, which means that if a set of data points are very far apart, their variance is high. If data points have values that are close to each other, their variance is low.
The variance is defined as the mean of squared differences between each data point and the mean of the data points. For clarity, the calculation should follow the logic below:
Write a function called variance
that computes the variance of a sample of data points. The function must receive as a single input a list of numeric data points and must return the variance of those data points as its output.
Note: To remove duplication, you can also define an additional function to calculate the mean (see steps 1 and 4).