Confidence Interval

In Statistics, a confidence interval is a kind of interval calculation, obtained from the observed data that holds the actual value of the unknown parameter. It is associated with the confidence level that quantifies the confidence level in which the interval estimates the deterministic parameter.

Confidence Interval Definition

The confidence level represents the proportion (frequency) of acceptable confidence intervals that contain the true value of the unknown parameter. In other terms, the confidence intervals are evaluated using the given confidence level from an endless number of independent samples. So that the proportion of the range contains the true value of the parameter that will be equal to the confidence level. Mostly, the confidence level is selected before examining the data. The commonly used confidence level is 95% confidence level. However, other confidence levels are also used, such as 90% and 99% confidence levels.

Confidence Interval Formula

The confidence interval is based on the mean and standard deviation. Thus, the formula to find CI is

X̄ ± Zα/2 × [ σ / √n ]


X̄ = Mean

Z = Confidence coefficient

α = Confidence level

σ = Standard deviation

N = sample space

The value after the ± symbol is known as the margin of error.

Confidence Interval Table

The confidence interval table for Z values are given as follows

Confidence Interval

Z Value















How to Calculate Confidence Interval?

To calculate the confidence interval, go through the following procedure.

Step 1: Find the number of observations n(sample space), mean X̄, and the standard deviation σ.

Step 2: Decide the confidence interval of your choice. It should be either 95% or 99%. Then find the Z value for the corresponding confidence interval given in the table.

Step 3: Finally, substitute all the values in the formula.

Also, try out: Confidence Interval Calculator

Confidence Interval Example

Question: In a tree, there are hundreds of apples. You are randomly choosing 46 apples with a mean of 86 and a standard deviation of 6.2. Determine that the apples are big enough.


Given: Mean, X̄ = 86

Standard deviation, σ = 6.2

Number of observations, n = 46

Take the confidence level as 95%. Therefore, the value of z = 1.960 (from the table)

The formula to find the confidence interval is

X̄ ± Zα/2 × [ σ / √n ]

Now, substitute the values in the formula, we get

86 ± 1.960 × [ 6.2 / √46 ]

86 ± 1.960 × [ 6.2 / 6.78]

86 ± 1.960 × 0.914

86 ± 1.79

Here, the margin of error is 1.79

Therefore, all the hundreds of apples are likely to be between in the range of 84. 21 and 87.79.