The earlier publish contained an fascinating remark:
Is it true extra usually that
for giant n? Sorta, however the approximation will get higher if we add a correction issue.
If we sq. each side of the approximation and transfer the factorials to 1 facet, the query turns into whether or not
Now the duty turns into to estimate the center coefficient in once we apply the binomial theorem to (x + y)2n.
A greater approximation for the center binomial coefficient is
Now the appropriate hand facet is the primary time period of an asymptotic collection for the left. The ratio of the 2 sides goes to 1 as n → ∞.
We may show the asymptotic consequence utilizing Stirling’s approximation, but it surely’s extra enjoyable to make use of a likelihood argument.
Let X be a binomial random variable with distribution B(2n, 1/2). As n grows, X converges in distribution to a traditional random variable with the identical imply and variance, i.e. with μ = n and σ² = n/2. This says for giant n,
The argument above solely provides the primary time period within the asymptotic collection for the center coefficient. In order for you extra phrases within the collection, you’ll want to make use of extra phrases in Stirling’s collection. If we add a pair extra phrases we get
Let’s see how a lot accuracy we get in estimating 52 select 26.
from scipy.particular import binom
from numpy import pi, sqrt
n = 26
precise = binom(2*n, n)
approx1 = 4**n/sqrt(pi*n)
approx2 = approx1*(1 - 1/(8*n))
approx3 = approx1*(1 - 1/(8*n) + 1/(128*n**2))
for a in [approx1, approx2, approx3]:
print(precise/a)
This prints
0.9952041409266293 1.0000118903997048 1.0000002776131290
and so we see substantial enchancment from every further time period. This isn’t at all times the case with asymptotic collection. We’re assured that for a mounted variety of phrases, the relative error goes to zero as n will increase. For a set n, we don’t essentially get extra accuracy by together with extra phrases.
