This post contains some simple code for calculating the square root of 2 and then generating the bits of the value.

It’s a companion to a post on my other blog.

I may come back later and document this better, maybe post some more detailed code to go with it, so consider this a Published Draft post for now. Updates to follow (probably).

The whole point was finding the smallest possible program to generate 16,384 bits of the square root of 2. (All because Max Tegmark wrote a book in which he claimed it could be done in 100 bits. Not bytes; bits.)

This code has a few extras (print statements and timing instrumentation), so it’s still not the smallest possible package…

from time import perf_counter
def sqrt2 (depth=12):
    def tail (level):
        if depth < level:
            return (1,2)
        t = tail(1+level)
        return ((2*t[0])+t[1], t[0])
    t = tail(0)
    return (t[0]+t[1], t[0])

def divider (a, b, prec=12, base=2):
    if b == 0:
        raise ValueError('/0!')
    quo = [0]
    while a:
        if b <= a:
            quo[-1] += 1
            a -= b
            if prec < len(quo):
            a *= base
    return quo

t0 = perf_counter()
p,q = sqrt2(depth=500)
print(p); print()
print(q); print()
print(p/q); print()

t1 = perf_counter()
bits = divider(p, q, prec=2000)
print(bits[0:75]); print()

t2 = perf_counter()

The first function, sqrt2, calculates the recursive fraction for square root of 2, the second function, divider, implements a division algorithm that generates a binary representation of the fraction.

Here’s the recursive fraction we’re implementing:

The accuracy of the fraction depends on how deeply we recurse. The number of bits we get depends on the precision requested from the division.