Fractals And Number Theory: Part 2

My research project is also within the subject of pure math, and I am investigating a type of numbers which are known as “normal numbers”. Normality is a property that we would expect a ‘random’ number to have, such as pi or e. Many of us have heard that these numbers have every possible combination of digits within them, but this is a property that would only occur if they were ‘normal’. These numbers are highly suspected to have the property of normality, but it has yet to be proven.

The mathematical definition of normality, in base 10, is that each digit should appear 1 out of 10 times in a number’s infinite decimal. That is, the probability of finding a 2 should be 1 out of 10, and the probability of finding the sequence “1234” should be 1 out of 104, since it is 4 digits long. We call the numbers we are used to base 10, because there are 10 different digits, from 0 to 9. Base 4, for example, would be the digits from 0 to 3 since there are 4 different numbers. A more proper version of normality would be to say that for a number in base β, each string of k digits should occur equally (1/β) k times. While this may seem intimidating, it is more generally explaining the same thing as before. For example, in base 2, where the only numbers are 0 and 1, we would expect each 1 to appear half of the time, every string of 00 to appear a quarter of the time, and so on.

An idea that may seem uncommon is the idea of an infinite decimal. Most of the numbers we know, such as 7, or 12 seem finite. In reality, every number has an infinite decimal expansion, such as 7.00000… and so on. An interesting note is that real numbers have a unique infinite decimal expansion, except for whole numbers. For example, 1 is equal to 1.000…, or 0.999… For our purposes however, none of these numbers are normal, since it was proved that no rational number is normal. This is because all of their infinite decimals are eventually repeating, so they can never have the property of being random.

One of the first numbers that was proved to be normal, almost 40 years after the concept of normality was created, is the Champernowne number, which is 0.123456789… Normality is a property expected from random numbers.
However, it is clear that although this number is normal, its digits are not random. This encouraged the idea of a new definition: strong normality. For a number to be strongly normal, it not only has to be normal, but the difference of the number of digits with the expected ratio (1/β)k has to oscillate with a certain frequency expected of random variables. There has been no strongly normal number discovered, and this is what my research mainly focuses on.

Image Source: Wikimedia

Tyler Kastner

Originally published in Bandersnatch Vol. 47 Issue 10 on March 14, 2018