Learning Some Logarithms
A colleague asked how many digits of \(\pi\) I can recite from memory. I realised I still remember the 27.3 digits I memorised in my childhood1 Wait, what does it mean to remember 0.3 digits? I know the digit after the last one is in the range 0–4, which is one bit of information, and the full digit would be just over three bits. Hence, 1/3 or 0.3 digits. Okay but why do I know the digit after the last is in the range 0–4? I deliberately picked a cut-off point that didn’t force me to round the last digit up, so that if I wanted to expand and learn even more digits later I wouldn’t have to re-learn the last rounded digit.. I was slightly embarrassed to admit this, because of how useless it is to know more than, say, four significant figures of \(\pi\).
There are many other useful values to memorise, e.g. some logarithms would be neat.
Base ten logarithms
I have picked up spaced repetition again, so I now have a way to learn some logarithms basically for free, as well as how to use them for mental maths. The values below are good to know and enough to get started (the rest can be roughly interpolated.)
- \(\;\;\;\; \log{1} = 0\)
- \(\;\;\;\; \log{2} \approx 0.3\)
- \(\;\;\;\; \log{3} \approx 0.5\)
- \(\;\;\;\; \log{5} \approx 0.7\)
- \(\;\;\;\; \log{8} \approx 0.9\)
- \(\;\;\;\; \log{10} = 1\)
A few weeks after I memorised that batch, I added a second batch to fill out some of the gaps. These are probably overkill but convenient once the memories are formed.
- \(\;\;\;\; (\log{1.25} \approx 0.1)\)
- \(\;\;\;\; (\log{1.6} \approx 0.2)\)
- \(\;\;\;\; (\log{2.5} \approx 0.4)\)
- \(\;\;\;\; (\log{4} \approx 0.6)\)
- \(\;\;\;\; (\log{6} \approx 0.8)\)
- \(\;\;\;\; (\log{7} \approx 0.85)\)
- \(\;\;\;\; (\log{9} \approx 0.95)\)
Base ten logarithms are particularly convenient because we write numbers in base ten, so we only need to know the logarithms of 1–10 to compute the logarithm of any other number.
For larger numbers, we can approximate using the rule that turns multiplication into addition2 This rule was in fact why Napier invented logarithms in the first place! Multiplication problems are hairy but addition is easier.
\[\log{50} = \log{(5 \times 10)} = \log{5} + \log{10} \approx 0.7 + 1 = 1.7\]
(Incorrect by 0.06 %.)
We can always just round to the nearest number with a bunch of zeroes on it:
\[\log{87234} \approx \log{(8.7 \times 10^4)} = \log{8.7} + \log{10^4} \approx 0.93 + 4 = 4.93\]
(Incorrect by 0.2 %.)
To approximate log 8.7, we see that it is about a third of the way between 8 and 10, and since all continuous functions are linear when you look up close, we just go a third of the way from log 8 to log 10, both of which we know!
Smaller numbers work on the same principle:
\[\log{0.055} = \log{(5.5 \times 10^{-2})} = \log{5.5} + \log{10^{-2}} \approx 0.73 - 2 = -1.27\]
(Incorrect by 0.8 %.)
Multiplication
If we need to multiply two numbers, say 365×48, we can do that by instead estimating the equivalent
\[10^{\log 365 + \log 48}\]
where the relevant logs turn it into something like
\[10^{2.55 + 1.68} = 10^{4.23} \approx 17000\]
(Incorrect by 3 %.)
Natural logarithms
If we add another two numbers to our memory we can do even more useful things:
- \(\;\;\;\; \log{e} \approx 0.43\)
- \(\;\;\;\; \frac{1}{\log{e}} \approx 2.3\)
Now we can change from base-ten logarithms to natural logarithms by plain multiplication with 2.3. Quick example:
\[\ln{57} \approx 2.3 \times \log{57} \approx 2.3 \times 1.77 \approx 4.1\]
(Incorrect by 1 %.)
One neat property of the natural logarithm is that for small \(x\), we can estimate \(\ln{(1 + x)} = x\). So \(\ln{1.03} \approx 0.03\).3 This sort of conversion is convenient when discussing growth rates of lognormal random walks and log-odds differences of small effects.
Powers
Raising a number to the power of another comes with another handy rule, namely that
\[a^b = 10^{b \log{a}}\]
Since we have \(\log{e}\) from before we can estimate
\[e^3 = 10^{3 \log{e}} \approx 10^{3 \times 0.43} \approx 10^{1.3} \approx 20\]
(Incorrect by 0.4 %.)
New here? I often write about mental calculation and getting an intuition for difficult mathematical ideas. You should subscribe to receive weekly summaries of new articles by email. If you don't like it, you can unsubscribe any time.
Square roots
Square roots are, of course, just powers under the hood.
\[\sqrt{512} = 512^{0.5} = 10^{0.5 \times \log{512}} \approx 10^{0.5 × 2.7} \approx 10^{1.3} = 20\]
(Incorrect by 13 % – the main imprecision enters the picture in the sloppy division of 2.7 by two. If we compute with 1.35 instead of 1.3, we get a quarter of the way between 20 and 30, i.e. 23, which is inaccurate only by 2 %.)
What’s neat is you can use the same technique for third roots, which are otherwise much trickier to guess!
Next steps
I’m not fast at doing this in my head yet, but I can at least do it without a calculator, which is more than I was able to before!
I would like to write a script that lets me practise this with random numbers to get a better feel for the distribution of logarithms. I have a vague suspicion that having a good feel for logarithms could be one of those secret superpowers that nobody knows is possible because people don’t bother doing it.
Imagine for example that someone asks what the arithmetic half of 80 is – you know immediately that it’s 40. What if someone asked you about the geometric half of 256? It’s 16, because \(\log{256} \approx 2.4\), half of which is \(1.2\), and \(10^{1.2}\) is 16.
You might guess already where I’m going with this’: \(16^2 = 256\), so “geometric half” is just a fancy way of saying square root. Geometric third is the third root. Geometric double is the square, and so on. Imagine you had as good a sense of powers of a number as you do for fractions and multiples! That sounds, at least to me, like it has to be useful.