Addition Is Useless, Multiplication Is King: Channeling Our Inner Logarithm

TIME magazine has been running a series called “Brilliant: The science of smart” by Annie Murphy Paul. The latest column, “Why Guessing Is Undervalued,” quoted several results from research on learning estimation, a topic near to my heart. One result surprised me particularly:

…good estimators possess a clear mental number line — one in which numbers are evenly spaced, or linear, rather than a logarithmic one in which numbers crowd closer together as they get bigger. Most schoolchildren start out with the latter understanding, shedding it as they grow more experienced with numbers.


I do agree that children start out with a logarithmic understanding. I first learned this idea from a wonderful episode of WNYC’s Radio Lab on “Innate numbers” (Nov. 30, 2009). The producers had asked Stanislas Dehaene to discuss his research on innate number perception.

One of his studies involved an Indian tribe in the Amazon. This tribe does not have words for numbers beyond five, and does not have formal teaching of arithmetic. But the members have a sophisticated understanding of numbers. They were given the following problem: on a line with one object at one end, and nine objects at the other end, they were asked, “How many objects would you place directly in the middle of the line?”

What number would you choose?

Twenty years ago I would have chosen five, for five is the average of one and nine: It is larger than one by four, and smaller than nine also by four. Thus, it is halfway on a linear (or additive) number line. Almost all Western students also choose five. However, the members of the Indian tribe chose three. That choice also makes sense, but in a different way: Three is larger than one by a factor of 3, and smaller than nine also by a factor of 3. Thus, three is halfway between one and nine on a multiplicative or logarithmic number line.

Dehaene concludes, and I agree, that our innate perception of numbers is logarithmic (or multiplicative); and that we learn our linear (or additive) scale through our culture. More details are in their paper in Science, whose abstract contains this summary:

This indicates that the mapping of numbers onto space is a universal intuition and that this initial intuition of number is logarithmic. The concept of a linear number line appears to be a cultural invention that fails to develop in the absence of formal education.

So far so good. However, in the Radio Lab interview that I mentioned above, and in the research on estimators cited in the TIME article, the further implication is that the culturally induced shift to a linear number scale is a good thing.

As a street-fighting mathematician, I respectfully disagree. I have been thinking about, teaching, and practicing the art of approximation for almost 20 years: at Caltech, the University of Cambridge, MIT, and Olin College. And I almost always use logarithmic number lines. Halfway between one and nine, I place three.

Here are reasons why a multiplicative, or logarithmic, number line is so useful:

  1. Many quantities grow by an amount proportional to the amount already present. For example, our raises or cost-of-living adjustments (if any!) are expressed as percentages, say 2.5 percent. This change is a multiplicative change: the new salary is 1.025 times the old salary. In contrast, an additive change would be: “Your annual raise is $2000.” If we happen to get the information in that form, as an absolute amount, we automatically translate it into a fractional change – that is, onto a multiplicative number line.

  2. Human perception follows the famous Weber–Fechner law: The just-noticeable difference between two stimuli is proportional to the magnitude of the stimuli. As an example, the light-intensity change to which we are sensitive is proportional to the current light intensity. More metaphorically: A $1000 per-head (poll) tax means far more to a migrant laborer than to a hedge-fund partner.

  3. The world talks to us using signals with a huge range of intensities. For example, our eyes function in starlight and bright sunlight, an intensity range of a factor of 1 billion. Our minds, including our perceptual hardware, manage this vast range partly by using a multiplicative number line. As one piece of that computation, the retina’s rod cells, each one “pixel” of an image, roughly computes the logarithm of light intensity; that is, it places the intensity onto a logarithmic (multiplicative) number line.

    Here are typical illumination levels (data from this Wikipedia article) shown on such a number line, with the levels given in scientific notation (where, for example, 102 means 100 and 105 means 100,000):

    This transformation compresses a huge range of light intensities into a manageable range: It replaces the absolute light intensity with the answer to the question, “How many factors of 10 does the intensity have?” (For those curious about how biochemistry can do such a computation: I also wondered about it, and wrote up an approximate analysis in Chapter 7 of my PhD dissertation.)
  4. In everyday life, multiplication is far more useful than addition. For example, suppose you want to estimate, very roughly, an annual budget. Maybe you are a management consultant trying to understand a new industry, or an intelligence agent trying to understand a foreign culture. A budget usually has several important components: for example, salaries and buildings. When finding the total cost, there are three possibilities:

    1. The salary cost is much larger than the buildings’ cost. In this case, the total cost is roughly just the salary cost.

    2. The buildings’ cost is much larger than the salary cost. In this case, the total cost is roughly just the buildings cost.

    3. The salary and buildings’ costs are comparable. Now the total cost is roughly just twice the salary (or buildings) cost.

    All we need is multiplication, or scaling, by 2. This idea of scaling is fundamentally different from the idea of repeated joining (how we often think of multiplication, because, sadly, we learn addition before multiplication — but that story is for another time).

    Here is another example. If you are like me, after you open your monthly credit-card statement, you might find yourself saying, “$1,500!! How in the
    hell did that happen?” To find out, I ignore all the trivial line items, like the pad of paper or the bagel and apple juice when I had no cash on hand. I run my eye over what remains in order to guesstimate two quantities: (1) the cost of a typical large item, and (2) the number of such items. Then I multiply these quantities and check whether the product mostly explains the monthly bill. Almost never do I start adding up the charges. (A far better way to spend time, alas, is to ensure that all the charges are legitimate.)

In short, we need multiplication, not addition. We need multiplicative number lines, not linear number lines. If we start out with logarithmic intuitions — and the evidence is strong that we do — let’s not shed them, let’s amplify them!

robin marlowe

why adding is not understood at all- or yet?


Nice post, but people tune out when you mention logs. Try thinking of multiplication as a count of iterations, so the 1 and 9 objects becomes 1 iteration of 3 and then 3 iterations of that.

I think of multiplication as "instances", as in 3 x 2 is 3 instances of 2. That's more technically correct than iterations - no process implications - but the word is more confusing.

Logs are hard to understand. There is no explanation why we have a base for natural logarithms other than that it calculates to that number. It is extremely difficult to put into simple words why that - or other selected number, like log10 - acts as a base for growth. We generally rely on illustrations, like a dead body cools in a room, but these raise questions of why this process involves more than one object in this setting and they teach the mechanics of calculating more than the underlying meaning. It's easy for a mathematically inclined person and nearly opaque to the vast majority.

All fields use rules of thumb. You can try to calculate how much the site work will be for a building - not in a big city, just on dirt that doesn't need environmental clean-up - or you can plug in $50 psf and adjust as you get bids.


Eric M. Jones

Here a way to think of Logs--

Let's say we have a string of digits like 1234567890. We can specify where we want to put a decimal place to get various values e.g. 1234567890 decimal-place 3 would be 123.4 etc.But we can represent ANY value if we don't restrict the decimal place to integral values....e.g. 1234567890 decimal-place 3.678=(whatever).

It should be clear to the student that we can use ANY number as the Base (e.g. 1234567890) and any number as the exponent e.g. 3.678. Certain bases like 2, 10, or "e" confer calculating ease in particular fields.

Eric Frey

I put zero on the line. That problem does not specify that you are trying to build a mathematical sequence. Only that there is 1 item in one place and 9 in another. Nothing about that imples that any number of other objects should be between them.

And multiplicative thinking can just as easily be misleading. It might be natural, but it's often unhelpful. People will drive to another store to save 50% on a $10 item, but won't lift a finger to save $5 when buying a computer. Same savings, but when viewed as 0.5% it seems unimportant.


"People will drive to another store to save 50% on a $10 item, but won’t lift a finger to save $5 when buying a computer."

Some people act like that. Others of us don't. It'd be interesting to find out why, and whether it actually has any relation to how people perceive quantities.


How does this explain people's seeming indifference between millions and billions? A billion dollars is a thousand times more than a million dollars, but people don't see it that way. Three orders of magnitude is pretty significant, even on a logarithmic scale.

Multiplicative thinking creates an inability to identify or care about small variances in large items. For estimating, that might be good, but for actual detail work, it's a terrible approach. If people intuitively understand the multiplicative approach, doesn't it make sense to teach them the technical skill of the additive approach in school instead?


Dear Mike;

When a person is used to thinking in terms of 10's, 100's or 1000's, it is not easy to begin to think in terms of 10,000's.... millions ... the more real opportunity (given or taken), perhaps the less indifference. Now, my husband is comfortable with thinking in large terms. But then,he reads the Times cover to cover - re wars, politics, economics, science---you name it- he is not indifferent to the numerical side.

Mike B

I'm going to give you a good example of why you're wrong. When I drive from Philly to Baltimore I'm not half way there when I hit the Delaware Memorial Bridge, however according to that tribesman 30 miles would be halfway through a 100 mile journey. Logs might work great for some abstract concepts, but the truth is that we LIVE on a number line. Distances are a number line. Cutting things to fit (or gluing them together) manipulates a number line. Most things you do, day in and day out, involve linear numbers, not logs. What good is someone who can calculate the porportion of their raise if fail at the basic number line math they need to function effectively in society.

Here's a little tip I've learned from Freakonomics, if our brains are hard wired for something that our ancestors needed to survive, it's probably hindering us in a modern society where we don't have to be on the constant look out for murderous bears. I'll provide a concrete example. People will often go well out of their way to save some small amount of money, let's say $25 on a $100 purchase. However when making a large purchase, like a car, people will always go for those $250 floor mats because in comparison to the size of the purchase that amount seems small. That's the result of logarithmic thinking...poor opportunity cost calculation. We would all be better off if making decisions based on absolute valuations of numbers instead of marginal ones.


Enter your name...

Sure, with 30% of the journey completed, you're not truly halfway there. But the human impact of a 60-mile trip and a 100-mile trip isn't that different. Neither are "short" trips. If the little kids ask, "are we there yet?" after 30 miles, in both cases, the answer is "No, and it's going to be a long time still."

Normal people (especially out on the "square states", where I learned to drive) might drive 30 miles for a fairly minor reason, but typically think twice -- and about as much -- about 60- and 100-mile trips. The actual distance, gasoline consumed, etc., is different, but the psychological impact is about the same.


Pretty sure you need both modes of thought; you can cherry pick examples where one system is better than the other, but it's hard to prove(imo impossible) that either is strictly better than the other.


I agree that we need both modes of thinking. But as the article points out, our school system seems to have chosen one mode and ignores the other. Schools need to get better at helping students use their brain the way it's designed in addition to using it for other less natural things.

Students also need to learn more about statistics in High School instead of Calculus. Engineers may need Calculus, but every citizen needs to be able to understand Statistics and Probabilities.

robin marlowe

Not instead of,but in addition to. that is the problem, until you understand it, you won't see the obvious hole in one.

santosh kumar

Your PhD dissertation is extraordinary. Most instructive and informative. And again why is this not taught at college. My teacher at college would always insist on setting a differential model and then follows the complicated solution. This simplifies and provides extraordinary insight into physical phenomena. Now a question arises, does our intuitive understanding of physical actors like force, velocity and space be equally exploited to get a more comprehensive view of physical world.

Adam W

Another example is the Richter scale. Most people think of a magnitude 6 earthquake being slightly better than a 7. In reality, due to the exponential nature of the Richter scale a 6 earthquake is only 1/10th the severity of a 7.

Pierre Maxted

Logarithmic scales - dontchajusluvem?! As an astronomer I'd have to vote for the magnitude scale for measuring stellar faintness as my favourite, but let's hear it for decibels and the Richter scale too.