Another Way to Look at Free-Throw Percentage

In a recent blog post, we linked to a New York Times article by John Branch showing that the percentage of made basketball free throws has remained steady for 50 years.

A reader named Ashley Smart (aptonym?) replied with an amplification/caveat that is well worth sharing:

I, like many of your other Freakonomics readers, was intrigued by John Branch’s article on free-throw shooting stagnation. Unlike many of your other readers, I would suppose, I instantly recalled a similar study which yielded a contrasting, if not contradictory result. You are probably not familiar with the other study, and that is quite forgivable; it was my own — impulsively undertaken, purely curiosity-driven, and very much unpublished.

Though relatively informal, my study was quantitative and easily repeatable. I simply found the average free-throw percentage of the top 20 N.B.A. free-throw shooters for each season, as listed in the N.B.A. encyclopedia, and plotted it as a function of time:


The contrast between this figure and the figure from the Times article is stark — the mean free-throw percentage in my study increased appreciably, particularly between the N.B.A.’s inception circa 1950, to the mid 1980’s. An exponential decay model (dashed line in the figure) seems to provide a reasonable fit, suggesting that the top free-throw shooters are asymptotically approaching a performance ceiling (which the model predicts is about 93.5 percent).

This result doesn’t contradict, but rather adds perspective to John Branch’s article: sure, average N.B.A. shooters have stayed nearly the same, but the best shooters have certainly gotten better. It’s not the mean; it’s the variance. There may be several potential explanations: growth of the league (more players, all else equal, increases the relative amount of poor shooters), less emphasis on free-throw shooting at certain positions, etc. Perhaps taking only the top 20 shooters has the effect of muting the statistical noise — i.e., the relatively small increase in the overall mean is real, just overshadowed by noise. No doubt your readers could produce much more convincing, more colorful explanations — I would rather leave that part up to them.


(1) The * in the figure title is because there were four seasons (’46, ’47, ’48, and ’00) for which I could only find data for the top 10 free-throw shooters.

(2) I fit the data with the model y-y_o = y_g*[1-exp(-x’/tau)]. y_o is then a baseline percentage, y_g is the gain from that percentage, so that y_max = y_o+y_g is the performance ceiling. x’ = x – x_o, where x_o is the initial time for exponential decay. Tau is then the characteristic growth time. I got (y_o = 48%, y_max = 93.5%, x_o = 1898, and tau = 47 years).


Leave A Comment

Comments are moderated and generally will be posted if they are on-topic and not abusive.



View All Comments »
  1. Matt says:

    Top 20 by percentage or by volume of shots?

    Thumb up 0 Thumb down 0
  2. Rebekah says:

    This suggests that the bottom tier of free-throw shooters has gotten worse over time. Not being a basketball fan, I can only assume other skills are being emphasized for certain positions. Do the top free-throw shooters represent a particular position (or two)?

    Thumb up 0 Thumb down 0
  3. Adam says:

    Looking at the graph, you would think they would be shooting better than 0.9% :)

    Thumb up 0 Thumb down 0
  4. chappy says:

    I said this earlier but it seems like their is a huge selection issue over time. If we assume that there is more shooting variance over time (and we assume that advance scouting has gotten better) there is a greater cost to fouling the best players. I think the fact that the “Hack a Shaq” strategy has been in vogue helps illustrate.

    Thumb up 0 Thumb down 0
  5. frankenduf says:

    i wonder if (relative) # of fouls called has been constant- it may be the case that if fouls are increasing, there may be an increase in the ‘hack-a-(your gangly big man here)’ strategy here, where not only skill of shooter but allowable dilution of free throw percentage comes into play- come to think of it, who invented the hack-a-shaq? did it predate shaq foo?

    Thumb up 0 Thumb down 0
  6. Math Man says:

    A brief review of 2007-08 statistics reveals a probable explanation for the difference between these two analyses. Generally speaking, players that have a higher free throw percentage attempt fewer free throws, and players with lower free throw percentages attempt more free throws.

    This makes sense. Late in a close game, the losing team will foul the opposing team players, to stop the clock and have them shoot free throws, instead of letting the clock run and risking an even higher score. The strategy only works when the winning team misses the free throws. So, the losing team will foul the worst shooter in the game.

    So, the analysis of the top 20 free throw percentages does show that free throws are being made more often by the best free throw shooters. The league average is pulled down by a corresponding drop in free throw percentage by the worst shooters.

    Thumb up 0 Thumb down 0
  7. Andrew says:

    Hey I agree with Chappy. Terrible free throw shooters will get fouled more, and good free throw shooters will be fouled less. But that has probably not changed over time.

    I think it is very interesting that the best players in the NBA have been getting better. I do not think that the self selection factor explains it all.

    Thumb up 0 Thumb down 0
  8. Zach says:

    I wonder if part of the explanation for these contradictory conclusions is that NBA players are becoming more specialized over time, similar to pitchers in baseball. If some players specialized in shooting (free-throw and otherwise) at the expense of other skills and some players specialized in non-shooting skills (thus hurting their FT%), then we might expect little change in the league-wide average, but a wider distribution. The wider distribution would explain Amy’s findings.

    Thumb up 0 Thumb down 0