Submitted on Monday, 2/26/2024, at 1:00 PM

In 2010, Jerome Himmelstein, the Andrew W. Mellon Professor of Sociology, launched “a numbers course that wasn’t about numbers.” With the Spring 2012 semester coming to a close, he discussed his Mellon Seminar, “Numbers Rule the World.”

Below are edited excerpts from an interview with Katherine Duke ’05.

2012_04_25_RM_HimmelsteinNumbers_010_cropped2
Professor Jerome Himmelstein with students in his Mellon Seminar, "Numbers Rule the World"

On the title “Numbers Rule the World”

That’s been a common claim, that numbers are the predominant form of evidence that people point to when they make a claim, [that] numbers are crucial in all kinds of decision-making, and so on. It’s a common theme in all the literature on numbers: numbers are important, [and] we live in a quantified world. The point of the course is to get people to think about the pervasiveness of numbers in their lives and to think more critically about numbers as they are used and presented to make a case for something.
 

On the structure of the course

We start by reading some basic texts on the role of numbers in public discourse. These are all “how numbers go wrong” books. We try to come up with a catalogue of ways that numbers get distorted or misused.

I argue that the real key issues with regard to the use of numbers aren’t numerical—they’re issues of methodology and research design. So that’s the second part of the course.

The third part is, we start looking at how numbers travel. Say a survey is done, so you end up with maybe hundreds of pages of results, and then the authors or the funding agency comes out with a summary and maybe a press release [or] press conference. The media report on this study and then may use those numbers in other contexts. I look, for example, at studies of drug use, particularly during the crack scare in 1986.

Then I complicate things by saying, “Look, the starting point is often complicated and obscure.” When you look at studies—for example, on the efficacy of drugs—we often find that you don’t know what studies are out there, because they’re selectively reported, selectively published. It takes some digging to figure out, what are all the studies saying about a particular antidepressant and what do they show? Then we look more generally at what are called “evidence gaps.”

[We end the course by] looking at how numbers frame our everyday lives. We look at things like the development of the SATs and their impact on people [and] the impact of public-opinion research on how people define themselves—and census data. Now we’re looking particularly at the issue of expertise. One argument is that quantitative expertise is, in some ways, replacing more traditional forms of expertise.
 

On the development of the Mellon Seminar

I got appointed Mellon Professor, which gave me a grant for three years, and I decided to use it to buy books. I’m constantly noticing studies in the media, claims being made, and then trying to track them down. I was also interested in research methodology; I teach a social research course. I’ve been on the CEP [Committee on Educational Policy], so I had all the documents that had been generated by the Committee on Quantitative Literacy. Most of the stuff on quantitative literacy says, “Yes, it should be taught across the curriculum,” but what they have in mind is different versions of a kind of statistics course. So I was wondering about doing a course that incorporated a little bit of that but provided more of a historical and sociological perspective.
 

On student work

There are two research projects and then a final paper. The final paper is, they’re going to look at three very different assessments of expertise and compare them and reach some kind of conclusion.

The first [research project] was to pick a set of numbers in the media and critically assess how well the media handle them. We go back to the methodological issues: Are there issues of causality? Establishing causality is a very complex process. There are issues that come up with measuring anything—for example, any measurement is imprecise, so any good measurement should have some way of indicating the range of measurement variation or error, but often that’s something that gets lopped off.

I have one econ student who writes a financial newsletter, who’s very interested in the rare earth metal industry, so he was attempting to explain how there are different ways of assessing the quality of an ore deposit and how some are clearly better than others, but the better numbers don’t always get the attention.

The second [research project] was to pick a study and show what happens to the numbers as they move from the study to the media.

Those two projects end up overlapping a lot.

 

2012_04_25_RM_HimmelsteinNumbers_006_web

On scientific truth

It’s often assumed that science speaks with a clear voice—that there is research out there that all says the same thing. That’s not always the case. Often, claims are made that are not based on any good research at all. Often, where there is good research, it’s then contradicted by later studies. The closer you look, the more contested scientific truth is. It’s not that there’s no truth out there, but establishing what’s true and then actually figuring out how to use that information is particularly complicated.

That kind of story, about the instability of scientific research, has itself now become news. Is the fact that there are some dissenters on global warming enough to say that anything might be true? I don’t want [students] to come away with that idea. I want them simply to have a more critical perspective and be able to ask good questions about what the research actually looks like. I told them at the beginning of the course, “Being a total skeptic is no better than being totally gullible.”
 

On the sociological theories and histories of numbers

[Michel] Foucault raised the idea that one important form of power occurs, in this very subtle way, simply by submitting everyone to a certain kind of measurement. When you measure something, you’re not just observing and reflecting the world—you have an impact on it. That feeds into a more recent body of literature that characterizes contemporary Western societies as “audit societies”: societies, increasingly, where people and institutions are responsible for evaluating themselves and presenting it as numbers that anyone, supposedly, can read and interpret. For example, law schools reorient themselves in important ways around their rankings.

There are so many distinct literatures that usually don’t talk to each other. There’s a literature on the use of numbers in public discourse and how to evaluate those numbers. There are multiple literatures on the history of measurement and the history of probability and statistics. Starting with the literature on quantitative literacy, I found myself branching out in multiple directions, and that’s exciting and it’s also very challenging to figure out how to synthesize it all.       

Photos by Rob Mattson