One of my favorite lectures at SIPA was Paul Thurman’s last lecture for Quantitative Analysis for International & Public Affairs.
Stats is required for every SIPA student, not just those concentrating in economic development or international finance, and a lot of people hate it in the same way that high school students hate math. Thurman was tasked with the very tough job of convincing us that it was relevant to each and every one of our careers. After a semester of practice sets and STATA and normal distributions and t-tests and midterms, Thurman sat us down, flipped on a projector, and pretty much convinced us all.
We had just finished presenting our final projects, which (he teased) had probably been rush jobs—all nighters pulled in the library frantically running multiple regressions. He asked us to walk through another rushed stats project with him to see if we could figure out what had gone wrong. The slides he put up were partially blacked out to protect the confidentiality of the client, but the essence was that a set of tests to identify possible correlations between temperature and the failure of a certain mechanical part were being horrifically misinterpreted.
Turns out the mechanical part was an O-ring—the O-ring that failed on an unusually cold day in January 1986, causing the Challenger to explode.
While the engineers in charge of determining safety—people whose jobs it was to run the numbers and interpret the math—had correctly identified this as a problem, multiple people at other layers of the project—managers at both NASA and Morton Thiokol, the contracting company responsible for building the O-rings—had decided to go ahead with the launch anyway.
Thurman’s point was that statisticians, engineers, and data geeks aren’t the only ones who need to pay attention to the numbers. Most of us would at some point be in a position where we would need to make decisions based on quantitative analysis, and given our collective interest in development, finance, and economics, many of these decisions could have real, serious impact on people’s lives. He then put a photo of his kids up on the screen and charged us with making sure, essentially, that we didn’t fuck things up for them or future generations.
(It sounds almost unforgivably hokey now, but I wasn’t the only one with goosebumps, and some people actually cried.)
Four years later, I’m taking David Malan’s intro to computer science course, CS50. Monday’s class was partially about imprecision in float variables, and Malan showed a video that took me right back to Thurman’s class:
(In case you’re not up for 9 minutes of Modern Marvels: because “one tenth of a second” can’t be represented precisely in binary, the clock on early versions of the Patriot missile lost precision over time. Because of this error, the missile failed to intercept an incoming Iraqi missile during the Gulf War, leading to the deaths of 28 American soldiers. A short explanation is here.)
To sum up: numbers are really important.