TheMcNamara fallacy (also known as thequantitative fallacy),[1] named forRobert McNamara, theU.S. Secretary of Defense from 1961 to 1968, involvesmaking a decision based solely on quantitative observations (ormetrics) and ignoring all others. The reason given is often that these other observations cannot be proven.
Daniel Yankelovich criticized McNamara's decision making as follows:
But when the McNamara discipline is applied too literally, the first step is to measure whatever can be easily measured. The second step is to disregard that which can't easily be measured or given a quantitative value. The third step is to presume that what can't be measured easily really isn't important. The fourth step is to say that what can't be easily measured really doesn't exist. This is suicide.[2]
While Yankelovich originally referred to McNamara's ideology during the two months that he was president ofFord Motor Company, commentators have also discussed the McNamara fallacy in relation to his attitudes during theVietnam War.
The McNamara fallacy is often considered in the context of the Vietnam War, in which the U.S. Secretary of Defense Robert McNamara attempted to reduce war to a mathematical model.
One example arose in an early 1962 conversation between U.S. Air Force Brigadier GeneralEdward Lansdale and McNamara. Lansdale reportedly told McNamara, who was trying to develop a list of metrics to allow him to scientifically follow the progress of the war, that he needed to add an 'x-factor'; McNamara wrote that down on his list in pencil and asked what it was. Lansdale told him it was the feelings of the common ruralVietnamese people. McNamara then erased it and sarcastically told Lansdale that he could not measure it.[3]
New York Times correspondentDavid Halberstam wrote about McNamara's fixation on metrics:
In 1964Desmond FitzGerald, the number-three man in the CIA, was briefing him every week on Vietnam, and FitzGerald, an old Asia hand, was made uneasy by McNamara's insistence on quantifying everything, of seeing it in terms of statistics, infinite statistics. One day after McNamara had asked him at great length for more and more numbers, more information for the data bank, FitzGerald told him bluntly that he thought most of the statistics were meaningless, that it just didn't smell right, that they were all in for a much more difficult time than they thought. McNamara just nodded curtly, and it was the last time he asked FitzGerald to brief him.[4]
Another example of the fallacy is the enemybody count metric which was taken to be a precise and objective measure of success. By increasing estimated enemy deaths and minimizing one's own, victory would be assured. Critics such as Jonathan Salem Baskin andStanley Karnow noted thatguerrilla warfare, widespread resistance, and inevitable inaccuracies in estimates of enemy casualties can thwart this formula.[5][6]
David Halberstam wrote of another such incident in which qualitative facts were disregarded due to quantitative bias:
One particular visit seemed to sum it up: McNamara looking for the war to fit his criteria, his definitions. He went toDanang [sic] in 1965 to check on the Marine progress there. A Marine colonel in I Corps had asand table showing the terrain and patiently gave the briefing: friendly situation, enemy situation, main problem. McNamara watched it, not really taking it in, his hands folded, frowning a little, finally interrupting. "Now, let me see," McNamara said, "if I have it right, this is your situation," and then he spouted his own version, all in numbers and statistics. The colonel, who was very bright, read him immediately like a man breaking a code, and without changing stride, went on with the briefing, simply switching his terms, quantifying everything, giving everything in numbers and percentages, percentages up, percentages down, so blatant a performance that it was like a satire. Jack Raymond of theNew York Times began to laugh and had to leave the tent. Later that day Raymond went up to McNamara and commented on how tough the situation was up in Danang, but McNamara wasn't interested in the Vietcong, he wanted to talk about that colonel, he liked him, that colonel had caught his eye. "That colonel is one of the finest officers I've ever met," he said.[7]
Halberstam would conclude "...he [McNamara] did not serve himself nor the country well; he was, there is no kinder or gentler word for it, a fool."[8]
Donald Rumsfeld, US Secretary of Defense underGeorge W. Bush, sought to prosecute wars with better data, clear objectives, and achievable goals. WritesJon Krakauer:
... the sense of urgency attached to the mission came from little more than a bureaucratic fixation on meeting arbitrary deadlines so missions could be checked off a list and tallied as 'accomplished'. This emphasis on quantification has always been a hallmark of the military, but it was carried to new heights of fatuity during Donald Rumsfeld's tenure atThe Pentagon. Rumsfeld was obsessed with achieving positive 'metrics' that could be wielded to demonstrate progress in theGlobal War on Terror.[9][dubious –discuss]
Although, Rumsfeld did show a sense of the existence, and importance, of data that was not quantifiable with his famous"There are unknown unknowns" answer to a question posed at a U.S. Department of Defense (DoD) news briefing on February 12, 2002.
This shows at least some progression away from the McNamara fallacy ('The fourth step is to say that what can't be easily measured really doesn't exist').[10]
There has been discussion of the McNamara fallacy in medical literature.[11][12] In particular, the McNamara fallacy is invoked to describe the inadequacy of only usingprogression-free survival (PFS) as a primary endpoint in clinical trials for agents treatingmetastatic solid tumors simply because PFS is an endpoint which is merely measurable, while failing to capture outcomes which are more meaningful, such as overallquality of life or overallsurvival.
Incompetitive admissions processes—such as those used for graduate medical education[13]—evaluating candidates using only numerical metrics results in ignoring non-quantifiable factors and attributes which may ultimately be more relevant to the applicant's success in the position.