OK, I am no pundit, but I am a scientist, so I know my way about the research methodology. Sampling procedures, survey design, data analysis with all the beta-weights and effect sizes are like a family to me. So, I get very interested when I hear stuff like this:
So, FOX News folks are discussing the issue of increase in liberal views among college graduates. The discussion is titled "Trouble with Schools: Indoctrination University" and "How well are universities teaching kids?" and the correspondents put into question the entire purpose of going to college, stating that it is an "open question". While I am laughing through my tears, I am tracking down the original document, you can read at your leisure right here
While Tucker Carlson admits to his sheer impotence when it comes to delivering the core of the study (he has not seen the methodology and claims that it's pretty straightforward - give college graduates a survey and then process their answers), I am diving right into the depths of the right-wing insanity, which The Intercollegiate Studies Institute that conducted the study, without a doubt, is, in hopes to understand how it was done.
I took a detour to look at the ISI's front page, just to make sure that Lasciate ogne speranza, voi ch'intrate is still inscribed above their gates. It is still there, but the translation is inaccurate. It says "Every study shows that the university is dominated by liberal professors. It is no wonder this country is currently on a slippery slope to socialism."
Part 1: Sampling. "The telephone survey data can be taken to represent
a probability sample of all individuals who live in households with residential telephone service in the United States."
There are definite issues associated with this method of selection. One dates as far back as 1948 when Chicago Tribune shamed itself with the Dewey Defeats Truman headline. They, too, relied on the telephone polls, disregarding the fact that those owning landlines were more affluent, older and - well - more likely - republican. Today the situation is similar with a twist: cellular service on a rise. Nielsen Mobile did a study in 2008, comparing the wireless-only and landline owning populations. According to the results, those living in poverty(26%) or nearing it(22.6%) were more likely to cut the cord compared to those with higher incomes (14.2%). So, yet again, we are dealing with the more affluent population, and affluence in this country is related to conservative views, a very Marxian picture.
So, what's with the wireless people? According to the same Nielsen Mobile report, 17-some% of the population was completely wireless and the number grows exponentially. So, that demographic is completely omitted from the survey. To be fair - the report clearly states the inference - the findings represent the landlines owners. Nielsen suggests that the cordless population is substantially different from those who are wired. Then, let's interpret it correctly: these findings do not represent the entire America, since the initial sampling procedure excludes nearly 1/5 of Americans by default. That, my friends, is a lot of people to overlook.
Part 2: Weighting. "A standard weighting process was applied to the data to adjust for error inherent in the sampling methodology. The frame of the general population was aligned to the national population, as taken from the 2006 American Community Survey, and a weight was applied based on age, gender, education, and race."
Well, during the period of my scholastic indoctrination, one of my doctorate emphases was Methodology of Research, so I've taken a fair share of classes that discussed every minute detail of processing data. I have heard this from every single prof and read it in every single textbook, from remedial to complex - you have to have a pretty good theoretical reason to assign weights, and when you do - you have to explain in detail how and why you went about it. So, looking further into the text I see even a regression equation (which is doomed to get an eye roll from anyone who is not in the research business), but I don't see a reasonable explanation of how the weights were applied.
Using weights is not hard and it is helpful when you want to beat your data into submission. The weighting system has been used in the US for a long time: I am rich, so I deserve to have health care, but you are poor - you really don't. So, I would like to know more about the weighting procedure used by the study, since I am not sure what a "standard" procedure is.
Part 3: The use of regression. I typically find the use of linear regression problematic - very few things in the social world are associated in this direct fashion, most social relationships are moderated, mediated and take funny shapes (in quadratic or cubic equations). So, while for a lay person all those beta-coefficients and stochastic errors will look impressive and scientific, there are crucial pieces of information that went amiss: what is the Adjusted R-squared of all these labors? R-squared is the number that tells you how much of the dependent variable (in this case, I believe, that would be liberal values and civic knowledge) is explained by the regression components (in this case, that would be the degrees). I would also like to see the effect sizes - how much each of the components influences the outcome. I would also like to know how well the model fits (give me that F-value!).
Now, an additional issue is as empirical as it is theoretic: are there real relationships between the actual degree vs. other variables that are discounted? Or is the difference attributed to hanging out on campus for x-number of years, socializing and expanding horizon (thanks, Tucker Carlson, you opened the door)? Universities have been trying to promote diversity on campus for years, so, perhaps, being exposed to all kinds of people, learning more about different beliefs, traditions and values helps adjusting one's view of the world, as opposed to, how did they put it, those liberal professors "pushing their values on young impressionable kids?" In fact, in the professorial trade the aspiration is to not let your students know what views you support - to generate free learning.
And finally, some things you learn in college have little to do with values, but with objective reality. So, spinning the Horatio Alger dime-tales of rags to riches can be easily challenged with facts: the US has second to lowest economic mobility rates among developed industrialized nations (UK has the least). 42% of kids born in the bottom 20% of the US population will remain there, similar with the top 20%. More than 50% of people at the bottom will remain there in 10 years. At the same time, those people work the most physically demanding jobs, many hold more than one job at a time, so it is not the matter of working hard. Puritan values are obsolete, because our achievement system is no longer pure. So, when you are in college, you are exposed to facts and figures that are hard to miss - there is no axe to battle them. Durkehim called it sui generis - reality of it's own, independent of individual properties of participants.
Part 4: The questions. Every professional survey designer knows the mighty power of the wording: it is too easy to skew the results just by the lack of language virtue. The golden rule is to use neutral, unambiguous, simple language. This is remedial stuff, at ISU we teach this in the first year methods course. So, let me give you a couple of poster examples of what not to do in a survey.
"America is the world’s greatest melting pot where people from all countries can unite into one nation." - this item is loaded, it is leading the participants to respond in a certain way. Who would want to appear Anti-American and contradict the "greatness"?
"Abortion should be available at any stage and for any reason." - same here, this is a severe, polarizing issue and there are no absolutes, therefore, issues of this caliber should be approached with certain gentleness. As an avid pro-choicer who doesn't believe that abortion should be allowed at any stage for any reason, I have no choice, but to support this statement. The Strongly Disagree to Strongly Agree scale is irrelevant in this matter, because the item is worded in the worst possible manner.
"The Bible is the Word of God." - last time I checked, we are guaranteed the freedom of religion, which means there are other religions possible than the one that assumes the Bible as the key reading. So, this questions takes out about 22% of the American population who are not Christian. That in itself has absolutely nothing to do with the education those people receive, does it?
"Raising the minimum wage decreases employment." - well, considering the climbing unemployment rate, who wants to contribute to that cause? An initially biased question produces the desires result.
Anyway, I can pound on the quality of this study all day and have fun doing that. I am not against the results and I find the conservative outcry for the liberalization (or, shall I say, liberation) of the fellow conservatives fairly amusing. I am just not sure equating "liberal" with the poor quality of civic education (which is an entirely different topic for a different post) is befitting. And, again - what on Earth is "liberal' when put in this context? The ability to think critically and use objective facts when making a decision? If that's the case - please turn me liberal! Oh, hold on, I already am...
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment