how social scientists think: doing it right
Thanks to everyone who commented on the How Social Scientists Think posts last week. You raised a lot of important issues about the differences in the ways that advocates and academics approach our respective jobs, and I hope it will be helpful to be aware of those differences in future debates. So much of the time, we talk past one another without understanding one another's "languages." I'd like to get past that and into a more productive dialogue through which we can gather and disseminate information in a timely fashion so that it might really make a difference.
To that end, I want to draw your attention to one advocacy group that got it right. This past July in Kampala, I attended an event at which 24 Hours for Darfur's Darfurian Voices report was presented by Jonathan Loeb, one of the primary authors of the report. The goal of the project was to collect and document "Darfurian Refugees' Views on Issues of Peace, Justice, and Reconciliation." This project is one of the best examples I know of advocacy work that used the best social science research methods available to address their questions. What are some of the things the group did right?
- Not starting with an answer. The team asked a research question, but did not presume to know the answer beforehand. Too many advocacy organizations I've encountered decide up front what they think about a situation and then proceed to discount information that conflicted with their pre-determined view of the situation. (To be fair, some academics do this, too.) Being open to all possible answers - even when they don't confirm the common wisdom - is key to doing solid research.
- Random sampling. Darfurian Voices had a limited pool from which to work (Darfuri refugees in camps in eastern Chad), but they made every effort to get a random sample within that group so as to represent the full range of opinions. Moreover, they were careful to report that the responses in the report do not represent the view of all Darfuris.
- Collaboration with academics. Loeb doesn't have a PhD, but he was smart enough to collaborate with people who do. The research methodology and survey were vetted by a team of academic experts and analyzed using solid, well-established statistical methods.
- Transparency. At the meeting in July, Loeb had his code book. He let me flip through it at length so I could inspect the team's method. It was super-solid, and the report itself contains detailed information about sampling methods, research methodology, and even how the surveys were translated into Arabic.
Labels: how social scientists think