"Africa is, indeed, coming into fashion." - Horace Walpole (1774)


shameless self-promotion and you should read this

I'm speaking on the Congo twice tomorrow, both times in Georgia. Here are the details:
  • Thursday, October 28 at Columbus State University at 12:30pm in the International House. I'll be giving a lecture on Conflict & Peace in the DR Congo.
  • Thursday, October 28 at Emory University, I'll be on a panel about conflict minerals and the Congo sponsored by Delta Sigma Theta. It's at 7:13pm (yes, that's the right time) in White Hall room 206.
I'd love to meet blog readers at either of these events; please stop by to say hi if you can make it.

Also, you should go read Dave Algoso's piece at Foreign Policy. Dave explains (much more politely than I could have) why The Kristof's DIY aid solution is not a good idea. At all.


how social scientists think: doing it right

Thanks to everyone who commented on the How Social Scientists Think posts last week. You raised a lot of important issues about the differences in the ways that advocates and academics approach our respective jobs, and I hope it will be helpful to be aware of those differences in future debates. So much of the time, we talk past one another without understanding one another's "languages." I'd like to get past that and into a more productive dialogue through which we can gather and disseminate information in a timely fashion so that it might really make a difference.

To that end, I want to draw your attention to one advocacy group that got it right. This past July in Kampala, I attended an event at which 24 Hours for Darfur's Darfurian Voices report was presented by Jonathan Loeb, one of the primary authors of the report. The goal of the project was to collect and document "Darfurian Refugees' Views on Issues of Peace, Justice, and Reconciliation." This project is one of the best examples I know of advocacy work that used the best social science research methods available to address their questions. What are some of the things the group did right?
  • Not starting with an answer. The team asked a research question, but did not presume to know the answer beforehand. Too many advocacy organizations I've encountered decide up front what they think about a situation and then proceed to discount information that conflicted with their pre-determined view of the situation. (To be fair, some academics do this, too.) Being open to all possible answers - even when they don't confirm the common wisdom - is key to doing solid research.
  • Random sampling. Darfurian Voices had a limited pool from which to work (Darfuri refugees in camps in eastern Chad), but they made every effort to get a random sample within that group so as to represent the full range of opinions. Moreover, they were careful to report that the responses in the report do not represent the view of all Darfuris.
  • Collaboration with academics. Loeb doesn't have a PhD, but he was smart enough to collaborate with people who do. The research methodology and survey were vetted by a team of academic experts and analyzed using solid, well-established statistical methods.
  • Transparency. At the meeting in July, Loeb had his code book. He let me flip through it at length so I could inspect the team's method. It was super-solid, and the report itself contains detailed information about sampling methods, research methodology, and even how the surveys were translated into Arabic.
With the publication of this report, 24 Hours for Darfur worked themselves out of a job. The need for the organization no longer exists, so neither does the organization, which is a lesson some other advocacy groups could take as well. But the team on the Darfurian Voices report did a great job of showing ways that solid research techniques can be used by advocates to make more solid arguments. It's doable.



costumes for development geeks

Following the popularity of last year's post, we here at Texas in Africa headquarters humbly offer some suggestions for your Halloween costuming needs should you be hanging out with other people who understand why dressing as a development economist is hilarious. And, really, everybody else is going to come as the Double Rainbow guy or Christine O'Donnell. Be creative:
  • One Million Shirts - Oh, like you didn't see this one coming a mile away. This is easy for those of you who have lots of t-shirts left over from college clogging up your dresser drawers, but it's also handy if you're on a remote posting in a place that has a market full of cast-off t-shirts. The trick is two-fold: 1) you need to wear as many t-shirts as possible. Layer them, tie one around your head, make them into sweatbands, wear them as pants, whatever. 2) Carry a bottle of hatorade.
  • SWEDOW - In a similar vein, pull all the Stuff WE DOn't Want (as @talesfromthhood so brilliantly named all the throwaway junk Americans have a habit of donating to people who neither want or need it) you can find out of the closet and wear it. TOMS shoes, used lingerie, jeans with too many holes to be wearable - whatever works.
  • Aid Transparency - Wear something floaty and/or transparent, then get a friend to dress as Bill Easterly and chase you around all night. (See last year's post for hints on dressing like the Greatest of All Aid Bloggers.) When asked for data or proof of anything at all, make up answers that blame your inability to answer on someone else's nonexistent lack of permission. Bonus: add in Jeff Sachs, who should run around all evening throwing fake money at problems.
  • Rwanda's New Times - The New Times is, of course, Rwanda's government-supported/written daily. To go in this costume, wear black and white so you look like a newspaper. Spend the entire party denying obvious facts with outrage: "This drink isn't cold, it's hot!" "Halloween isn't in October if we say it isn't!" Bonus points if you ask everyone who questions you, "Why do you want this party to fail?"
  • Omar al-Bashir - Sudan's president/indicted war criminal has quite a few looks as this handy Foreign Policy fashion spread makes clear. Pick one of those to wear. To really pull this off, you'll want to flaunt your obviously well-stamped passport and tout the frequent flier miles you've accrued since the ICC indictment came down. (Tasteless? Possibly. But not as tasteless as what the countries letting al-Bashir get away with this stuff are doing.)
  • Saving Haiti - This is a great costume for two: one of you should go as Wyclef Jean (wearing a Wyclef 4 Prez button or t-shirt) and the other as Sean Penn (with a gun in your belt). This would be another great place to pull out your SWEDOW, but the key to pulling it off is to focus entirely on yourselves and your so-called contribution while ignoring the desires and suggestions of the people around you and all experts.
  • The Sucking Vortex - Some friends and I have been trying to figure out how you could dress up as Time Africa Bureau Chief Alex Perry's infamous description of the DRC pretty much since he made the mistake of calling it that and then made the bigger mistake of reacting angrily (and in public) to criticism of his comment. We can't figure out how you'd pull it off, but we all agree: the creation of the outfit should involve a BeDazzler. If you come up with something, let us know in the comments.
Other ideas?

Labels: ,


this & that


how social scientists think: we're not completely sure about much

If we've gathered information thoroughly and using solid methods, if we've chosen from where and from whom to get data in a systematic manner, and we've not mixed up correlation and causation, eventually, we ought to have an answer to our original research question, right?

Well, maybe. If most social scientists are honest, they'll admit that we rarely know anything for sure. We study human behavior, and since we can't ethically design massive social experiments that would tell us once and for all what human behavior will do in any given political situation, our data - and therefore our explanation - is never perfect. Unlike our colleagues in the hard sciences whose worlds are filled with laws and near-certainties (like gravity), social scientists usually operate in a land of heavily caveated pronouncements and exceptions to rules. (Hence the titles of talks like Professor Blattman's recent "10 Things I Kindof Believe About Conflict and Governance" lecture or claims that "It seems to be the case that under condition X, Q prevails.") As Hans Nole points out, in political science, we only really have one law, which explains why third parties will never have a chance in the United States, but even that has caveats.

Basically, we're pretty sure about a lot of things, but aren't completely sure about any of them.

Why? Part of the reason for this has to do with the pesky inconsistency of human behavior. Many social scientists are reasonably convinced that most people are rational, self-interested actors most of the time, but that doesn't help us to explain why someone would give away a fortune to live in poverty, run into a burning building to save children at the cost of his own life, or other altruistic behavior. Add in cultural differences, religious beliefs, and different conceptions of what constitutes self-interest and the rational actor model starts to explain so much that it might not explain anything at all.

Another reason we can't be completely certain about our findings (which, by the way, is the fancy word for "what we figured out") is that our data is often imperfect. Sometimes, the data you need just doesn't exist. (Nowhere is this more true than for those of us working in extremely fragile states. What I wouldn't give for solid, reliable population figures.) When data doesn't exist, sometimes the only choice left is to eliminate all other possible explanations and hope that the only one left is right.

Social scientists operate on the basis of eliminating possible explanations. When we begin a research project, we propose a series of hypotheses to explain causal relationships. Then we use our data to eliminate the ones that are obviously wrong - that is, correlating phenomena that are not causal. If we can show that the hypothesis or hypotheses are the only ones that work and simultaneously show that all other possible explanations don't explain the effect, then we can be reasonably certain that we've gotten it right. But there's always a little doubt. In statistical analysis, this is expressed by what is called a "confidence interval," which is a numerical way of expressing the level of confidence that a researcher can have that her analysis was done correctly. If we can be 95% certain that an explanation is good, that's good enough for most social scientists. But we're still not 100% sure, and usually can't be.

If there's only one explanation left, but there's no data with which to analyze that explanation, it's harder. A researcher in that situation is dealing with a known unknown - he usually knows exactly what data he needs, but for whatever reason (war, rebels, absence of a time machine, etc.) can't access it. It's not satisfying, but sometimes it's the best we can do.

Another reason we're hesitant to claim absolute certainty about our findings is that the world is a pretty complex place. Out of necessity, we try to explain human behavior in complicated social systems in simple terms. We do this by trying to isolate variables and control for the rest of them. When we control, that means we use an assumption called ceteris paribus, which is a fancy Latin phrase that refers to all other things being held constant. If I can hold constant factors like the mineral trade, rebel activity, and the state's presence, I can do a reasonably solid analysis of the role of land rights disputes in causing violence in the DRC and really get at just why and how that variable matters.

Problem is, in the real world, variables are never isolated. Some social scientists (me included) try to account for this issue by providing lots of context in our research, but at the end of the day, we have to figure out causal relationships, too. It's messy.

Then there's the problem of endogeneity. Endogeneity is a little hard to get your head around, but here's the simplest explanation I could come up with: it's the idea that an explanation isn't really an explanation, but rather, the independent variable that one thinks is explaining a phenomenon (defined as the dependent variable) is actually part of the system of the phenomenon you're trying to explain. In other words, the independent variable isn't really independent of the dependent variable in a meaningful way - it's determined within the dependent variable's system. There are fancy statistical ways to avoid making an endogeneity error, but it's still tricky. How do you know for sure that X caused Y rather than X caused A, which then caused Y?

I'm not sure that advocates think about uncertainty and complexity in quite the same way as academics do, and for good reason. While I'm writing for an audience that understands all these methodological and theoretical issues, advocates need a story that average readers can understand, which means they often need a simple, straightforward narrative of a situation. Do advocates claim uncertainty about the causes of events? What do you think?



how social scientists think: correlation is not causation

Whenever I teach students about the difference between causation and correlation, I try to have them do something ridiculous. I might have one student repeatedly flip the light switch while having another jump up and down while another sings "I'm a Little Teapot." Then I ask, what caused the light to go on and off?

Students generally roll their eyes as they answer, "flipping the switch," but the point is clear: just because events happen at the same time doesn't mean they're in any way related. They learn one of the key principles of all sciences: correlation does not imply causation.

Social scientists spend our time trying to figure out whether phenomena are causally related - that is, whether one event or occurrence or circumstance causes another event to happen. We want to explain how and why specific events are related to one another in hopes of being able to explain more generally why similar events are related to one another.

Problem is, it's a lot harder to determine causality in the real world than it is with a ridiculous example in the classroom. It's especially difficult when there are multiple causes for an event, as is the case with violence in the eastern Congo. It's impossible to ever be 100% sure we have correctly determined the cause of an event, but we can reach a reasonable degree of certainty and have developed a number of means by which we can (hopefully) avoid confusing correlation and causation.

We do this through a couple of mechanisms. One is to isolate variables. Variables are just another way of talking about causes (which we call "independent variables") and effects (which we call "dependent variables"). Of course, most human behaviors and situations involve far more than just two variables, so we try to control for effects. This is pretty easy using statistical analysis; a social scientist using that method will use math (and, these days, sophisticated software) to control for the effects of the other variables so that she can look only at the one she thinks matters. She can then do statistical tests to determine whether she can establish a reasonable degree of certainty that the cause she has identified is indeed causing the observed effect.

With qualitative methods, it's a lot harder to establish causality, because the goal of causal inference is to determine what the effect would have looked like had an event or circumstance not happened. We call this idea of what could have been the "counterfactual." But real life doesn't usually allow us to establish counterfactuals (although the world of Randomized Control Trials is now opening up all kinds of possibilities in this regard). We can, however, look for real life counterfactuals, or places in which natural controls are in place. For example, I have an observed effect in my research which suggests that ethnicity may be an important causal variable, but I'm not certain enough about that to publish it yet. However, I have some new data from a town which is ethnically homogeneous. It's my hope that the data from this town will function as a kind of natural control, which will help me to figure this out with a higher degree of certainty.

The distinction between causation and correlation - and the obsession with making sure the two are not confused - sets quality research apart from shoddy or sloppy research. It's incredibly frustrating to me to read a hastily put-together advocacy report or journalist's account that assumes correlation means causation, despite the lack of evidence for such a claim. I understand why it happens; advocates and journalists have to work quickly, and if they talk to people who don't understand the difference, how would they know otherwise? But it's incredibly frustrating to see these errors made, especially when they lead to bad policy decisions.

Advocates, what do you think? Do most researchers in your field do a good job of distinguishing between correlation and causation? How could we better work together to make sure that the causes we're identifying are actually the causes of various events?



how social scientists think: what your driver says isn't evidence

We all do it. Fly into a random capital in the developing world with only a couple of days to get what needs doing done, find our taxi driver, and, in a fit of jet lag and the need to get critical information quickly, we ask, "So, what's going on with the election/president/other serious political issue?"

Taxi drivers can be a great source of information about traffic patterns, road quality, and even politics. But to a social scientist, the information any one random individual provides is not in and of itself evidence. No matter how much talk radio the driver listens to.

Why not? I've already written about the need to gather huge amounts of data and the necessity of using appropriate methods to gather and assess information. But how do you know that the data you've gathered is an accurate reflection of reality?

One way to be more certain that the data you've gathered will actually tell you something useful is to be deliberate and systematic about from whom you get information. Whenever a social scientist uses research methods that involve asking people questions (be that through interviews, surveys, or participant observation), it's very important to be sure that you are asking those questions of different kinds of people. Why? Because while it's impossible to talk to every single person who is part of or affected by the issue you're studying, you want to be certain that you're getting a reasonably accurate portrayal of what's really going on.

Just like doctors running medical trials for a new drug don't usually test that drug only on one gender, racial group, or age group, social scientists aim to get information on what we're studying from representatives of all demographic groups in society. We call the process of choosing all different kinds of people "random sampling," and the group from which we eventually gather data is the "representative sample." Without getting too far into the nitty gritty details of how this is done, the main thing is that the representative sample needs to look as much like the general population as possible. If the general population is 60% Ethnic Group A and 40% Ethnic Group B, the sample should reflect that, as it should reflect age, race, gender, education level, religious affiliation, and any number of other demographic characteristics.

Not every study requires a completely random sample, of course. When I'm interviewing civil society leaders in the Congo about their organizations' activities, I don't need to get information from all the housewives in a village that's not connected to those organizations' activities. What I do need to do, however, is talk to as many civil society leaders as I can find. Remember, the goal is to get all possible information.

While this isn't always possible, an ideal social science study has what we call a "large n." Large n is a shorthand way of referring to a large number of cases, or subjects of study. Individuals can be cases, but so can organizations and events. For example, people who study the causes or likelihood of civil wars want to be sure that their answers apply across a wide range of cases, so they often compare what happened in hundreds of civil wars.

When you do a thorough search for data, you're almost inevitably going to get contradictory information. How do you know who's right? The method by which you gather that information helps. If you've done a truly thorough search, then you can be fairly confident that you've gathered a wide range of opinions. You can also compare answers and decide who is a more reliable source on a particular piece of information. But there's always uncertainty, and it's best to acknowledge that up front (more on this later in the week).

How do you know whether a source of information is reliable? You can't always know, but it's a good idea to seek out people who are known to be honest and reliable. Reliability of information is another reason it's extremely important to talk to as wide a range of sources as possible - and to not rely on a driver or fixer to find all your subjects. Like most of us, when asked who would know about a subject, drivers, research assistants, and enumerators will typically refer me to the people they know. In Africa, I've found that these people are almost always family members, neighbors, members of their religious group, and members of their ethnic group or community. And there's nothing wrong with that; just because someone is related to your driver doesn't mean she won't be a helpful source.

But it's not enough. Your driver and his acquaintances (or any other individual) do not constitute a random sample. Particularly if you're studying a place in which the politics and dynamics of ethnicity matter, it's extremely important to make an effort to interview or survey everyone.

But everyone doesn't mean everyone. Like all academics, social scientists are bound by laws and ethical guidelines regarding research involving human subjects. We have to have all our research - including the questions we ask - approved by ethics review boards at our institutions. These boards - usually known as IRB's, or Institutional Review Boards - are there to protect vulnerable individuals from being exploited, endangered, or otherwise put in harm's way. IRB's typically require us to get what is known as "informed consent" from subjects before we ask them any questions. Informed consent procedures let subjects know what research is about, why it is being undertaken, and the specific steps that will be used to protect their identities. IRB's also limit the populations we can research depending on the goal and circumstances of the project.

In my research, for example, I can't interview children, identify any of my interview subjects by name, or interview anyone without getting their informed consent. This is to protect the individuals who are kind enough to provide me with information. Especially when dealing with people who might not fully understand how research is disseminated in the modern world (how would you if you've never used the internet?), it's extra important to ensure that they know what is going on. Is it a hassle to get informed consent from every single research subject? Yes. But is it absolutely necessary in order to protect people from being harmed? Definitely.

So there you have it: from whom you gather information and how matters. How do social scientists' procedures here differ from those of advocates? Do advocates use informed consent procedures? Should they?



how social scientists think: anecdotes aren't evidence

Every time I write about the lack of systematic evidence that there is a causal relationship showing that mining is the root cause of violence in the DRC, I get a long list links to stories, articles, and advocacy materials that purport to show just that. But the links to which passionate readers direct me almost never prove their point. Why? It has to do with the nature of evidence and the way it's gathered. Social scientists are a diverse bunch, but if there's one thing we can agree on, it's this: anecdotes aren't evidence.

The most important thing to understand about social scientists is that we are just as obsessed with how information is obtained and analyzed as we are with what that data tells us. We call the process of obtaining and analyzing data our research method.

Research methods are important because they determine the quality, kind, and extensiveness of the data we get. The method by which we analyze our research also validates (or invalidates) that we are actually answering the research question we set out to answer.

It's extremely important that the method match the research question. While we'll fight over this to high heaven on boring panels at obscure conferences, deep down, most social scientists agree that certain methods are better for certain kinds of questions. We divide methods into several types; some are numbers-based (those are called quantitative), others are based on descriptive information (qualitative methods). There are also ways of using logical reasoning and advanced mathematics to explain or predict the decisions leaders or states might take - that falls into a broad category called "game theory."

Before your eyes glaze over too much, remember that you don't need to understand the different types of methods. All you really need to understand is that the important thing is that the method used to obtain data will allow the researcher to answer the original research question. If I'm studying how young women perceive militias in the DRC in relation to their personal religious and political beliefs, for example, my method needs to allow me to answer that question. In this case, I'd probably use interviews, focus groups, and possibly a survey to get an answer to those questions, which provide me with the kind of data I need to explain that relationship. If, however, I want to know whether conflicts in the DRC have an effect on school enrollment, attendance, and completion figures, I would use a very different method that involves lots of numerical data and statistical analysis. Interviewing families might give me some useful background on the subject, but it won't really provide the data necessary to get an answer. The method has to match the question.

It's also important to get all the information about a subject possible. This means that not only do you have to read everything that's ever been written on the topic, but also that you have to make a good faith effort to gather all available data. That means finding and evaluating everything you can possibly get. If you're thinking, "That sounds really inefficient," you're right. It takes forever. It took me four years to gather the data for my dissertation. Now that it's a book project, the data-gathering continues, well into year seven, with no end to the data-gathering in sight.

Why do you need so much information? Because having a lot of information is the only way to be sure you've covered all the bases and teased out every possible explanation to be sure that yours is the right one. The goal of social science is to develop theories (more on that later this week). For a theory to be useful, it needs to tell us generalizable information, that is, information that can be applied in more than one situation. If information can't be applied at a general level, across contexts, space, and time, it's of limited utility. We don't just want to know how to end the use of rape as a weapon of war in the DRC; we want to know how to end the use of rape as a weapon of war everywhere. More importantly, we want to know what strategies don't end the use of rape as a weapon of war, as well as what effects don't explain why rape stopped being used as a weapon of war in a particular situation. So we need to find generalizable information, which means that we have to eliminate every possible alternative explanation, which means that we need to analyze as much information as we can.

How does this make us different from advocates, 0ther than that we have the luxury of time and self-set deadlines? It means that we can't rely on anecdotal evidence. Why not? Because anecdotes aren't good predictors of general phenomena - that is, we can't say that because something happened once, we know the cause of every similar event. Just because one woman was raped by LRA soldiers does not mean that all rapes in the DRC are committed by the LRA.

Now, I can hear some advocates protesting, "But we do gather systematic evidence!" And I know that's true. In some cases. However, far too often, the advocacy materials that end up in the general public's view rely far too much on anecdotes and far too little on systematically-gathered, methodically-analyzed evidence.

Why does this happen? Part of it is because of time constraints. Most advocates don't have the luxury of spending 2-10 years on a single project. Many have to do research on quick, in-and-out trips to the field that last at most two or three weeks. It's impossible to gather solid evidence in that time. But it's easy to gather heartbreaking, soul-stirring anecdotes that will move campaigners to action. More importantly, it's necessary for a good advocate to do so. Would you give money and time to a cause that didn't make you feel something, or that you could make a difference?

That's part one. Tomorrow we'll talk about why from whom evidence is gathered matters. For now, do you perceive differences in the ways that academic social scientists and advocates gather information? What are the strengths and weaknesses of each approach? I'm particularly interested in hearing from advocacy people on this.



how social scientists think week

After several days of posts on a pie-in-the-sky idea about military intervention against the LRA (See here, here, and here for arguments as to why it's a bad idea that won't work and here for a defense of it) and overstated claims about impending genocide in South Sudan from an actor and a celebrity-courting advocate, I'm reminded of a fundamental truth: advocates and academics think differently.

Not that any of this is really news, but I've been thinking a lot lately about the ways that social scientists consider evidence, facts, and forecasting in light of the way that our jobs as researchers are different from those whose job it is to persuade others to take action on an issue. Academic researchers are trained to think in particular ways. In the social sciences, we are trained to take the most messy of subjects - human behavior - and think about it in systematic ways that explain causal relationships between phenomena. Advocates, though, are trained to stir emotions and to draw personal connections between international events and Western students, consumers, and families.

I think this difference in training and purpose accounts for a lot of the disconnect between academics and advocates on a number of policy questions (eg, conflict minerals in the DRC). So I thought it might be useful to spend a few days this week explaining how those of us in the social sciences think. Stay tuned for the first post in the series tomorrow. Here's hoping it's not mind-bogglingly boring.



end the Obiang Prize

It is past time for UNESCO to end this altogether. The chorus of voices grows louder:
Renowned African elites have written to UNESCO to cancel a prize named for the president of Equatorial Guinea, Teodoro Obiang Nguema Mbasogo who took over the country in a coup and has shown little tolerance for opposition during the three decades of his rule. Although the oil rich central African country’s GDP per capita is on par with Spain and Italy, it has some of the worst socioeconomic indicators in the world.

UNESCO had chosen to award President Obiang the Prize for Research in the Life Sciences, but 125 African scholars and human rights defenders have opposed UNESCO’s choice.

Signing the letter of protestation were former Archbishop Desmond Tutu, internationally acclaimed author Chinua Achebe, Nobel laureate Wole Soyinka, amongst others.


say it ain't so

This is not a good idea:
...there is no better case for the humanitarian use of force than the urgent need to arrest Joseph Kony, the ruthless leader of the Lord's Resistance Army (LRA), and protect the civilians who are his prey. And far from requiring a non-consensual intervention, Kony's apprehension would be welcomed by the governments concerned.

The LRA began as a rebel movement in northern Uganda, but it now terrorizes the civilian population of northern Democratic Republic of the Congo as well as southern Sudan and the Central African Republic. Its cadre often descends on a remote village, slaughters every adult in sight, and then kidnaps the children, some shockingly young -- the boys to become soldiers slinging AK-47s, the girls to serve as "bush wives." Over more than two decades, many thousands have fallen victim to these roving mass murderers.

The International Criminal Court has issued arrest warrants for Kony and other LRA commanders, charging them with war crimes and crimes against humanity, but the court depends on governments to make arrests.

So far Uganda has done the most to pursue the LRA, but ineffectively. The LRA is not large -- an estimated 200 to 250 seasoned Ugandan combatants, plus at least several hundred abductees -- but as Ugandan President Yoweri Museveni recently told me, Uganda lacks the special forces, expert intelligence, and rapid-deployment capacity needed to stamp out this enemy.

In May, Obama signed a bill committing the United States to help arrest Kony and his commanders and protect the affected population. Now it is high time to act. Arresting Kony would reaffirm that mass murder cannot be committed with impunity. And it would show that, despite the difficulties in Iraq and Afghanistan, the humanitarian use of force remains a live option at the Obama White House.
Oh, Kenneth Roth, Executive Director of Human Rights Watch. Really? Sending some kind of US force into the weakest corner of three extremely weak states and one that could have dealt with this long ago had its leadership really wanted to do so, into territory they don't know, where they don't speak the local languages, to track down an enemy nobody's yet been able to nab, with limited resources? Is this what you're advocating? Really?

I have a ton of respect for Human Rights Watch and the incredible work they do, especially in Africa's Great Lakes region. While I don't agree that it's the worst idea on the internet from Tuesday, this recommendation is off base. Aside from the significant logistical and diplomatic quandaries such operations would pose (How, for example, does Roth think Khartoum would react to an American military presence on south Sudanese soil? Would the French agree to the presence of an American force in the CAR?), fighting in the dense forests in which the LRA hides without knowing the territory, the languages, or the local cultures means that troops undertaking such an operation would be at a significant tactical disadvantage.

Of course all reasonable people agree that Kony needs to be arrested and prosecuted for the unbelievable crime for which he is allegedly responsible. But if it were that easy, it would've been done already. Say, by the French troops who are already in the Central African Republic. Though mostly engaged in training operations these days, they at least theoretically comprise a significant enough force strength to get the job done.

Part of the reason Kony has been able to evade capture for so long has to do with the way he positions his fighters around his camps and the systems of notification of impending attack he's able to employ. You can't always track the LRA's movements with satellites and open-source intelligence; they're smart enough to stay under tree cover most of the time and there aren't many mobile phone networks in these areas through which informants can phone in sightings. Kony may be crazy, but he's not an idiot - he's got a system. This is not an operation that can be undertaken quickly with a few helicopters and some RPG's.

While the humanitarian use of force may be a good idea in theory, as we've seen before, it doesn't often work out as well as planned. Especially in unfamiliar territory. Tread lightly on this one, policy makers. It's going to take far more than a quick in-and-out sweep to take down Kony.


sustainable agriculture in South Kivu

TED Fellow Alex Petroff of Working Villages International discusses a sustainable agriculture project in Ruzizi, South Kivu. In three minutes:

Other than the last part (Yachting? Seriously?), I found this an interesting model of the ways that best practices from around the world can be integrated into local contexts while using local expertise. What do you think?



The AP's Michelle Faul does a wonderful job explaining the DRC Mapping Report, and, more importantly, covering the effects Rwanda's actions in Zaire had on the local population:
Matata Ihigihugo has relatives in three mass graves: her husband and two sons in the one reserved for males, a sister in the women's grave, and her 8-year-old daughter in the one where children's small bodies were buried.

"They killed all my people. I have no life left," said Ihigihugo, who thinks she is 40 but looks many years older.

She objected to being asked to name her massacred family. "Why do you ask me to call out the names of those who are dead?" she demanded. "There can be no peace for me until they are properly buried."

...Uncovering the graves, proving how people were killed and even perhaps identifying them could bring closure for people like Ihigihugo, one of the widows of Musekera.

"There can be no rest for people buried like that," she said of the mass graves. "Giving a proper burial to my family also would put my heart at rest."
The decisions made in foreign capitals - about whether to prosecute these crimes, where to try potential accused war criminals, and what this report means for regional stability - really, really matter. As former investigator Reed Brody told Faul, Rwanda's actions in some ways established the culture of impunity that surrounds crime in the eastern DRC today:
"The fact that these killings of tens of thousands, if not more, went utterly unpunished, the fact that there was clearly not the political will to identify the authors of these massacres and to bring them to justice, has facilitated the cycle of violence," he said.
Faul also covers Brody's frustrations with trying to get information from the American government. Suffice it to say that the Clinton Administration's Africa policy team - in particular Susan Rice, who was at the time Assistant Secretary of State for African Affairs - really messed this up the first time around. They decided early on to more-or-less unquestioningly support Kagame, Museveni, and Kabila. The decisions they made in those critical months in 1996-97 allowed Rwanda to get away with slaughtering Hutus in Zaire.

Many of those same policy makers are now in the Obama administration. They've got a rare, second chance to get it right. Do they have the courage to cut through the diplomatic niceties and stand up for the rights of all the people of the Great Lakes? Will they finally hold Kagame to account for the events for which he bears full responsibility? I don't know. But I do know that people like Mrs. Ihigihugo deserve nothing less.


this & that


budgets, rape, & the UN

File this under "peacekeeping operations don't pay for themselves:"
Budget cuts mean U.N. peacekeepers in Democratic Republic of the Congo do not have enough helicopters to operate effectively in DRC's unstable east, a U.N. official said on Wednesday.

Under pressure from Congolese President Joseph Kabila, the U.N. Security Council agreed in May to allow a phased withdrawal of the U.N.'s biggest peacekeeping force (MONUSCO) and a shifting of its focus to reconstruction, training and other aid.

The move triggered a $73 million cut to MONUSCO's roughly $1.3 billion budget, of which $61 million affects the type and number of aircraft available to the force, Paul Buades, head of MONUSCO's logistic support base in Entebbe, told Reuters.

Buades said the cutbacks will make it harder to carry out operations such as the capture on Tuesday of a rebel commander accused of orchestrating a series of mass rapes in Congo.

"We can't support the forces in more robust operations like this," Buades said. "The jungle is the jungle," he said, referring to country's sprawling eastern provinces, which are roughly the size of France.
With all the outrage about the Walikale mass rape situation, it's hard to see how cutting MONUSCO's force size and budget is even remotely justifiable.

Many people who've not worked in the eastern DRC have a hard time understanding what conditions there are like. Unless you've visited an extremely fragile state, it's really hard to get your head around how incredibly difficult even the most basic logistics can be there. For example, a lot of people were really upset when they learned that MONUSCO has a base only 12 kilometers from where the Walikale rapes happened. But covering 12 kilometers isn't nearly as easy as it sounds. On a good day, that might take less than an hour. On a bad, rainy one, it could take two or three times as long. Many parts of the DRC aren't even accessible by road; particularly in Walikale, it's not uncommon for human rights researchers to have to fly to an airstrip, drive to the end of the road, and hike for several days to reach victims and record stories. Inaccessible doesn't even begin to describe it.

Then there are the communication issues. As you probably know by now, UN forces actually traveled through the town where the mass rapes took place while they were going on. They didn't stop to intervene because they weren't told that anything was going on.

I have no basis for knowing whether that's true or not, but I don't doubt that the soldiers patrolling the area had no idea what was going on. An issue that doesn't get talked about much is the fact that most of the peacekeeping troops working in the eastern DRC can barely communicate with the people they're trying to protect. Why? Because they come from non-Francophone and non-Swahiliphone states. The most significant battallions come from, among other places, India, Bangladesh, South Africa, Nepal, and Morocco. Like most troops in most countries, the soldiers are not highly educated. They speak their national or local language and not much else. Only a few - usually officers - speak English.

This leads to a lot of problems, to put it mildly. While the UN certainly has people who can translate between English and French, when a group is on patrol in a place like rural Walikale, the odds of finding villagers who speak either are pretty slim. As are the odds of having a fully qualified translator on the patrol. I have no idea what happened in Walikale in July, but whatever was or was not reported to those troops, it's hard to see how they could have understood what was going on.

That's not to excuse MONUSCO's failure to protect those civilians. The UN has admitted as much. Clearly, the lack of communication about the situation led to even more suffering on the part of the population. Jason Stearns has further analysis about what went wrong and how similar situations could be prevented in the future.

But it really comes down to this: an undersized force comprised almost entirely of under-equipped soldiers from developing countries can't do everything. The UN in Congo is burdened with an almost impossible task. If the budget and size of the force continue to be cut, we're likely to see less civilian protection, not more.


time to vote

Via Roving Bandit, you need to watch this:

peace in the DRC?

Insight on Conflict interviews Severine Autesserre about peacebuilding in the Congo here.

Remember, you can buy Autesserre's book directly from the publisher with a 20% discount ($23.19 plus tax) by using code E10CONGO until October 31, 2010.


in which the law of unintended consequences takes its nasty course

You may remember that a few weeks ago, Congolese President Joseph Kabila suspended all mining in the country's eastern North Kivu, South Kivu, and Maniema provinces. This ban was ostensibly undertaken to curtail the use of mineral exploitation as a means of financing the region's various and sundry rebel movements.

But then it turned out that it actually wasn't a ban on all mining; rather, only the activities of artisanal miners were put to a stop.

Predictably, this didn't go over well with the 50,000 or so people who suddenly found themselves out of a job. That's because mining in the eastern Congo is not simply an activity in which warlords engage; it's part of the regional economy. Without mining, people don't eat. And that includes people who are in no way directly involved in the mining industry. As one Western donor told Reuters, "It's already damaging the economy in the east so badly we are considering humanitarian aid."

Now Martin Kabwelulu, the Minister of Mines, says that he will allow mining to resume in the provinces around October 15. He says this can be done because:
"...we have stopped some of the illegal mining and put in place measures so that when we re-start we want to be sure that the production is coming from one known source and going to the next stage. We had to improve our control because there was total chaos."
And the DRC government - which, let's remember, has been incapable of really controlling the east for the last 16 years - managed to do this in less than a month?

Clearly the question we should be asking is not whether the mining ban will stop all the violence in the east (it hasn't), but rather who is now benefiting from the trade. The source of most violence and human rights violations in the Kivus for the past few years has not been rebel groups, but rather is the national army itself. Since presumably Kabila couldn't take control of the mineral trade without the army, odds are that they're now getting a cut of the profits.

Most observers also believe that the detente between Kabila and Kagame (which ended fighting in January 2009, which is why Nkunda is under house arrest outside Kigali rather than marauding in North Kivu) almost certainly involved some guarantee of Rwandan access to Congolese minerals since Rwanda lacks minerals of its own and needs the revenue. I seriously doubt that access will be permanently cut off any time in the near future, especially given the fairly credible reports that Rwandan troops are currently in North Kivu to support (read: force) the redeployment of CNDP troops who've been integrated into the FARDC.

Meanwhile, many civil society and community leaders in the region are distressed over the loss of livelihoods for thousands and thousands of Congolese who were already on the brink of starvation. As a general concept, just about everyone agrees that allowing the population to benefit from the mineral trade would be a good thing for the region. But the actual steps to doing that - especially when it's not at all clear that these latest measures will benefit the population - are quite harmful to normal people. Without institutions in place to ensure the rule of law, the fair distribution of resources and resource revenues, and basic security, as civil society leaders in Walikale pointed out, this move was premature.

This series of events also underscores the need for advocates not to oversimplify narratives, which is exactly what happened with the story of conflict minerals and the consumer's capacity to change the DRC. The legislation passed in the US this year made it seem to the DRC government that they had to do something about the situation, but the lack of serious institution-building support on the part of the US government -which passed an ill-conceived law without addressing any of the underlying issues that actually drive fighting in the region - coupled with the DRC's massive norms of corruption in almost every sector means that the government's actions are likely to only cause a shift in who exploits the minerals on the backs of the population. It's certainly not going to bring about peace or help the DRC's poor.

In the short term, this decision created very serious suffering for the population who was dependent on their work in the mines. I have no doubt that children have become malnourished - and may die - as a direct consequence of the ban. You cannot cut off an entire sector of a weak economy overnight without severe consequences, and the most vulnerable are hit the worst by shocks like this one. That's why it's so terribly important for advocates, policy makers, and politicians to get it right.


how the other half weds

Congolese politician/2006 presidential candidate/once and possible future rebel leader Mbusa Nyamwisi tied the knot in Kinshasa last week. Beni-Lubero Online has all the pictures of the civil and religious ceremonies (and the reception buffet) you could want. And, believe me, you don't want to miss this. There's a seven-tier cake, at least two wedding gowns, and plenty of awkward to go around.

Now if only we knew what else Mbusa Nyamwisi is up to these days. He's a member of Kabila's coalition, but that may or may not hold through to next year's election season. And then there's all the nasty business with the ADF-NALU, which used to be close to Mbusa Nyamwisi's old rebel force. There were rumors for awhile that his brother was involved in the July fight with the FARDC. Mbusa Nyamwisi is still very, very popular in the far north of North Kivu; it's an open question as to whether he'll run as opposition candidate in next year's elections.

(Photo: Beni-Lubero Online)


the mapping report release

Today marks the long-awaited release of the official version of the United Nations High Commissioner for Human Rights Democratic Republic of Congo Mapping Report on human rights abuses committed in the country between 1993-2003. Most of the fireworks on this have already passed given the leak of a draft version of the report over a month ago, but the significance of today cannot be overstated.

According to the Associated Press, the final version of the report is largely unchanged, although apparently the genocide language has been tempered to something more cautious. That it remained mostly unchanged huge, not only because the draft version accused the Rwandan government of perpetrating large-scale abuses against Hutus in Zaire/Congo during the first DRC war, but also because it implicates a number of DRC's neighbors in very serious human rights abuses.

A few key points as the discussion takes off today:
  • Rwanda has provided comments on the final draft, including seven objections to the report's claims about its role in the DRC. These range from disagreements about historical context, the report's methodology, evidentiary standards, the use of anonymous sources, and contradictory reports regarding the presence of Hutu fighters in the Zaire refugee camps, and that the genocide charge is contradictory to Rwanda's repatriation of Hutus into Rwanda.
  • I would note that none of the above are valid claims. That the Rwandan government did repatriate Hutus does not mean they did not also slaughter Hutus in Zaire. Additionally, the fact that Hutu militants were using the refugee camps as a base to attack Rwanda (a claim no one disputes) did not give the Rwandan army the right to hunt down and slaughter those individuals, especially when they clearly could have arrested and put many of those genocidaires on trial. The methodology and standards for admission of evidence were solid, and the use of anonymous sources is standard (and the only ethical way to do it) in cases involving human rights abuses.
  • Uganda is angry over the report as well. All countries implicated in the mapping report were given the chance to respond, and it's important to remember that almost every group involved in the DRC wars is responsible for human rights violations at one point or another. Taking a page from Rwanda's playbook, Ugandan government officials have attacked the report's methodology and sources.
  • Not surprisingly, Burundi has also objected to its inclusion in the report.
  • To reiterate, nothing in the mapping report was really unknown. There were witnesses to almost all of the violence at the time and humanitarian actors in the region were acutely aware of what was going on, but previous efforts to provide a comprehensive accounting of human rights abuses in the DRC wars were stymied by various parties over the years. This happened most notably with the never-released 1994 Gersony Report - see Howard French and Jeffrey Gettleman's piece on it here. UN bureaucratic wrangling aside, the central point stands: none of this is new.
  • (By the way, I'm with Wronging Rights: Howard French deserves an award for not writing, "I told you so" in his current work on the story he covered as it happened. French reported most of these events from the field at the time and is at long last getting some validation for his excellent work. Give him a Pulitzer already.)
  • Claims like those of Raymond Bonner, writing at The Atlantic a couple of weeks ago, that suggest that the abuses were the work of rogue soldiers defy all the evidence. Bonner implies that all is really needed is for the involved soldiers to be punished. But Rwanda's activities in Zaire/Congo were very deliberate, and were clearly conceived and coordinated at the very highest levels. This is not My Lai.
  • At some point, Rwanda's most vocal defenders, including Bill Clinton, Tony Blair, Philip Gourevitch, and Rick Warren, are going to have to deal with the reality reflected in the report. There are ways to help Rwanda's people that don't involve supporting the antidemocratic, authoritarian RPF regime. And if anyone is creative enough to figure those out, it's these guys.
  • Jason Stearns has analysis with an eye to possible legal scenarios here. Keep in mind that outcomes on this question will be contingent on what actors throughout the region have to lose from a tribunal. Since many, many, many, many, many, many politicians throughout the Great Lakes region would be in trouble were witnesses to take to the stand, I'm pretty cynical that any tribunal will really be able to deal with the abuses outlined in the mapping report. But stranger things have happened.
  • Wronging Rights covers the legal issues surrounding this in great detail here.
  • It appears that Rwanda will not withdraw its peacekeepers from Darfur as threatened. I'd be interested to know what role Clinton played in mediating between Ban and Kagame on this issue.
  • Acknowledging that the RPF is responsible for horrible crimes in the Congo is not the same as wanting Rwanda to be destroyed. I feel like any reasonable person would agree with that, but some commenters on this blog will claim otherwise, so I feel a need to get that out of the way. No one wants Rwanda to fall apart again. No one.
I'll have more on the final report and reactions to it after its release. It's very likely that the New Times will run an hysterical editorial dictated to their staff by some high-level government official in the next 24 hours or so; I'll keep an eye out.

(BTW, you should totally read today's New Times editorial, which is a paean to the RPF, which invaded Rwanda from Uganda twenty years ago today. Oh, the irony.)

UPDATE: The full report, comments from Angola, Burundi, Rwanda, and Uganda, and the press releases are available here.

UPDATE #2: Human Rights Watch statement on the mapping report is here. "Human Rights Watch supports the establishment of a mixed chamber, with jurisdiction over past and current war crimes and crimes against humanity committed in Congo."

UPDATE #3: Reed Brody, a member of the 1997 UN team that investigated these atrocities and whose report was suppressed, weighs in. Money quote: "By seeking to quash publication of the report, the Rwandan government is raising further questions about what it may be trying to hide."

UPDATE #4: Jason Stearns weighs in here. Official DRC response here.

UPDATE #5: A great roundup and personal reflection from @sonjasugira is here. Amnesty International's statement is here.

UPDATE #6: Colum Lynch parses the differences between the leaked draft and the final report here. MSF France has a statement (in French) on the report here. This is important as MSF is named several times in the report as a source; the statement as I read it is aimed at clarifying the fine line MSF (like many other organizations) walks in remaining neutral while drawing attention to serious human rights abuses.

UPDATE #7: Rwanda's New Times government daily weighs in with a hissy fit of an editorial that, while not exactly factual, is mighty entertaining. A choice quote: "There are several reports that were sponsored by unfriendly countries that always falsely accused the Government of Rwanda of exploiting DRC minerals. Rwanda is rich in mineral deposits and there is no need of illegal exploitation of minerals in other countries." Sure. Except for the fact that it is a well-established fact that Rwanda is not rich in mineral deposits, but that's just a detail, right guys?