From a recent post by Jay Ulfelder,
“When you hear the term “conflict of interest,” you probably think of corporations paying for studies that advance their commercial interests. I know I do. It’s easy to see why studies on the effectiveness of new drug therapies or the link between pollution and cancer, for example, warrant closer scrutiny when they’re funded by firms with profits riding on the results. You don’t have to be a misanthrope to believe that the profit motive might have shaped the analysis, and there are enough examples of outright fraud to make skepticism the prudent default setting.
That’s not the only conflict that can arise, though. What I think many scholars working in comparative politics don’t appreciate as much as we should is that it’s also possible for political values and advocacy to play a similar role, and to similar effect. When a researcher’s work deals with issues on which he or she has strong moral beliefs, that confluence can hinder his or her ability to identify and fairly weigh relevant evidence. Confirmation bias is hard to overcome, especially in studies that rely entirely on an author’s interpretation, as many qualitative studies do. The problem is even more intense if the researchers’ personal life is interwoven with her work. Certain conclusions may be more palatable or appealing to people with certain values, and it can be professionally and personally damaging for researchers to report findings that suggest the work their friends and colleagues are doing may not be all that useful, or may even be counterproductive.”
I think this is a larger problem than people realize. We all have certain moral and conceptual beliefs that we carry to make sense of the world and it is very difficult to separate those beliefs from our research and from our assessments of other people’s research. In that sense, I think we need to be more upfront about these realities and the beliefs that people carry with them.
In my primary research field, Aboriginal-settler relations in Canada, this is a particularly tricky problem, given that Indigenous peoples have been so disempowered and impoverished by the Crown. There’s a real temptation to use one’s scholarship and peer review role as advocacy. I know I certainly struggle with this issue.
As an author, I think the solution is to be upfront about your beliefs. Recently, I’ve been reading a bunch books on Indigenous methodology and one of the many things that they do right is to announce early on who they are, where they come from, and what kind of perspective they bring to the table. Rather than pretending to be “objective”, which quite frankly, is an impossibility in my view, we as authors should be up front about our beliefs so the readers know where we are coming from. And reviewers should assess manuscripts by respecting those beliefs rather than rejecting ideas outright because they don’t gel with our own beliefs.
Some of my work, for instance, uses rational choice to analyze treaty and devolution negotiations. During the peer review process, I’ve encountered multiple reviewers who have rejected my research outright because they don’t like rational choice. Rarely do they ever say why the use of rational choice is inappropriate to the case at hand or how the evidence does not support the argument.
As a peer reviewer, I try to approach new research by accepting the theoretical choices of the author as a given (at least at first). So if an author decides to use political culture, for instance, which is a concept I’m highly skeptical of, I initially accept that choice and ask: a) why is this concept better than others for explaining the phenomenon at hand? b) how well does the author sketch out, deploy, and defend the concept/argument with evidence?
So what are my beliefs? Quite frankly, I think I’m more confused and uncertain about the world than anything else! My scholarship has been characterized as right wing, left wing, moderate, and libertarian, all at the same time by different people. I hope that reflects my commitment to being open to the very real possibility that my past and present views about the world are wrong (or maybe I’m just engaging in Bayesian updating!).