Victor Asal, Tiffiany Howard, and Stephen Shellman
In terms of methods, researchers working in nationalism, ethnicity, and migration have used everything from broad historical narratives to automated coding and event data analysis. Traditionally, narratives were the dominant methodological approach implemented to study these areas. The narrative approach allowed for explication of groundbreaking theoretical arguments generating testable hypotheses, the deep inspection of particular areas of the world or particular issues with richness of detail and process, and the investigation of a small number of cases. In addition, the use of formal theory to explore issues related to nationalism, ethnicity, and migration also has a long tradition. Formal theory allows for the construction of concise decision making models that force the researcher to be explicit about key assumptions made regarding preferences and the political structure involved. The formal theory approach has encouraged greater specificity from the arguments formed by scholars of nationalism, ethnicity, and immigration and has generated important theoretical insights. Finally, the most rapidly expanding approach to the study of nationalism, ethnicity, and immigration over the past two decades has been statistics. Statistical analyses offer the advantage of being able to bracket confidence intervals around the causal inferences one makes and to more formally control for a variety of competing factors. As statistical technology and training have become more common, the use of statistics has grown substantially.
Collecting and examining datasets on ethnicity and religion involves translating and codifying real-world phenomena such as actions taken by governments and other groups into data which can be analyzed by social science statistical techniques. This methodology is intended to be applied to phenomena which in their original form are in a format not readily accessible to statistical analyses, i.e. “softer” phenomena and events such as government policies and conflict behavior. Thus, this methodology is not necessary for phenomena like GDP or government military spending, but is based on behavior by organizations or groups of individuals which are assessed by a coder who translates this behavior into data. Aggregate data collected by this methodology should have three qualities. First, they must be reproducible. Second, the data must be transparent in that all aspects of the data collection process and its products be clear and understandable to other researchers, to the extent that they could, in theory, be replicated. Third, it must measure what it intends to measure in a clear, accurate, and precise manner. A project which accomplishes all of this must be conceptualized properly from the beginning, including the decision on which unit of analysis to use and which cases to include and exclude. It must have appropriate sources and a tight variable design. Finally, the data must be collected in a systematic, transparent, and reproducible manner based upon appropriate sources.