Accountable Decision-Making

Analysing and Interpreting Data

Accountable Decision-Making

In this section, we explore how every stage of data analysis involves decision-making, highlighting the importance of transparency and accountability in shaping research outcomes.

Introduction

Every research discipline works with some form of data that must be analysed to draw conclusions. Data can take many forms, including sound recordings, text, interviews, measurements, drawings, simulations, and symbols. It serves as the crucial link between our research question and its answer. However, every step of working with data involves decision-making, and these choices shape the outcomes of research in significant ways.

Regardless of its form, data is never truly ‘raw’ [1].

Researchers make choices at every stage—whether in collecting, cleaning, labelling, aggregating, or filtering data—to turn it into something interpretable. These decisions introduce researcher values, assumptions, and biases into the dataset, which can influence the conclusions drawn. Some examples of how data analysis decisions impact research are introduced below.

Data Exclusion Decisions

Bias can be introduced when data is excluded because it doesn’t conform to the expectations of the researcher. These expectations may be grounded in false assumptions about their subject. For example, when researchers dismissed data from Black participants in fear-conditioning experiments as ‘unusable’ because it did not conform to their expectations. Read Addressing Racial and Phenotypic Bias in Human Neuroscience Methods by Webb et al. for more detail.

Data Labelling Choices

Bias can be introduced when researchers or annotators categorise data. The wording of labelling instructions and the annotator’s own cognitive biases shape how data is classified.
Read Don’t Blame the Annotator: Bias Already Starts in the Annotation Instructions by Parmar et al. and Annotators with Attitudes: How Annotator Beliefs And Identities Bias Toxic Language Detection by Sap et al..

Data Aggregation Decisions

Foundational research on IQ differences between countries used inappropriate aggregation methods, combining studies with different sample sizes and participant ages, leading to misleading conclusions.
Read ‘National IQ’ datasets do not provide accurate, unbiased or comparable measures of cognitive ability worldwide by Rebecca Sear.

Statistical Analysis Decision

The choice of statistical methods, models, and interpretations can lead to vastly different outcomes.
For example, Many Analysts, One Data Set: Making Transparent How Variations in Analytic Choices Affect Results by Silberzahn et al. demonstrates how different analytical approaches applied to the same dataset led to different results.

Earlier in the toolkit, we emphasised the importance of representative and inclusive study design—these principles should also extend to data analysis. Transparent documentation of decisions about data—such as when it was collected, who collected it, and what it represents—ensures that others working with the data or interpreting the results can do so with full awareness of its context.

Writing a data biography, also known as metadata, is one framework for ensuring transparency in decision-making.

By recognising that every step of data analysis involves human decisions, researchers can take responsibility for ensuring those decisions are transparent, justified, and accountable. Thoughtful and well-documented decision-making leads to more reliable, ethical, and inclusive research.

Practical Steps and Tools

Write a data biography for your own data.

Clearly document your data analysis and interpretation process, including the rationale behind your decisions.

Be reflexive about your analysis decisions. Ask yourself:

  • What assumptions am I making about this data? Do these align with the reality presented in the data biography?
  • Why have I chosen this method of analysis and these specific variables?
  • Has my identity or positionality influenced my analytical choices?

Be reflexive about your interpretation of the data. Consider:

  • Could cognitive biases have shaped my conclusions?
  • Why have I interpreted the results in this way? How might my positionality have influenced my interpretation?
  • Is there an alternative explanation for these results?
  • Could my interpretation be harmful to anyone or reinforce harmful stereotypes?

References and Further Resources

Reflexive Accounts and Accounts of Reflexivity in Qualitative Analysis by Mauthner & Doucet – This paper discusses the importance and practicalities of reflexivity in qualitative data analysis.

Statistical Tales: Bringing in Reflexivity to Make Sense of Quantitative Data by Lakew – This chapter provides a researcher’s account of reflexivity in quantitative data analysis and interpretation.

Watch Critical Process Matrix, a talk by Dr Paola Buedo for Black and Brown in Bioethics where she discusses an inclusive framework for data analysis.

Contribute to the Hub

Feedback helps improve research quality, refine methods, and keep insights relevant and impactful. By sharing their perspectives, users help shape future studies, refine methodologies, and contribute to a more dynamic and collaborative research community.

Contribution Submission

The Hub is a living resource. As such, we welcome critical feedback and contributions of all kinds. In particular, we invite feedback on:

  • Concepts or practices we may have missed or under-explained
  • Our use of language, and how it could be clarified or made more inclusive
  • The organisation and presentation of information and resources

We would especially appreciate suggestions for subject-specific case-studies that are relevant to the various sections of the Hub.

Contextualising Research

Contextualising Research

Forming a Research Question

Forming a Research Question

Designing a Research Project

Designing a Research Project

Collecting Data

Collecting Data

Analysing and Interpreting Data

Analysing and Interpreting Data

Communicating Results

Communicating Results