Alison Heppenstall - Challenges of using microsimulation for simulating the impact of interventions on populations

  • Presenting author: Alison Heppenstall (University of Glasgow)

  • Authors: Alison Heppenstall, Nik Lomax, Corinna Elsenbroich, Rob Clay, Luke Archer

  • Session: C04D - Uncertainty - Wednesday 16:00-17:00 - Erika-Weinzierl Hall

  • Slides: PDF

The past decade has seen a rapid increase in both the amount of variable rich data and computational resources available to researchers. This has seen a renewed interest in population synthesis techniques for creating bespoke populations, as well as the use of both static and dynamic microsimulation for exploring the potential impacts of interventions on different subpopulations. However, whilst there have been some useful developments in population synthesis e.g. CTGAN for reducing oversampling, there remains a number of challenges to resolve for microsimulation. Drawing on our experience of building complex dynamic microsimulation models for a large research project (sipher.ac.uk), we discuss some of these issues including (i) Creating transition probabilities: Microsimulations’ main purpose is to model heterogeneity and great care is taken to differentiate individuals in their attributes for the underlying population. In contrast transition probabilities rarely if ever show variation between subpopulations. Methods dealing with complex causality, such as Causal Loop Diagrams, can help to better formulate the effects interventions have on different subpopulations (ii) Quantification of uncertainty: Input data uncertainty vs cumulative transition uncertainty amplifying over time? (iii) Communicating outputs: As with all models care has to be taken in communicating the models. The data driven nature of Microsimulation allows it to be a great forecasting tool but limitations of different levels of accuracy of transition probabilities needs to be communicated.