Wednesday 14th October
Jessica Gurevitch: Thinking in Meta-analysis: Research Synthesis, Generalisation, and the Nature of Scientific Evidence
Research synthesis consists of systematic reviewing and meta-analysis. What can and can't these approaches do to synthesize results to reach general conclusions about scientific questions? What can we expect, and where do our expectations fall short? Are there alternatives? I will discuss some of the philosophical and practical differences and similarities between different disciplines in how these powerful tools are used, where developments are headed, and how they can and can't address risk. (Spoiler alert: meta-analysis is probably a lot better at interpolation than at extrapolation, but it's been underutilized in parameter estimation).
Dr. Jessica Gurevitch is a Distinguished Professor in the Department of Ecology and Evolution at Stony Brook University, a research university of the State University of New York. Dr. Gurevitch’s research interests span several traditional categories within the field of ecology, including research synthesis and meta-analysis, biological invasions, and broadly in ecology with a focus on plant communities and populations. She introduced contemporary quantitative research synthesis and meta-analysis to the fields of ecology and evolution, changing the way scientists in these fields conceptualize and review scientific data. This work has been controversial and highly influential, and grew out of her interests in applying rigorous statistical methodology to the analysis of ecological data and the design of ecological experiments.
In addition to carrying out scientific studies, Prof. Gurevitch has co-authored and co-edited several books, including Design and Analysis of Ecological Experiments (Scheiner and Gurevitch 1993, Chapman and Hall; 2nd ed. 2001, Oxford University Press), The Ecology of Plants (Gurevitch, Scheiner and Fox, Sinauer Assoc. 2002, 2006, 3rd ed. 2020—just out) and Handbook of Meta-analysis in Ecology and Evolution (2013, Koricheva, Gurevitch and Mengersen, Princeton University Press). In addition, Prof. Gurevitch co-authored an early software package for meta-analysis in ecology ( MetaWin 2.0 , Rosenberg, Adams and Gurevitch, publ. Sinauer Assoc.) as well as an open-access package, OpenMEE (2013, with several collaborators).
Friday 4th September
Jessica Gurevitch: Potential Ecological Risks and Impacts of Solar Radiation Modification and Why You Should Pay Attention
Climate intervention is a set of proposed activities designed to intentionally modify global climate to reduce anthropogenic global warming. A major proposed approach, solar radiation management (SRM), aims to deliberately reduce or stabilize the temperatures by reflecting incoming solar radiation to increase Earth’s albedo. The most well studied approach to SRM is stratospheric aerosol intervention (SAI). While a great deal of work has been done on climate projections for SAI, almost nothing is known about its predicted ecological impacts. I will talk about what is known and unknown about the predicted impacts and risks of SAI on ecological systems, and why the involvement of scientists who study risk is important in assessing the possible future of SAI.
Wednesday 9th September
Anne Michiels van Kessenich: Risk is a tool, not a problem: communicating risk to make confident decision makers
Dr. Anne Michiels van Kessenich is a political science graduate of Maastricht University, The Netherlands, with experience working with decision-makers in political organisations and local government. She has first-hand experience in political decision-making at the public interface and has an interest in the patterns through which the public approach and use the risk-concept. Anne’s research develops methods through which children can be taught to handle the risk-concept confidently, including their approach selecting behavioural options and consequent decision-making. Dr. van Kessenich focuses on the use of embodied learning to address the important part of the mental processes we use to arrive at a decision; and on the quality or emotional colour that we learn to give to the concepts we use.
In this lecture Anne presents the guiding material and principles that undergird the idea of risk as a communicative tool. In a nutshell, Anne’s research has observed a category error which is consistently made when thinking about the risk-concept: risk if often conceptualised as a threat in-of-itself within the general population due to how it has been communicated from early infancy, a perception which may result in decision-making avoidance and potentially exacerbate the effect of hazards. A successful shift in perception away from risk-as-a-threat requires new frames of communication: risk is a sign that asks you to pay close attention, like a traffic sign. There is no point in fearing a traffic sign; instead, you should learn to understand what it means and to use it wisely. Attendees are encouraged to share their views and ideas surrounding the problems and solutions regarding effective risk communication, and are asked to think one question in particular: is there a “silver bullet” with which we can help grown-ups lose their fear and avoidance of real engagement with the concept?
Wednesday 30th September
Kelli Johnson & Christian Luhmann: Epistemic uncertainty explains seemingly maladaptive behavioral findings
Kelli Johnson recently received her PhD in Cognitive Science from Stony Brook University where she studied decision making using a combination of behavioral and computational methods. She is now a postdoctoral researcher in the Meinig School of Biomedical Engineering at Cornell University, and works on tools for medical decision making.
Christian Luhmann is an Associate Professor of Cognitive Science at Stony Brook University in the Department of Psychology and Institute for Advanced Computational Science. His research interests include: decision making, learning and computational modeling.
Researchers studying decision making often present subjects with numerical information and assume (sometimes implicitly) that such information is perceived as precise. We discuss the possibility that decision makers treat all numerical information as estimates, which are subject to sampling error. Such an assumption can explain, and even justify, seemingly maladaptive behavioral phenomena. Behavioral evidence, and opportunities for improving decision making in medical contexts, are discussed.
Wednesday 5th August
Dominik Hose: The Embarrassingly Simple Calculus of Possibility Theory
Dominik Hose received his bachelor's and master's degree in Simulation Technology from the University of Stuttgart (Germany) in 2015 and 2017. Currently, he is a phd student under the supervision of Michael Hanss at the University of Stuttgart. Most importantly, the REC-Workshop hosted by the Risk Institute in 2018 was his first UQ conference ever to attend and it sparked his passion for imprecise probabilities. Dominik's research focuses on solutions to inverse problems in possibility theory and their numerical implementation.
Possibility theory is the mathematically rigorous heir to fuzzy set theory and can be used very efficiently for the quantification of polymorphic uncertainty in both forward and inverse problems.
We discuss how a possibility measure can be derived from an axiomatic basis, and how - equipped with some suitable principles - this provides a mathematical framework for imprecise probabilities. We then see how possibility distributions arise very naturally in many situations and which techniques are available for modeling possibilistic membership functions. Finally, many problems of possibilistic calculus may be expressed in a simple and general manner for which we consider several numerical solution approaches. Special emphasis is put on the intuitiveness, applicability, and simplicity of the presented results.
Friday 7th August
Bruno Merk: Nuclear Technology - A High Risk Technology? or What Happens if Risk Communication Fails?
Professor Bruno Merk is currently NNL/RAEng Research Chair in Computational Modelling for Nuclear Engineering at the University of Liverpool, NNL Laboratory Fellow for Physics of Nuclear Reactors, and Member of the Academic Editorial Board of PLOS ONE. He is an internationally recognized expert and thought leader in advanced nuclear reactor technologies and nuclear waste management strategies. He is currently involved in several national nuclear innovation projects sponsored by the Department for Business, Energy & Industrial Strategy and represents innovative technologies like iMAGINE – a nuclear energy system operating on spent nuclear fuel. Before coming to the UK, he was PI in the German National Programme on Nuclear Waste Management, Safety and Radiation Research (NUSAFE) and advised the German government on Nuclear Waste Management Strategies.
The seminar will introduce nuclear reactor safety to non-specialists by surveying the fundamentals and basic principles of safety in terms of historical examples and describing the safety design approach and engineering principles for controlling high-risk technologies. It will review how the perception of nuclear technologies and nuclear energy production has changed with time in Germany and what impact perception might have on new and old technologies and their acceptability, comparing examples of projects that failed in Germany although similar projects precipitated no major discussions in France or the UK. The presentation will lead into an open discussion on several fundamental questions. Why is risk communication important for engineering? What went wrong to cause the historical failures? How could we do better in the future?
Wednesday 19th August
Ullrika Sahlin: Precise Versus Bounded Probability
Ullrika Sahlin is an Associate Professor in Environmental Science at Lund University, Sweden. She is doing research on the use of methods to treat uncertainty in scientific assessments with applications on environmental assessments and evidence-based decision making. Ullrika is leading the research group Uncertainty and Evidence Lab at Lund University.
What does it mean to quantify epistemic uncertainty by probability? Can there be more than one way to do it? In this discussion seminar I will compare precise and bounded probability as measures for epistemic uncertainty. I look for criteria for suitable quantitative measures of epistemic uncertainty such as a coherent theory, interpretation, the ability to learn from data in different types of situations, the ability to in a transparent way integrate expert knowledge and the ability to propagate epistemic uncertainty in a model. We focus the discussion to Bayesian inference (for precise probability) and Robust Bayesian inference (for bounded probability) and Confidence theory (for confidence boxes).View the slides for Ullrika's talk here
Friday 21st August
Maria Tsoutsou: Risk Communication in Perioperative Medicine: Discussing the Questionnaire Feedback
Dr. Maria Tsoutsou is a Clinical Fellow in perioperative medicine and anaesthesia at the Royal Liverpool University Hospital.
Perioperative medicine covers the full range of a patients experience through treatment, from contemplation of surgery to full recovery. Communicating the risks present throughout this period is necessary to keep the patient informed and to manage anxiety and expectations. A questionnaire was sent to anaesthetists working at the Royal Liverpool University Hospital asking them about the difficulties of communicating risks, and the perception of risk calculators and risk communication tools. This meeting will cover the findings of this questionnaire based on 38 responses. The aim is to develop a theme around which a subsequent meeting on 9th October can be based, which could address concerns and ideas raised from the questionnaire feedback. This could lead to further research projects, but should also serve to inform both anaesthetists and risk analysts about the intricacies of risk communication in perioperative medicine.
Wednesday 26th August
Ottone Scammacca: Risk assessment of mining projects at the territory level: the case of gold mining in French Guiana
Ottone Scammacca is a geographer and soil scientist working at the GeoRessources Laboratory, Université de Lorraine. He is currently completing his thesis on the development of a methodology for the assessment of gold mining risks at the territory level for land planning-purposes. His research interests concern as well soil ecosystem services, soil contamination, land reclamation and land-planning. He has been a researcher at the French National Institute for Agriculture, Food, and Environment (INRAE) in soil science and cartography, developing a spatially explicit indicator approach for quantifying soil ecosystem services to support urban planning. He was educated in soil science at the AgroParisTech and in geography at the Université de Paris 1 Panthéon-Sorbonne where he studied the socio-environmental impacts of nickel mining in New Caledonia. He also has a degree in law and juridical sciences from Università Degli Studi Roma Tre (Italy).
Mining is the source and target of various positive and negative risks, that generally exceed the mine-site perimeter, and might impact the socio-ecological system where mining is performed. However current mandatory risk and impact assessment methodologies are often project-centered, performed on one project at a time and sometimes neglecting the cumulative dimension of risks, the great variability of mining activities and the socio-ecological vulnerability in which mining is performed.
Therefore, since a mining project should be considered as a matter of land-planning and territorial management rather than a simple industrial object, the current PhD (2017-2020) aims to develop a methodology that has the goal to propose and compare different mining development strategies for land-planning purposes, based on the risk assessment of given scenarios. The methodology is applied on the case of gold mining in French Guiana as a demonstrative example. In this French region in the Amazon, gold exploitation plays a critical role within territory dynamics, taking a great variety of forms in a very sensitive socio-ecological context. Furthermore, gold commodities are still underexploited, incresing public debates and the urgent needs for public authorities to develop future strategies that might integrate the development of gold mining activities at a regional level.
Wednesday 1st July
Michael Balch: Numerical Methods for Propagating Confidence Curves
Michael Balch is the Technical Lead at Alexandria Validation Consulting, LLC. He designs the algorithms underpinning our software and renders all consulting services personally. Dr. Balch has twelve years of experience as a research-practitioner specializing in uncertainty quantification. He has worked on applications spanning engineering, medicine, defense, and finance. His career has included time as a contractor at both NASA Langley and AFRL Wright-Patterson. He received his Doctorate in Aerospace Engineering from Virginia Tech in 2010.
Confidence curves—aka consonant confidence structures, aka inferential models—fuse the comprehensiveness and flexibility of Bayesian inference with the statistical performance and rigor of classical frequentist inference. Rooted in possibility theory, these structures visualize the long-known connection between confidence intervals and significance testing. More importantly, they enable the statistically reliable assignment of belief to propositions (or sets, hypotheses, etc.) about a fixed parameter being inferred from random data. This presentation explores a Monte Carlo approach to propagating these structures through black-box functions, a necessity if these methods are to be widely applied in engineering work.Further discussion with Michael Balch about his talk
Wednesday 8th July
Jürgen Hackl: ABM (agent-based modeling) simulations of epidemic spreading in urban areas
Human mobility is a key element in the understanding of epidemic spreading. Thus, correctly modeling and quantifying human mobility is critical for studying large-scale spatial transmission of infectious diseases and improving epidemic control. In this study, a large-scale agent-based transport simulation (MATSim) is linked with a generic epidemic spread model to simulate the spread of communicable diseases in an urban environment. The use of an agent-based model allows reproduction of the real-world behavior of individuals’ daily path in an urban setting and allows the capture of interactions among them, in the form of a spatial-temporal social network. This model is used to study seasonal influenza outbreaks in the metropolitan area of Zurich, Switzerland. The observations of the agent-based models are compared with results from classical SIR models. The model presented is a prototype that can be used to analyze multiple scenarios in the case of a disease spread at an urban scale, considering variations of different model parameters settings. The results of this simulation can help to improve comprehension of the disease spread dynamics and to take better steps towards the prevention and control of an epidemic
Wednesday 15th July
Alan Calder: Resolving Thermonuclear Supernovae
Alan Calder is associate professor in Physics and Astronomy at Stony Brook University in New York, working in the nuclear physics of explosive astrophysical phenomena. With his extensive background in large-scale computing, he is deputy director of the Institute for Advanced Computational Science. He has held research appointments at the National Center for Supercomputing Applications and the University of Chicago at the Center for Astrophysical Thermonuclear Flashes. His research is principally in bright stellar explosions known as Type Ia supernovae which produce and distribute heavy elements and are therefore important for galactic chemical evolution, and whose light curves can be used as distance indicators for cosmological studies of the expansion of the universe. His simulations explore how stellar age and composition affect the event’s brightness which is critical to addressing their variability, which is a source of significant uncertainty in cosmology.
Thermonuclear (Type Ia) supernovae are bright stellar explosions distinguished by light curves that can be calibrated to allow for their use as "standard candles" for measuring cosmological distances. While many fundamental questions remain, it is accepted that the setting of these events involves a white dwarf star (or two), and that the explosion is powered by explosive thermonuclear burning under degenerate conditions. Modeling these events presents a challenge because the outcome of an event sensitively depends on the details of the physics occurring on scales orders of magnitude smaller than the star. Such "microphysics" includes nuclear burning, fluid instabilities, and turbulence. I will give an overview of our understanding of thermonuclear supernovae and describe our approach to capturing these sub-grid-scale processes in macroscopic simulations.
Wednesday 22nd July
Michael Balch: Beyond False Confidence
For those who take frequentist notions of reliability seriously, normative statistical inference remains an unresolved challenge. For any one problem, there are multiple solutions that satisfy the Martin-Liu validity criterion. Some of these are obviously more efficient than others, but the vigorous pursuit of efficient and reliable inference can yield counter-intuitive results. This presentation explores three counter-intuitive phenomena that can arise in the use of confidence curves. Two of these phenomena hint at the need for additional constraints on statistical inference, beyond simple reliability.
Wednesday 29th July
Renato Schiavo: Health Equity and Health Communication: Moving Toward a New Paradigm during COVID-19 and Beyond
Renata Schiavo, PhD, MA, is a health communication, public health and global health specialist. She is the founding president and CEO of Health Equity Initiative, a nonprofit organisation. She is also a Senior Lecturer at Columbia University Mailman School of Public Health, Department of Sociomedical Sciences.
Increasingly, global health communication has been recognized as a key discipline in advancing health and human rights and promoting behavioral, social, organizational and policy change. This presentation focuses on why health equity matters in health communication and highlights the role of communication on improving population health and patient outcomes. It also features key elements of a proposed paradigm shift to a health equity and human rights-driven approach to global health communication research and practice during COVID-19 and beyond.
Wednesday 10th June
Yan Wang: Generalized Interval Probability and Its Applications in Engineering
Yan Wang, Ph.D. is a Professor of Mechanical Engineering at Georgia Institute of Technology. He is interested in multiscale systems engineering, modeling and simulation, and uncertainty quantification, and has published over 90 archived journal papers and 80 peer-reviewed conference papers. He recently edited the first book of its kind on uncertainty quantification in multiscale materials modeling
Uncertainty in engineering analysis is composed of two components. One is the inherent randomness because of fluctuation and perturbation as aleatory uncertainty, and the other is epistemic uncertainty due to lack of perfect knowledge about the system. Imprecise probability provides a compact way to quantify and differentiate the two components, where the probability measures randomness and the interval range quantifies the imprecision associated with the probability. Several forms of imprecise probability have been proposed such as Dempster-Shafer theory, coherent lower prevision, p-box, possibility theory, fuzzy probability, and random set. To simplify the computation for engineering analysis, we introduced generalized interval probability where the interval bounds take the form of directed or modal interval instead of classical set-based interval. Interval calculation is based on the more intuitive Kaucher interval arithmetic. Generalized interval probability has been applied in studying stochastic dynamics, hidden Markov model, Kalman filter, random set sampling, and molecular dynamics simulation.
Read the talk notes here, or watch the full talk below.
Wednesday 17th June
Jurgen Hackl: Complex Infrastructure Systems: Intelligent risk and resilience assessments
Dr Jürgen Hackl is a Lecturer at the University of Liverpool and a member of the Data Analytics Group at the University of Zurich. He received his doctorate in Engineering Science from the ETH Zurich in July 2019. His research interests lie in complex urban systems and span both computational modelling and network science. Much of his work has been on improving the understanding, design, and performance of complex interdependent infrastructure systems, affected by natural hazards. Presently, he works on getting a better understanding of how the topology of the system influences dynamic processes and how this can be used to decrease the complexity of computational models. In order to transfer this knowledge to the industry, he co-founded the start-up Carmentae Infrastructure Management, helping infrastructure managers in their decision-making processes. Furthermore, he has a long history of supporting a sustainable digital world by developing and maintaining various open-source projects.
Dr Hackls' presentation focuses on complex infrastructure systems (such as transportation and supply chains), intelligent risk and resilience assessments for climate change, and integrated solutions to future challenges facing our cities and society. To gain a deeper understanding of such complex systems, new mathematical approaches and computational models are needed. In order to achieve this, we have to go beyond the classical boundaries of the individual disciplines and work in an interdisciplinary team. In this sense, research on smart mobility and smart cities have been developed as new research areas.
The aim of this presentation is to give an overview how complex infrastructure systems are currently modelled; how novel network analytic methods for spatial-temporal networks can be utilized to gain a better understanding of our complex urban environment; how advances in data analytics and machine learning provide us new ways to extract knowledge and support decision-making processes; as well as how cloud-based simulations might offer a solution for computational risk and resilience assessments of complex infrastructure systems.
Monday 22nd June
Noémie Le Carrer: Making sense of ensemble predictions in weather forecasting: Can possibility theory overcome the limitations of standard probabilistic interpretations?
Ensemble forecasting is widely used in weather prediction to reflect uncertainty about high-dimensional, nonlinear systems with extreme sensitivity to initial conditions. Results are generally interpreted probabilistically but this interpretation is not reliable because of the chaotic nature of the dynamics of the atmospheric system as well as the fact that the ensembles were not actually generated probabilistically. We show that probability distributions are not the best way to extract the information contained in ensemble prediction systems. A more workable possibilistic interpretation of ensemble predictions takes inspiration from fuzzy and possibility theories. This framework also integrates other sources of information such as the insight on the local system’s dynamics provided by the analog method and provides more meaningful quantitative results.
Wednesday 24th June
Ryan Martin: False confidence, imprecise probabilities, and valid statistical inference
Dr. Ryan Martin is a Professor in the Department of Statistics at North Carolina State University, USA. His research interests include asymptotics, empirical Bayes analysis, high- and infinite dimensional inference problems, foundations of statistics, imprecise probability, mixture models, etc. He is co-author of the monograph Inferential Models and co-founder of the Researchers.One peer review and publication platform.
Despite remarkable advances in statistical theory, methods, and computing in the last 50+ years, fundamental questions about probability and its role in statistical inference remain unanswered. There is no shortage of ways to construct data-dependent probabilities for the purpose of inference, Bayes being the most common, but none are fully satisfactory. One concern is the recent discovery that, for any data-dependent probability, there are false hypotheses about the unknown quantity of interest that tend to be assigned high probability -- a phenomenon we call false confidence -- which creates a risk for systematically misleading inferences. Here I argue that these challenges can be overcome by broadening our perspective, allowing for uncertainty quantification via imprecise probabilities. In particular, I will demonstrate that it is possible to achieve valid inference, free of false confidence and the associated risks of systematic errors, by working with a special class of imprecise probabilities driven by random sets. Examples will be given to illustrate the key concepts and results, and connections between this new framework and familiar things from classical statistics will be made.
Wednesday 27th May
Dan Rozell: Technological Risk Attitudes in Science Policy
Daniel Rozell has two decades of experience in the fields of engineering and science working in private industry and for public regulatory agencies. Dr. Rozell holds an affiliation as Research Professor in the Department of Technology and Society at Stony Brook University in New York. His recent book, Dangerous Science: Science Policy and Risk Analysis for Scientists and Engineers is available open access at Ubiquity Press
Science and technology policy decisions must often be made before there is sufficient data, widely accepted theories, or consensus in the scientific community. Furthermore, what constitutes credible science is sometimes itself a contentious issue. The result is that we frequently encounter science and technology policy debates where well-intentioned and reasonable individuals can arrive at different conclusions. In the face of inconclusive data, people tend to evaluate new information using heuristics that include their pre-existing attitudes about science and technology.
Read the talk notes here, or watch the full talk below.