Science is a critical partner in developing the tools, evidence, analysis, and actionable knowledge to help us disentangle the complexity of current global challenges - not least those captured in the 2030 UN Development Agenda. Yet, there are many and barriers to implementing evidence-based approaches in designing effective policies and programmes.
On Monday 24 June 2019, the SDG Solution Space at Geneva’s Campus Biotech offered the perfect setting for the launch of the GSPI’s Evidence-Based Thinking in Practice event series.
The series aims to support exchange and learning on challenges and practical approaches for evidence-based thinking across Geneva’s policy and research communities. The target audience are professionals engaged in strategic planning, research management or production, and science-policy collaboration functions within Geneva’s international community.
During this event, approximately 20 participants from a variety of backgrounds (such as knowledge management, research, policy analysis and advice) and sectors (such as humanitarian, ICTs, health, environment, economic affairs) actively unpacked barriers and opportunities within evidence-based approaches.
Starting from a working definition of evidence-based thinking:
“A process of making good-quality decisions based on a combination of critical thinking and best-available, relevant evidence of the highest quality”
The discussion was stimulated by the intervention of two experts who shared real-life success and struggles around their efforts to bring evidence-based approaches to their work.
Brenda Koekkoek, Programme Officer, SAICM Secretariat (UN Environment Programme), shared her perspective on the needs, opportunities and barriers for scientific collaboration to support the sound management of chemicals throughout their lifecycle. This is an issue that not only requires an interdisciplinary approach, but also resonates strongly with several SDGs. She stressed that, unlike climate change and biodiversity, chemical management is a fragmented field that is not currently covered by strong science-policy interfaces and would benefit from more holistic, evidence-based mechanisms. Sharing her one-time experience mobilising students to help produce new policy-relevant knowledge, her fundamental challenge was to find the time and space for engaging both with academics and with her professional hierarchy , to make sure the knowledge produced doesn’t remain in the classroom but is practical and functional for effective policy and action in chemical management.
Nathan Ford shared his experience working as the Head of the World Health Organisation’s Guidelines Review Committee, whose mission is to ensure the existence of sufficient evidence to produce health guidelines. As authoritative policy products, guidelines are based on a thorough quality assurance process that evaluates existing scientific evidence. Beyond assessing evidence, the role of the committee is also to avoid conflicts of interest and to ensure that there is diverse representation among the committee members making the decisions.
Evidence must not only underpin policies, but evidence-based policies should turn into actual practice. Displaying a hierarchy of evidence (from single case reports to randomised trials and meta-analysis), Ford also stressed that when considering evidence and research, factors such as feasibility, cost, ethics must also be taken into account.Based on these thought-provoking presentations, a dynamic discussion with participants led to the identification of a set of barriers and opportunities, including:
Barriers:
- Time: neither researchers nor policy professionals are time-rich people;
- Space: conveners are needed to gather science, policy and implementation actors around the table to define their common needs;
- Timeframes: referring to the amount of time between the production of research-based evidence and policy and programmatic needs;
- Mindsets: there is a difference of mindsets and language between academics and practitioners. Not all disciplines are (or should) be aimed at producing actionable research. There are also challenges for scientists with a neutral research ethic to find the right setting to work with practitioners or policy actors that have an agenda;
- Asking the right questions: people spend a lot of energy on questions that might not be the most pressing (given time and resource constraints), or might already have answers (information asymmetry).
Opportunities:
- Datasets: policy and implementation actors often produce important datasets that they would like to exploit better, but don’t necessarily have the time or expertise to do so. This offers opportunities for researchers, who are themselves often in need of this type of source material;
- Recognising that different actors can produce evidence: think tanks, advocacy organisations and international organisations produce evidence, often codifying knowledge at a certain time. In complement and adding further value, researchers may spend more time for in-depth analysis and knowledge production. Researchers can provide longer-term outlook on existing knowledge gaps;
- Integrating citizens in the equation: communicating evidence to people affected by policies and getting them involved in implementation for more impactful research, policy and programmes. Strengthening the alliance between citizens and scientists to positively influence political priorities.
The GSPI’s Program Associate, Maxime Stauffer, then provided a synthesis that contextualised evidence-based thinking not only as a normative goal but also in terms of pragmatic strategies that can be adapted to barriers and contextual realities. He shared some best practices and recommendations, including building a toolbox of methods that can be selected on a case-by-case basis. Helping professionals in Geneva build that toolbox will be the aim of the future sessions of the Evidence-based Thinking in Practice Series.
Recommendations for upcoming events included going past the notion of evidence as monolithic, but exploring its individual elements: what type of evidence do we need? For whom? At which part of a policy or a programmatic cycle is it most useful?Similarly, participants encouraged exploring different models in the evidence hierarchy: though softer instruments might be easier to produce and put into policymakers’ hands, they might produce less authoritative outcomes. This, again, calls for adapting evidence-based thinking to specific contexts.
As we take stock of the feedback received, we are starting to plan for upcoming events in this series that will dive into specific methodologies and scientific disciplines that resonate with efforts of professionals in the Geneva ecosystem.
Interested in receiving information about our next events? Subscribe to our newsletter.