Evidence & literature review
All our projects begin with the same two stages.
First, what is the research question? In other words, what are we trying to find out, or what problem are we trying to solve?
Second, what do we already know? This second stage involves looking at relevant work – our own and others – to unearth ideas and insights that could prove useful to the people we are working with.
Drawing from existing evidence & ideas.
How much time should be invested?
Making good decisions about where to look for evidence, how far to throw the net and what constitutes high quality, useful and relevant evidence, are all critical to producing an excellent review. We will talk with you in depth about whom the evidence review is for and how you plan to use the findings. That way we can design the review and plan the way we report with that in mind. And we’re realistic, so we’ll ask about your budget and give you advice about how to get the best quality for what you can afford.
Relationships with centres of excellence across the UK & internationally give us access to the best academic research.
Where we look & why
International research: Relationships with centres of excellence across the UK and internationally give us access to the best academic research.
Public policy contexts: Rapid reviews of the history and current policy context can help organisations think strategically about the problems they face or likely opportunities.
Calls for unpublished evidence: We know that many smaller voluntary organisations hold unpublished literature which, while less robust, are often at the leading edge of practice that has yet to be captured in academic research.
Volunteering & early childhood outcomes: Evidence review
The following describes how we used practitioner literature to inform our work.
Our review was conducted by researchers working in collaboration with practitioners, and this corroborated our intuition that practitioners will always know more than researchers about how an intervention works, but their detailed knowledge is rarely captured in outcome focused peer-reviewed papers. We therefore adopted a dual approach to the review: searches of bibliographic databases for peer-reviewed papers; and a call for evidence (using our collective professional networks of practitioners and a manual search for relevant organisations) for the practitioner (‘grey’) literature.
The quality of the evidence was very variable, with some interventions being poorly described, and some reported findings based on low response rates. Existing systematic reviews had already screened many of the quantitative and (quasi-) experimental studies for quality. We used an abridged and pragmatic quality appraisal process, using questions that were applicable across both quantitative and qualitative methodologies in order to mitigate any risks to the validity of the research that could arise from the integration of the findings from these different methodologies. These questions were built up around the relevance and transparency of the evidence, its methodological robustness, and data confidence.
Learn more about IVAR