Abstract zum Vortrag von Christina Bergmann

Robust cumulative psychological science using existing data: MetaLab and metalabR

For theory building and translating findings into practice, psychologists usually try to generalize as far as possible based on single data points. Yet summaries made on the basis of one or a few studies, including (null) results in often under-powered studies, can misrepresent a messy and complex evidence base. Even true results may further not be generalizable outside of the specific context for theoretically important reasons. Other factors that are typically considered "noise" might also impede generalizability, including the lab where testing takes place, the type of stimuli, or the methods that were used. Systematic reviews and meta-analyses help bring coherence to a complex evidence base, control for confounds, and integrate over a range of studies. However, meta-analyses are underused in experimental research and suffer from limits in replicability and transparency.
In this talk, I will present MetaLab and metalabR as a case study of how to use meta-analytic tools and open science for increasing the robustness and assessing generalizability of our conclusions. Both are domain-specific and aggregate findings from developmental studies, but the approach can be generalized to other disciplines. MetaLab (https://metalab.stanford.edu/) is a platform for open, dynamic meta-analytic datasets. In 6 years, the site has grown to 30 meta-analyses with data from 45,000 infants and children, and a direct spin-off has been built on the basis of MetaLab: https://metavoice.au.dk/. A key feature is the standardized data storage format, which allows a unified framework for analysis and visualization. This facilitates the addition of new data points as new studies emerge or are unearthed from the file-drawer, resulting in community-augmented meta-analyses (CAMAs; Tsuji, Bergmann, & Cristia, 2014), which provide the most up-to-date summary of the body of literature. Use of this standardized format is facilitated by tailored documentation and the metalabR package, which is currently under development.
This talk will highlight how the approach taken in MetaLab and metalabR can support robust science at different stages of a typical research project, from literature review over experiment design to result integration. At the same time, building MetaLab highlighted a few gaps in experimental psychology's current appraisal of evidence, including a lack of study quality indicators and a framework to compare studies on the same topic and quantify their closeness, which is also reflected in ongoing debates about what constitutes a "replication" and which forms of replications are "useful" - after all, meta-analyses are nothing else than a collection of replications (either narrowly or broadly construed).
Veröffentlicht am | Veröffentlicht in Allgemein