Grants and Awards Department Methodology and Statistics
2020 Grants and Awards Methodology and Statistics
Maria Bolsinova: postdoctoral fellowship from the Spencer Foundation and the National Academy of Education (USA)
Statistical Tools to Track a Multitude of Abilities as They Develop
With education’s ambition to move towards personalized learning systems, it is crucially important to in digital learning systems be able to accurately track learners’ development of skills and abilities as they develop during the learning process. While traditional statistical methods are not well-equipped to address this challenge, this project will develop novel flexible and efficient statistical methods that allow for dynamically tracking a large number of interrelated abilities and skills in learners as they develop over time, where the assessment of the learner’s ability levels is updated directly after every new response. The envisioned statistical machinery allows for balancing of the accuracy of the assessment and the speed with which a change in ability is tracked, and can form the statistical basis for the development of a wide range of large-scale online adaptive learning systems. The concrete outcome of the project will be well-tested and open-source statistical software that make these multidimensional analyses possible. By developing the statistical foundations, this project will function as a key stepping stone for the development of personalized learning systems, which requires the statistical machinery for accurately and efficiently tracking abilities to be in place before they can fully be developed, tested, and optimized.
Michèle Nuijten: NWO Veni Grant (e250,000)
Project: “The Four-Step Robustness Check: Assessing and Improving Robustness of Psychological Science”
Results of psychological research underlie important decisions concerning health, education, etc. Unfortunately, it appears that many psychological findings may be unreliable. In this project, the researcher will develop a protocol to determine efficiently if a result is robust, by focusing on verification of reported results in articles.
Leonie Vogelsmeier: IOPS Best Presentation Award
At the 2020 Winter Conference of the Interuniversity Graduate School of Psychometrics and Sociometrics (IOPS) that was held online, Leonie Vogelsmeier was awarded the prize for the best presentation, in which she discussed the main ideas and the results of her PhD project entitled “Latent Markov Factor Analysis: A mixture modeling approach for evaluating within- and between-person measurement model differences in intensive longitudinal data”.
E. Damiano D’Urso: IOPS Best Poster Award
At the 2020 Winter Conference of the Interuniversity Graduate School of Psychometrics and Sociometrics (IOPS) that was held online, Damiano D’Urso was awarded the Best Poster Award for his poster entitled “Scale Length Does Matter: Recommendations for Measurement Invariance Testing with Categorical Factor Analysis and Item Response Theory.” This work is part of Damiano's PhD project, in which he compared the impact of multiple group categorical factor analysis- and multiple group item response theory-based hypotheses and testing strategies on detecting violations of measurement invariance. A relevant outcome of the project were the recommendations provided to applied researchers that are interested to test measurement invariance for ordinal data.
Jelte Wicherts: Fellow van de Association for Psychological Science APS
Jelte Wicherts is named Fellow of the Association for Psychological Science (APS). Fellow status is awarded to APS members who have made sustained outstanding contributions to the science of psychology.
Paul Lodder: Elsevier Young Investigator Award
Under the auspices of the European Association of Psychosomatic Medicine (EAPM), Elsevier has established a scientific award for young investigators in field of psychosomatic medicine, consultation-liaison psychiatry and integrated care.
Michèle Nuijten: Open Science Champion Award
Named Open Science Champion by the Open Science Community Tilburg for my work on statcheck and for advocating Open Science throughout all my work.
Inga Schwabe together with Karin Gehring (main applicant) Elke Butterbrod, Eline Verhaak, Geert-Jan Rutten & Margriet Sitskoorn : HRSI Seed funding
Discovering latent patterns of fatigue in patients with benign and malignant brain tumors – A first step towards personalized monitoring and treatment This collaboration investigates latent patterns in previously obtained data on fatigue in patients brain tumors before start of treatment, and their relationship with disease and patient characteristics. The aim is to 1) better understand the heterogeneity in the manifestation of fatigue in this population, and 2) provide a starting point for investigations into its longitudinal course and value in predicting functional and disease outcomes.
Michèle Nuijten: Herbert Simon Research Institute Seed Funding for COVID19 Research (e10,000)
Project: “Employing meta-research to enrich COVID-19 preprints and study the impact of time pressure on research quality”. Co-application with Dr. Robbie van Aert and Prof. Jelte Wicherts.
Richard Klein and colleagues: SIPS Commendation
Richard Klein and colleagues received a commendation from the Society for the Improvement of Psychological Science for developing a Lab Philosophy document and research templates (https://psyarxiv.com/6jmhe/) to organize a research lab to use open science tools efficiently, and transparently lay out expectations and resources for new lab members. These tools are tailored for one lab but made to be easy to duplicate and customize.
Michèle Nuijten: Center for Open Science / SCORE project ($10,536.56)
As part of the SCORE project (Systematizing Confidence in Open Research and Evidence), the Center for Open Science (COS) contracted me to automatically extract statistical reporting errors from a large set of papers.
2019 Grants and Awards Methodology and Statistics
Robbie van Aert: IOPS Best Paper Award
Robbie van Aert received the IOPS Best Paper Award for his work in collaboration with Marcel van Assen on the development of a method to combine an original published study and replication study. This meta-analysis method is called the hybrid method, because it treats the original and replication study differently to correct for a likely overestimation in the original study. The proposed method was applied to data of the Reproducibility Project Psychology (Open Science Collaboration, 2015) and revealed that the conclusions based on the hybrid method are often in line with those of the replication, suggesting that many published psychological studies have smaller effect sizes than reported in the original study and that some effects may be even absent.
Olmo van den Akker and Stanislav Vlasov : Fetzer Franklin Fund
The EVA-algorithm: Author name disambiguation for large Web of Science datasets
Preregistration—the practice where researchers define research questions, research design, data collection plan, and analysis plan before collecting and analyzing their data—has been lauded as one of the main solutions to the so-called ‘crisis of confidence’ in the social sciences. Preregistration has increased in popularity in recent years as evidenced by the development of preregistration templates for many different types of primary research. However, as of yet there exists no templates specifically catered to meta-analyses. For this reason, we will organize a workshop where we invite several leading experts in meta-analysis to extend a general-purpose template for systematic reviews to a ready-to-use preregistration template specifically tailored to meta-analyses in the social sciences. This template will be accompanied by a tutorial paper with a worked example highlighting how to effectively use the template.
Olmo van den Akker and Jelte Wicherts: Fetzer Franklin Fund
Preregistering your meta-analysis: A template and tutorial
The creation of co-authorship networks is a valuable way to depict the social structure of scientific fields. However, these co-authorship networks often get distorted because of the problems of author name synonymy (the same author is split into two nodes because his name is spelled differently in different publications) and author name homonymy (different authors are compounded into one node because they share the same name). In this project, we will develop an open source author name disambiguation algorithm using the programming language R that is easy-to-use for any researcher willing to create a co-authorship network with as few as possible author name ambiguities.
Jia He: DAAD grant
Michael Bender (Department of Social Psychology), together with Jia He and Mark Brandt, was able to secure the DAAD (German Academic Exchange Service) grant for a longitudinal, experience sampling study to investigate how character affects how international and Dutch students deal with different types of adversity. The grant for this study exceeds 44.000 Euro.
Michèle Nuijten: Center for Open Science
Michèle received a grant from the Center for Open Science as part of the SCORE project (Systematizing Confidence in Open Research and Evidence). In her proposed project, she automatically extracted statistical reporting errors from over 3000 psychology papers and 100 papers on COVID-19. These data will become part of a larger data base that will be used to build models to predict which studies will replicate and which will not. ($10,536.56)
Leonie Vogelsmeier, Shuai Yuan & E. Damiano D’Urso: IOPS Award for Organizing a Classification Symposium
The grant was awarded by the interuniversity graduate school of psychometrics and sociometrics (IOPS) to support the organization of the first ‘Classification Methods in the Social and Behavioral Sciences (CSBS)’ symposium, which was held in October 2019 at Tilburg University. The goal of this one-day event was to exchange knowledge about latest methods and applications with room for informal discussions and the possibility to create a network in the field of classification.
2018 Grants and Awards Methodology and Statistics
Leonie Vogelsmeier: Best Junior Scientist Presentation Award from the European Association of Methodology
At the 2018 European Congress of Methodology in Jena (Germany), Leonie Vogelsmeier received the Best Junior Scientist Presentation Award for her presentation entitled “Latent Markov Factor Analysis for Exploring Measurement Model Changes in Time-Intensive Longitudinal Studies.” In this work, Leonie extended latent Markov factor analysis—which allows to evaluate measurement model changes in time-intensive longitudinal data—to accommodate unequally spaced measurement occasions. As part of the award, Leonie was granted the opportunity to publish this work in a special issue of the journal Methodology.
Inga Schwabe: John B. Carroll Award for Research Methodology by the International Society for Intelligence Research (ISIR)
For her methodological contributions in the field of behavior genetics, Inga Schwabe has been awarded the John B. Carroll Award for Research Methodology by the International Society for Intelligence Research (ISIR). In her research, she has shown that ignoring psychometric issues such as heterogeneous measurement error or scale transformations can result in spurious findings of genotype by environment interactions. To solve these problems, she introduced a new framework that integrates psychometric models with twin data.
Leonie van Grootel: Thomas C Chalmers Award
The Thomas C Chalmers Award is given at each Cochrane Colloquium to the principal author of the best presentation addressing methodological issues related to systematic reviews given by an early career investigator. The presentations must demonstrate originality of thought, high quality science, relevance to the advancement of the science of systematic reviews and clarity of presentation. Leonie presented one of the chapters of her dissertation in which she describes the rationale and steps concerning the incorporation of quantitized qualitative findings in an informative prior distribution in a Bayesian meta-analysis.
Olmo van den Akker: IOPS Best Poster Award
In a vignette study, Olmo van den Akker and colleagues studied the way researchers in psychology interpret situations where they are presented with the results of multiple studies that all test a given theory. They found that only 1% of the 505 participants used the normative approach of Bayesian inference, while a majority of the participants used simple vote counting approaches. These findings indicate that researchers fail to use important information like power and the significance level when assessing the results of scientific papers with multiple experiments.
Xynthia Kavelaars: NWO Talent Grant
Randomized controlled trials (RCTs) are considered the gold standard to investigate effectiveness of new treatments. However, as treatments become more personalized and address smaller subpopulations it becomes increasingly hard to setup powerful trials. We solve this problem by developing novel methodsthat 1) combine data from different endpoints within a trial, 2) include evidence from similar trials using different endpoints, and 3) include evidence from similar trials conducted on different groups of patients.
We will develop and evaluate a Bayesian framework for information sharing within and between trials to advance the efficiency of RCTs.
2017 Grants and Awards Methodology and Statistics
Leonie Vogelsmeier: IOPS Best Poster Award
At the 2017 Winter Conference of the Interuniversity Graduate School of Psychometrics and Sociometrics (IOPS) in Tilburg, Leonie Vogelsmeier was awarded the Best Poster Award for her poster entitled “Latent Markov Factor Analysis for Exploring Measurement Model Changes in Time-Intensive Longitudinal Studies.” This work is part of Leonie's PhD project, in which she develops new methods for evaluating within- and between-person differences in measurement models underlying participants’ answers in intensive longitudinal data (e.g., experience sampling data).
Leonie Vogelsmeier: NWO Talent Grant
Understanding between- and within-person differences in experience sampling measurements using mixture factor analysis
Experience sampling, in which participants are questioned repeatedly via smartphone apps, is popular for studying psychological constructs (e.g., wellbeing, depression) within subjects over time. The validity of such studies, e.g., regarding decisions about treatment allocation over time, may be hampered by distortions of the measurement of the relevant constructs, e.g., by response styles or substantively altered interpretations of questionnaire items. This project develops a new approach for disentangling the distortions from the actual construct measurements while taking the specific features of experience sampling studies into account.
Yuan Shuai: NWO Talent Grant
Social science research in the era of Big Data
Genetic markers, GPS coordinates, and online behavior information are becoming increasingly available and can be linked to traditional survey data. Although this may improve our understanding of social phenomena, sociological theory is usually insufficiently specific to guide analyses of such multi-source data sets. In this project, we aim to develop novel exploratory methods, by combining elements of principal component, regression, and cluster analysis that could automatically detect interpretable associations within and between sources.
Joris Mulder: Vidi
Joris Mulder has been awarded a Vidi grant worth of 800,000 euros by the Netherlands Organization Scientific Research. In this project, Joris Mulder and colleagues will develop a new Bayesian statistical framework for analyzing relational data between individuals or groups of individuals in a social network. The new framework will be implemented in user-friendly software (e.g., R, JASP).
Joris Mulder: ERC Starting Grant
In this project, Joris Mulder will set up a research group to work on Bayesian relational event history modeling. The goal is to learn about how and how fast social relationships change in continuous time. The focus will be on social networks of colleagues in large organizations, social networks of children and teachers in classrooms, and social networks of criminal gangs in city districts.
Paulette Flore and Jelte Wicherts: NWO replicate grant
A large scale registered replication report of the stereotype threat effect in female math performance.
In this project, the researchers will team up with other labs across the world to replicate a well known study by Johns, Schmader and Martens (2005) in the effects of stereotype threat on female student's math test performance to study the replicability and generalizability of this common explanation of the gender gap in mathematics.
2016 Grants and Awards Methodology and Statistics
Katrijn van Deun: Vidi grant for data research
Katrijn Van Deun (Tilburg School of Social and Behavioral Sciences, department Methodology and Statistics) has been awarded a Vidi grant worth 800,000 euros by the Netherlands Organisation for Scientific Research NWO. The grant will enable her to develop her own research program and research group. Van Deun's research concerns the development of statistical tools for the analysis of so-called Big Data from multiple sources.
Kim de Roover: Innovational Research Incentives Scheme Veni
Lack of measurement invariance in multilevel data: A cluster-based solution for making valid attribute comparisons.
When measuring unobservable attributes by observed variables like questionnaire items, psychologists assume a measurement model: each item measures the intended attribute. When comparing attributes based on item scores, they assume invariance of the measurement model across compared groups/subjects. The grant comprises 250,000 euro.
Katrijn van Deun: Aspasia Grant
Katrijn Van Deun (Tilburg School of Social and Behavioral Sciences, department of Methodology and Statistics) was elected for an Aspasia premium by the Netherlands Organisation for Scientific Research NWO. Following her promotion to an associate professorship, the university was awarded a premium worth 200,000 euros. € 50,000 of the premium is used to fund the university’s structural diversity policy while the remaining sum is used to support Van Deun's research.
Jelte Wicherts: European Research Council (ERC)
With this €2 million grant, Jelte Wicherts and his colleagues will investigate, refine, and develop innovative methodological and statistical tools that help to make research at different levels –from individual test results to meta-analyses- stronger, more efficient and more useful.
Michèle Nuijten: Leamer-Rosenthal prize for Open Social Science
This award is an initiative of the Berkeley Initiative for Transparency in the Social Sciences (BITSS), and comes with a cash prize of $10,000.
Statcheck is an R package to extract statistical results from scientific articles and recalculate p-values, and offers a concrete tool for researchers to check their own work before submitting it, and for journals to detect misreported statistics during peer review.
The Leamer-Rosenthal prize was created to reward those driving change in social science research by educating others, developing tools to facilitate openness, and carrying out transparent and reproducible science themselves.
Chris Hartgerink: R2RC "Next generation leadership award"
At OpenCon 2016 in Washington D.C., Chris Hartgerink was awarded the Right to Research Coalition (R2RC) "Next Generation Leadership Award". His prize was announced by European Commissioner for Science, Carlos Moedas, who lauded him for his "tireless work on open access" and stated that "he is a worthy recipient of the R2RC next generation leadership award for 2016."
US Office of Research Integrity grant for research on detecting data fabrication
Chris Hartgerink (main applicant), Jelte Wicherts, and Marcel van Assen have been awarded $100,000 by the US federal government to conduct research into how valuable statistical tools are to detect data fabrication. The entire research proposal was published upon submission of the grant application, in the spirit of Open Science and to allow for feedback on the research plans. The grant funds their research during the academic year 2016-2017.
Robbie van Aert (main applicant) SSMART grant
"Getting it Right with Meta-Analysis: Correcting Effect Sizes for Publication Bias in Meta-Analyses from Psychology and Medicine", Social Science Meta-Analysis and Research Transparency (SSMART) grant of $30,000.