Discover 6 peer-reviewed studies in Crowdsourcing Behavioral Research (2017–2025). Explore research findings powered by Prolific's diverse participant panel.
This page lists 6 peer-reviewed papers in the research area of Crowdsourcing Behavioral Research in the Prolific Citations Library, a curated collection of research powered by high-quality human data from Prolific.
-
Authors: Y Ba, MV Mancenido, EK Chiou, R Pan
Year: 2025
Published in: Behavior Research Methods, 2025 - Springer
Institution: University of Delaware, National Taiwan University, University of British Columbia, Monash University
Research Area: Crowdsourcing, Data Quality, Spamming Behavior Detection, LLM Applications in Behavioral Research
Discipline: Computer Science, Artificial Intelligence, LLM
The paper introduces a systematic method to evaluate crowdsourced data quality and detect spam behaviors through variance decomposition, proposing a spammer index and credibility metrics to improve consistency and reliability in labeling tasks.
Methods: Variance decomposition, Markov chain models, and generalized random effects models were used to assess annotator consistency and credibility; metrics were applied to both simulated and real-world data from two crowdsourcing platforms.
Key Findings: Quality of crowdsourced data, spammer behaviors, annotators’ consistency, and credibility.
Citations: 2
-
Authors: DT Esch, N Mylonopoulos, V Theoharakis
Year: 2025
Published in: Behavior Research Methods, 2025 - Springer
Institution: University of Cologne, University of Piraeus, Aristotle University of Thessaloniki
Research Area: Crowdsourcing Behavioral Research, Mobile Data Collection
Discipline: Behavioral Research
Mobile-based responses via platforms like Pollfish are comparable in quality to computer-based ones from MTurk and Prolific, though attentiveness varies significantly across platforms and is influenced by factors like incentives, distractions, and system 1 thinking.
Methods: Conducted two studies distributing the same survey across MTurk, Prolific, Pollfish, and Qualtrics panels to compare data quality and analyze attentiveness scores.
Key Findings: Attentiveness, device usage (mobile vs. computer), and factors influencing data quality such as incentives, respondent activity, distractions, and survey familiarity.
Citations: 1
-
Authors: F Joessel, S Denkinger, PE Joessel, CS Green
Year: 2025
Published in: Acta Psychologica, 2025 - Elsevier
Institution: Max Planck Institute, University of Potsdam, University of Maryland, University of Zurich, University of Arizona
Research Area: Online cognitive training, Automated psychological studies, Crowdsourcing, behavioral research
Discipline: Psychology
The study introduces a fully online method for conducting cognitive training experiments using Prolific, significantly reducing resource demands while achieving robust results and diverse participant recruitment.
Methods: Participants were recruited via Prolific, assigned to groups using a pseudo-randomized procedure, and completed a 12-hour remote cognitive training study with pre- and post-test assessments monitored via custom dashboards.
Key Findings: Impact of a 12-hour cognitive training intervention on participants' cognitive functions, conducted in a remote and automated manner.
-
Authors: R Kapitany, C Kavanagh
Year: 2023
Published in: 2023 - OSF
Research Area: Crowdsourcing ethics, best practices in behavioral science research
Discipline: Behavioral Science
Citations: 4
-
Authors: S Connors, K Spangenberg, AW Perkins
Year: 2020
Published in: Journal of ..., 2020 - Taylor & Francis
Institution: University of Guelph, University of Utah, University of Victoria
Research Area: Psychological Measurement, Crowdsourcing, Behavioral Research Methods
Discipline: Behavioral Science
DOI: https://www.tandfonline.com/doi/abs/10.1080/00913367.2020.1806155#
Citations: 10
-
Authors: E Peer, L Brandimarte, S Samat, A Acquisti
Year: 2017
Published in: Journal of experimental social ..., 2017 - Elsevier
Institution: Eller College of Management, University of Arizona, Heinz College, Carnegie Mellon University
Research Area: Crowdsourcing behavioral research
Discipline: Behavioral Science
Citations: 3928