Skip to content

Commit

Permalink
Update cite.bib
Browse files Browse the repository at this point in the history
  • Loading branch information
Miccighel committed Jun 22, 2024
1 parent ea4744f commit 72c80e3
Showing 1 changed file with 1 addition and 0 deletions.
1 change: 1 addition & 0 deletions content/publication/journal-paper-tsc-2024/cite.bib
Original file line number Diff line number Diff line change
Expand Up @@ -6,4 +6,5 @@ @article{soprano2024longitudinal
year = 2024,
keywords = {Longitudinal Studies, Crowdsourcing Platforms, Surveys, Online Sampling, Amazon Mechanical Turk, Prolific, Toloka},
note = {Under Publication. Journal Ranks: Journal Citation Reports (JCR) Q2 (2023), Scimago (SJR) Q2 (2023)},
abstract = {Crowdsourcing tasks have been widely used to collect a large number of human labels at scale. While some of these tasks are deployed by requesters and performed only once by crowd workers, others require the same worker to perform the same task or a variant of it more than once, thus participating in a so-called longitudinal study. Despite the prevalence of longitudinal studies in crowdsourcing, there is a limited understanding of factors that influence worker participation in them across different crowdsourcing marketplaces. We present results from a large-scale survey of 300 workers on 3 different micro-task crowdsourcing platforms: Amazon Mechanical Turk, Prolific and Toloka. The aim is to understand how longitudinal studies are performed using crowdsourcing. We collect answers about 547 experiences and we analyze them both quantitatively and qualitatively. We synthesize 17 take-home messages about longitudinal studies together with 8 recommendations for task requesters and 5 best practices for crowdsourcing platforms to adequately conduct and support such kinds of studies. We release the survey and the data at: https://osf.io/h4du9/.}
}

0 comments on commit 72c80e3

Please sign in to comment.