Skip to content

Commit

Permalink
deploy: 8e117d1
Browse files Browse the repository at this point in the history
  • Loading branch information
Miccighel committed Jun 22, 2024
1 parent 245d890 commit 13f116e
Show file tree
Hide file tree
Showing 14 changed files with 24 additions and 9 deletions.
5 changes: 5 additions & 0 deletions index.html
Original file line number Diff line number Diff line change
Expand Up @@ -631,6 +631,11 @@ <h2 id=view-detailspublication><a href=./publication/>View details</a></h2>
<div class="section-subheading article-title mb-0 mt-0">
<a href=/publication/journal-paper-tsc-2024/>Longitudinal Loyalty: Understanding The Barriers To Running Longitudinal Studies On Crowdsourcing Platforms</a>
</div>
<a href=/publication/journal-paper-tsc-2024/ class=summary-link>
<div class=article-style>
Crowdsourcing tasks have been widely used to collect a large number of human labels at scale. While some of these tasks are deployed by …
</div>
</a>
<div class="stream-meta article-metadata">
<div>
<span>
Expand Down
2 changes: 1 addition & 1 deletion index.json

Large diffs are not rendered by default.

1 change: 1 addition & 0 deletions publication-type/2/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -141,6 +141,7 @@ <h1>2</h1>
<div>
<h2><a href=/publication/journal-paper-tsc-2024/>Longitudinal Loyalty: Understanding The Barriers To Running Longitudinal Studies On Crowdsourcing Platforms</a></h2>
<div class=article-style>
Crowdsourcing tasks have been widely used to collect a large number of human labels at scale. While some of these tasks are deployed by requesters and performed only once by crowd workers, others require the same worker to perform the same task or a …
</div>
</div>
<div>
Expand Down
2 changes: 1 addition & 1 deletion publication/journal-paper-ipm-2024-2/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -234,8 +234,8 @@ <h3>Related</h3>
<ul>
<li><a href=/publication/workshop-paper-iir-2023/>Fact-Checking at Scale with Crowdsourcing Experiments and Lessons Learned</a></li>
<li><a href=/publication/journal-paper-ipm-2024/>Cognitive Biases in Fact-Checking and Their Countermeasures: A Review</a></li>
<li><a href=/publication/journal-paper-ipm-2021/>The Many Dimensions of Truthfulness: Crowdsourcing Misinformation Assessments on a Multidimensional Scale</a></li>
<li><a href=/publication/journal-paper-puc-2021/>Can The Crowd Judge Truthfulness? A Longitudinal Study on Recent Misinformation About COVID-19</a></li>
<li><a href=/publication/journal-paper-ipm-2021/>The Many Dimensions of Truthfulness: Crowdsourcing Misinformation Assessments on a Multidimensional Scale</a></li>
<li><a href=/publication/conference-paper-facct-2022/>The Effects of Crowd Worker Biases in Fact-Checking Tasks</a></li>
</ul>
</div>
Expand Down
12 changes: 7 additions & 5 deletions publication/journal-paper-tsc-2024/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
<meta http-equiv=x-ua-compatible content="IE=edge">
<meta name=generator content="Wowchemy 5.1.0 for Hugo">
<meta name=author content="Michael Soprano">
<meta name=description content>
<meta name=description content="Crowdsourcing tasks have been widely used to collect a large number of human labels at scale. While some of these tasks are deployed by requesters and performed only once by crowd workers, others require the same worker to perform the same task or a variant of it more than once, thus participating in a so-called longitudinal study. Despite the prevalence of longitudinal studies in crowdsourcing, there is a limited understanding of factors that influence worker participation in them across different crowdsourcing marketplaces. We present results from a large-scale survey of 300 workers on 3 different micro-task crowdsourcing platforms: Amazon Mechanical Turk, Prolific and Toloka. The aim is to understand how longitudinal studies are performed using crowdsourcing. We collect answers about 547 experiences and we analyze them both quantitatively and qualitatively. We synthesize 17 take-home messages about longitudinal studies together with 8 recommendations for task requesters and 5 best practices for crowdsourcing platforms to adequately conduct and support such kinds of studies. We release the survey and the data at: https://osf.io/h4du9/.">
<link rel=alternate hreflang=en-us href=https://michaelsoprano.com/publication/journal-paper-tsc-2024/>
<link rel=preconnect href=https://fonts.gstatic.com crossorigin>
<meta name=theme-color content="#3f51b5">
Expand All @@ -31,11 +31,11 @@
<meta property="og:site_name" content="Michael Soprano">
<meta property="og:url" content="https://michaelsoprano.com/publication/journal-paper-tsc-2024/">
<meta property="og:title" content="Longitudinal Loyalty: Understanding The Barriers To Running Longitudinal Studies On Crowdsourcing Platforms | Michael Soprano">
<meta property="og:description" content><meta property="og:image" content="https://michaelsoprano.com/media/icon_hu4525676fb5a133c3f276d8a3d210c954_1943161_512x512_fill_lanczos_center_3.png">
<meta property="og:description" content="Crowdsourcing tasks have been widely used to collect a large number of human labels at scale. While some of these tasks are deployed by requesters and performed only once by crowd workers, others require the same worker to perform the same task or a variant of it more than once, thus participating in a so-called longitudinal study. Despite the prevalence of longitudinal studies in crowdsourcing, there is a limited understanding of factors that influence worker participation in them across different crowdsourcing marketplaces. We present results from a large-scale survey of 300 workers on 3 different micro-task crowdsourcing platforms: Amazon Mechanical Turk, Prolific and Toloka. The aim is to understand how longitudinal studies are performed using crowdsourcing. We collect answers about 547 experiences and we analyze them both quantitatively and qualitatively. We synthesize 17 take-home messages about longitudinal studies together with 8 recommendations for task requesters and 5 best practices for crowdsourcing platforms to adequately conduct and support such kinds of studies. We release the survey and the data at: https://osf.io/h4du9/."><meta property="og:image" content="https://michaelsoprano.com/media/icon_hu4525676fb5a133c3f276d8a3d210c954_1943161_512x512_fill_lanczos_center_3.png">
<meta property="twitter:image" content="https://michaelsoprano.com/media/icon_hu4525676fb5a133c3f276d8a3d210c954_1943161_512x512_fill_lanczos_center_3.png"><meta property="og:locale" content="en-us">
<meta property="article:published_time" content="2024-06-11T15:00:00+01:00">
<meta property="article:modified_time" content="2024-06-21T18:25:07+02:00">
<script type=application/ld+json>{"@context":"https://schema.org","@type":"Article","mainEntityOfPage":{"@type":"WebPage","@id":"https://michaelsoprano.com/publication/journal-paper-tsc-2024/"},"headline":"Longitudinal Loyalty: Understanding The Barriers To Running Longitudinal Studies On Crowdsourcing Platforms","datePublished":"2024-06-11T15:00:00+01:00","dateModified":"2024-06-21T18:25:07+02:00","author":{"@type":"Person","name":"Michael Soprano"},"publisher":{"@type":"Organization","name":"University of Udine","logo":{"@type":"ImageObject","url":"https://michaelsoprano.com/media/icon_hu4525676fb5a133c3f276d8a3d210c954_1943161_192x192_fill_lanczos_center_3.png"}},"description":""}</script>
<meta property="article:modified_time" content="2024-06-22T16:50:03+02:00">
<script type=application/ld+json>{"@context":"https://schema.org","@type":"Article","mainEntityOfPage":{"@type":"WebPage","@id":"https://michaelsoprano.com/publication/journal-paper-tsc-2024/"},"headline":"Longitudinal Loyalty: Understanding The Barriers To Running Longitudinal Studies On Crowdsourcing Platforms","datePublished":"2024-06-11T15:00:00+01:00","dateModified":"2024-06-22T16:50:03+02:00","author":{"@type":"Person","name":"Michael Soprano"},"publisher":{"@type":"Organization","name":"University of Udine","logo":{"@type":"ImageObject","url":"https://michaelsoprano.com/media/icon_hu4525676fb5a133c3f276d8a3d210c954_1943161_192x192_fill_lanczos_center_3.png"}},"description":"Crowdsourcing tasks have been widely used to collect a large number of human labels at scale. While some of these tasks are deployed by requesters and performed only once by crowd workers, others require the same worker to perform the same task or a variant of it more than once, thus participating in a so-called longitudinal study. Despite the prevalence of longitudinal studies in crowdsourcing, there is a limited understanding of factors that influence worker participation in them across different crowdsourcing marketplaces. We present results from a large-scale survey of 300 workers on 3 different micro-task crowdsourcing platforms: Amazon Mechanical Turk, Prolific and Toloka. The aim is to understand how longitudinal studies are performed using crowdsourcing. We collect answers about 547 experiences and we analyze them both quantitatively and qualitatively. We synthesize 17 take-home messages about longitudinal studies together with 8 recommendations for task requesters and 5 best practices for crowdsourcing platforms to adequately conduct and support such kinds of studies. We release the survey and the data at: https://osf.io/h4du9/."}</script>
<script src=https://cdnjs.cloudflare.com/ajax/libs/cookieconsent2/3.1.1/cookieconsent.min.js integrity="sha256-5VhCqFam2Cn+yjw61zbBNrbHVJ6SRydPeKopYlngbiQ=" crossorigin=anonymous></script>
<link rel=stylesheet href=https://cdnjs.cloudflare.com/ajax/libs/cookieconsent2/3.1.1/cookieconsent.min.css integrity="sha256-zQ0LblD/Af8vOppw18+2anxsuaz3pWYyVWi+bTvTH8Q=" crossorigin=anonymous>
<script>window.addEventListener("load",function(){window.cookieconsent.initialise({palette:{popup:{background:"#3f51b5",text:"rgb(255, 255, 255)"},button:{background:"rgb(255, 255, 255)",text:"#3f51b5"}},theme:"classic",content:{message:"This website uses cookies to ensure you get the best experience on our website.",dismiss:"Got it!",link:"Learn more",href:"/privacy/"}})})</script>
Expand Down Expand Up @@ -158,6 +158,8 @@ <h1>Longitudinal Loyalty: Understanding The Barriers To Running Longitudinal Stu
</div>
</div>
<div class=article-container>
<h3>Abstract</h3>
<p class=pub-abstract>Crowdsourcing tasks have been widely used to collect a large number of human labels at scale. While some of these tasks are deployed by requesters and performed only once by crowd workers, others require the same worker to perform the same task or a variant of it more than once, thus participating in a so-called longitudinal study. Despite the prevalence of longitudinal studies in crowdsourcing, there is a limited understanding of factors that influence worker participation in them across different crowdsourcing marketplaces. We present results from a large-scale survey of 300 workers on 3 different micro-task crowdsourcing platforms: Amazon Mechanical Turk, Prolific and Toloka. The aim is to understand how longitudinal studies are performed using crowdsourcing. We collect answers about 547 experiences and we analyze them both quantitatively and qualitatively. We synthesize 17 take-home messages about longitudinal studies together with 8 recommendations for task requesters and 5 best practices for crowdsourcing platforms to adequately conduct and support such kinds of studies. We release the survey and the data at: <a href=https://osf.io/h4du9/>https://osf.io/h4du9/</a>.</p>
<div class=row>
<div class=col-md-1></div>
<div class=col-md-10>
Expand All @@ -178,7 +180,7 @@ <h1>Longitudinal Loyalty: Understanding The Barriers To Running Longitudinal Stu
<div class=col-md-10>
<div class=row>
<div class="col-12 col-md-3 pub-row-heading">Publication</div>
<div class="col-12 col-md-9"><em>ACM Transactions on Social Computing. Journal Ranks: Journal Citation Reports (JCR) Q2 (2023), Scimago (SJR) Q2 (2023). Under Publication.</em></div>
<div class="col-12 col-md-9"><em>Under Publication in ACM Transactions on Social Computing. Journal Ranks: Journal Citation Reports (JCR) Q2 (2023), Scimago (SJR) Q2 (2023).</em></div>
</div>
</div>
<div class=col-md-1></div>
Expand Down
Loading

0 comments on commit 13f116e

Please sign in to comment.