Articles

A Crowdsourcing Platform Designed for Scientific Research

Ekaterina (Katia) Damer
|July 23, 2015
LeanBack

As a junior psychological scientist I am constantly looking for ways to recruit participants. While I have used online recruitment platforms in the past with reasonable success, I’ve always mused about some of the obvious drawbacks. Why is it so hard to prescreen? How on earth do they permit this exploitative reward structure (or rather lack of reward structure)? Why are some still in beta after 10 years?

Let me provide one concrete example of the limitations of current crowdsourcing solutions. Imagine you need a very specific sample, say 20-30 year old females studying engineering. How would you identify them on conventional crowdsourcing sites? You’d basically have two options:
A) Explain on your consent form what you're looking for, or
B) create filters as part of your study without giving away your criteria in advance.

(Approaching a traditional panel company is perhaps a third option, but not really feasible for the vast majority of budget-conscious academics, especially postgraduate students.)

Since A) invites dishonesty, the clever researcher will normally go for B). Yet, this is where the turmoil begins. Because the sample you're looking for (20-30 year old females studying engineering) is very specific, many participants will get kicked out of your survey after already having invested time into it. Now, if we zoom out for a second and see the bigger picture, we will quickly realise: This approach is not sustainable because it is incredibly frustrating for participants/ “workers”. In fact, it’s unnecessarily frustrating. What is needed is a prescreening system that determines participants’ eligibility in advance.

This is one of the key insights that prompted us to set up a new research platform that is tailored for scientific research. We decided to call it, very descriptively, Prolific Academic (“Prolific”).

A fair reward structure

We agreed from the very beginning that Prolific must have a reasonable reward system, one that fairly compensates participants for their time. While taking part in scientific online experiments is fun and you feel like you actually make a meaningful contribution to advancing human knowledge, your motivation may fade over time. You will complete 1 or 2 studies hoping to win a voucher in a prize draw, but if you never win anything, you’ll eventually give up participating. This is precisely why we set the minimum reward on Prolific at £5.00 (6.50€/$7.50) per hour. To us, it is a sensible, ethical reward structure and simultaneously the key to unlocking participants’ sustained motivation to participate in studies.

In spring 2014, word came out that one of the most heavily used crowdsourcing sites, MTurk, would close its gates for international requesters. This was right after it had stopped allowing international workers. What a blow to the international researcher community! And now, taking effect today, MTurk pricing has quadrupled for certain types of tasks: If you require 10+ participants for your survey, then be ready to pay a 40% commission (and an extra 5% if you use Master workers). This pricing change affects pretty much 99.99% of scientific researchers, who aim for high sample sizes & high statistical power. As the newly coined hashtag #mturkgate (à la watergate) reveals, academic requesters are everything but pleased:

Bildschirmfoto-2018-08-16-um-14.09.13


Bildschirmfoto-2018-08-16-um-14.09.21

Next-generation crowdsourcing
Where are we left with this? If one thing is clear, then it’s that there’s enormous untapped potential in scientific crowdsourcing. Most universities still stick to their undergraduate participant pools, refusing to share them with each other, perhaps for political or administrative reasons. (Or maybe it’s just inertia, who knows.) Whenever the capacity of using the undergraduate populations is stretched and more diverse samples are needed, academics venture out to sites like MTurk. There’s no doubt that Amazon Turk has pioneered the DIY-style crowdsourcing format. But, we suggest that it’s time to check out new alternatives that have entered the playing field, offering cost-effective, nifty solutions.

On Prolific, we now have almost 12,000 international participants who contribute to scientific studies and earn cash rewards for themselves or for one of two chosen charities (Save the Children & Cancer Research UK). It takes researchers less than 10 minutes to start data collection and many studies get completed in less than an hour. Here's a cool video created by one of our researchers, Andy Woods, explaining how Prolific works. Thanks, Andy!

If you're looking for an alternative to MTurk and would like to give Prolific a try, then you can run your first 10 pounds study for free. Plus, for a limited time, you can choose what commission you pay us. It can be as low as 0%. It's totally up to you!