Why participants get banned
Our mission is to make trustworthy data more accessible to everybody. We strive to empower great research by connecting researchers with participants from around the world.
Our mission statement is at the core of everything we do, for our researchers and for our participants. We want to make sure you trust us, and that you trust each other.
We've made this our mission because we believe the best decisions are backed up with data. For innovation and society to keep moving forward, it's typically a good idea to take the time to reflect, do your research and use it to grow!
In 2020, industries across the globe had to adapt to 'the new normal' and change the way they work. Research is no exception! With an increasing number of researchers moving online, it is more important than ever that action is taken to ensure the highest data quality possible.
This post will explain some of the actions we take to maintain high data quality and why this sometimes results in participants being banned.
To uphold a high standard of data quality on our platform, we run checks on participants when they first sign up. We then run periodic checks on their accounts throughout their time on Prolific.
Our first port of call when it comes to checking data quality is participant verification. We make sure there's a person behind that participant ID and check participants are who, and where, they say they are. We verify a participant's:
A participant is fully verified once they have confirmed the first three things on this list. Only once they're verified will they receive studies. We introduced ID verification in November 2020 and we're thrilled at the level of certainty it gives us and our researchers, and for how it will help build an additional level of trust.
If a participant can't verify one of these things, unfortunately they can't participate. We're not doing this to penalise potential good participants, we're doing this to make sure we maintain a safe and trustworthy platform.
We're constantly reviewing our processes to make sure they remain fair and inclusive. We have a fantastic, diverse group of participants and we want to increase this diversity as we grow. We will therefore do what we can to make the site as accessible as possible.
Once verification has been submitted, things get a little more technical. We look at a variety of things including, but not limited to:
Certain IPs and ISPs carry a high risk of VPN and/or proxy usage. We don't allow VPN or proxy usage whilst on Prolific as they can be used to conceal the location of an IP and generally anonymise web browsing. For online research, knowing that the participant is where they say they are is very important.
So, we decided to maintain a very strict list of trusted ISPs for the time being. This decision has allowed us to block a high number of what we call 'bad actors' but unfortunately some good participants using smaller ISPs may be unable to take part right now.
We will be reviewing our ISP allowlist in the near future so good participants can return to the site, but this will be done slowly and strictly so we don't undo all of our good work.
Once a trusted participant is verified, they're invited to complete 'My First Study'. We check attention and comprehension and ask them to write a short story about superheroes, while using a transparent internet connection. If this short story is meaningful and intelligible, they are invited to take part in studies!
Participants can then complete studies and submit their responses (= submissions) for review. From here, their submission can be approved, returned or rejected. You can read more about what these statuses mean in our Help Centre, starting with: Approved submission status
When a submission gets approved, the participant gets paid by the researcher and each party is happy.
If a submission gets rejected, the participant does not receive the reward. The rejection is recorded on the participant's account, and if a participant receives a certain percentage of rejections, they won't be able to take part in future studies.
We want to build trust, and part of that means ensuring our participants are diligent and attentive. On the other hand, we also want attention checks to be fair for our participants. We strongly suggest researchers use the guidance in our article: Using attention checks as a measure of data quality
If a researcher finds data from certain participants to be unusable, or suspects duplicate accounts, we encourage them to report the participant(s) to us. We have a report function in-app, which is handy for researchers, and also allows us to keep improving our checks.
We also take feedback via our Researcher Support Request Form. Again, we'll review the account and decide if a ban is suitable.
Participants also have the option to report researchers if they feel they're being treated unfairly. They can give feedback at the end of a study, and tell us via our Participant Support Request Form.
A quick side note: In either reports from researchers or participants, if we see or hear of any bad behaviour i.e being rude, threatening, swearing, we will ban the person responsible. This isn't acceptable in any situation.
We know that sometimes our systems can get things wrong and ban people from our platform incorrectly. If banned participants feel this is the case, they can get in touch with our support team who will review the account and decide whether to uphold or remove the ban.
To give some perspective on this, the number of active users (participants who have been active in the last 90 days) is currently ~140,000 (as of Jan 2021). We calculated that, on average over a 3-month period, our support team removed bans for 133 participants. We think this shows we're pretty accurate with our checks.
Participants, you can use this article as a guide to being a super participant. By giving your full attention and engaging as much as you can, you'll help strengthen our data quality. This will lead to more researchers on the platform which means more top quality studies for you to participate in, and the added bonus of earning extra rewards. Ultimately, being a good participant is good for our researchers, and good for our participants!
Researchers, this means your data quality when recruiting on Prolific is only going to get better (and it's pretty good already!).
One side effect you may encounter is participants becoming banned between parts of a longitudinal study. If you notice that participants aren't returning for your longitudinal study, we recommend messaging them to check whether they've encountered an issue. If you haven't heard from them in five days, please send the relevant Prolific ID(s) to our researcher support team and they'll be happy to help - Researcher Support Request Form.
We use a whole host of tools and data in our decisions to ban participants. We don't take the decision lightly, and we only ban participants to improve the site for everyone. By removing participants we've identified as 'bad actors' we're improving the experience for participants we see as 'good'.
If you're a participant reading this and you're concerned about being banned, you don't need to be. As long as you're honest and pay attention, you'll continue to have access to studies on Prolific.
If you're a participant who has been banned, it will be due to one or multiple of the reasons outlined in this post. We're sorry you can't participate in any more studies.
To conclude, we're on the side of honest data quality, to empower our researchers, and ensure our participants can contribute to research. We're not on the hunt for ill-intentioned participants, but we want to provide only trustworthy and engaged participants for studies, so take proactive measures to make this happen.
Today Prolific is turning 5 years old – Happy Birthday to us! 🥳 It's been a remarkable journey so far. 3000+ researchers from science and industry have used Prolific last year, we have 45,000 quarterly active participants, and we've seen 200% year-on-year growth. But we're only getting started. In this post, I'll tell you a little bit about our journey, give credit where it's due!, and tell you about our exciting plans for the future.
Fresh out of YC's Summer 2019 batch, we want to share some of our most interesting learnings. If you're a startup founder or enthusiast and want to learn about product-market fit, growth experimentation and culture setting, you're in the right place!