An annotation task is to be completed by German platform workers, and should therefore pay at least German minimum wage, 9.19 € per hour. First, the client or platform generates a reasonable time estimate for how long the task takes an average qualified worker. To do this, they may “pilot” the task in-house to get a rough estimate — but in-house workers may be more qualified than platform workers, and therefore may do the task faster. So a second “pilot” with platform workers can be helpful. Depending on the capabilities of the crowdsourcing platform, the qualifications of these workers can be specified and they can be asked to complete the task without interruption.
Suppose the “pilot” is run with 10 in-house employees who do the task without interruption, and they take the following times (in minutes):
8 9 10.5 12 15 19 20 20 20 21
Suppose the online pilot with platform workers is done with 25 workers with the following requirements: (a) fluent speakers of the task language, (b) experienced with this kind of task, (c) they are told they should complete the task without interruption, and (d) the task monitors if they appear to have stopped working. Of the 25 workers, 4 are interrupted and their times are removed from the final data. The remaining 21 workers do the task with the following times:
11 12 18 19 20 21 22 23 24 24 25 25 25 25 26 26 26 27 29 33 34
We can set the initial hourly rate by ensuring that the fastest 75% of workers (in this case the fastest 16 workers) earn at least German minimum wage. In this case, the 16th fastest worker took 26 minutes to do the task, so the pay should be at least 26 / 60 * 9.19 € = 3.98 €.
Finally, completion times should be monitored as the task is run to ensure that the actual workers completing the tasks are not taking significantly longer for some unexpected reason. The person monitoring the tasks should be qualified to assess the situation and authorized to pay workers more or make changes to the task design or description “on the fly” as needed.
Especially the first few times a client runs a task, platform workers, like any other workers, must generally be managed. It may eventually be possible to achieve “frictionless” or “fire and forget” crowdsourcing — but this is not “the norm” the first time a task is run. Task descriptions, technology, and evaluation processes must first be tested and improved!