May 28, 2009
Brown and Lambert Report on Their Crowdsourcing Experiment for Law Firm Work
Toby Brown and Greg Lambert decided to run an experiment to see how well crowdsourcing might work for a firm. "We wanted to show immediate value and cost savings and to demonstrate which types of tasks might be handled this way." Using Mechanical Turk (MTurk), Brown and Lambert offered to compensate MTurk users to supply information about General Counsels and provide bullet-point and 100-word reviews of cited legal articles.
About their two-week experiment, Lambert writes:
I have to admit that I was pretty impressed with the quality of the work. Regardless of if we paid 25 or 50 cents, the work was very good. I'm also stunned by the seriousness that the MTurkers seem to take with regards to the quality of the work. Take a look at the last bullet-point of the Finnegan Henderson article. A MTurker posted a comment saying that they had some difficulty with the article but hoped that their results were "good enough" for us. That really impressed me.
The more I test the MTurk idea, the more I see potential in crowdsourcing a number of projects that we'd love to do within the law firm setting, but generally don't have the staffing to help us complete the projects. We'll break down some of the other MTurk projects we tested over the past week and show you what we've found to be the pro's and con's of crowdsourcing.
Very interesting. [JH]
TrackBack URL for this entry:
Listed below are links to weblogs that reference Brown and Lambert Report on Their Crowdsourcing Experiment for Law Firm Work: