Designing an Automatic and Immediate Feedback Assessment Feature for the CrowdSurfer Extension

  • Type:Bachelor / Master Thesis
  • Date:ab sofort
  • Supervisor:

    Saskia Haug

  • Links:CrowdSurfer in Google Web Store
  • Problem

    The CrowdSurfer is a Chrome extension to collect in situ design feedback from crowdworkers. In an experimental study, it showed that although crowdworkers assumed their feedback was more relevant and real the quality of the feedback was lower than when collecting the feedback via a simple survey. Our interviews revealed that this may be caused by crowdworkers not knowing what exactly is expected from them. Further, their feedback presents their first thought about an element without much reflection.

    Idea

    To help crowdworkers understand the requirements of high-quality design feedback, we want to offer them an automatic and immediate assessment of their feedback using NLP algorithms. The development of the NLP algorithms can be based on an existing assessment of 600 feedback comments that were collected with the CrowdSurfer plus additional 2000 design feedback comments that were collected in a separate study on design feedback.

    Goal

    The goal of the thesis is to design a feedback assessment panel that is implemented in the CrowdSurfer that shows crowdworkers how good their feedback performs in specific categories (e.g., relevance or specificity). In the first step features that correlate with the feedback quality need to be extracted. Then an NLP algorithm needs to be trained to automatically predict the ratings of feedback comments. Finally, this prediction shall be integrated into the existing CrowdSurfer extension.