Abstract: Large-scale socially-generated metadata – like user-contributed tags, comments, and ratings – is one of the key features driving the growth and success of the emerging Social Web. While tags and ratings provide succinct metadata about Social Web content (e.g., a tag is often a single keyword), user-contributed comments offer the promise of a rich source of contextual information about Social Web content but in a potentially “messier” form, considering the wide variability in quality, style, and substance of comments generated by a legion of Social Web participants. In this paper, we study how an online community perceives the relative quality of its own user-contributed comments, which has important implications for the successful self-regulation and growth of the Social Web in the presence of increasing spam and a flood of Social Web metadata. Concretely, we propose and evaluate a machine learning-based approach for ranking comments on the Social Web based on the community's expressed preferences, which can be used to promote high-quality comments and filter out low-quality comments. We study several factors impacting community preference, including the contributor's reputation and community activity level, as well as the complexity and richness of the comment. Through experiments over three social news platforms (Digg, Reddit, and the New York Times), we find that the proposed approach results in significant improvement in ranking quality versus alternative approaches.