conciselyverbose , (edited )

I think sorting out actual quality reviews is harder than people think. Even something like Steam, where the cumulative user rating is relatively respected, surfaces a lot of junk reviews, because people respond to meme-ing and jokey shitposts more than actual high quality reviews. The signals even for a behemoth like Amazon to train an AI on just really aren't amazing. I know fakespot looks for outright fraud Amazon doesn't, but I think part of their success is that they're not the benchmark cheaters are trying to beat. In any case, "genuine" reviews and "quality" reviews aren't the same thing, and the latter is really hard to measure.

I think a more robust set of curation tools would have some value. Flipboard has been mentioned a bit lately for articles, and while I haven't used it, my impression is that the premise is that you subscribe to curated lists of different interests. Something like that for reviewers who catch the eye of curators could be interesting for a federated book platform.

My main issue with the article is the premise that "professional" reviewers are objectively any higher quality on average than user reviews. A sizable proportion of them are very detached from what real people care about. I absolutely critically read non-fiction, and am somewhat judgy if a certain rigor isn't applied, but for fiction? How is that fun? It's OK for a story just to be cheap fun. It's OK for different authors to have different writing styles and different levels of attention to detail and different levels of grittiness to their stories. There is absolutely actual bad writing out there, and some gets published, but a story not being for you doesn't mean that voice doesn't connect with someone else. A lot of book critics are huge snobs.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • [email protected]
  • All magazines