ABSTRACT
This study analyzes the “tax talk” in online client reviews of tax preparers, and evaluates off-the-shelf (SentiStrength, LIWC2007, and DICTION 6.0) and customized software packages' detection of sentiment in these reviews. Compared to human-coded sentiment, three off-the-shelf programs poorly assess client sentiment. We adapted two software packages, SentiStrength and LIWC2007, to “tax talk” by splitting the sample into learning and holdout segments (n = 50 each). Results for SentiStrength with a customized dictionary and keywords evidenced high validity. The results suggest that client reviews of tax preparers contain unique taxation language.
Data Availability: Data used in this study are available from the corresponding author upon request.