Exportar Publicação
A publicação pode ser exportada nos seguintes formatos: referência da APA (American Psychological Association), referência do IEEE (Institute of Electrical and Electronics Engineers), BibTeX e RIS.
Harb, Y., Moro, S. & Harb, A. (N/A). AI business writing assistant tools in practice: Text mining and statistical analysis of user experience. International Journal of Human–Computer Interaction. N/A
Y. Harb et al., "AI business writing assistant tools in practice: Text mining and statistical analysis of user experience", in Int. Journal of Human–Computer Interaction, vol. N/A, N/A
@article{harbN/A_1764929727070,
author = "Harb, Y. and Moro, S. and Harb, A.",
title = "AI business writing assistant tools in practice: Text mining and statistical analysis of user experience",
journal = "International Journal of Human–Computer Interaction",
year = "N/A",
volume = "N/A",
number = "",
doi = "10.1080/10447318.2025.2563743",
url = "https://www.tandfonline.com/journals/hihc20"
}
TY - JOUR TI - AI business writing assistant tools in practice: Text mining and statistical analysis of user experience T2 - International Journal of Human–Computer Interaction VL - N/A AU - Harb, Y. AU - Moro, S. AU - Harb, A. PY - N/A SN - 1044-7318 DO - 10.1080/10447318.2025.2563743 UR - https://www.tandfonline.com/journals/hihc20 AB - This study employs a multi-method quantitative research design to review various popular AI writing assistant tools using 11562 actual users’ satisfied and dissatisfied reviews. The research design applies Kruskal-Wallis test to differentiate between AI writing assistants based on user rating, functionality, ease of use (EOU), value for money, and customer support attributes; uses the quantitative text analysis to identify satisfaction and dissatisfaction attributes; and employs topic modeling to uncover the specific related attributes that influence user satisfaction. The statistical analysis results indicated that there is a significant difference among AI writing assistant tools according to user rating, functionality, EOU, and customer support attributes. The text analysis revealed the attributes leading to user satisfaction and dissatisfaction. Topic modeling further identified both common and tool-specific attributes. This study provides a relatively comprehensive set of attributes and sub-attributes for evaluating AI writing assistant tools, which can serve as a foundation for future research. ER -
English