Human Behaviour
Human Behaviour
A R T I C L E I N F O A B S T R A C T
Keywords: As fintech chatbots become more and more popular in online banking services, most banks have discovered their
Continuance intention business potential. To understand why fintech chatbots can be used quickly and widely to realize commercial
Fintech chatbots benefits, this research uses social response theory to detect the continuous intention mechanism behind fintech
Potential growth model
chatbots. This research includes social capital (social cues) and attitudes toward fintech chatbots to describe how
Social capital
Social response theory
social cues based on social response theory evoke users’ social behaviors, which in turn may affect continuous
intention. Based on the potential growth model, the growth trends and relationships of these variables were
analyzed based on the longitudinal data of 455 fintech chatbot users in Taiwan in three stages over six months.
The results of the survey support all hypotheses and can enable vendors to understand the mechanism of how to
enhance users’ continuous intention for fintech chatbots.
1. Introduction the above example, although fintech chatbots are gradually becoming
popular in online interactions, so far, it is still unclear which mechanism
Artificial intelligence robot services have gradually started various will affect users’ perceptions of fintech chatbots. Although a lot of re
technological revolutions and established industry principles (Delgosha sources are used to optimize artificial intelligence algorithms and hu
& Hajiheydari, 2021; International Federation of Robotics, 2017). manized interfaces of fintech chatbots (Wigglesworth, 2016), until now,
However, despite these virtual banking services use artificial intelli this research has begun to study the psychological impact of these social
gence to achieve commercial interests, many users still like to use cues on users. Indeed, past research has confirmed that online services
physical banks (Thusyanthy & Senthilnathan, 2017). As more and more with avatars (human-like characters) can effectively persuade con
users switch to online banking services, it seems that online banking sumers and increase their positive loyalty (e.g., continuance intention)
services lack the social presentation of physical interactions. Indeed, in (Teng, 2019). Continuance intention (CI) indicates the possibility of
the online banking setting, the previous research has examined the continuance intention in using certain information technology systems
driving variables of continuance intention (Montazemi & Qahri-Saremi, in the future (Bhattacherjee, 2001).
2015)), they summarized that the driving factors are almost utilitarian Past research on the CI field of information systems (IS) has not
orientation. However, bankers are starting to think about how to include attracted enough attention (Amoroso & Lim, 2017) because of the
human-like characteristics in their online services (e.g., fintech chat insufficiency of research on this issue. Although past research has pro
bots) to simulate physical interactions. A fintech chatbot is a chat service posed various aspects of CI (e.g., Dai, Teo, & Rappa, 2020; Gan & Li,
that can automatically respond to users with financial text language in 2018; Li et al., 2018), this stream is still somewhat immature. The past
human-like manners, including internet links, structured text, images, or research on CI can be identified as three streams. The first stream em
specific command buttons. In particular, the interaction context of a ploys social network theory to predict CI (e.g. Chang & Zhu, 2012;
chatbot is similar to a conversation between friends or family (Bayerque, Zhang, Li, Wu, & Li, 2017). The second stream employs continuance
2016; Hill, Ford, & Farreras, 2015; Jang, Jung, & Kim, 2021). For theory to predict CI (e.g., Bøe, Gulbrandsen, & Sørebø, 2015; Foroughi,
example, if you plan to invest in a fund, you can ask a fintech chatbot for Iranmanesh, &Hyun, 2019). The third stream combines different IS
professional suggestions instead of spending a lot of time and energy adoption models as a new model to explain CI (e.g., Lu, Yu, Liu, & Wei,
collecting information. Besides, you can employ a call-to-action button 2017; Wu & Chen, 2017). This research opens a new stream that predicts
to instantly invest in the fund through a fintech chatbot. According to CI by social response theory. In academics, because CI is not just a trivial
* Corresponding author. Master Program of Financial Technology, School of Financial Technology, Ming Chuan University, No.130, Jihe Rd., Shihlin District,
Taipei City 111, Taiwan.
E-mail addresses: [email protected] (S.Y.B. Huang), [email protected] (C.-J. Lee).
https://doi.org/10.1016/j.chb.2021.107027
Received 17 August 2020; Received in revised form 6 September 2021; Accepted 16 September 2021
Available online 1 January 2022
0747-5632/© 2021 Published by Elsevier Ltd.
S.Y.B. Huang and C.-J. Lee Computers in Human Behavior 129 (2022) 107027
2
S.Y.B. Huang and C.-J. Lee Computers in Human Behavior 129 (2022) 107027
humans to respond to computers through social behaviors. Indeed, main factors affecting conscious or unconscious mental behavior
previous studies have found that human-computer interaction is similar (Mehrabian & Russell, 1974). Therefore, social interactivity cues, social
to human interaction, and two-way communication is a necessary con credence cues, and social sharing signs and language cues can be used as
dition for interaction with information technology systems (McMillan & social prompt cues, because these social cues can stimulate human
Hwang, 2002; Liu & Shrum, 2002; Pelau, Dabija, & Ene, 2021). Because conscious or unconscious emotional arousal. Past research also pointed
the design of fintech chatbots is similar to the two-way interaction be out that whether it is a physical store (Baker, Grewal, & Levy, 1992) or
tween people, humans can respond to fintech chatbots socially as if they an online store (Davis et al., 1989), stimuli (cues) can also affect human
are social roles. Social interactivity cues indicate a certain degree of emotional arousal. Berry, Carbone, and Haeckel (2002) also believes
intimacy, frequent communications, and time spent on interactions that the cues generated by interpersonal interaction are important fac
(Nahapiet & Ghoshal, 1998), and this research takes it as the primary tors affecting emotions. Reeves and Nass (1996) also believe that higher
role of social cues. Second, The previous studies have found that the levels of social cues will cause higher levels of emotions in the
language displayed on the computer screen can affect the perception of human-computer interaction environment. This research proposes the
individuals, such as treating the computer as a living person or a person first to third hypotheses.
with a personality (Moon, 2000; Nass, Moon, Fogg, Reeves, & Dryer,
Hypothesis 1. Users who perceive more social interactivity cues at
1995). Other studies also support this assume in the context of the
Time 1 may lead to more development of emotional arousal.
interaction between humans and artificial intelligence robots (Park,
Jang, Cho, & Choi, 2021; Shumanov & Johnson, 2021). Since the fintech Hypothesis 2. Users who perceive more social credence cues at Time 1
chatbot combines language information in the form of an information may lead to more development of emotional arousal.
system interface, humans should respond to computers as social roles.
Hypothesis 3. Users who perceive more social sharing signs and lan
Social sharing signs and language cues represent a certain degree of
guage cues at Time 1 may lead to more development of emotional
general terms, meaningful communication methods, and readability of
arousal.
messages (Nahapiet & Ghoshal, 1998). This research uses it as the sec
The theory of social response points out that when people interact
ond role of social cues. Finally, Boone, Declerck, and Suetens (2008) alos
with fintech chatbots, the stimulus of social cues will cause people to
pointed out that credence is an important social cue because it enables
show social behaviors that respond to them. This research uses Mehra
humans to adopt social behaviors in response to others. Other studies
bian and Russell’s (1974) theory to open black box between social cues
also support this assume in the context of the interaction between
and social behaviors (e.g., attitude towards fintech chatbots), and pro
humans and artificial intelligence robots (Aoki, 2021; Youn & Jin,
poses that emotional arousal is an organism’s internal emotional
2021). Indeed, since fintech chatbots are designed for interpersonal
mechanism to link stimulus (social cues) and response (social
interaction, people who feel a high degree of social credence may regard
behaviors).
fintech chatbots as social roles. Social credence cues represent a personal
Since attitude is a broad social behavior in human-computer inter
belief in the interaction of other members, including keeping promises
action, this research incorporates it into the proposed model of the
and consistent behaviors (Nahapiet & Ghoshal, 1998) and this research
present survey. The past researches of human-computer interaction
uses it as the third cues.
behaviors have confirmed that attitude is the most significant and
The present survey develop interactivity cues, credence cues, and
important social behavior in the environment of information technology
sharing signs and language cues for virtual fintech chatbots. This
systems, continuous model of multiple motivational information sys
research digs out suitable existing scales to adapt the domains of social
tems, and consumer acceptance technology model (Ajzen, 1991; Davis,
cues rather than perfect scales. In reviewing existing measures, the
1989; Fishbein & Ajzen, 1975; Kulviwat, Bruner, Kumar, Nasco, & Clark,
theory of Tsai and Ghoshal (1998) is adopted to develop cues, including
2007; Lowry, Gaskin, & Moody, 2015). This research proposes the
social interactivity cues (structural dimensions), social credence cues
fourth hypothesis:
(relationship dimensions), and social sharing signs and language cues
(Cognitive dimension). Therefore, this research modified the structural Hypothesis 4. Users who develop more emotional arousal may lead to
orientation (interactivity), relationship orientation (social credence), more growth in the development of attitude toward the fintech chatbot
and cognitive orientation (sharing signs and language) of social capital over time.
as the social cues of social response theory. People’s positive or negative attitudes towards information tech
Taken together, social response theory in a fintech chatbot setting nology systems are formed by people’s experience in use and are closely
assumes that the human interaction to fintech chatbot that demonstrates related to future continuous needs (Amoroso & Ogawa, 2011; Kim,
social cues (e.g., social interactivity cues, social credence cues, and so Galliers, Shin, Ryoo, & Kim, 2012). In other words, attitude is human
cial sharing signs and language cues) is similiar to the human interac identification or preference for information technology systems (Shih,
tion. Indeed, a human receives the social interactivity cues from a 2011; Wixom & Todd, 2005), and it has also proven to be a good pre
fintech chatbot, and then the cues arouse him or her to employ social dictor of continued willingness and loyalty (Liljander, Polsa, & Forsberg,
behaviors to respond to it. This research proposes a theoretical model 2007; Wu & Chen, 2017). On the other hand, if humans have a positive
that explains how the social cues arouse the attitude toward the fintech attitude towards information technology systems, it will also affect their
chatbot, which in turn, causes the CI. This research proposes a theo continued willingness (Chau & Hu, 2011). Besides, past empirical
retical model to explain how social cues elicit people’s attitudes towards studies using the expectation confirmation model also pointed out that
fintech chatbots, which in turn lead to CI. attitude is a factor that affects continued willingness (Alraimi, Zo, &
Ciganek, 2015). Therefore, this research proposes the fifth hypothesis:
2.2. Hypothese development
Hypothesis 5. Users who develop more attitudes towards fintech
chatbots may lead to more continuance intention growths over time.
Based on the theoretical framework (Fig. 1), CI is affected by the
attitudes toward the fintech chatbot and its antecedents (social cues and
3. Methodology
emotional arousal) based on social reaction theory. The next section will
discuss these hypotheses with theoretical rationales and justifications.
The research framework of this research is developed from social
In order to explain the influence of emotions, Russell’s (1974) theory
cues CI according to the theory of social response.
is adopted to describe the mechanism of social response theory. Meh
rabian and Russell’s (1974) theories claim that rapid and unconscious
stimuli (cues) precede emotional responses, and stimuli (cues) are the
3
S.Y.B. Huang and C.-J. Lee Computers in Human Behavior 129 (2022) 107027
4
S.Y.B. Huang and C.-J. Lee Computers in Human Behavior 129 (2022) 107027
5
S.Y.B. Huang and C.-J. Lee Computers in Human Behavior 129 (2022) 107027
Fig. 2. The potential growth model of this research. Note: SIC: Social interactivity cues; SCC: Social credence cues; SLC: Social sharing signs and language cues; EA =
Emotional arousal; AFC = Attitude toward the fintech chatbot; CI = Continuance intention. Yn = Measurement items. *p < 0.05; **p < 0.01.
6
S.Y.B. Huang and C.-J. Lee Computers in Human Behavior 129 (2022) 107027
4.3. Limitations s and future research 1. I feel stimulated when I interact with the fintech chatbot.
2. I feel aroused when I interact with the fintech chatbot.
First, although empirical study usually uses non-probabilistic sam 3. I have a frenzy of joy when I interact with the fintech chatbot.
ples, researchers should take precautions in the generalization of find 4. I feel excited when I interact with the fintech chatbot.
ings. In addition, culture also influences the use intention in the online
environment (Crotts & Erdmann, 2000), so further research should A.5. Attitude toward the fintech chatbot
compare the different samples in different environments. Second, when
this research employs social interactivity cues, social credence cues, and 1. I am like to use the fintech chatbot.
social sharing signs and language cues to describe the theory of social 2. I am satisfied with the service provided by the fintech chatbot.
response, there is no doubt that other cues may also cause emotional 3. I feel comfortable using the fintech chatbot.
arousal. Future research should compare the impact of different cues in 4. I feel that using the fintech chatbot can solve my problem well.
different contexts. Third, this research employs PGM to capture the
development of user perceptions, and it can explain the dynamic and
A.6. Continuance Intention
complex interrelationships between the theory of social response to CI.
Fourth, previous studies always measured behavioral intentions in
1. I intend to continue using the fintech chatbot.
empirical studies to represent actual behaviors. It is still not certain that
2. I want to continue using the fintech chatbot instead of alternative
behavioral intentions can predict actual behaviors well. However, based
means.
on information systems/computers technology theories and protection
3. If I could, I would like to continue using the fintech chatbot over
motivation theory (Ajzen, 1991; Davis, Bagozzi, & Warshaw, 1989;
the next year.
Fishbein & Ajzen, 1975; Floyd, Prentice-Dunn, & Rogers, 2000), these
4. It is unlikely for me to stop using the fintech chatbot.
theories have proposed that behavioral intentions can predict actual
behaviors well, and these theories have been applied in various disci
pline fields. Although this research did not use an experimental design s References
(Fisher, 1971) to confirm the relationship between behavioral intentions
Ajzen, I. (1991). The theory of planned behavior. Organizational Behavior and Human
and actual behaviors, this research used a follow-up questionnaire to Decision Processes, 50, 179–211.
represent the true continuous use behavior (e.g., the frequency of use in Alraimi, K. M., Zo, H. J., & Ciganek, A. P. (2015). Understanding the MOOCs
the past three months). Further research should use rigorous experi continuance: The role of openness and reputation. Computers & Education, 80, 28–38.
Amoroso, D., & Lim, R. (2017). The mediating effects of habit on continuance intention.
mental methods to verify the predictability of the theoretical model of International Journal of Information Management, 37(6), 693–702.
this research. Finally, although there is no sensitive social desirability in Amoroso, D., & Ogawa, M. (2011). Japan’s model of mobile ecosystem success: The case
the measurement items of this research, further study should confirm the of NTT DoCoMo. Journal of Emerging Knowledge on Emerging Markets, 3. Article 27.
Aoki, N. (2021). The importance of the assurance that “humans are still in the decision
desirability bias of the theoretical model in the present survey. loop” for public trust in artificial intelligence: Evidence from an online experiment.
Computers in Human Behavior, 114, 106572.
Credit author statement Bae, M. (2018). Understanding the effect of the discrepancy between sought and
obtained gratification on social networking sites users’ satisfaction and continuance
intention. Computers in Human Behavior, 79, 137–153.
Stanley Y.B. Huang: Methodology, Software, Data curation, Writing- Baker, J., Grewal, D., & Levy, M. (1992). An experimental approach to making retail
Original draft preparation, Investigation. Chih-Jen Lee: Writing- store environmental decisions. Journal of Retailing, 68(4), 445–460.
Bayerque, N. (2016). A short history of chatbots and artificial intelligence (accessed August
Reviewing and Editing, Literature Collection. 1, 2020) https://venturebeat.com/2016/08/15/a-short-history-of-chatbots-and-artif
icial-intelligence/.
Appendix A. Measurement items Berry, L. L., Carbone, L. P., & Haeckel, S. H. (2002). Managing the total customer
experience. Sloan Management Review, 43, 85–89.
Bhattacherjee, A. (2001). Understand information systems continuance: An expectation-
A.1. Social interactivity cues confirmation model. MIS Quarterly, 16(3), 351–370.
Bøe, T., Gulbrandsen, B., & Sørebø, O. (2015). How to stimulate the continued use of ICT
1. I feel that the fintech chatbot means having close social relation in higher education: Integrating Information Systems Continuance Theory and
agency theory. Computers in Human Behavior, 50, 375–384.
ships with me. Boone, C., Declerck, C., & Suetens, S. (2008). Subtle cues, explicit incentives, and
2. I feel that the fintech chatbot means spending a lot of time inter cooperation in social dilemmas. Evolution and Human Behavior, 29, 179–188.
acting with me. Chang, Y. P., & Zhu, D. H. (2012). The role of perceived social capital and flow
experience in building users’ continuance intention to social networking sites in
3. I feel that the fintech chatbot means frequent communication with China. Computers in Human Behavior, 28, 995–1001.
me. Chau, P., & Hu, P. (2011). Information technology acceptance by individual
professionals: A model of comparison approach. Decision Sciences, 32(4), 699–719.
Chen, G., & Kanfer, R. (2006). Toward a systems theory of motivated behavior in work
A.2. Social credence cues team. In B. M. Staw (Ed.), Research in organizational behavior (349–381). Greenwich,
CT: JAI Press.
1. I feel that the fintech chatbot can keep its promises. Chen, Q., & Wells, W. D. (1999). Attitude toward the site. Journal of Advertising Research,
39(5), 27–37.
2. I feel that the fintech chatbot knows we can count on each other. Crotts, J. C., & Erdmann, R. (2000). Does national culture influence consumers’
3. I feel that the fintech chatbot behaves consistently. evaluation of travel services? A test of hofstede’s model of cross-cultural differences.
4. I feel that the fintech chatbot is truthful in dealing with each other. Managing Service Quality, 10(5), 410–419.
Dai, H. M., Teo, T., & Rappa, N. A. (2020). Understanding continuance intention among
MOOC participants: The role of habit and MOOC performance. Computers in Human
A.3. Social sharing signs and language Behavior, 11, 106455.
Davis, F. D., Bagozzi, R. P., & Warshaw, P. R. (1989). User acceptance of computer
technology: A comparison of two theoretical models. Management Science, 35(8),
1. I feel that the fintech chatbot uses common terms or jargons to
982–1003.
convey messages. Delgosha, M. S., & Hajiheydari, N. (2021). How human users engage with consumer
2. I feel that the fintech chatbot uses an understandable communi robots? A dual model of psychological ownership and trust to explain post-adoption
cation pattern during the discussion. behaviours. Computers in Human Behavior, 117, 106660.
Donkin, C. (2019). Line, Rakuten poised for Taiwan online bank launch (accessed
3. I feel that the fintech chatbot uses understandable narrative forms February 1, 2021) https://www.mobileworldlive.com/money/news-money/line-ra
to convey messages. kuten-poised-for-taiwan-online-bank-launch.
7
S.Y.B. Huang and C.-J. Lee Computers in Human Behavior 129 (2022) 107027
Eagly, A. H., & Chaiken, S. (1993). The psychology of attitudes. Fort Worth. TX: Harcourt Mehrabian, A., & Russell, J. A. (1974). An approach to environmental psychology.
Brace Jovanovich. Cambridge, MA: MIT Press.
Edwards, C., Edwards, A., Stoll, B., Lin, X., & Massey, N. (2019). Evaluations of an Montazemi, A. R., & Qahri-Saremi, H. (2015). Factors affecting adoption of online
artificial intelligence instructor’s voice: Social Identity Theory in human-robot banking: A meta-analytic structural equation modeling study. Information &
interactions. Computers in Human Behavior, 90, 357–362. Management, 52(2), 210–226.
Fishbein, M., & Ajzen, I. (1975). Belief, attitude, intention and behavior: An introduction to Moon, Y. (2000). Intimate exchanges: Using computers to elicit self-disclosure from
theory and research. MA: Addison-Wesley Boston. consumers. Journal of Consumer Research, 26, 323–339.
Fisher, R. A. (1971). The design of experiments. Macmillan. Nahapiet, J., & Ghoshal, S. (1998). Social capital, intellectual capital, and the
Floyd, D. L., Prentice-Dunn, S., & Rogers, R. W. (2000). A meta-analysis of research on organizational advantage. Academy of Management Review, 23(2), 242–266.
protection motivation theory. Journal of Applied Social Psychology, 30(2), 407–429. Nass, C., Fogg, B. J., & Moon, Y. (1996). Can computers Be teammates? International
Fornell, C., & Lacker, D. F. (1981). Evaluating structural equation models with Journal of Human-Computer Studies, 45(6), 669–678.
unobservable variables and measurement error. Journal of Marketing Research, 18(1), Nass, C., & Moon, Y. (2000). Machines and mindlessness: Social responses to computers.
39–50. Journal of Social Issues, 56(1), 81–103.
Gan, C., & Li, H. (2018). Understanding the effects of gratifications on the continuance Nass, C., Moon, Y., & Carney, P. (1999). Are people polite to computers? Responses to
intention to use WeChat in China: A perspective on uses and gratifications. Computers computer-based interviewing systems. Journal of Applied Social Psychology, 29(5),
in Human Behavior, 78, 306–315. 1093–1110.
Gillath, O., Ai, T., Branicky, M. S., Keshmiri, S., Davison, R. B., & Spaulding, R. (2021). Nass, C., Moon, Y., Fogg, B. J., Reeves, B., & Dryer, D. C. (1995). Can computer
Attachment and trust in artificial intelligence. Computers in Human Behavior, 115, personalities Be human personalities? International Journal of Human-Computer
106607. Studies, 43(2), 223–239.
Henseler, J., Ringle, C. M., & Sarstedt, M. (2015). A new criterion for assessing Park, N., Jang, K., Cho, S., & Choi, J. (2021). Use of offensive language in human-
discriminant validity in variance-based structural equation modeling. Journal of the artificial intelligence chatbot interaction: The effects of ethical ideology, social
Academy of Marketing Science, 43(1), 115–135. competence, and perceived humanlikeness. Computers in Human Behavior, 121,
Hill, J., Ford, W. R., & Farreras, I. G. (2015). Real conversations with artificial 106795.
intelligence: A comparison between human–human online conversations and Pelau, C., Dabija, D. C., & Ene, I. (2021). What makes an AI device human-like? The role
human–chatbot conversations. Computers in Human Behavior, 49, 245–250. of interaction quality, empathy and perceived psychological anthropomorphic
Hong, J. C., Tai, K. H., Hwang, M. Y., Kuo, Y. C., & Chen, J. S. (2017). Internet cognitive characteristics on the acceptance of artificial intelligence in the service industry.
failure relevant to users’ satisfaction with content and interface design to reflect Computers in Human Behavior. https://doi.org/10.1016/j.chb.2021.106855
continuance intention to use a government e-learning system. Computers in Human Podsakoff, P. M., MacKenzie, S. B., Lee, J., & Podsakoff, N. P. (2003). Common method
Behavior, 66, 353–362. biases in behavioral research: A critical review of the literature and recommended
Hsu, H. Y., Liu, F. H., Tsou, H. T., & Chen, L. J. (2019). Openness of technology adoption, remedies. Journal of Applied Psychology, 88, 879–903.
top management support and service innovation: A social innovation perspective. Rahman, A. M., Mamun, A. A., & Islam, A. (2017). Programming challenges of chatbot:
Journal of Business & Industrial Marketing, 34, 575–590. Current and future prospective. 2017 IEEE region 10 humanitarian technology conference
International Federation of Robotics. (2017). Executive summary world robotics 2017 (R10-HTC) (pp. 75–78). Dhaka.
service robots. available at: https://ifr.org/free-downloads/. (Accessed 1 August Reeves, B., & Nass, C. I. (1996). The media equation. Stanford, CA: CSLI Publications.
2020) accessed. Shane-Simpson, C., Manago, A., Gaggi, N., & Gillespie-Lynch, K. (2018). Why do college
Jang, M., Jung, Y., & Kim, S. (2021). Investigating managers’ understanding of chatbots students prefer facebook, twitter, or instagram? Site affordances, tensions between
in the Korean financial industry. Computers in Human Behavior, 120, 106747. privacy and selfexpression, and implications for social capital. Computers in Human
Johnson, S., & Daoud, Z. (2020). Who innovates first? Ranking 135 economies around Behavior, 86, 276–288.
the world (accessed February 1, 2021) https://venturebeat.com/2016/08/15/a-sho Shang, R. A., & Sun, Y. (2020). So little time for so many ties: Fit between the social
rt-history-of-chatbots-and-artificial-intelligence/. capital embedded in enterprise social media and individual learning requirements.
Kalton, G. (2009). Methods for oversampling rare subpopulations in social surveys. Computers in Human Behavior, 106615.
Survey Methodology, 35(2), 125–141. Shih, C. (2011). Comparisons of competing models between attitudinal loyalty and
Kim, C., Galliers, R., Shin, N., Ryoo, J., & Kim, J. (2012). Factors influencing internet behavioral loyalty. International Journal of Business and Information, 8(1), 149–166.
shopping value and consumer continuance intention. Electronic Commerce Research Shumanov, M., & Johnson, L. (2021). Making conversations with chatbots more
and Applications, 11, 374–387. personalized. Computers in Human Behavior, 117, 106627.
KPMG. (2019). Fintech100: Leading global fintech innovators. https://home.kpmg/ Steuer, J., & Nass, C. (1993). Voices, boxes, and sources of messages computers and
xx/en/home/insights/2019/11/2019-fintech100-leading-global-fintech-innovator social actors. Human Communication Research, 19(4), 504–527.
s-fs.html. Sundar, S. S., & Nass, C. (2000). Source-orientation in human-computer interaction.
Kuhn, T. (1970). The structure of scientific revolutions. Chicago: University of Chicago Communication Research, 27(6), 683–703.
Press. Teng, C.-I. (2019). How avatars create identification and loyalty among online gamers:
Kulviwat, S., Bruner, G. C., Kumar, A., Nasco, S. A., & Clark, T. (2007). Toward a unified Contextualization of self-affirmation theory. Internet Research, 29(6), 1443–1468.
theory of consumer acceptance technology. Psychology and Marketing, 24(12), Tett, R. P., & Burnett, D. D. (2003). A personality trait-based interactionist model of job
1059–1084. performance. Journal of Applied Psychology, 88, 500–517.
Lance, C. E., Vandenberg, R. J., & Self, R. M. (2000). Latent growth models of individual Thusyanthy, V., & Senthilnathan, S. (2017). Customer satisfaction in terms of physical
change: The case of newcomer adjustment. Organizational Behavior and Human evidence and employee interaction. The IUP Journal of Marketing Management, XI(3),
Decision Processes, 83, 107–140. 7–24.
Li, H., Li, L., Gan, C., Liu, Y., Tane, C.-W., & Deng, Z. (2018). Understanding the effects of Tsai, W., & Ghoshal, S. (1998). Social capital and value creation: An empirical study of
gratifications on the continuance intention to use WeChat in China: A perspective on intrafirm networks. Academy of Management Journal, 41(4), 464–476.
uses and gratifications. Computers in Human Behavior, 85, 175–182. Valaei, N., & Baroto, M. B. (2017). Modelling continuance intention of citizens in
Liljander, A., Polsa, P., & Forsberg, K. (2007). Do mobile CRM services appeal to loyalty government facebook page: A complementary PLS approach. Computers in Human
program consumers? International Journal of E-Business Research, 3(2), 24–40. Behavior, 73, 224–237.
Lin, M. J. J., Hung, S. W., & Chen, C. J. (2009). Fostering the determinants of knowledge VanderWeele, T. J., Jackson, J. W., & Li, S. (2016). Causal inference and longitudinal
sharing in professional virtual communities. Computers in Human Behavior, 25(4), data: A case study of religion and mental health. Social Psychiatry and Psychiatric
929–939. Epidemiology, 51, 1457–1466.
Liu, Y., & Shrum, L. J. (2002). What is interactivity and is it always such a good thing? Voorhees, C. M., Brady, M. K., Calantone, R., & Ramirez, E. (2016). Discriminant validity
Implications of definition, person and situation for the influence of interactivity on testing in marketing: An analysis, causes for concern, and proposed remedies.
advertising effectiveness. Journal of Advertising, 4, 53–64. Journal of the Academy of Marketing Science, 44(1), 119–134.
Lowry, P. B., Gaskin, J. E., & Moody, G. D. (2015). Proposing the multimotive Walley, L., & Smith, M. (1998). Deception in selection. Chichester: Wiley.
information systems continuance model (MISC) to better explain end-user system Wigglesworth, R. (2016). Fintech: Search for a super-algo. available at: https://www.ft.
evaluations and continuance intentions. Journal of the Association for Information com/content/5eb91614-bee5-11e5-846f-79b0e3d20eaf. (Accessed 1 August 2020)
Systems, 16(7), 515–579. accessed.
Luo, X., Tong, S., Fang, Z., & Qu, Z. (2019). Frontiers: Machines vs. humans: The impact Wixom, B., & Todd, P. (2005). A theoretical integration of user satisfaction and
of artificial intelligence chatbot disclosure on customer purchases. Marketing Science, technology acceptance. Information Systems Research, 16(1), 85–102.
38(6), 937–947. Wu, B., & Chen, X. (2017). Continuance intention to use MOOCs: Integrating the
Lu, J., Yu, C.-S., Liu, C., & Wei, J. (2017). Comparison of mobile shopping continuance technology acceptance model (TAM) and task technology fit (TTF) model. Computers
intention between China and USA from an espoused cultural perspective. Computers in Human Behavior, 67, 221–232.
in Human Behavior, 75(1), 130–146. Youn, S., & Jin, S. V. (2021). “In A.I. we trust?” the effects of parasocial interaction and
Maghsoudi, R., Shapka, J., & Wisniewski, P. (2020). Examining how online risk exposure technopian versus luddite ideological views on chatbot-based customer relationship
and online social capital influence adolescent psychological stress. Computers in management in the emerging “feeling economy”. Computers in Human Behavior, 119,
Human Behavior, 113, 106488. 106721.
Malhotra, N. K., Kim, S. S., & Patil, A. (2006). Common method variance in IS research: A Zhang, C. B., Li, Y. N., Wu, B., & Li, D. J. (2017). How WeChat can retain users: Roles of
comparison of alternative approaches and a reanalysis of past research. Management network externalities, social interaction ties, and perceived values in building
Science, 52(1), 1865–1883, 2. continuance intention. Computers in Human Behavior, 69, 284–293.
McMillan, S. J., & Hwang, J. S. (2002). Measures of perceived interactivity: An
exploration of the role of direction of communication, user control, and time in
shaping perceptions of interactivity. Journal of Advertising, 31(3), 29–42.