0% found this document useful (0 votes)
3 views8 pages

Human Behaviour

This research investigates the factors influencing users' continuance intention to use fintech chatbots, utilizing social response theory to analyze the impact of social cues on user behavior. By examining longitudinal data from 455 users in Taiwan, the study identifies how social interactivity, credence, and sharing cues affect emotional arousal and attitudes towards chatbots, ultimately enhancing users' intention to continue using these services. The findings provide insights for banks to improve user engagement and loyalty through effective chatbot design.

Uploaded by

psaineeraj2003
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views8 pages

Human Behaviour

This research investigates the factors influencing users' continuance intention to use fintech chatbots, utilizing social response theory to analyze the impact of social cues on user behavior. By examining longitudinal data from 455 users in Taiwan, the study identifies how social interactivity, credence, and sharing cues affect emotional arousal and attitudes towards chatbots, ultimately enhancing users' intention to continue using these services. The findings provide insights for banks to improve user engagement and loyalty through effective chatbot design.

Uploaded by

psaineeraj2003
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

Computers in Human Behavior 129 (2022) 107027

Contents lists available at ScienceDirect

Computers in Human Behavior


journal homepage: www.elsevier.com/locate/comphumbeh

Predicting continuance intention to fintech chatbot


Stanley Y.B. Huang, Chih-Jen Lee *
Master Program of Financial Technology, School of Financial Technology, Ming Chuan University, Taiwan

A R T I C L E I N F O A B S T R A C T

Keywords: As fintech chatbots become more and more popular in online banking services, most banks have discovered their
Continuance intention business potential. To understand why fintech chatbots can be used quickly and widely to realize commercial
Fintech chatbots benefits, this research uses social response theory to detect the continuous intention mechanism behind fintech
Potential growth model
chatbots. This research includes social capital (social cues) and attitudes toward fintech chatbots to describe how
Social capital
Social response theory
social cues based on social response theory evoke users’ social behaviors, which in turn may affect continuous
intention. Based on the potential growth model, the growth trends and relationships of these variables were
analyzed based on the longitudinal data of 455 fintech chatbot users in Taiwan in three stages over six months.
The results of the survey support all hypotheses and can enable vendors to understand the mechanism of how to
enhance users’ continuous intention for fintech chatbots.

1. Introduction the above example, although fintech chatbots are gradually becoming
popular in online interactions, so far, it is still unclear which mechanism
Artificial intelligence robot services have gradually started various will affect users’ perceptions of fintech chatbots. Although a lot of re­
technological revolutions and established industry principles (Delgosha sources are used to optimize artificial intelligence algorithms and hu­
& Hajiheydari, 2021; International Federation of Robotics, 2017). manized interfaces of fintech chatbots (Wigglesworth, 2016), until now,
However, despite these virtual banking services use artificial intelli­ this research has begun to study the psychological impact of these social
gence to achieve commercial interests, many users still like to use cues on users. Indeed, past research has confirmed that online services
physical banks (Thusyanthy & Senthilnathan, 2017). As more and more with avatars (human-like characters) can effectively persuade con­
users switch to online banking services, it seems that online banking sumers and increase their positive loyalty (e.g., continuance intention)
services lack the social presentation of physical interactions. Indeed, in (Teng, 2019). Continuance intention (CI) indicates the possibility of
the online banking setting, the previous research has examined the continuance intention in using certain information technology systems
driving variables of continuance intention (Montazemi & Qahri-Saremi, in the future (Bhattacherjee, 2001).
2015)), they summarized that the driving factors are almost utilitarian Past research on the CI field of information systems (IS) has not
orientation. However, bankers are starting to think about how to include attracted enough attention (Amoroso & Lim, 2017) because of the
human-like characteristics in their online services (e.g., fintech chat­ insufficiency of research on this issue. Although past research has pro­
bots) to simulate physical interactions. A fintech chatbot is a chat service posed various aspects of CI (e.g., Dai, Teo, & Rappa, 2020; Gan & Li,
that can automatically respond to users with financial text language in 2018; Li et al., 2018), this stream is still somewhat immature. The past
human-like manners, including internet links, structured text, images, or research on CI can be identified as three streams. The first stream em­
specific command buttons. In particular, the interaction context of a ploys social network theory to predict CI (e.g. Chang & Zhu, 2012;
chatbot is similar to a conversation between friends or family (Bayerque, Zhang, Li, Wu, & Li, 2017). The second stream employs continuance
2016; Hill, Ford, & Farreras, 2015; Jang, Jung, & Kim, 2021). For theory to predict CI (e.g., Bøe, Gulbrandsen, & Sørebø, 2015; Foroughi,
example, if you plan to invest in a fund, you can ask a fintech chatbot for Iranmanesh, &Hyun, 2019). The third stream combines different IS
professional suggestions instead of spending a lot of time and energy adoption models as a new model to explain CI (e.g., Lu, Yu, Liu, & Wei,
collecting information. Besides, you can employ a call-to-action button 2017; Wu & Chen, 2017). This research opens a new stream that predicts
to instantly invest in the fund through a fintech chatbot. According to CI by social response theory. In academics, because CI is not just a trivial

* Corresponding author. Master Program of Financial Technology, School of Financial Technology, Ming Chuan University, No.130, Jihe Rd., Shihlin District,
Taipei City 111, Taiwan.
E-mail addresses: [email protected] (S.Y.B. Huang), [email protected] (C.-J. Lee).

https://doi.org/10.1016/j.chb.2021.107027
Received 17 August 2020; Received in revised form 6 September 2021; Accepted 16 September 2021
Available online 1 January 2022
0747-5632/© 2021 Published by Elsevier Ltd.
S.Y.B. Huang and C.-J. Lee Computers in Human Behavior 129 (2022) 107027

theoretical concept, this research tailors social response theory to sup­


plement relevant research and predict CI. In practice, CI is also not just a
trivial concept because bankers hope to obtain high page views and user
loyalty to achieve commercial interests.
Social response theory explains how human-like characters (social
clues) cause people to respond to computers (Moon, 2000; Reeves &
Nass, 1996), but it only examines the initial thoughts of the response
process in the computer context. To detect the user response mechanism
behind the fintech chatbot, this research tailored social cues for the
setting of fintech chatbots and studied how these social cues affect in­
dividual responses to virtual fintech chatbots. This research incorporates
the theory of social capital (Tsai & Ghoshal, 1998) into social cues to
discuss how these social cues (social interactivity cues, social credence
cues, and social sharing signs and language cues) shape social behaviors
(attitude toward the fintech chatbot), which in turn leads to CI. Social
interactivity cues indicate the degree of intimacy, interaction time, and
frequent communication. The social credence cues indicate the degree of
keeping promises, consistent behavior, and honesty. The social sharing
signs and language cues denote the degree of commonly used terms,
meaningful communication patterns, and the comprehensibility of
messages. Emotional arousal indicates the degree of emotional arousal
(Mehrabian & Russell, 1974). Attitudes toward fintech chatbots indicate
the degree of attitude to use in the future, satisfaction, and relative
merits of using for fintech chatbots (Chen & Wells, 1999).
All in all, this research provides future research directions for fintech
chatbots. By incorporating social response theory into the user’s
response mechanism, this research fills the research gaps in human-
Fig. 1. Theoretical model of this research.
computer interaction research on how information systems display
virtual presentation to affect user responses in a humanlike manner.
Besides, longitudinal data design can well detect the causal relationship even if these habits have no reasonable meaning (Reeves & Nass, 1996).
between variables (VanderWeele, Jackson, & Li, 2016), so this research These habits include reciprocity, interactions between individuals,
uses a potential growth model to analyze three waves of empirical data. politeness, and interdependence with others (Nass, Fogg and & Moon,
That is, this research uses a potential growth model to investigate how 1996; Nass, Moon, & Carney, 1999). Humans may have developed a
the three social cues at time 1 affect the positive changes in emotional relationship with computers, but this relationship is the real develop­
arousal development (changes from time 1 to time 3), and how these ment of other humans behind the computer. Past psychological research
positive changes lead to follow-up positive changes in attitude toward pointed out that humans tend to use various heuristics (e.g.,
the fintech chatbot development and CI development. The view of experience-based shortcuts) to process large amounts of information in
changes in these variables is important because previous studies were an inert manner (Eagly & Chaikin, 1993).
cross-sectional design instead of examining the development (change) of The reason behind these social responses is human unconscious
CI with its antecedents over time (e.g., Bae, 2018; Hong, Tai, Hwang, thinking (Nass & Moon, 2000), which is caused by humans’ unconscious
Kuo, & Chen, 2017; Valaei & Baroto, 2017). Therefore, there is little response to these contextual cues (social cues), and then these cues have
empirical evidence on whether these variables increase, decrease, or caused various social scripts and expected behaviors based on past ex­
stabilize over time. This research surveyed three waves of 855 fintech periences. When computers display social cues, people tend to use their
chatbot users within six months to address these literature concerns, and simplified social scripts (e.g., responding to the computer with social
the research question is as follows: behaviors) to automatically respond to the computer. The responding
RQ. What roles do social cues play in affecting users’ social behaviors reason is that humans prefer to pay close attention to the nearest in­
and CI based on social response theory in a fintech chatbot setting over formation source (Sundar & Nass, 2000).
time? The present survey employs the theory of social response to the
background of fintech and propose a specific model with its antecedents
2. Theory and development of hypotheses (social cues) and outcomes (CI) to explain the behaviors of users and
fintech chatbots. In particular, the present research doesn’t advance the
This research shows that social cues (social interactivity cues, social theory of social response, but proposes a paradigm shift (Kuhn, 1970) in
credence cues, and social sharing signs and language cues) will posi­ social response theory from a computer screen environment to a virtual
tively affect the emotional arousal and attitude toward the fintech fintech chatbot. In other words, based on the social response theory,
chatbot based on the social responsibility theory, which in turn will users interacting with fintech chatbots can adopt social behaviors to
induce CI (Fig. 1). respond to social cues (environmental prompts or social cues) embedded
in the interaction through emotional arousal. It makes sense, because
2.1. Social response theory social cues, emotions, and social behaviors have been examined in the
context of the interaction between humans and artificial intelligence
The theory of social response (Moon, 2000) believes that people try robots (Edwards, Edwards, Stoll, Lin, & Massey, 2019; Gillath et al.,
toregard a computer as a human role instead of a media interface, even if 2021; Graves et al., 2021).
they know that the computer has no sense or self (Nass & Moon, 2000). Although various social cues have emerged between interpersonal
In particular, when an information technology system has a series of interactions, certain social cues, have not appeared in the environment
human characteristics (e.g., interactive cues), humans will follow social of fintech chatbots, which are also not found in the original social
rules to display humanized social behaviors to the computer (Reeves & response theory (Moon, 2000). First, Steuer and Nass (1993) proposed
Nass, 1996). Interpersonal interaction also produces many social habits, that interactivity and language are two important factors that cause

2
S.Y.B. Huang and C.-J. Lee Computers in Human Behavior 129 (2022) 107027

humans to respond to computers through social behaviors. Indeed, main factors affecting conscious or unconscious mental behavior
previous studies have found that human-computer interaction is similar (Mehrabian & Russell, 1974). Therefore, social interactivity cues, social
to human interaction, and two-way communication is a necessary con­ credence cues, and social sharing signs and language cues can be used as
dition for interaction with information technology systems (McMillan & social prompt cues, because these social cues can stimulate human
Hwang, 2002; Liu & Shrum, 2002; Pelau, Dabija, & Ene, 2021). Because conscious or unconscious emotional arousal. Past research also pointed
the design of fintech chatbots is similar to the two-way interaction be­ out that whether it is a physical store (Baker, Grewal, & Levy, 1992) or
tween people, humans can respond to fintech chatbots socially as if they an online store (Davis et al., 1989), stimuli (cues) can also affect human
are social roles. Social interactivity cues indicate a certain degree of emotional arousal. Berry, Carbone, and Haeckel (2002) also believes
intimacy, frequent communications, and time spent on interactions that the cues generated by interpersonal interaction are important fac­
(Nahapiet & Ghoshal, 1998), and this research takes it as the primary tors affecting emotions. Reeves and Nass (1996) also believe that higher
role of social cues. Second, The previous studies have found that the levels of social cues will cause higher levels of emotions in the
language displayed on the computer screen can affect the perception of human-computer interaction environment. This research proposes the
individuals, such as treating the computer as a living person or a person first to third hypotheses.
with a personality (Moon, 2000; Nass, Moon, Fogg, Reeves, & Dryer,
Hypothesis 1. Users who perceive more social interactivity cues at
1995). Other studies also support this assume in the context of the
Time 1 may lead to more development of emotional arousal.
interaction between humans and artificial intelligence robots (Park,
Jang, Cho, & Choi, 2021; Shumanov & Johnson, 2021). Since the fintech Hypothesis 2. Users who perceive more social credence cues at Time 1
chatbot combines language information in the form of an information may lead to more development of emotional arousal.
system interface, humans should respond to computers as social roles.
Hypothesis 3. Users who perceive more social sharing signs and lan­
Social sharing signs and language cues represent a certain degree of
guage cues at Time 1 may lead to more development of emotional
general terms, meaningful communication methods, and readability of
arousal.
messages (Nahapiet & Ghoshal, 1998). This research uses it as the sec­
The theory of social response points out that when people interact
ond role of social cues. Finally, Boone, Declerck, and Suetens (2008) alos
with fintech chatbots, the stimulus of social cues will cause people to
pointed out that credence is an important social cue because it enables
show social behaviors that respond to them. This research uses Mehra­
humans to adopt social behaviors in response to others. Other studies
bian and Russell’s (1974) theory to open black box between social cues
also support this assume in the context of the interaction between
and social behaviors (e.g., attitude towards fintech chatbots), and pro­
humans and artificial intelligence robots (Aoki, 2021; Youn & Jin,
poses that emotional arousal is an organism’s internal emotional
2021). Indeed, since fintech chatbots are designed for interpersonal
mechanism to link stimulus (social cues) and response (social
interaction, people who feel a high degree of social credence may regard
behaviors).
fintech chatbots as social roles. Social credence cues represent a personal
Since attitude is a broad social behavior in human-computer inter­
belief in the interaction of other members, including keeping promises
action, this research incorporates it into the proposed model of the
and consistent behaviors (Nahapiet & Ghoshal, 1998) and this research
present survey. The past researches of human-computer interaction
uses it as the third cues.
behaviors have confirmed that attitude is the most significant and
The present survey develop interactivity cues, credence cues, and
important social behavior in the environment of information technology
sharing signs and language cues for virtual fintech chatbots. This
systems, continuous model of multiple motivational information sys­
research digs out suitable existing scales to adapt the domains of social
tems, and consumer acceptance technology model (Ajzen, 1991; Davis,
cues rather than perfect scales. In reviewing existing measures, the
1989; Fishbein & Ajzen, 1975; Kulviwat, Bruner, Kumar, Nasco, & Clark,
theory of Tsai and Ghoshal (1998) is adopted to develop cues, including
2007; Lowry, Gaskin, & Moody, 2015). This research proposes the
social interactivity cues (structural dimensions), social credence cues
fourth hypothesis:
(relationship dimensions), and social sharing signs and language cues
(Cognitive dimension). Therefore, this research modified the structural Hypothesis 4. Users who develop more emotional arousal may lead to
orientation (interactivity), relationship orientation (social credence), more growth in the development of attitude toward the fintech chatbot
and cognitive orientation (sharing signs and language) of social capital over time.
as the social cues of social response theory. People’s positive or negative attitudes towards information tech­
Taken together, social response theory in a fintech chatbot setting nology systems are formed by people’s experience in use and are closely
assumes that the human interaction to fintech chatbot that demonstrates related to future continuous needs (Amoroso & Ogawa, 2011; Kim,
social cues (e.g., social interactivity cues, social credence cues, and so­ Galliers, Shin, Ryoo, & Kim, 2012). In other words, attitude is human
cial sharing signs and language cues) is similiar to the human interac­ identification or preference for information technology systems (Shih,
tion. Indeed, a human receives the social interactivity cues from a 2011; Wixom & Todd, 2005), and it has also proven to be a good pre­
fintech chatbot, and then the cues arouse him or her to employ social dictor of continued willingness and loyalty (Liljander, Polsa, & Forsberg,
behaviors to respond to it. This research proposes a theoretical model 2007; Wu & Chen, 2017). On the other hand, if humans have a positive
that explains how the social cues arouse the attitude toward the fintech attitude towards information technology systems, it will also affect their
chatbot, which in turn, causes the CI. This research proposes a theo­ continued willingness (Chau & Hu, 2011). Besides, past empirical
retical model to explain how social cues elicit people’s attitudes towards studies using the expectation confirmation model also pointed out that
fintech chatbots, which in turn lead to CI. attitude is a factor that affects continued willingness (Alraimi, Zo, &
Ciganek, 2015). Therefore, this research proposes the fifth hypothesis:
2.2. Hypothese development
Hypothesis 5. Users who develop more attitudes towards fintech
chatbots may lead to more continuance intention growths over time.
Based on the theoretical framework (Fig. 1), CI is affected by the
attitudes toward the fintech chatbot and its antecedents (social cues and
3. Methodology
emotional arousal) based on social reaction theory. The next section will
discuss these hypotheses with theoretical rationales and justifications.
The research framework of this research is developed from social
In order to explain the influence of emotions, Russell’s (1974) theory
cues CI according to the theory of social response.
is adopted to describe the mechanism of social response theory. Meh­
rabian and Russell’s (1974) theories claim that rapid and unconscious
stimuli (cues) precede emotional responses, and stimuli (cues) are the

3
S.Y.B. Huang and C.-J. Lee Computers in Human Behavior 129 (2022) 107027

3.1. Measures Table 1


Means, standard deviations, and correlation (N = 455).
This research used reverse translation to confirm translation con­ M S.D. SIC SCC SLC EA AFC
sistency between Chinese and English (Reynolds, Diamantopoulos &
SIC 3.92 .87
Schlegelmilch, 1993). SCC 3.63 .86 .17
Social cues. Social interactivity cues indicate a certain degree of in­ SLC 3.52 .90 .16 .19
timacy, interaction time, and frequent communications. Social credence EA 3.66 .86 .41 .39 .38
cues indicate a certain degree of keeping promises, consistent behavior, AFC 3.46 .89 .22 .21 .19 .44
CI 3.51 .91 .21 .17 .18 .24 .46
and honesty. Social sharing signs and language cues indicate a certain
level of common terms, meaningful communication methods, and Note: SIC: Social interactivity cues; SCC: Social credence ccues; SLC: Social
comprehensibility of messages. These three cues are developed by pre­ sharing signs and language cues; EA: Emotional arousal; AFC: Attitude toward
vious surveys (Nahapiet & Ghoshal, 1998; Tsai & Ghoshal, 1998). the fintech chatbot; CI: Continuance intention.
Emotional arousal. Mehrabian and Russells’ (1974) scale was adopted
to assess emotional arousal. the biases of common methods. The marker construct selected in this
Attitude toward the fintech chatbots. This research uses the scale of research is knowledge sharing, and the reason for choosing this variable
Chen and Wells (1999) to measure future use attitudes, satisfaction, and is that this construct is common behavior on the internet (Lin, Hung, &
relative pros and cons of use. Chen, 2009). Besides, this variable is unrelated to social response theory
Continuance intention. Continuance intention is based on the scale of or the fintech chatbot context. Finally, the single-factor test of Harman
Bhattacherjee (2001), which measures whether to stop using, whether to was used in testing general method bias and social expectations bias,
use alternatives, and whether to continue to use it in the future. indicating that these two biases are not serious.
Besides, the questionnaire items of this research have nothing to do
3.2. Subjects and procedures with sensitive social expectations, such as bad behavior, personality, and
drug abuse (Walley & Smith, 1998), so the deviation of social expecta­
This research used a three-stage longitudinal sampling with an in­ tions should be insignificant. Similarly, this research used data from
terval of three months between each stage. There are two reasons for multiple intervals, which can also alleviate the bias of common method.
using the Taiwanese sample in this research. First, Taiwan not only Finally, to confirm whether the behavioral intention of CI can predict
ranks fifth in Bloomberg’s 2020 innovation ranking (Johnson & Daoud, the actual behavior of CI well, this research contacted 885 valid samples
2020) but also has a high degree of openness in technology adoption who have joined the previous survey of this research and asked them to
(Hsu, Liu, Tsou, & Chen, 2019). Therefore, the sample from Taiwan fill out a follow-up questionnaire with one item (the frequency of use in
should have more knowledge about the new technology, and it can also the past three months) to measure actual behaviors of CI. Among 485
reduce sampling errors caused by the lack of knowledge of fintech valid samples, 790 valid samples agreed to fill out the follow-up ques­
chatbots. Second, the world’s number one fintech company is "Ant tionnaire. The correlation between CI and actual behaviors of CI is 0.89,
Financial" (KPMG, 2019) and other fintech companies in Asia are also which supports a high degree of consistency between CI and actual be­
well-known worldwide. Therefore, examining the configuration of fin­ haviors of CI.
tech (e.g., fintech chatbots) in Asian countries is crucial. Besides, this The test of a confirmatory factor was employed in analyzing social
research used a three-month time interval between each survey collec­ interactivity cues (Stage 1), social credence cues (Stage 1), social sharing
tion point, because changes in attitude should be detectable within this signs and language cues (Stage 1), emotional arousal (Stage 1, 2, and 3),
time frame, which is also supported by previous studies (Lance, Van­ attitude toward the fintech chatbot (Stage 1, Stage 2, and Stage 3), and
denberg and & Self, 2000). CI (Stage 1, Stage 2, and Stage 3). The reliability (composite reliability)
We collected customer lists of 500 bank customers and sent an e-mail and validity (average variance extracted) are all above 0.5. All model fit
asking if these customers were willing to participate in the sampling of (RMSE, RMR, GFI, CFI, and NFI) is better than suggested indexes (For­
this research, and 470 customers joined the survey. Before filling out nell and Larcker, 1981). In addition, the factor loadings are shown in
these items of the questionnaire, this research asked these customers to Table 2. HTMT was adopted by this survey to analyze discriminant
use the banking services of a fintech chatbot. A month later, this validity (Henseler, Ringle, & Sarstedt, 2015; Voorhees, Brady, Cal­
research asked them to start the first phase of the research questionnaire, antone, & Ramirez, 2016), and the ratio values of HTMT were all lower
which included social interactivity cues, social credence cues, and social than 0.85. The above analysis shows that empirical data is very suitable
sharing signs and language cues, emotional arousal, attitudes towards for the model of this research. Besides, the valid sample of this research
fintech chatbots, and CI. Three months after the end of the first phase, is 455, which may cause over-sampling concerns (Kalton, 2009). To
this research asked these customers to fill out the second phase of the discover this concern, this research randomly divided the 855 valid
questionnaire, which involved emotional arousal, attitudes towards samples into two groups (227 and 228) and performed the test of
fintech chatbots, and CI. Three months after the end of the second phase, confirmatory factor respectively. These results confirm the indexes
this research asked these customers to fill in the third phase question­ indicated by Fornell and Lacker (1981), so the concern for oversampling
naires about emotional arousal, attitudes towards fintech chatbots, and should be negligible.
CI. This research received 455 valid samples in the third phase. The
males account for 46%, and their age are above 25.6 years. Also, 68% of 3.3. Potential growth modeling
people have the university degree, and the average user experience of a
bank is 5.5 years. In addition, the means, standard deviations, and Potential growth modeling (PGM) can effectively capture how the
correlationare are shown in Table 1. growths of constructs affect the growths of other constructs. For
To prevent common method bias (Podsakoff, MacKenzie & Lee, example, this research surveyed three-stage data to analyze the theo­
2003) and social expectation bias (Walley & Smith, 1998), this research retical framework, and PGM can evaluate the linear growths of
mitigated and detected them by four methods. First, this research used emotional arousal, attitude toward the fintech chatbot, and CI over time,
anonymity to reduce doubt or hesitation about filling out the sample. and how the growths in these constructs affect the growths in its
Second, insert some items that are not related to variables into the outcome constructs (Bollen & Curran, 2006; Duncan, Duncan, &
questionnaire (e.g., "I can’t stop surfing … ") to neutralize the bias. Strycker, 2006).
Third, this research used a marker construct (not related to the theo­ In this research, a second-order structure was used to analyze cau­
retical model of this research) (Malhotra, Kim, & Patil, 2006) to detect sality. For example, the perception of social interactivity cues, social

4
S.Y.B. Huang and C.-J. Lee Computers in Human Behavior 129 (2022) 107027

Table 2 4.1. Academic implications


Standardized factor loadings.
Construct Indicators Factor loading These analysis results demonstrate thatsocial interactivity cues, so­
cial credence cues, and social sharing signs and language cues are all
Social Interactivity cues SIC1 0.82**
SIC2 0.79** important predictors of human social behavior. So far, since past
SIC3 0.81** research mainly focused on the optimization of artificial agents (e.g.,
Social credence cues SCC1 0.79** Rahman, Mamun, and & Islam, 20 17), the research field of fintech
SCC2 0.84** chatbots in social presentation is still relatively limited. Although the
SCC3 0.81**
SCC4 0.79**
research on chatbots has been gradually emerged, it mainly touched the
Social sharing signs and language cues SLC1 0.80** interaction differences between chatbots and real humans (e.g., Hill &
SLC2 0.79** Ford, 2015; Luo, Tong, Fang, & Qu, 2019).
SLC3 0.83** That is to say, they have not yet explored the underlying psycho­
Emotional arousal EA1 0.79**
logical mechanism of users. On the other hand, previous empirical re­
EA2 0.83**
EA3 0.80** searchers tend to add new variables to the existing information
EA4 0.82** technology behavior model to predict adoption (Lu et al., 2017; Wu &
Attitude toward the fintech chatbot AFC1 0.79** Chen, 2017), but it is also impossible to consider the intrinsic nature of
AFC2 0.81** human behavior from a social response context. This research believes
AFC3 0.79**
AFC4 0.80**
that the “one size fits all” model to building an information technology
Continuance intention CI1 0.78** behavior model can not be sufficient to fully explain the connotation of
CI2 0.80** CI, and it is also difficult to understand the interaction behavior of
CI3 0.83** humans with information technology. This research found that the
CI4 0.84**
theory of social response is so unique in the field of fintech chatbot that
Note: SIC: Social interactivity cues; SCC: Social credence cues; SLC: Social is not used on the traditional information technology behavior model,
sharing signs and language cues; EA = Emotional arousal; AFC = Attitude to­ which also highlights the importance of this research model. This also
ward the fintech chatbot; CI = Continuance intention. explains why fintech chatbots are so popular in our lives.
*p < 0.05; **p < 0.01. Since the concept of social capital was developed through Tsai and
GFI = 0.98; RMSEA = 0.004; NFI = 0.98; RMR = 0.024; CFI = 0.99. Ghoshal (1998), which has been applied in various fields (e.g., Magh­
soudi, Shapka, & Wisniewski, 2020; Shang & Sun, 2020; Shane-Simpson,
credence cues, and social sharing signs and language cues at time 1 are Manago, Gaggi, & Gillespie-Lynch, 2018). However, the present
related to emotional arousal, and emotional arousal is related to the research introduces social capital theory into the field of fintech chatbot.
attitude of the fintech chatbot. The slope and initial state of the attitude According to the theory of social response, emotional arousal is caused
towards the fintech chatbot are related to the initial state and slope of CI. by the three social capital cues, which prompt individuals to show social
behaviors (attitude toward the fintech chatbot). The social response
3.4. The results of analysis process is also explained by the trait activation theory, which believes
that when a person is in social cues, these cues may come from human
PGM can analyze how the perception of the three social cues in the activation and human tendencies (Tett & Burnett, 2003; Chen & Kanfer,
first stage affects the growths in emotional arousal, and then affects the 2006).
growths in attitudes toward fintech chatbots, which consequently leads Fintech chatbots have gradually become a mature fintech applica­
to growths in CI (see Fig. 2). tion, because real-time communication, artificial intelligence, and
According to Table 3, social interactivity cues (β=0.36, p< .01), financial expertise have been integrated into the fintech chatbots to
social credence cues (В=0.31, p< .01) and social sharing signs and enable human beings to conduct financial investment consulting. The
language cues (β=0.28, p< .01) at the stage 1 significantly affected the theoretical model in this research can provide a reference to customize
positive growths (development) in emotional arousal. Hypothesis 1, 2, the models of information technology adoption in predicting CI in a
and 3 that argued that a user who perceived more social cues at stage 1 virtual environment.
would cause more increases in emotional arousal development are all Finally, regarding the growths of variables and how these growths
supported. In other words, users who perceive more social cues at stage lead to follow-up growths in variables, this research has opened up a
1 have already developed emotional arousal over time. new trend for social response theory and CI. That is to say, this research
Hypothesis 4 proposes that more growths in emotional arousal will employs the theoretical framework with PGM methodology to detect the
cause more growths in attitude toward the fintech chatbot over time. relationship between these variables, and the results support that in­
Hypothesis 5 proposes that growths in attitude toward the fintech dividuals actively develop more emotional arousal, attitude toward the
chatbot would cause more growths in CI over time. Based on Table 3, the fintech chatbot and CI over time because of the higher levels of social
growths in emotional arousal significantly affected the growths in atti­ cues at time 1, which cannot be analyzed through traditional cross-
tude toward the fintech chatbot (β=0.37, p< .01), and more growths in section data using statistical methods.
attitude toward the fintech chatbot also significantly affect more
growths in CI (β=0.41, p< .01). Hypothesis 4 and Hypothesis 5 are both 4.2. Practice implications
supported. In other words, users who perceive more growths in
emotional arousal will cause more growths in attitude toward the fin­ The research results can provide the valuable enlightenment to
tech chatbot, which consequently causes more growths in CI over time. practitioners. First, this research have opened up a new direction for
new product development and can provide the implementation of
4. Discussion product design strategies. The theoretical model of this research can
effectively predict the willingness to continue using fintech chatbots.
This research is expected to illustrate how to extend the theory of New information technology often encounters high risks of complex
social response to the field of fintech chatbot for explaining CI. The product functions and interactions. This research shows that fintech
proposed model is an advanced model of the previous information chatbots with few social cues are unlikely to affect user attitudes and CI.
technology behavior model and provides a sufficient explanation for the Vendors should keep this proposition in mind during the design phase of
interaction mode between humans and fintech chatbots. the information technology system. In addition, the theoretical model

5
S.Y.B. Huang and C.-J. Lee Computers in Human Behavior 129 (2022) 107027

Fig. 2. The potential growth model of this research. Note: SIC: Social interactivity cues; SCC: Social credence cues; SLC: Social sharing signs and language cues; EA =
Emotional arousal; AFC = Attitude toward the fintech chatbot; CI = Continuance intention. Yn = Measurement items. *p < 0.05; **p < 0.01.

presentation to improve users’ attitudes and CI. Third, based on the


Table 3
significant effects of these three social cues, vendors should learn how to
Test results of growth model.
optimize their fintech chatbots. For example, vendors can improve the
IEADB IAFCDB ICIDB content of social capital cues to activate human emotional arousal.
β β β Fourth, based on the significant impact of attitudes toward fintech
Antecedent variable chatbots on CI, vendors should include an information technology
SIC .36** interface that can promote positive attitudes towards fintech chatbots to
SCC .31** increase CI. When the CI of fintech chatbots flourishes, it may generate a
SLC .28** positive cycle. For example, CI represents users to continue to partici­
ISEADB
IEADB .37**
pate in the system, and then the vendors can obtain high enough traffic
ISAFCDB to carry out more business activities and profit from it to invest in the
IAFCDB .41** system, which can attract more users who continue to participate in the
Note: SIC: Social interactivity cues; SCC: Social credence cues; SLC: Social
system. Fifth, the significant impact of social cues and social behaviors
sharing signs and language Cues; IEADB = Increase on emotional arousal on CI indicates that vendors should tailor their systems to fit the theo­
development behavior; IAFCDB = Increase on attitude toward the fintech retical model of this research, instead of opening up novel technical
chatbot development behavior; ICIDB = Increase on continuance intention functions.
development behavior. Finally, with the rise of internet-only banking in Taiwan (Donkin,
*p < 0.05; **p < 0.01. 2019), internet-only banking is most criticized for its inability to interact
with physical service personnel. This research suggests another possible
also enables vendors to realize that not only social cues but also solution to this problem. In other words, internet-only banking can
emotional arousal can predict CI well for fintech chatbots. Second, these invest resources to improve the social presentation of fintech chatbots
results have great significance for suppliers to conduct promotional and integrate the three key social cues explored in this research into the
activities aimed at persuading users to continue using fin\tech chatbots. user interface to improve user experience, which can be simulated by
Advertisers should tailor-made campaign themes suitable for social fintech chatbots. In this way, the interactive effects can increase user

6
S.Y.B. Huang and C.-J. Lee Computers in Human Behavior 129 (2022) 107027

satisfaction and willingness to continue using (loyalty). A.4. Emotional arousal

4.3. Limitations s and future research 1. I feel stimulated when I interact with the fintech chatbot.
2. I feel aroused when I interact with the fintech chatbot.
First, although empirical study usually uses non-probabilistic sam­ 3. I have a frenzy of joy when I interact with the fintech chatbot.
ples, researchers should take precautions in the generalization of find­ 4. I feel excited when I interact with the fintech chatbot.
ings. In addition, culture also influences the use intention in the online
environment (Crotts & Erdmann, 2000), so further research should A.5. Attitude toward the fintech chatbot
compare the different samples in different environments. Second, when
this research employs social interactivity cues, social credence cues, and 1. I am like to use the fintech chatbot.
social sharing signs and language cues to describe the theory of social 2. I am satisfied with the service provided by the fintech chatbot.
response, there is no doubt that other cues may also cause emotional 3. I feel comfortable using the fintech chatbot.
arousal. Future research should compare the impact of different cues in 4. I feel that using the fintech chatbot can solve my problem well.
different contexts. Third, this research employs PGM to capture the
development of user perceptions, and it can explain the dynamic and
A.6. Continuance Intention
complex interrelationships between the theory of social response to CI.
Fourth, previous studies always measured behavioral intentions in
1. I intend to continue using the fintech chatbot.
empirical studies to represent actual behaviors. It is still not certain that
2. I want to continue using the fintech chatbot instead of alternative
behavioral intentions can predict actual behaviors well. However, based
means.
on information systems/computers technology theories and protection
3. If I could, I would like to continue using the fintech chatbot over
motivation theory (Ajzen, 1991; Davis, Bagozzi, & Warshaw, 1989;
the next year.
Fishbein & Ajzen, 1975; Floyd, Prentice-Dunn, & Rogers, 2000), these
4. It is unlikely for me to stop using the fintech chatbot.
theories have proposed that behavioral intentions can predict actual
behaviors well, and these theories have been applied in various disci­
pline fields. Although this research did not use an experimental design s References
(Fisher, 1971) to confirm the relationship between behavioral intentions
Ajzen, I. (1991). The theory of planned behavior. Organizational Behavior and Human
and actual behaviors, this research used a follow-up questionnaire to Decision Processes, 50, 179–211.
represent the true continuous use behavior (e.g., the frequency of use in Alraimi, K. M., Zo, H. J., & Ciganek, A. P. (2015). Understanding the MOOCs
the past three months). Further research should use rigorous experi­ continuance: The role of openness and reputation. Computers & Education, 80, 28–38.
Amoroso, D., & Lim, R. (2017). The mediating effects of habit on continuance intention.
mental methods to verify the predictability of the theoretical model of International Journal of Information Management, 37(6), 693–702.
this research. Finally, although there is no sensitive social desirability in Amoroso, D., & Ogawa, M. (2011). Japan’s model of mobile ecosystem success: The case
the measurement items of this research, further study should confirm the of NTT DoCoMo. Journal of Emerging Knowledge on Emerging Markets, 3. Article 27.
Aoki, N. (2021). The importance of the assurance that “humans are still in the decision
desirability bias of the theoretical model in the present survey. loop” for public trust in artificial intelligence: Evidence from an online experiment.
Computers in Human Behavior, 114, 106572.
Credit author statement Bae, M. (2018). Understanding the effect of the discrepancy between sought and
obtained gratification on social networking sites users’ satisfaction and continuance
intention. Computers in Human Behavior, 79, 137–153.
Stanley Y.B. Huang: Methodology, Software, Data curation, Writing- Baker, J., Grewal, D., & Levy, M. (1992). An experimental approach to making retail
Original draft preparation, Investigation. Chih-Jen Lee: Writing- store environmental decisions. Journal of Retailing, 68(4), 445–460.
Bayerque, N. (2016). A short history of chatbots and artificial intelligence (accessed August
Reviewing and Editing, Literature Collection. 1, 2020) https://venturebeat.com/2016/08/15/a-short-history-of-chatbots-and-artif
icial-intelligence/.
Appendix A. Measurement items Berry, L. L., Carbone, L. P., & Haeckel, S. H. (2002). Managing the total customer
experience. Sloan Management Review, 43, 85–89.
Bhattacherjee, A. (2001). Understand information systems continuance: An expectation-
A.1. Social interactivity cues confirmation model. MIS Quarterly, 16(3), 351–370.
Bøe, T., Gulbrandsen, B., & Sørebø, O. (2015). How to stimulate the continued use of ICT
1. I feel that the fintech chatbot means having close social relation­ in higher education: Integrating Information Systems Continuance Theory and
agency theory. Computers in Human Behavior, 50, 375–384.
ships with me. Boone, C., Declerck, C., & Suetens, S. (2008). Subtle cues, explicit incentives, and
2. I feel that the fintech chatbot means spending a lot of time inter­ cooperation in social dilemmas. Evolution and Human Behavior, 29, 179–188.
acting with me. Chang, Y. P., & Zhu, D. H. (2012). The role of perceived social capital and flow
experience in building users’ continuance intention to social networking sites in
3. I feel that the fintech chatbot means frequent communication with China. Computers in Human Behavior, 28, 995–1001.
me. Chau, P., & Hu, P. (2011). Information technology acceptance by individual
professionals: A model of comparison approach. Decision Sciences, 32(4), 699–719.
Chen, G., & Kanfer, R. (2006). Toward a systems theory of motivated behavior in work
A.2. Social credence cues team. In B. M. Staw (Ed.), Research in organizational behavior (349–381). Greenwich,
CT: JAI Press.
1. I feel that the fintech chatbot can keep its promises. Chen, Q., & Wells, W. D. (1999). Attitude toward the site. Journal of Advertising Research,
39(5), 27–37.
2. I feel that the fintech chatbot knows we can count on each other. Crotts, J. C., & Erdmann, R. (2000). Does national culture influence consumers’
3. I feel that the fintech chatbot behaves consistently. evaluation of travel services? A test of hofstede’s model of cross-cultural differences.
4. I feel that the fintech chatbot is truthful in dealing with each other. Managing Service Quality, 10(5), 410–419.
Dai, H. M., Teo, T., & Rappa, N. A. (2020). Understanding continuance intention among
MOOC participants: The role of habit and MOOC performance. Computers in Human
A.3. Social sharing signs and language Behavior, 11, 106455.
Davis, F. D., Bagozzi, R. P., & Warshaw, P. R. (1989). User acceptance of computer
technology: A comparison of two theoretical models. Management Science, 35(8),
1. I feel that the fintech chatbot uses common terms or jargons to
982–1003.
convey messages. Delgosha, M. S., & Hajiheydari, N. (2021). How human users engage with consumer
2. I feel that the fintech chatbot uses an understandable communi­ robots? A dual model of psychological ownership and trust to explain post-adoption
cation pattern during the discussion. behaviours. Computers in Human Behavior, 117, 106660.
Donkin, C. (2019). Line, Rakuten poised for Taiwan online bank launch (accessed
3. I feel that the fintech chatbot uses understandable narrative forms February 1, 2021) https://www.mobileworldlive.com/money/news-money/line-ra
to convey messages. kuten-poised-for-taiwan-online-bank-launch.

7
S.Y.B. Huang and C.-J. Lee Computers in Human Behavior 129 (2022) 107027

Eagly, A. H., & Chaiken, S. (1993). The psychology of attitudes. Fort Worth. TX: Harcourt Mehrabian, A., & Russell, J. A. (1974). An approach to environmental psychology.
Brace Jovanovich. Cambridge, MA: MIT Press.
Edwards, C., Edwards, A., Stoll, B., Lin, X., & Massey, N. (2019). Evaluations of an Montazemi, A. R., & Qahri-Saremi, H. (2015). Factors affecting adoption of online
artificial intelligence instructor’s voice: Social Identity Theory in human-robot banking: A meta-analytic structural equation modeling study. Information &
interactions. Computers in Human Behavior, 90, 357–362. Management, 52(2), 210–226.
Fishbein, M., & Ajzen, I. (1975). Belief, attitude, intention and behavior: An introduction to Moon, Y. (2000). Intimate exchanges: Using computers to elicit self-disclosure from
theory and research. MA: Addison-Wesley Boston. consumers. Journal of Consumer Research, 26, 323–339.
Fisher, R. A. (1971). The design of experiments. Macmillan. Nahapiet, J., & Ghoshal, S. (1998). Social capital, intellectual capital, and the
Floyd, D. L., Prentice-Dunn, S., & Rogers, R. W. (2000). A meta-analysis of research on organizational advantage. Academy of Management Review, 23(2), 242–266.
protection motivation theory. Journal of Applied Social Psychology, 30(2), 407–429. Nass, C., Fogg, B. J., & Moon, Y. (1996). Can computers Be teammates? International
Fornell, C., & Lacker, D. F. (1981). Evaluating structural equation models with Journal of Human-Computer Studies, 45(6), 669–678.
unobservable variables and measurement error. Journal of Marketing Research, 18(1), Nass, C., & Moon, Y. (2000). Machines and mindlessness: Social responses to computers.
39–50. Journal of Social Issues, 56(1), 81–103.
Gan, C., & Li, H. (2018). Understanding the effects of gratifications on the continuance Nass, C., Moon, Y., & Carney, P. (1999). Are people polite to computers? Responses to
intention to use WeChat in China: A perspective on uses and gratifications. Computers computer-based interviewing systems. Journal of Applied Social Psychology, 29(5),
in Human Behavior, 78, 306–315. 1093–1110.
Gillath, O., Ai, T., Branicky, M. S., Keshmiri, S., Davison, R. B., & Spaulding, R. (2021). Nass, C., Moon, Y., Fogg, B. J., Reeves, B., & Dryer, D. C. (1995). Can computer
Attachment and trust in artificial intelligence. Computers in Human Behavior, 115, personalities Be human personalities? International Journal of Human-Computer
106607. Studies, 43(2), 223–239.
Henseler, J., Ringle, C. M., & Sarstedt, M. (2015). A new criterion for assessing Park, N., Jang, K., Cho, S., & Choi, J. (2021). Use of offensive language in human-
discriminant validity in variance-based structural equation modeling. Journal of the artificial intelligence chatbot interaction: The effects of ethical ideology, social
Academy of Marketing Science, 43(1), 115–135. competence, and perceived humanlikeness. Computers in Human Behavior, 121,
Hill, J., Ford, W. R., & Farreras, I. G. (2015). Real conversations with artificial 106795.
intelligence: A comparison between human–human online conversations and Pelau, C., Dabija, D. C., & Ene, I. (2021). What makes an AI device human-like? The role
human–chatbot conversations. Computers in Human Behavior, 49, 245–250. of interaction quality, empathy and perceived psychological anthropomorphic
Hong, J. C., Tai, K. H., Hwang, M. Y., Kuo, Y. C., & Chen, J. S. (2017). Internet cognitive characteristics on the acceptance of artificial intelligence in the service industry.
failure relevant to users’ satisfaction with content and interface design to reflect Computers in Human Behavior. https://doi.org/10.1016/j.chb.2021.106855
continuance intention to use a government e-learning system. Computers in Human Podsakoff, P. M., MacKenzie, S. B., Lee, J., & Podsakoff, N. P. (2003). Common method
Behavior, 66, 353–362. biases in behavioral research: A critical review of the literature and recommended
Hsu, H. Y., Liu, F. H., Tsou, H. T., & Chen, L. J. (2019). Openness of technology adoption, remedies. Journal of Applied Psychology, 88, 879–903.
top management support and service innovation: A social innovation perspective. Rahman, A. M., Mamun, A. A., & Islam, A. (2017). Programming challenges of chatbot:
Journal of Business & Industrial Marketing, 34, 575–590. Current and future prospective. 2017 IEEE region 10 humanitarian technology conference
International Federation of Robotics. (2017). Executive summary world robotics 2017 (R10-HTC) (pp. 75–78). Dhaka.
service robots. available at: https://ifr.org/free-downloads/. (Accessed 1 August Reeves, B., & Nass, C. I. (1996). The media equation. Stanford, CA: CSLI Publications.
2020) accessed. Shane-Simpson, C., Manago, A., Gaggi, N., & Gillespie-Lynch, K. (2018). Why do college
Jang, M., Jung, Y., & Kim, S. (2021). Investigating managers’ understanding of chatbots students prefer facebook, twitter, or instagram? Site affordances, tensions between
in the Korean financial industry. Computers in Human Behavior, 120, 106747. privacy and selfexpression, and implications for social capital. Computers in Human
Johnson, S., & Daoud, Z. (2020). Who innovates first? Ranking 135 economies around Behavior, 86, 276–288.
the world (accessed February 1, 2021) https://venturebeat.com/2016/08/15/a-sho Shang, R. A., & Sun, Y. (2020). So little time for so many ties: Fit between the social
rt-history-of-chatbots-and-artificial-intelligence/. capital embedded in enterprise social media and individual learning requirements.
Kalton, G. (2009). Methods for oversampling rare subpopulations in social surveys. Computers in Human Behavior, 106615.
Survey Methodology, 35(2), 125–141. Shih, C. (2011). Comparisons of competing models between attitudinal loyalty and
Kim, C., Galliers, R., Shin, N., Ryoo, J., & Kim, J. (2012). Factors influencing internet behavioral loyalty. International Journal of Business and Information, 8(1), 149–166.
shopping value and consumer continuance intention. Electronic Commerce Research Shumanov, M., & Johnson, L. (2021). Making conversations with chatbots more
and Applications, 11, 374–387. personalized. Computers in Human Behavior, 117, 106627.
KPMG. (2019). Fintech100: Leading global fintech innovators. https://home.kpmg/ Steuer, J., & Nass, C. (1993). Voices, boxes, and sources of messages computers and
xx/en/home/insights/2019/11/2019-fintech100-leading-global-fintech-innovator social actors. Human Communication Research, 19(4), 504–527.
s-fs.html. Sundar, S. S., & Nass, C. (2000). Source-orientation in human-computer interaction.
Kuhn, T. (1970). The structure of scientific revolutions. Chicago: University of Chicago Communication Research, 27(6), 683–703.
Press. Teng, C.-I. (2019). How avatars create identification and loyalty among online gamers:
Kulviwat, S., Bruner, G. C., Kumar, A., Nasco, S. A., & Clark, T. (2007). Toward a unified Contextualization of self-affirmation theory. Internet Research, 29(6), 1443–1468.
theory of consumer acceptance technology. Psychology and Marketing, 24(12), Tett, R. P., & Burnett, D. D. (2003). A personality trait-based interactionist model of job
1059–1084. performance. Journal of Applied Psychology, 88, 500–517.
Lance, C. E., Vandenberg, R. J., & Self, R. M. (2000). Latent growth models of individual Thusyanthy, V., & Senthilnathan, S. (2017). Customer satisfaction in terms of physical
change: The case of newcomer adjustment. Organizational Behavior and Human evidence and employee interaction. The IUP Journal of Marketing Management, XI(3),
Decision Processes, 83, 107–140. 7–24.
Li, H., Li, L., Gan, C., Liu, Y., Tane, C.-W., & Deng, Z. (2018). Understanding the effects of Tsai, W., & Ghoshal, S. (1998). Social capital and value creation: An empirical study of
gratifications on the continuance intention to use WeChat in China: A perspective on intrafirm networks. Academy of Management Journal, 41(4), 464–476.
uses and gratifications. Computers in Human Behavior, 85, 175–182. Valaei, N., & Baroto, M. B. (2017). Modelling continuance intention of citizens in
Liljander, A., Polsa, P., & Forsberg, K. (2007). Do mobile CRM services appeal to loyalty government facebook page: A complementary PLS approach. Computers in Human
program consumers? International Journal of E-Business Research, 3(2), 24–40. Behavior, 73, 224–237.
Lin, M. J. J., Hung, S. W., & Chen, C. J. (2009). Fostering the determinants of knowledge VanderWeele, T. J., Jackson, J. W., & Li, S. (2016). Causal inference and longitudinal
sharing in professional virtual communities. Computers in Human Behavior, 25(4), data: A case study of religion and mental health. Social Psychiatry and Psychiatric
929–939. Epidemiology, 51, 1457–1466.
Liu, Y., & Shrum, L. J. (2002). What is interactivity and is it always such a good thing? Voorhees, C. M., Brady, M. K., Calantone, R., & Ramirez, E. (2016). Discriminant validity
Implications of definition, person and situation for the influence of interactivity on testing in marketing: An analysis, causes for concern, and proposed remedies.
advertising effectiveness. Journal of Advertising, 4, 53–64. Journal of the Academy of Marketing Science, 44(1), 119–134.
Lowry, P. B., Gaskin, J. E., & Moody, G. D. (2015). Proposing the multimotive Walley, L., & Smith, M. (1998). Deception in selection. Chichester: Wiley.
information systems continuance model (MISC) to better explain end-user system Wigglesworth, R. (2016). Fintech: Search for a super-algo. available at: https://www.ft.
evaluations and continuance intentions. Journal of the Association for Information com/content/5eb91614-bee5-11e5-846f-79b0e3d20eaf. (Accessed 1 August 2020)
Systems, 16(7), 515–579. accessed.
Luo, X., Tong, S., Fang, Z., & Qu, Z. (2019). Frontiers: Machines vs. humans: The impact Wixom, B., & Todd, P. (2005). A theoretical integration of user satisfaction and
of artificial intelligence chatbot disclosure on customer purchases. Marketing Science, technology acceptance. Information Systems Research, 16(1), 85–102.
38(6), 937–947. Wu, B., & Chen, X. (2017). Continuance intention to use MOOCs: Integrating the
Lu, J., Yu, C.-S., Liu, C., & Wei, J. (2017). Comparison of mobile shopping continuance technology acceptance model (TAM) and task technology fit (TTF) model. Computers
intention between China and USA from an espoused cultural perspective. Computers in Human Behavior, 67, 221–232.
in Human Behavior, 75(1), 130–146. Youn, S., & Jin, S. V. (2021). “In A.I. we trust?” the effects of parasocial interaction and
Maghsoudi, R., Shapka, J., & Wisniewski, P. (2020). Examining how online risk exposure technopian versus luddite ideological views on chatbot-based customer relationship
and online social capital influence adolescent psychological stress. Computers in management in the emerging “feeling economy”. Computers in Human Behavior, 119,
Human Behavior, 113, 106488. 106721.
Malhotra, N. K., Kim, S. S., & Patil, A. (2006). Common method variance in IS research: A Zhang, C. B., Li, Y. N., Wu, B., & Li, D. J. (2017). How WeChat can retain users: Roles of
comparison of alternative approaches and a reanalysis of past research. Management network externalities, social interaction ties, and perceived values in building
Science, 52(1), 1865–1883, 2. continuance intention. Computers in Human Behavior, 69, 284–293.
McMillan, S. J., & Hwang, J. S. (2002). Measures of perceived interactivity: An
exploration of the role of direction of communication, user control, and time in
shaping perceptions of interactivity. Journal of Advertising, 31(3), 29–42.

You might also like